Category Archive
for: ‘SEO’

5 Steps to Facebook Advertising – Whiteboard Friday

Posted by LaurenV

Facebook advertising has taken the marketing world by storm. But with so many advertising options available within Facebook, how do you know where to start the campaigns that will best support your goals and objectives?

In today’s Whiteboard Friday, Lauren Vacarello outlines the different ad types on Facebook and walks us through getting started with Facebook advertising. We’d love to hear your thoughts in the comments below! 

Video Transcription

“Hi, I’m Lauren Vacarello, the Senior Director of Online Marketing for, and today we’re going to talk about 5 Steps to Facebook Advertising. In the next 5 or 10 minutes, we’ll talk about how to get started with advertising on Facebook.

So before we really dive into it, let’s talk about the different types of ads there are on Facebook. I’m sure everyone’s familiar with something that mildly resembles this with your news feed on Facebook. There are actually three real main types of ads on Facebook. Everything else falls under those categories.

First, there’s going to be your marketplace ads. So your marketplace ads are the ads that you’re most familiar with. They’re the ads on the right-hand side of your Facebook feed. What makes these different from, say, a premium ad is both the cost of these ads and the different options that you have. With marketplace ads, all you’re really going to get is that little, tiny image and a little bit of copy to the right-hand side.

With your premium ads, sometimes they’ll show up on the right, but they’ll also show up in the center of your Facebook page as well, with one of the big example of premium ads being your sponsored stories. Now, sponsored stories are a newer ad type that Facebook is really testing out right now. So you’re going to see it as part of your Facebook feed. Some people are getting a little unhappy about seeing these larger ads in their feed, but Facebook is now a publicly traded company and they need to make money, and to do that they need to start selling more advertising.

The thing about premium ads that is really interesting is there are these different types of premium ads that you can actually have. So one of the types of premium ads is going to be video ads. That’s when you’ll see a video embedded into an ad unit, lots of different copy. If you see a poll, if you see someone pitching an event, those are all going to be part of premium ads.

One of the really cool things that Facebook is doing right now is something called custom audiences. Think of it this way. Say you have an email list and you have an email list of 10,000 potential customers. You can work with Facebook to build something called a custom audience. You give them these 10,000 email addresses. They’ll match it to people’s Facebook accounts, and now you’re able to build a really targeted ad campaign just to those 10,000 people that you may have in your lead nurturing program at the same time.

What also really differentiates premium ads from marketplace ads is the cost. Now, premium ads, you have to buy on a CPM basis. You also have to buy through a Facebook account team. Marketplace ads, super easy, self-service, you can sign up for them with a credit card. Anyone can advertise on marketplace ads.

Your premium ads are going to have about an $8.50 CPM. You have to talk to a person to sell them, and in a lot of instances, you don’t actually have the control over the ads, your Facebook account team is going to have to set them up for you.

Definite advantages, you’ve a lot higher response rate with your video ads. You can do a lot more with polls and the sponsored stories in the center of the feed. You’re going to get the most interaction with those types of ads. But at the same time, the costs are going to be anywhere from two to four times as much as you’d pay using marketplace ads.

Then there’s also something that’s been getting a lot of press right now, which is FBX or Facebook retargeting. Facebook retargeting is really interesting, and Facebook is finally trying to start to monetize their audience base. They’ve been experimenting with new ad types. Facebook retargeting is Facebook saying, “I’m not going to try to come up with my own ad type. I’m going to take something that works for everybody else.” It’s been really, really profitable for them.

So think of it the same way that you think of retargeting. I think there’s been a recent Whiteboard Friday on retargeting. Very similar principles apply, but instead of just retargeting people who come to your website as they browse random places on the web, you’re retargeting them as they go onto their Facebook page. So really, really interesting possibilities there.

So now that you’ve had a quick primer on what the types of Facebook advertising are, let’s actually talk about how you’d use it and about getting started with this. Before you really do any type of Facebook advertising, before you do any advertising in general, it starts with identifying what your goals are. So I’m going to walk you through a scenario of let’s say we’re going to sell SEOmoz to small businesses using Facebook advertising.

The first thing you want to do is identify your goals. In this situation, we want to sell SEOmoz to small businesses. Let’s figure out exactly what we want to do with those goals. Are we trying to get new people that we’ve never spoken to? Are we trying to nurture existing people in the SEOmoz database, or are we trying to go after existing SEOmoz customers to get them to buy a larger, more expensive product?

So start by figuring out exactly what your goals are. Also identify do you want them to buy after seeing that ad, or do you want the ad to be part of a brand awareness play, where you’re just trying to introduce your product to them and then eventually get them to buy? Start by identifying what your goals are.

So we’re going to identify our goals. Let’s say we want to sell SEOmoz to new people. Perfect. So who are we trying to sell this to? Set your targets. You have to know your audience with this. Facebook is amazing when it comes to targeting capabilities. With a lot of behavioral targeting on the web right now, it’s all assumptions people make. Because I go to ESPN and Golf Digest and Harvard Business Review, then I must be a CEO of a company, and I must be a man in between these ages.

What’s really cool about Facebook is it’s not assumptions that people are making based on what actions someone may or may not be taking on the Internet. We self-identify on social media really, really well. We tell people who we are and what we’re interested in. We tell them what school we went to. We tell them our jobs. We tell them who we’re friends with. Because we know all of that information, it’s really easy to target these people as a marketer. You’re not guessing what people are interested in. They’re actually telling you what they’re interested in, and you can see what they’re talking about, and that gives Facebook incredible targeting options. Not even using Facebook retargeting, but just through their marketplace ads and their premium ads, you have a lot of really great options.

So let’s set our targets for this scenario. Let’s say SEOmoz wants to target small businesses to buy the SEOmoz product. But not just small businesses, who in small businesses? Do we want to target the marketing team? Do we want to target CEOs? Do we want to target CMOs? Do we also want to think a little bit differently and target based on what people are interested in? Great targets in this scenario.

Facebook does something really cool and lets you target the friends of your fans. So we’re going to make the assumption if I’m a fan of SEOmoz and I’m friends with 500 people on Facebook, the average number of friends that someone on Facebook has, I believe, is 500. I think most people watching this probably have more than 500 friends. So be happy you have a lot of Facebook friends, but great for SEOmoz. We’re going to target my friends and fans on Facebook because we’re going to make the assumption, if I’m a fan of SEOmoz, there’s a good chance we might want to sell to my friends, because I’m friends with like-minded people. So the first target is going to be friends of fans.

You might be asking yourself, “Lauren, why aren’t we targeting SEOmoz fans if we want to sell SEOmoz to more people?” Well, maybe a lot of SEOmoz fans are already customers, so you’re going to need to find that out. Whether it’s for your business, SEOmoz or even for Salesforce, it’s really important to know who your fans actually are. It’s as simple as doing a quick SurveyMonkey survey just to ask your fans who they are and if they’re already a customer.

So let’s say we target friends of fans, we target fans. Now, small business itself is a really big audience on Facebook. Tens of millions of people fall into the small business audience, so you might want to narrow that down a little bit more. Facebook gives you a lot of different clusters, which is really just a cluster of different keywords and different interests to build a larger group of people. So maybe it’s small business, maybe we want to think differently. Think about who these people, who your target buyer actually thinks about, who they care about. Maybe it’s small business, maybe it’s people interested in marketing or in the Internet.

The Internet is a target audience on Facebook, because I am a fan of the Internet. Think about all of your targets and build out each individual line for your targets. Think of it the same way if anyone’s run a paid search campaign or a display campaign. You have your account. You have campaigns, and you have ad groups. Think of this similar to how you’re going to think of your ad groups, because if you’re going doing a marketplace ad, you want to have a different type of ad for each one of these different target audiences, really similar to how you’re going to have a different paid search ad to a different ad group.

So if you’re Salesforce and we’re advertising CRM and we’re also advertising customer support application, it’s going to be two different ad copies. Really similar, if you’re advertising to friends of fans and to people that are identified as interested in marketing, you need different ad copy. So start by having different ad copy. Perfect.

The next part, you need to determine what content you’re going to use, and you need to post that content. So two different options here. Let’s start with marketplace ads. You get this little 50 x 50 image, and you get about 135 characters over here in the ad. You can treat this really similar to how you treat, say, paid search or display and come up with different ad copies, work with the different the stakeholders within your organization to post these.

But if you’re going to do some of the premium ads and you’ve got this big, sponsored story over here, this is what gets really interesting. With sponsored story, it’s going to be something you post on your company’s Facebook page. So think of this as your company’s Facebook page. You post something, you want to get that piece of content in front of a lot of people.

Let’s say I am going to draw Roger really poorly right now. Let’s say we’ve got a great post. This is the SEOmoz page. We’ve got this great post with Roger talking about all the new features that SEOmoz is coming out with. So now we say, “Okay, here’s this piece of content. This piece of content needs to almost serve two masters.” We need to make all of our fans and followers happy and show them this content, but let’s try to take this content and get it outside of the SEOmoz audience.

So now we want to take this piece of content, and we want to get people in the small business segment to care about this. So we can’t get a new piece of copy if we’re using a sponsored story. We have to take an existing story on the SEOmoz page, but we want to get it in front of small businesses, people in marketing, fans, friends of fans. So you build all of your different targets, and you choose to sponsor this story. But with this, you have less control.

So two options, but the great thing about Facebook, honestly, you want to do both. It’s not an if-then. You can use your marketplace ads to really customize and put the custom content in front of all these audiences. But with sponsored stories, you’re taking up more real estate. You’re in the center of the page,so it’s a really good way to attract people’s attention. If you are using friends of fans and advertising to friends of fans and any of these people comment, it’s going to have that little bit of the extra engagement over there. So try sponsored stories and look into marketplace ads as well.

If you’re using marketplace ads, the Facebook interface is still being developed, so it’s a little bit hard to clone ads. It’s not super easy to use. So you can end up – I will do the Salesforce pitch – using a tool like Salesforce marketing cloud because it lets you start to clone these ads. So instead of making 50 different ads for 50 different targets, you take one ad, you clone it and make changes. It just helps you move a little bit more quickly. There are lots of different options for doing that as well.

We figure out our content and we post our content. So for sponsored stories, we put a great piece of content front and center on the page. We promote it to all of our different fans and followers, but know that it’s going to show up in the center of their feed as an option. Also build your marketplace ad campaign where you’ve got individual ads for each target.

Here’s an interesting trick that people don’t think about, and it’s my one, major tip for everyone. If you are doing premium content and you do have this sponsored story over here, say you have content that you don’t necessarily want your fans and followers to see. Say you really just want to post your great piece of content to advertise to people in marketing. You’re okay if your fans and followers see it, but it’s not really for them, it’s really advertising content. You could actually backdate that story so that your fans and followers don’t really see this unless they scroll all the way back, which some of your fans and followers might. But backdate your content so that this doesn’t show up on your main company page, but you can still use it in advertising to some of your different audiences. So that’s my one tip for you.

Once you determine and post your content, so let’s say we’re going to do sponsored stories and marketplace ads. Perfect. Then the next and most important step is testing and optimization. No one’s going to get everything right on their first try, and that’s okay. You’re not supposed to get everything right. You just need to move really, really quickly, and the larger your budget, the more quickly you should be able to test and optimize.

Now, with Facebook ads, we find out that the ads actually burn out pretty quickly, so that if you’re using a sponsored story and you’re advertising to the same audience, first of all, make sure you set up frequency caps when you work with your Facebook team, because you don’t want to show the same person an ad 15 times in the middle of their feed. That really does get kind of irritating, and the effectiveness of your ads starts to decrease. So you don’t want to do that. So set frequency caps and also start to rotate your content, especially on the marketplace ads on the right-hand rail. Start to rotate out the image. Start to rotate out your copy.

Depending on how much advertising you’re doing and the size of your budget, it could be as little as two days. You might be able to get away with a week or two. But as long as you’re monitoring results, you’ll start to see performance over here. Let’s say you’re tracking leads. You’re tracking leads, so leads start to go up, and then they start to peak and fall off as impressions go up. You want to find that point where your impressions are going up but your performance is dropping. Once you reach that point, it’s time to switch out your ad because people have seen it. Anyone who’s going to respond to it already has. So make sure you know when to pull your ad copy, and that’s really reporting, doing your analysis and the whole time, what you should be doing is testing and optimizing.

Think of it the way you think of paid search. You’re advertising to friends and fans. You don’t want to just give them one ad. You want to rotate through different titles, different images, different copy, different offers to see what’s really going to perform best, same way you’re going to do this with, let’s say, a paid search campaign or a display advertising campaign. So make sure you test and optimize. Let’s say lots of testing. We like testing.

Then this is the thing that’s really different with Facebook advertising. We’re going to think all the way back – and everyone will make fun of me for making this statement – think about when TV first came out, because I was alive when TV first came out. It’s a joke. Picture when TV first came out, and people all go home and they’re sitting and watching their three channels on television. But you have to make money, so they started putting commercials.

People hated commercials on television. People hated it, and they would complain about commercials interrupting television. But you have to have commercials on television or else television couldn’t exist in the early days of TV, because they needed a way to make money. Now we have cable, and you have to pay for channels, which is a whole different model. But when TV first came out, people hated commercials. Even when email first came out, even email now, a lot of people really hated getting emails and were complaining about emails. It became a little easier with email because there was the option to unsubscribe.

Now think about Facebook. We’ve spent years on Facebook without really having to deal with a lot of advertising. Their right-hand rail started. This in the middle of your feed is really just coming out. So people in general are very taken aback by this. They’re not sure what to do about ads in their feed, and not everyone’s going to like having ads in their feed, the same way not everyone likes receiving an email from a potential company they’re going to buy from, the same way people hated commercials when television first came out.

The biggest difference with this is, because you can comment on this, you’ll start to see how much people don’t like advertising. It won’t necessarily be about your product. It will be that they don’t like that Facebook is offering advertising. Facebook has to make money. They’re a publicly traded company, and they’re going to try to figure out different ways they can make money. But you need to know that this might impact your ads.

If you run a TV commercial and someone doesn’t like it, there’s not a lot they can do about it. If you send an email and someone doesn’t like it, they can unsubscribe. If you run an ad and it shows up in someone’s feed and they don’t like it, they can write a comment about it. If you’re doing a lot of advertising, you might start to see a lot of negative comments.

What’s really fun about these is you’ll also get a lot of likes. You’ll get a lot of shares, and if the content is engaging and people care about the content, if you’re advertising to SMBs or people in marketing and you’re giving them content that tells them how to help their business and really gives them useful information, you’ll get a lot of likes, and you’ll get a lot of shares and people will be really happy with it. But you can’t make everyone happy, and some people will just start to leave negative comments.

As a company, before you really launch this type of campaign, you need to think about what you’re going to do about those negative comments. Are you going to engage with them? Are you going to respond? What’s your threshold? What are you comfortable with? If people start having a lot of negative comments about your ad, do you pull the ad? Do you try to talk to all of these people? Do you say, “You know what? I don’t care about negative comments?” What’s your threshold?

So figure out your engagement strategy and how you’re going to monitor that before you launch the campaign. Otherwise, you’ll launch all of this, you’ll run this sponsored story, you’ll make tons and tons of money, hopefully, from all of the advertising that you’re doing. You’ll sell shoes, you’ll sell Moz licenses, you’ll be really successful, but then you’ll see all these negative comments and then suddenly maybe you won’t be as successful and maybe this is a bigger deal for your business of having all those negative comments than the potential upside.

So think about that first and know what you’re willing to deal with, what you’re comfortable with, and then what your engagement and response strategy is before you launch. If you do that and you start off with goals, you go after those targets, you optimize, because maybe some of these targets don’t work. Maybe people who like the Internet don’t like SEOmoz. It’s weird. You think they would, but what if they don’t? You have to know which lines you’re willing to get rid of, which comes from the optimization piece. If someone’s not happy, if you’ve got a great community team that’s really engaging, you can start turning some of those negative comments into real sales opportunities.

So that’s my how to get started with Facebook advertising. A few quick steps to go, but for anyone who’s looking to try it, you can sign up for marketplace ads with a credit card, give it a shot – a lot of times they offer a little bit of free Facebook money – and see what happens and see what works for your business.

Thank you, everybody. I am Lauren Vacarello. Take care.”

Video transcription by

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Read More

How to Live Tweet Like a Pro

Posted by RuthBurr

Those of you who follow me on Twitter have probably noticed that I live-tweet the conferences I go to. Extensively. Some people love it, some people hate it – but if you want to start live-tweeting for yourself, here are some things to keep in mind.

Why I Live Tweet

I started live tweeting events a couple of years ago, when I realized that I was spending as much time and effort tweeting out the most relevant points of the session I was in as I spent taking notes – plus, the notes I took were less relevant than my tweets, since I was only tweeting out the best parts!

Once I committed to live tweeting conferences, I got a lot of great, positive feedback about it from other attendees, so I kept on going. I’ve also gotten the bulk of my followers through live tweeting; it can be a great way to build your personal brand at conferences and get increased visibility with attendees and speakers alike. Live tweeting doesn’t just build your brand among attendees of the conference, either. People who are trying to follow along at home via the conference hash tag are often even bigger fans of quality live tweets.

There’s a noticeable uptick in people who read my name badge and say “oh, you’re Ruth Burr!” at the end of a conference compared to the beginning (when they usually just say “nice to meet you”).

@ruthburr Cheers for all the tweets, they are better than my notes, and much neater ;)

— Kingsland Linassi (@kingslandlinass) March 15, 2013

A big thanks for @ruthburr for live tweeting at #LinkLoveAppreciate it! :)

— Dennis Seymour (@denseymour) March 15, 2013

So that’s nice.

Why You Might Not Want to Live Tweet

A few caveats before we get in to the nitty-gritty of quality live Twitter coverage:

You will lose followers. When I’m covering a conference, I’m tweeting multiple times per minute, all day. That can really blow up someone’s Twitter feed. I usually encourage my followers to mute me or the conference hash tag  if they don’t want to be inundated, but some people just choose to unfollow – and some of those people don’t re-follow after the conference is over.

Here are my daily follow/unfollow numbers from the last 60 days, courtesy of Followerwonk:

Live tweeting impact on followers

As you can see, I get the most new followers on days I’m live tweeting, but I get the most unfollows on those days as well. With the 31 followers I lost during SearchFest, my 54 new followers starts to look a lot more like 23. I’m still at a net gain of followers, but if you’re not prepared to (permanently) lose some followers (especially those who aren’t in the search industry), live tweeting may not be for you.

It takes a ton of energy. Conferences can already be really draining, between the late nights, the “always on” networking conversations and the stress of trying to still get some work done while you’re there. Live tweeting takes a surprising amount of energy: the bulk of your focus needs to be on the session, not on the session + your work email + your slides for later in the day + Facebook. Tweeting live also means that even if a session is really boring or not at all useful to you, you can’t take a nice relaxing mental break and zone out or work on something more important.

You’re reporting the news, not making it. That’s something that can get lost in translation through retweets and replies. You’re going to get clarifying questions and dissenting opinions about things you didn’t even say (or necessarily agree with). No matter how many times you say “I didn’t say it, Duane Forrester did. I’d suggest asking him if you need more information,” some people are still going to get hung up on the idea that you’re the one advocating a particular position. It can get sticky.

You’ll probably get rate limited. I usually end up unable to tweet for at least an hour per conference, because the Twitter API has blocked me for tweeting too many times in too short a period.

So! Caveats firmly in place, let’s talk about:

How to Provide Value via Live Tweets

  • Provide as much context as you can. Take this tweet from SearchFest:

    Agility: Kinect was for games 1st, ppl hacked it, MSFT provided an SDK for ppl to build what they want @melcarson #searchfest

    — Ruth Burr (@ruthburr) February 22, 2013

    Just adding the word “Agility” to the beginning of the tweet puts the entire factoid into the context in which Mel was using it. This increases the ability for the tweet to be read and understood outside of the context of other conference tweets. Which brings me to:

  • Think about the retweet. Each piece of information you tweet needs to be able to survive on its own, independent of the tweets that preceded or followed it. When you get retweeted, the new audience viewing that tweet may not have seen your other tweets on the topic: make sure that tweet will make sense to them, too.
  • Numbers are gold. When someone cites a statistic in their talk, tweeting the specific numbers they mention really increases the relevance of your tweet.

    Sites that regularly post content w/video have 200-300% more new visitors and 2x time on page – key signs of relevance @thetoddhartley #SMX

    — Ruth Burr (@ruthburr) March 12, 2013

  • Don’t try to live tweet anecdotes. Speakers will often use illustrative examples in their talks, whether they’re passing anecdotes or full-on case studies. These can be extremely hard to live tweet. Remember to stick to the rules above. It’s OK to sum up a two-minute anecdote or case study into one or two tweets that are focused on the point.
  • Capture as many URLs as you can. If someone includes a link on a slide, I’ll usually type that out first and then write the tweet context around it, in case they change the slide before I can write it down (this is especially important with links). Want to go above and beyond? If someone mentions a great article but doesn’t include the link, Google the piece and provide the link yourself. That way you’re adding extra value with your tweets.
  • Give shout-outs. Any time someone mentions a tool, tweet that out. If you know that company’s Twitter handle, include them with an @ mention. Do the same for people. People love hearing about new tools to use, and businesses and individuals alike love hearing they got a shout-out in a presentation. Doing this also gets you on the radar of people who might not even be following the conference.
  • Watch the conference hash tag. In addition to tweeting out the session you’re attending, keep an eye on the tweets coming out of other sessions. When you see a juicy, highly-retweetable tweet come out, retweet it! Now you’re providing information on other sessions, too. Speaking of which:
  • Use the conference hash tag and speaker handles. I usually end each conference tweet with the speaker’s twitter handle and the conference hash tag. It helps mitigate the “I don’t make the news, I just report it” factor I mentioned earlier, plus it’s important to give credit to where credit’s due. Most of the time I’ll just copy the speaker handle and hash tag from my first tweet and then paste them at the end of each tweet (be careful there aren’t any typos when you copy, though – I spent half of Marty Weintraub’s MozCon session accidentally tweeting him as @aimcear instead of @aimclear).

One tool I’ll often use for live-tweeting conferences is TweetChat. It allows you to track just the tweets coming from one hash tag, and will automatically add the tag to the end of every tweet you post from the tool.

Other than that, I don’t use many tools for live tweeting – I’m usually just using the Twitter app for Mac. I use keyboard shortcuts for “new tweet” and “post tweet” to save a bit of time.

The last thing you’ll really need to be able to live tweet a full conference is the ability to type very fast, with few mistakes, and without looking at your hands or, necessarily, the screen. I don’t have any good recommendations for tools/programs to use to learn to type faster; I learned to type really fast by getting in a lot of arguments with people over instant messenger in high school and college, so you could try that. If anybody has any suggestions for programs to hone your typing skills, I’d love to see them in the comments!

Happy live tweeting everybody!

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Read More

Barnacle Reviews on Google+ Local

Posted by David Mihm

Since Google+ Local was released last May, it’s safe to say that everyone in the local search community — business owners and agencies alike — has been waiting with bated breath for the launch of Google’s rumored “Business Builder” dashboard. For whatever reason, it still isn’t out yet, but while you’re waiting, there’s no reason you can’t take advantage of the most underrated feature of Google+: the ability to interact on Google+ as a business page. And in particular, to leave reviews of other businesses as your business page.

Why leave reviews as a page?

Business owners, if this concept doesn’t immediately make sense to you, think of it like this: you probably go to networking events with your local chamber of commerce, Rotary club, or your industry trade group all the time. When you go to these events, you’re likely wearing your “business owner” hat, rather than your “weekend warrior” or “soccer mom” hat.

That’s essentially what this feature allows you to do: network socially with your “business owner” hat on, rather than your personal hat. Just like you would refer business to other business owners you trust and admire in these networking environments, the idea behind page-to-page recommendations on social networking sites works the same way.

Facebook gave its page users this functionality years ago, and many of you are likely accustomed to leaving comments on other Facebook pages and generally interacting with their community as their page rather than an individual profile. You may not have known, though, that you can do the same thing on Google+.

Why “Barnacle” reviews?

As far as I know, Search Influence‘s Will Scott was the pioneer of this concept in local search, which he defined as:

“Attaching oneself to a large fixed object and waiting for the customers to float by in the current.”

As most of you would probably admit, it’s hard work to optimize a local business website/Plus page/etc. So why not leverage pages that are already visible in your markets for your own visibility? That’s the idea behind Barnacle SEO.

Will’s original concept applied to link building to prominent Internet Yellow Pages profiles like Yelp business pages or Yahoo Local listings to increase the rankings of those profiles. As Facebook became more popular, he also applied the idea to Facebook conversations on popular pages in a given community (such as the home of your local newspaper or major/minor league sports team).

The problem is that with’s Facebook’s Timeline interface, comments and conversations drop “below the fold” awfully quickly, especially on popular pages with lots of conversations.

The results on Google+ Local pages, when done well, can yield much “stickier” results.

Getting started: using Google+ as your page

This part is pretty easy. Simply go to and log in with the Google Account under which you claimed your page. At the top righthand side, you’ll see a dropdown that shows the pages on which you’re an admin. Simply select the name of your page. Google will then take you to that page, and when it does, you should see the icon of the page show up at the top righthand side (rather than your personal profile photo).

You’re now using Google+ as your business!

Getting your feet wet: reviewing friendly businesses

Going back to the Rotary club analogy, you probably already have a network of existing businesses that you refer friends and clients to in the offline world — pay it forward and put your speech about why you would refer people to them out there for the entire Internet to see.

Chances are, when they Google themselves, they’ll see your business’ review right at the top of the list and might even leave YOU a review once they notice it.

Here’s an example of this in action with my friend Mike Ramsey’s business. You’ll see, because he doesn’t have that many reviews for his newspaper site, my face-for-radio shows up publicly right at the top of his list.

Kicking it up a notch: finding popular businesses

OK, that was simple enough. But most of your friends aren’t likely to run tremendously popular businesses that are getting a lot of traffic from search, let alone organic activity on Google+. You want to identify who the most popular businesses are in your market. You probably have some idea of what they are already, but here are some algorithmically-influenced ways to find them.

1) Perform a search for “things to do” in your market

Google is showing more and more of these carousel-style results for these searches every day. The businesses and points of interest shown in this carousel tend to be the ones that get the most visibility on Google+.

2) See what businesses Google recommends at

Visit and see who Google shows to the left of the map — both in text and image format. Again, these are likely to be popular businesses with lots of visibility on Google’s local products.

3) See where top reviewers are going 

Hat tip to my previously-mentioned friend Mike Ramsey of Nifty Marketing whose team authored this excellent piece earlier this week about how to find top reviewers on Google+ Local. Just follow the instructions in that post, and you’ll get a screen like this. Chances are, most of the places visited by top reviewers are pretty popular.

4) See what places are popular on Foursquare

Visit and see what businesses are mentioned when you search for “best nearby.” These places are going to have a lot of visibility among techies–good for a variety of reasons that I won’t go into in this post.

Finishing things off: reviewing those businesses

So, the final step in the process is to leave a review of those top businesses. I don’t have any earth-shattering tips for best practices when it comes to actually leaving a review, but I will point out that the more effort you put into leaving a killer review, the more likely it is that effort will be rewarded.  

Why is that? Google+ sorts reviews by “Most Helpful” by default. This means that the better your review is, the more likely it is to have staying power over time — which is the whole point of this exercise. You want people to gain real value from your review and have a positive experience when they see your brand for the first time.  

Just like no one wants to talk to an incessant glad-hander or self-promoter at a networking event, no one wants to read reviews that talk about how great their own business is. Just imagine that you’re talking to people face-to-face at one of these events, except instead of a 1:1 interaction, it’s more like a 1:100 or a 1:1000 interaction.  

Note that my business’ review, though I left it over two weeks ago and haven’t asked anyone to mark it as helpful, is still ranking second out of all reviews. Imagine the permanent “stickiness” of a review marked as helpful by even a handful of Google+ users.


Obviously, this technique works best for retail- or hospitality industry businesses, who are probably referring their guests to top attractions anyway, and are most likely to get traffic from out-of-town guests in the process of planning their trips.

But my guess is that (especially) in larger markets, even in-town residents are likely to do “recovery” searches on popular destinations — where Google is increasingly pushing searchers towards Knowledge Graph results and popular reviews from prominent Google+ users.  Make sure your business (or your clients’ businesses) have a chance to gain this “barnacle” visibility.

In the comments, I’d love to hear if anyone has used this technique on their own, or on behalf of their clients, and what the results have been!

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Read More

SEO Finds In Your Server Log

Posted by timresnik

I am a huge Portland Trail Blazers fan, and in the early 2000s, my favorite player was Rasheed Wallace. He was a lightning-rod of a player, and fans either loved or hated him. He led the league in technical fouls nearly every year he was a Blazer; mostly because he never thought he committed any sort of foul. Many of those said technicals came when the opposing player missed a free-throw attempt and ‘Sheed’ passionately screamed his mantra: “BALL DON’T LIE.”

‘Sheed’ asserts that a basketball has metaphysical powers that acts as a system of checks and balances for the integrity of the game. While this is debatable (ok, probably not true), there is a parallel to technical SEO: marketers and developers often commit SEO fouls when architecting a site or creating content, but implicitly deny that anything is wrong. 
As SEOs, we use all sorts of tools to glean insight into technical issues that may be hurting us: web analytics, crawl diagnostics, and Google and Bing Webmaster tools. All of these tools are useful, but there are undoubtedly holes in the data. There is only one true record of how search engines, such as Googlebot, process your website. These are web server logs. As I am sure Rasheed Wallace would agree, logs are a powerful source of oft-underutilized data that helps keep the integrity of your site’s crawl by search engines in check. 
A server log is a detailed record of every action performed by a particular server. In the case of a web server, you can get a lot of useful information. In fact, back in the day before free analytics (like Google Analytics) existed, it was common to just parse and review your web logs with software like AWStats
I initially planned on writing a single post on this subject, but as I got going I realized that there was a lot of ground to cover. Instead, I will break it into 2 parts, each highlighting different problems that can be found in your web server logs:
  1. This post: how to retrieve and parse a log file, and identifying problems based on your server’s response code (404, 302, 500, etc.).
  2. The next post: identifying duplicate content, encouraging efficient crawling, reviewing trends, and looking for patterns and a few bonus non-SEO related tips. 

Step #1: Fetching a log file

Web server logs come in many different formats, and the retrieval method depends on the type of server your site runs on. Apache and Microsoft IIS are two of the most common. The examples in this post will based on an Apache log file from SEOmoz. 
If you work in a company with a Sys Admin, be really nice and ask him/her for a log file with a day’s worth of data and the fields that are listed below. I’d recommend keeping the size of the file below 1 gig as the log file parser you’re using might choke up. If you have to generate the file on your own, the method for doing so depends on how your site is hosted. Some hosting services store them in your home directory in a folder called /logs and will drop a compressed log file in that folder on a daily basis. You’ll want to make sure to it includes the following columns:
  • Host: you will use this to filter out internal traffic. In SEOmoz’s case, RogerBot spends a lot of time crawling the site and needed to be removed for our analysis. 
  • Date: if you are analyzing multiple days this will allow you to analyze search engine crawl rate trends by day. 
  • Page/File: this will tell you which directory and file is being crawled and can help pinpoint endemic issues in certain sections or with types of content.
  • Response code: knowing the response of the server — the page loaded fine (200), was not found (404), the server was down (503) — provides invaluable insight into inefficiencies that the crawlers may be running into.
  • Referrers: while this isn’t necessarily useful for analyzing search bots, it is very valuable for other traffic analysis.
  • User Agent: this field will tell you which search engine made the request and without this field, a crawl analysis cannot be performed.
Apache log files by default are returned without User Agent or Referrer — this is known as a “common log file.” You will need to request a “combine log file.” Make your Sys Admin’s job a little easier (and maybe even impress) and request the following format:
LogFormat “%h %l %u %t \”%r\” %>s %b \”%{Referer}i\” \”%{User-agent}i\””
For Apache 1.3 you just need “combined CustomLog log/acces_log combined”
For those who need to manually pull the logs, you will need to create a directive in the httpd.conf file with one of the above. A lot more detail here on this subject.  

Step #2: Parsing a log file

You probably now have a compressed log file like ‘mylogfile.gz’ and it’s time to start digging in. There are myriad software products, free and paid, to analyze and/or parse log files. My main criteria for picking one includes: the ability to view the raw data, the ability to filter prior to parsing, and the ability to export to CSV. I landed on Web Log Explorer ( and it has worked for me for several years. I will use it along with Excel for this demonstration. I’ve used AWstats for basic analysis, but found that it does not offer the level of control and flexibility that I need. I’m sure there are several more out there that will get the job done. 
The first step is to import your file into your parsing software. Most web log parsers will accept various formats and have a simple wizard to guide you through the import. With the first pass of the analysis, I like to see all the data and do not apply any filters. At this point, you can do one of two things: prep the data in the parse and export for analysis in Excel, or do the majority of the analysis in the parser itself. I like doing the analysis in Excel in order to create a model for trending (I’ll get into this in the follow-up post). If you want to do a quick analysis of your logs, using the parser software is a good option. 
Import Wizard: make sure to include the parameters in the URL string. As I will demonstrate in later posts this will help us find problematic crawl paths and potential sources for duplicate content.
You can choose to filter the data using some basic regex before it is parsed. For example, if you only wanted to analyze traffic to a particular section of your site you could do something like: 
Once you have your data loaded into the log parser, export all spider requests and include all response codes:
Once you have exported the file to CSV and opened in Excel, here are some steps and examples to get the data ready for pivoting into analysis and action: 
1. Page/File: in our analysis we will try to expose directories that could be problematic so we want to isolate the directory from the file. The formula I use to do this in Excel looks something like this. 
Formula: <would like to put this is a textbox of some sort>
=IF(ISNUMBER(SEARCH(“/”,C29,2)),MID(C29,(SEARCH(“/”,C29)),(SEARCH(“/”,C29,(SEARCH(“/”,C29)+1)))-(SEARCH(“/”,C29))),”no directory”)
2. User Agent: in order to limit our analysis to the search engines we care about, we need to search this field for specific bots. In this example, I’m including Googlebot, Googlebot-Images, BingBot, Yahoo, Yandex and Baidu. 
Formula (yeah, it’s U-G-L-Y)
=IF(ISNUMBER(SEARCH(“googlebot-image”,H29)),”GoogleBot-Image”, IF(ISNUMBER(SEARCH(“googlebot”,H29)),”GoogleBot”,IF(ISNUMBER(SEARCH(“bing”,H29)),”BingBot”,IF(ISNUMBER(SEARCH(“Yahoo”,H29)),”Yahoo”, IF(ISNUMBER(SEARCH(“yandex”,H29)),”yandex”,IF(ISNUMBER(SEARCH(“baidu”,H29)),”Baidu”, “other”))))))
Your log file is now ready for some analysis and should look something like this:
Let’s take a breather, shall we?

Step # 3: Uncover server and response code errors

The quickest way to suss out issues that search engines are having with the crawl of your site is to look at the server response codes that are being served. Too many 404s (page not found) can mean that precious crawl resources are being wasted. Massive 302 redirects can point to link equity dead-ends in your site architecture. While Google Webmaster Tools provides some information on such errors, they do not provide a complete picture: LOGS DON’T LIE.
The first step to the analysis is to generate a pivot table from your log data. Our goal here is to isolate the spiders along with the response codes that are being served. Select all of your data and go to ‘Data>Pivot Table.’
On the most basic level, let’s see who is crawling SEOmoz on this particular day:
There are no definitive conclusions that we can make from this data, but there are a few things that should be noted for further analysis. First, BingBot is crawling the site at about an 80% more clip. Why? Second, ‘other’ bots account for nearly half of the crawls. Did we miss something in our search of the User Agent field? As for the latter, we can see from a quick glance that most of which is accounting for ‘other’ is RogerBot — we’ll exclude this. 
Next, let’s have a look at server codes for the engines that we care most about.
I’ve highlighted the areas that we will want to take a closer look. Overall, the ratio of good to bad looks healthy, but since we live by the mantra that “every little bit helps” let’s try to figure out what’s going on. 
1. Why is Bing crawling the site at 2x that of Google? We should investigate to see if Bing is crawling inefficiently and if there is anything we can do to help them along or if Google is not crawling as deep as Bing and if there is anything we can do to encourage a deeper crawl. 
By isolating the pages that were successfully served (200s) to BingBot the potential culprit is immediately apparent. Nearly 60,000 of 100,000 pages that BingBot crawled successfully were user login redirects from a comment link. 
The problem: SEOmoz is architected in such a way that if a comment link is requested and JavaScript is not enabled it will serve a redirect (being served as a 200 by the server) to an error page. With nearly 60% of Bing’s crawl being wasted on such dead-ends, it is important that SEOmoz block the engines from crawling. 
The solution: add rel=’nofollow’ to all comment and reply to comment links. Typically, the ideal method for telling and engine not to crawl something is a directive in the robots.txt file. Unfortunately, that won’t work in this scenario because the URL is being served via the JavaScript after the click. 
GoogleBot is dealing with the comment links better than Bing and avoiding them altogether. However, Google is crawling a handful of links sucessfully that are login redirects. Take a quick look at the robots.txt and you will see that this directory should probably be blocked. 
2. The number of 302s being served to Google and Bing is acceptable, but it doesn’t hurt to review in case there are better ways for dealing with some of edge cases. For the most part SEOmoz is using 302s for defunct blog category architecture that redirects the user to the main blog page. They are also being used for private message pages /message, and a robots.txt directive should exclude these pages from being crawled at all. 
3. Some of the most valuable data that you can get from your server logs are links that are being crawled that resolve in a 404. SEOmoz has done a good job managing these errors and does not have an alarming level of 404s. A quick way to identify potential problems is to isolate 404s by directory. This can be done by running a pivot table with “Directory” as your row label and count of “Directory” in your value field. You’ll get something like:
The problem: the main issue that’s popping here is 90% of the 404s are in one directory, /comments. Given the issues with BingBot and the JavaScript driven redirect mentioned above this doesn’t really come as a surprise. 
The solution: the good news is that since we are already using rel=’nofollow’ on the comment links these 404s should also be taken care of. 


Google and Bing Webmaster tools provide you information on crawl errors, but in many cases they limit the data. As SEOs we should use every source of data that is available and after all, there is only one source of data that you can truly rely on: your own. 
And for your viewing pleasure, here’s a bonus clip for reading the whole post.


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Read More

Back to the Future: Forecasting Your Organic Traffic

Posted by Dan Peskin

This post was originally in YouMoz, and was promoted to the main blog because it provides great value and interest to our community. The author’s views are entirely his or her own and may not reflect the views of SEOmoz, Inc.

Great Scott! I am finally back again for another spectacularly lengthy post, rich with wonderful titles, and this time – statistical goodness. It just so happens, that in my past short-lived career, I was a Forecast Analyst (not this kind). So today class, we will be learning about the importance of forecasting organic traffic and how you can get started. Let’s begin our journey.

I just put this here because it looks really cool.

Forecasting is Your Density. I Mean, Your Destiny

Why should I forecast? Besides the obvious answer – it’s f-ing cool to predict the future, there are a number of benefits for both you and your company.

Forecasting adds value in both an agency and in-house setting. It provides a more accurate way to set goals and plan for the future, which can be applied to client projects, internal projects, or overall team/dept. strategy.

Forecasting creates accountability for your team. It allows you to continually set goals based on projections and monitor performance through forecast accuracy (Keep in mind that exceeding goals is not necessarily a good thing, which is why forecast accuracy is important. We will discuss this more later).

Forecasting teaches you about inefficiencies in your team, process, and strategy. The more you segment your forecast, the deeper you can dive into finding the root of the inaccuracies in your projections. And the more granular you get, the more accurate your forecast, so you will see that segmentation is a function of accuracy (assuming you continually work to improve it).

Forecasting is money. This is the most important concept of forecasting, and probably the point in where you decided that you will read the rest of this article.

The fact that you can improve inefficiencies in your process and strategy through forecasting, means you can effectively increase ROI. Every hour and resource allocated to a strategy that doesn’t deliver results can be reallocated to something that proves to be a more stable source of increased organic traffic. So finding out what strategies consistently deliver the results you expect, means you’re investing money into resources that have a higher probability of delivering you a larger ROI.

Furthermore, providing accurate projections, whether it’s to a CFO, manager, or client, gives the reviewer a more compelling reason to invest in the work that backs the forecast. Basically, if you want a bigger budget to work with, forecast the potential outcome of that bigger budget and sell it. Sell it well.

Okay. Flux Capacitor, Fluxing. Forecast, Forecasting?

Contraption that I have no clue what it does

I am going to make the assumption that everyone’s DeLorean is in the shop, so how do we forecast our organic traffic?

There are four main factors to account for in an organic traffic forecast: historical trends, growth, seasonality, and events. Historical data is always the best place to start and create your forecast. You will want to have as many historical data points as possible, but the accuracy of the data should come first.

Determining the Accuracy of the Data

Once you have your historical data set, start analyzing it for outliers. An outlier to a forecast is what Biff is to George McFly, something you need to punch in the face and then make wash your car 20 years in the future. Well something like that.

The quick way to find outliers is to simply graph your data and look for spikes in the graph. Each spike is associated with a data point, which is your outlier, whether it spikes up or down. This way does leave room for error, as the determination of outliers is based on your judgement and not statistical significance.

The long way is much more fun and requires a bit of math. I’ll provide some formula refreshers along the way.

Calculating the mean and the standard deviation of your historical data is the first step.


Formula for finding the mean

Standard Deviation

Standard Deviation Formula

Looking at the standard deviation can immediately tell you whether you have outliers or not. The standard deviation tells you how close your data falls near the average or mean, so the lower the standard deviation, the closer the data points are to each other.

You can go a step further and set a rule by calculating the coefficient of variation (COV). As a general rule, if your COV is less than 1, the variance in your data is low and there is a good probability that you don’t need to adjust any data points.

Coefficient of Variation (COV)

Coefficient of Variation Formula

If all the signs point to you having significant outliers, you will now need to determine which data points those are. A simple way to do this is calculate how many standard deviations away from the mean your data point is.

Unfortunately, there is no clear cut rule to qualify an outlier with deviations from the mean. This is due to the fact that every data set is distributed differently. However, I would suggest starting with any data point that is more than one deviation from the mean.

Making your decision about whether outliers exist takes time and practice. These general rules of thumb can help you figure it out, but it really relies on your ability to interpret the data and be able to understand how each data point affects your forecast. You have the inside knowledge about your website, your equations and graphs don’t. So put that to use and start making your adjustments to your data accordingly.

Adjusting Outliers

Ask yourself one question: Should we account for this spike? Having spikes or outliers is normal, whether you need to do anything about it is what you should be asking yourself now. You want to use that inside knowledge of yours to determine why the spike occurred, whether it will happen again, and ultimately whether it should accounted for in your future forecast.

Organic Search Traffic Graph

In the case that you don’t want to account for an outlier, you will need to accurately adjust it down or up to the number it would have been without the event that caused the anomaly.

For example, let’s say you launched a super original infographic about the Olympics in July last year that brought your site an additional 2,000 visits that month. You may not want to account for this as it will not be a recurring event or maybe it fails to bring qualified organic traffic to the site (if the infographic traffic doesn’t convert, then your revenue forecast will be inaccurate). So the resulting action would be to adjust the July data point down 2,000 visits.

On the flipside, what if your retail electronics website has a huge positive spike in November due to Black Friday? You should expect that rise in traffic to continue this November and account for it in your forecast. The resulting action here is to simply leave the outlier alone and let the forecast do it’s business (This is also an example of seasonality which I will talk about more later).

Base Forecast

When creating your forecast, you want to create a base for it before you start incorporating additional factors into it. The base forecast is usually a flat forecast or a line straight down the middle of your charted data. In terms of numbers, this can be simply be using the mean for every data point. The line down the middle of the data follows the trend of the graph, so this would be the equivalent of the average but accounting for slope too. Excel provides a formula which actually does this for you:

=FORECAST(x, known_y’s,known_x’s)

Given the historical data, excel will output a forecast based on that data and the slope from the starting point to end point. Dependent on your data, your base forecast could be where you stop, or where you begin developing an accurate forecast.

Now how do you improve your forecast? It’s a simple idea – account for anything and everything the data might not be able to account for. Now you don’t need to go overboard here. I would draw the line well before you start forecasting the decrease in productivity on Fridays due to beer o clock. I suggest accounting for three key factors and accounting for them well; growth, seasonality, and events.


You have to have growth. If you aren’t planning to grow anytime soon, then this is going to be a really depressing forecast. Including growth can be as simple as adding 5% month over month, due to a higher level estimate from management, or as detailed as estimating incremental search traffic by keyword from significant ranking increases. Either way, the important part is being able to back your estimates with good data and know where to look for it. With organic traffic, growth can come from a number of sources but these are a couple key components to consider:

Are you launching new products?

New product being built by Doc Brown

New products means new pages, and dependent on your domain’s authority and your internal linking structure, you can see an influx of organic traffic. If you have analyzed the performance of newly launched pages, you should be able to estimate on average what percentage of search traffic from relevant and target keywords they can bring over time.

Using Google Webmaster Tools CTR data and the Adwords Tool for search volume are your best bet to acquire the data you need to estimate this. You can then apply this estimate to search volumes for the keywords that are relevant to each new product page and determine the additional growth in organic traffic that new product lines will bring.

Tip: Make sure to consider your link building strategies when analyzing past product page data. If you built links to these pages over the analyzed time period, then you should plan on doing the same for the new product pages.

What ongoing SEO efforts are increasing?

Did you get a link building budget increase? Are you retargeting several key pages on your website? These things can easily be factored in, as long as you have consistent data to back it up. Consistency in strategy is truly an asset, especially in the SEO world. With the frequency of algorithm updates, people tend to shift strategies fairly quickly. However, if you are consistent, you can quantify the results of your strategy and use it improve your strategy and understand its effects on the applied domain.

The general idea here is that if you know historically the effect of certain actions on a domain, then you can predict how relative changes to the domain will affect the future (given there are no drastic algorithm updates).

Let’s take a simple example. Let’s say you build 10 links to a domain per month and the average Page Authority is 30 and Domain Authority is 50 for the targeted pages and domain when you started. Over time you see as a result, your organic traffic increase by 20% for the pages you targeted on this campaign. So if your budget increases and allows you to apply the same campaign to other pages on the website, you can estimate an increase in organic traffic of 20% to those pages.

This example assumes the new target pages have:

  • Target keywords with similar search volumes
  • Similar authority at prior to the campaign start
  • Similar existing traffic and ranking metrics
  • Similar competition

While this may be a lot to assume, this is for the purpose of the example. However, these are things that will need to be considered and these are the types of campaigns that should be invested in from a SEO standpoint. When you find a strategy that works, repeat it and control the factors as much as possible. This will provide for an outcome that is the least likely to diverge from expected results.


To incorporate seasonality into a organic traffic forecast, you will need to create seasonal indices for each month of the year. A seasonal index is an index of how that month’s expected value relates to the average expected value. So in this case, it would be how each month’s organic traffic compares with average or mean monthly organic traffic.

So let’s say your average organic traffic is 100,000 visitors per month and your adjusted traffic for last November was 150,000 visitors, then your index for November is 1.5. In your forecast you simply multiply by this weight for the corresponding index month.

To calculate these seasonal indices, you need data of course. Using adjusted historical data is the best solution, if you know that it reflects the seasonality of the website’s traffic well.

Remember all that seasonal search volume data the Adwords tool provides? That can actually be put to practical use! So if you haven’t already, you should probably get with the times and download the Adwords API excel plugin from SEOgadget (if you have API access). This can make gathering seasonal data for a large set of keywords quick and easy.

What you can do here, is gather data for all the keywords that drive your organic traffic, aggregate it, and see if the trends in search align with the seasonality you are observing in your adjusted historical data. If there is a major discrepancy between the two, you may need to dig deeper into why or shy away from accounting for it in your forecast.


This one should be straightforward. If you have big events coming up, find a way to estimate their impact on your organic traffic. Events can be anything from a yearly sale, to a big piece of content being pushed out, or a planned feature on a big media site.

All you have to do here is determine the expected increase in traffic from each event you have planned. This all goes back to digging into your historical data. What typically happens when you have a sale? What’s the change in traffic when you launch a huge content piece? If you can get an estimate of this, just add it to the corresponding month when the event will take place.

Once you have this covered, you should have the last piece to a good looking forecast. Now it’s time to put it to the test.

Forecast Accuracy

So you have looked into your crystal ball and finally made your predictions, but what do you do now? Well the process of forecasting is a cycle and you now need to measure the accuracy of your predictions. Once you have the actuals to compare to your forecast, you can measure your forecast accuracy and use this to determine whether your current forecasting model is working.

There is a basic formula you can use to compare your forecast to your actual results, which is the mean absolute percent error (MAPE):

MAPE formula

This formula requires you to calculate the mean of the absolute percent error for each time period, giving you your forecast accuracy for the total given forecast period.

Additionally, you will want to analyze your forecast accuracy for just a single period if your forecast accuracy is low. Looking at the percent error month to month will allow you to pin point where the largest error in your forecast is and help you determine the root of the problem.

Keep in mind that accuracy is crucial if organic traffic is a powerful source of product revenue for your business. This is where exceeding expectations can be a bad thing. If you exceed forecast, this can result in stock outs on products and a loss in potential revenue.

Consider the typical online consumer, do you think they will wait to purchase your product on your site if they can find it somewhere else? Online shoppers want immediate results, so making sure you can fulfil their order makes for better customer service and less bounces on product pages (which can affect rank as we know).

Google Results for Vizio 19in

Walmart Vizio TV

Top result for this query is out of stock, which will not help maintain that position in the long term.

Now this doesn’t mean you should over forecast. There is a price to pay on both ends of the spectrum. Inflating your forecast means you could be bringing in excess inventory as it ties to product expectations. This can bring in unnecessary inventory expenses such as increased storage costs and tie up cash flow until the excess product is shipped. And dependent on product life cycles, continuing this practice can lead to an abundance of obsolete product and huge financial problems.

So once you have measured your forecast to actuals and considered the above, you can repeat the process more accurately and refine your forecast! Well this concludes our crash course in forecasting and how to apply it to organic traffic. So what are you waiting for? Start forecasting!

Oh and here is a little treat to get you started.

Are you telling me you built a time machine…in Excel?

Well no, Excel can’t help you time travel, but it can help you forecast. The way I see it, if you’re gonna build a forecast in Excel, why not do it in style?

I decided that your brain has probably gone to mush by now, so I am going to help you on your way to forecasting until the end of days. I am providing a stylish little excel template that has several features, but I warn you it doesn’t do all the work.

It’s nothing to spectacular, but this template will put you on your way to analyzing your historical data and building your forecast. Forecasting isn’t an exact science, so naturally you need to do some work and make the call on what needs to be added or subtracted to the data.

What this excel template provides:

  • The ability to plug in the last two years of monthly organic traffic data and see a number of statistical calculations that will allow you to quickly analyze your historical data.
  • Provides you with the frequency distribution of your data.
  • Highlights the data points that are more than a standard deviation from the mean.
  • Provides you with some metrics we discussed (mean, growth rate, standard deviation, etc).

Oh wait there’s more?

The expression on your face right now.

Yes. Yes. Yes. This simple tool will graph your historical and forecast data, provide you with a base forecast, and a place to easily add anything you need to account for in the forecast. Lastly, for those who don’t have revenue data tied to Analytics, it provides you with a place to add your AOV and Average Conversion Rate to estimate future organic revenue as well. Now go have some fun with it.


Obviously we can’t cover everything you need to know about forecasting in a single blog post. That goes both from a strategic and mathematical standpoint. So let me know what you think, what I missed, or if there are any points or tools that you think are applicable for the typical marketer to add to their skillset and spend some time learning.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Read More

Personalization and SEO – Whiteboard Friday

Posted by randfish

Personalization usage data and user data give marketers deep insights into their users’ interests and actions. But how can you make the most out of these complex data sets to better serve your SEO campaigns?

In this week’s Whiteboard Friday, Rand takes us through the intricate world of personalization and how it affects SEO. We’d love to hear your thoughts and tips in the comments below! 

Video Transcription

“Howdy, SEOmoz fans. Welcome to another edition of Whiteboard Friday. This week I’m wearing a hoodie and a T-shirt, so it must be informal. I want to take you in a casual fashion into the topic of personalization user data and usage data, and these are complex topics. This Whiteboard Friday will not be able to cover all of the different areas that user and usage data and personalization touch on. But what I do hope to do is expose you to some of these ideas, give you some actionable insights, and then allow you guys to take some of those things away, and we can point to some other references. There are lots of folks who have done a good job in the search world of digging in deep on some of these other topics.

Let’s start by talking about some of the direct impacts that personalization usage data have. Of course, by personalization usage data I mean the areas where Google is showing you or other users specific things based on your usage activities, where they are leveraging usage data, broad usage data, for many users to come up with different changes to these types of search results, and where they’re leveraging user personalization on a macro level, taking the aggregate of those things and creating new types of results, re-ranking things and adding snippets. I’ll talk about each of those.

In these direct impacts, one of the most important ones to think about is location awareness. This is particularly important obviously if you’re serving a local area, but you should be aware that location biases a lot of searches that may not have intended to be local simply by virtue of their geography. If you’re at a point, if I’m here in downtown Seattle, there is location awareness that affects the results ordering. I can perform searches, for example for Coffee Works, and I will get these Seattle Coffee Works results.

Perhaps if I was in Portland, Oregon and they had a Coffee Works in Portland, I would be getting those Coffee Works results. Usage history also gives Google hints about your location, meaning that even if you’re searching on your smartphone or searching on your laptop, and you said, “Don’t share my location,” Google and Bing will still try to figure this out, and they’ll try to figure it out by looking at your search history. They’ll say to themselves, “Hey, it looks like this user has previously done searches for Madison Markets, Seattle Trader Joe’s, used our maps to get directions from Capitol Hill to Queen Anne. I can guess, based on that usage data, that you are in Seattle, and I will try and give you personalized results that essentially are tied to the location where I think you’re at.”

A fascinating example of this is I was searching on my desktop computer last night, which I have not made it location aware specifically, but I did a search for a particular arena in Dublin, which is where the DMX Conference, that I’m going to in a couple days and speaking at, is going to be held. Then I started typing in the name of the hotel I was at, and it’s a brand name hotel. What do you know? That location came up, the Dublin location of the brand hotel, even though that hotel has locations all over the world. How do they know? They know because I just performed a search that was related to Dublin, Ireland, and therefore they’re thinking, oh yeah, that’s probably where he’s looking for this hotel information as well. Very, very smart usage history based personalization.

Do be aware search suggest is also affected directly by personalization types of results. If you are doing a search that is going to be biased by some element of personalization, either your search history or your location, those kinds of things, auto-suggest will come up with those same biases as the rankings might.

Next, I want to talk about the semantics of how you perform queries and what you’re seeking can affect your search as well. Search history is an important bias here, right? Basically, if I’ve been doing searches for jewelry, gemstones, wedding rings, those kinds of things, and I do a search for ruby, Google and Bing are pretty smart. They can realize, based on that history, that I probably mean ruby the stone, not Ruby the programming language. Likewise, if I’ve just done searches for Python, Pearl and Java, they might interpret that to mean, “Aha, this person is most likely, when they’re searching for Ruby, looking for the programming language.” This makes it very hard if you’re a software engineer who’s trying to look for gemstones, by the way. As you know, the ruby gem is not just a gem. It’s also part of the programming protocol.

This gets very interesting. Even seemingly unrelated searches and behavior can modify the results, and I think this is Google showing their strength in pattern matching and machine learning. They essentially have interpreted, for example, as disparate things as me performing searches around the SEO world and them interpreting that to mean that I’m a technical person, and therefore as I do searches related to Ruby or Python, they don’t think the snake or the gemstone. They think the programming language Python or the programming language Ruby, which is pretty interesting, connecting up what is essentially a marketing discipline, SEO a technical marketing discipline, and connecting up those programming languages. Very, very interesting. That can modify your results as well.

Your social connections. So social connections was a page that existed on Google until last year. In my opinion, it was a very important page and a frustrating page that they’ve now removed. The social connections page would show, based on the account you were inside of, all your contacts and how Google connected you to them and how they might influence your search results.

For example, it would say,which is my Gmail account that I don’t actually use, is connected to Danny Sullivan because Rand has emailed Danny Sullivan on that account, and therefore we have these accounts that Danny Sullivan has connected to Google in one way or another. In fact, his Facebook account and several other accounts were connected through his Quora account because Quora OAuths into those, and Google has an agreement or whatever, an auth system with Quora. You could see, wow, Google is exposing things that Danny Sullivan has shared on Facebook to me, not directly through Facebook, but through this protocol that they’ve got with Quora. That’s fascinating. Those social connections can influence the content you’re seeing, can influence the rankings where you see those things. So you may have never seen them before, they may have changed the rankings themselves, and they can also influence the snippets that you’re seeing.

For example, when I see something that Danny Sullivan has Plus One’d or shared on Google+, or I see something that Darmesh Shah, for example, has shared on twitter, it will actually say, “Your friend, Darmesh, shared this,” or “Your friend, Danny Sullivan, shared this,” or “Danny Sullivan shared this.” Then you can hover on that person and see some contact information about them. So fascinating ways that social connections are being used.

Big take-aways here, if you are a business and you’re thinking about doing marketing and SEO, you have to be aware that these changes are taking place. It’s not productive or valuable to get frustrated that not everyone is seeing the same auto-suggest results, the same results in the same order. You just have to be aware that, hey, if we’re going to be in a location, that location could be biasing for us or against us, especially if you’re not there or if something else is taking your place.

If people are performing searches that are related to topics that might have more than one meaning, you have to make sure that you feel like your audience is well tapped into and that they’re performing searches that they are aware of your products getting more content out there that they might be searching for and building a bigger brand. Those things will certainly help. A lot of the offline branding kinds of things actually help considerably with this type of stuff.

Of course, social connections and making sure that your audience is sharing so that the audience connected to them, even if they’re not your direct customers, this is why social media strategy is so much about not just reaching people who might buy from you, but all the people who might influence them. Remember that social connections will be influenced in this way. Right now, Google+ is the most powerful way and most direct way to do this, but certainly there are others as well as the now removed social connections page, helped show us.

What about some indirect impacts? There are actually a few of these that are worth mentioning as well. One of those indirect impacts that I think is very important is that you can see re-ranking of results, not just based on your usage, but this can happen or may happen, not for certain, but may happen based on patterns that the engines detect. If they’re seeing that a large number of people are suddenly switching away from searching ruby the gemstone to Ruby the language, they might bias this by saying, “You know what, by default, we’re going to show more results or more results higher up about Ruby the programming language.”

If they’re seeing, boy a lot of people in a lot of geographies, not just Seattle, when they perform a Coffee Works search, are actually looking for Seattle Coffee Works, because that brand has built itself up so strongly, you know what, we’re going to start showing the Seattle Coffee Works location over the other ones because of the pattern matching that we’re seeing. That pattern matching can be a very powerful thing, which is another great reason to build a great brand, have a lot of users, and get a lot of people around your product, your services, and your company.

Social shares, particularly what we’ve heard from the search engines, Bing’s been a little more transparent about this than Google has, but what Bing has basically said is that with social shares, the trustworthiness, the quality, and the quantity of those shares may impact the rankings, too. This is not just on an individual basis. So they’re not just saying, “Oh well, Danny Sullivan shared this thing with Rand, and so now we’re going to show it to Rand.” They’re saying, “Boy, lots of people shared this particular result around this topic. Maybe we should be ranking that higher even though it doesn’t have the classic signals.” Those might be things like keywords, links, and all the other things, anchor text and other things that they’re using the ranking algorithm. They might say, “Hey the social shares are such a powerful element here, and we’re seeing so much of a pattern around this, that we’re going to start re-ranking results based on that.” Another great reason to get involved in social, even if you’re just doing SEO.

Auto-suggest can be your friend. It can also be your enemy. But when you do a search today, Elijah and I just tried this, and do a search for Whiteboard space, they will fill in some links for you – paint, online, information. Then I did the same search on my phone, and what do you think? Whiteboard Friday was the second or third result there, meaning, they’ve seen that I’ve done searches around SEOmoz before and around SEO in general. So they’re thinking, “Aha. You, Rand, you’re a person who probably is interested in Whiteboard Friday, even though you haven’t done that search before on this particular phone.” I got a new phone recently.

That usage data and personalization is affecting how auto-suggest is suggesting or search suggest is working. Auto-suggest, by the way, is also location aware and location biased. For example, if you were to perform this search, whiteboard space, in Seattle, you probably would have a higher likelihood of getting Friday than in, let’s say, Hong Kong, where Whiteboard Friday is not as popular generally. I know we have Hong Kong fans, and I appreciate you guys, of course. But those types of search suggests are based on the searches that are performed in a local region, and to the degree that Google or Bing can do it, they will bias those based on that, so you should be aware.

For example, if lots and lots of people in a particular location, and I have done this at conferences, it’s actually really fun to ask the audience, “Hey, would everyone please perform this particular search,” and then you look the next day, and that’s the suggested search even though it hadn’t been performed previously. They’re looking at, “Oh, this is trending in this particular region.” This was a conference in Portland, Oregon, where I tried this, a blogging conference, and it was really fun to see the next day that those results were popping up in that fashion.

Search queries. The search queries that you perform, but not just the ones the you perform, but the search queries as a whole, kind of in an indirect, amalgamated, pattern matching way, may also be used to form those topic models and co-occurrences or brand associations that we’ve discussed before, which can have an impact on how search results work and how SEO works. Meaning that, if lots of people start connecting up the phrase SEOmoz with SEO or SEOmoz with inbound marketing, or those kinds of things, it’s very likely or you might well see that Google is actually ranking pages on that domain, on SEOmoz’s domain, higher for those keywords because they’ve built an association.

Search queries, along with content, are one of the big ways that they put those topics together and try to figure out, “Oh yeah, look, it seems like people have a strong association with GE and washer/dryers, or with Leica and cameras or with the Gap and clothing.” Therefore, when people perform those types of searches, we might want to surface those brands more frequently. You can see this in particular when you perform a lot of ecommerce-related searches and particular brands come up. If you do a search for outdoor clothing and things like Columbia Sportswear and REI and those types of brands are popping up as a suggestion, you get a strong sense of the types of connections that Google might build based on these things.

All right, everyone. I hope you’ve enjoyed this edition of Whiteboard Friday. I hope you have lots of great comments, and I would love to jump in there with you and suggestions on how you people can dig deeper. We will see you again next week.”

Video transcription by

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Read More

Announcing the Just-Discovered Links Report

Posted by The_Tela

Hey everyone, I’m Tela. I head up data planning at SEOmoz, working on our indexes, our Mozscape API, and other really fun technical and data-focused products. This is actually my first post on the blog, and I get to announce a brand new feature – fun!

One of the challenges inbound marketers face is knowing when a new link has surfaced. Today, we’re thrilled to announce a new feature in Open Site Explorer that helps you discover new links within an hour of them going up on the web: the Just-Discovered Links report.

This report helps you capitalize on links while they’re still fresh, see how your content is resonating through social channels, gauge overall sentiment of the links being shared, give you a head start on instant outreach campaigns, and scope out which links your competitors are getting. Just-Discovered Links is in beta, and you can find it in Open Site Explorer as a new tab on the right. Ready to learn more? Let’s go!

What is the Just-Discovered Links report?

This report is driven by a new SEOmoz index that is independent from the Mozscape index, and is populated with URLs that are shared on Twitter. This means that if you would like to have a URL included in the index, just tweet it through any Twitter account.

One note: The cralwers respect robots.txt and politeness rules, which would prevent such URLs from being indexed. Also, we won’t index URLs that return a 500 status code.

search results

Who is it for?

Our toolsets and data sources are expanding to support a wider set of inbound marketing activities, but we designed Just-Discovered Links with link builders in mind.

Getting started

You can search Just-Discovered Links through the main search box on Open Site Explorer. Enter a domain, subdomain, or specific URL just as you would when using the Inbound Links report. Then select the Just-Discovered Links beta tab. The report gives PRO members up to 10,000 links with anchor text and the destination URL, as well as Domain Authority and Page Authority metrics.

One important note on Page Authority: we will generally not have a Page Authority score available for new URLs, and will show [No data] in this case. So, when you see [No data], it generally indicates a link on a new page.

You can also filter the results using many of the same filter drop-downs you are used to using in other reports in Open Site Explorer. These include followed and no-followed links, and 301s; as well as internal or external links, and links to specific pages or subdomains. Note: We recommend you start searches using the default “pages on this root domain” query, and refine your search from there.

How does it work?

When a link is tweeted, we crawl that URL within minutes. We also crawl all of the links on the page that have been tweeted. These URLs, their anchor text, and their meta data (such as nofollow, redirect, and more) are stored and indexed. It may take up to an hour for links to be retrieved, crawled, and indexed.

We were able to build this feature rapidly by reusing much of the technology stack from Fresh Web Explorer. The indexes and implementation are a little different, but the underlying technology is the same. Dan Lecocq, the lead engineer on both projects, recently wrote an excellent post explaining the crawling and indexing infrastructure we use for Fresh Web explorer.

There are a few notable differences: we don’t use a crawl scheduler because we just index tweeted URLs as they come in. That’s how we are able to include URLs quickly. Also, unlike Fresh Web Explorer, the Just-Discovered Links report is focused exclusively on anchor text and URLs, so we don’t do any de-chroming as that would mean excluding some links that could be valuable.

How is it different?


Freshness of data continues to be a top priority when we design new products. We have traditionally released indexes on the timeframe of weeks. With this report, we have a new link index that is updated in about an hour. From weeks to an hour – wow! We’ll be providing additional details in the future on what this means.

URL coverage

This index includes valuable links that may be high-quality and topically relevant to your site or specific URL but are new, and thus have a low Page Authority score. This means they may not be included in the Mozscape index until they have been established and earned their own links. With this new index, we expect to uncover high-quality links significantly faster than they would appear in Mozscape.

I want to clarify that we are not injecting URLs from the Just-Discovered Links report into our Mozscape index. We will be able to do this in the future, but we want to gather customer feedback and understand usage before connecting these two indexes. So for now, the indexes are completely separate.

How big is the index?

We have seeded the index and are adding new URLs as they are shared, but don’t yet have a full 30 days worth of data in the index. We are projecting that the index will include between 250 million and 300 million URLs when full. We keep adding data, and will be at full capacity in the next week. 

How long will URLs stay in the index?

We are keeping URLs in the index for 30 days. After that, URLs will fall out of the index and not appear in the Just-Discovered Links report. However, you can tweet the URL and it will be included again.

How long does it take to index a URL?

We are able to crawl and include URLs in the live index within an hour of being shared on Twitter. You may see URLs appear in the report more quickly, but generally you can expect it to take about an hour.

Why did you choose Twitter as a data source?

About 10% of tweets include URLs, and many Twitter users share links as a primary activity. However, we would like to include other data sources that are of value. I’d love to hear from folks in the comments below on data sources they would like to see us consider for inclusion in this report.

How much data can I get?

The Just-Discovered Links report has the same usage limits as the Inbound Links report in Open Site Explorer. PRO customers can retrieve 10,000 results per day, community members can get 20 results, and guests can see the first five results.

What is “UTC” in the Date Crawled column?

We report time in UTC, or Coordinated Universal Time format. This time format will be familiar for our European customers, but might not be as familiar for customers in the states. The time zones for UTC are ahead of Eastern Standard Time, so US customers will see links where the time-stamp appears to be in the future, but this is really just a time zone issue. We can discover links quickly, but can’t predict links before they happen. Yet, anyways :)

CSV export

You can export a CSV with the results from your Just-Discovered Links report search. The CSV export will be limited to 5,000 links for now. We plan to increase this to 10,000 rows of data in the near future. We need to re-tool some of Open Site Explorer’s data storage infrastructure before we can offer a larger exports, and don’t have an exact ETA for this addition quite yet.

export search results

This is a beta release

We wanted to roll this out quickly so we can gather feedback from our customers on how they use this data, and on overall features. We have a survey where you can make suggestions for improving the feature and leave feedback. However, please keep in mind the fact that this is a beta when deciding how to use this data as part of your workflow. We may make changes based on feedback we get that result in changes to the reports.

Top four ways to use Just-Discovered Links

Quick outreach is critical for link building. The Just-Discovered Links report helps you find link opportunities within a short time of being shared, increasing the likelihood that you’ll be able to earn short-term link-building wins and build a relationship with long-term value. Here are four ways to use the recency of these links to help your SEO efforts:

  1. Link building: Download the CSV and sort based on anchor text to focus on keywords you are interested in. Are there any no-followed links you could get switched to followed? Sort by Domain Authority for new links to prioritize your efforts.
  2. Competitor research: See links to your competitor as they stream-in. Filter out internal links to understand their link building strategy. See where they are getting followed links and no-followed links. You can also identify low-quality link sources that you may want to avoid. Filter by internal links for your competitors to identify issues with their information architecture. Are lots of their shared links 301s? Are they no-following internal links on a regular basis?
  3. Your broken links: The CSV export shows the http status code for links. Use this to find 404 links to your site and reach-out to get the links changed to a working URL.
  4. Competitor broken links: Find broken links going to your competitors’ sites. Reach out and have them link to your site instead.

what you can do with Just-Discovered Links

Ready to find some links?

We’ve been releasing new versions of our Mozscape index about every two weeks. An index that is continuously updated within an hour is new for us, too, and we’re still learning how this can make a positive impact on your workflow. Just as with the release of Fresh Web Explorer, we would love to get feedback from you on how you use this report, as well as any issues that you uncover so we can address them quickly.

The report is live and ready to use now. Head on over to Open Site Explorer’s new Just-Discovered Links tab and get started!

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Read More

Black Hat or White Hat SEO? It’s Time to Ask Better Questions

Posted by Dr. Pete

Since the Wild West days of the late 1990s, SEOs have been grouped into two camps – the “black hats” and the “white hats”. Over time, these distinctions have become little more than caricatures, cartoon villains and heroes that only exist in our individual imaginations, usually embellished to suit our marketing agendas. Even when grounded in specific tactics, “black” vs. “white” is a lot like “conservative” vs. “liberal” – the definition changes with the year and every person you ask, and that definition almost always comes loaded with assumptions and judgments.

Unfortunately, too many business owners still choose their SEOs based on the hat they wear, even when that hat only comes out on sales calls. So, I’d like to ask some better questions. There are real strategic and tactical differences behind what we often think of as “white” and “black” hat SEO, and those differences are what you need to understand to make the right choices for your own business.

This Isn’t About Ethics

While we generally think of ourselves as “white hat” here at SEOmoz, I’m going to put ethics aside temporarily for this post. I will assume that, when we say “black hat”, we’re not talking about outright illegal behavior (like hacking into someone’s site). We’re talking about willfully violating Google’s rules to improve your ranking. While I do believe there are ethical implications to cheating the system and harming search quality, this post is intended to be an honest look at the real choices you face when choosing an SEO path.

(1) High-Value or Low-Value?

The first question is – are you going to pursue “high-value” or “low-value” tactics? I don’t want to replace one hopelessly vague duality with another one, so let me define my terms. By “value”, I mean the value that these tactics provide to site and search visitors. We sometimes call low-value tactics “spam”. It’s not usually illegal and it’s not always even unethical (depending on your point of view), but it’s always done specifically for SEO purposes.

Here’s an example – linking all of your client’s sites back to your own site with keyword-loaded footer links. I wouldn’t call this unethical, but it doesn’t add value and, frankly, it’s just too easy. Google knows this, and they naturally devalue those links now (in extreme cases, they might even penalize the target site).

Ironically, “low-value” tactics are often considered to be a value to people who are trying to gain ground as cheaply as possible. Practically, people often underestimate the time these tactics take and overestimate the return on investment. Low-value tactics tend to fade quickly. As Google gets more aggressive, low-value tactics are also getting riskier (see Question #2).

There’s a more fundamental problem, though, in my opinion – low-value tactics don’t build anything toward the future. Once they fail, and they usually do, you have to start over and chase some new low-value tactic. Here’s an example – let’s say you get links back from all of your clients in low-value footer text, and your one-way link network looks something like this:

Simple link network - SEO only

Link “juice” is flowing, and all signs are green. Then, one day, Google pulls the plug on this particular low-value tactic. What are you left with?

Failed link network

You’re not left with much, because these links never had real value beyond SEO. Imagine, though, that those links carried not only authority (in green), but traffic (in blue):

Simple link network - SEO + Traffic

Now, let’s say Google changes the rules, and you lose the ranking power of those links. The links still have value, because they’re still carrying visitors to your site:

Partially failed link network

The picture may not look exactly the same, and the traffic quantity and quality have changed, but you’re not dead in the water. I know I’m oversimplifying this, but I just want to make the point perfectly clear.  If you play the game purely for SEO, and you lose, you lose everything. If you build something of value that actually attracts visitors and then the rules change, you’ve still built something.

(2) High-Risk or Low-Risk?

The second question you need to ask yourself is: How much risk are you willing to accept? Don’t just smile and nod and tell me about how you’re a “risk-taker” – I’ve heard plenty of people tell their SEO companies to “Go for it!” only to be reduced to sobbing in the corner when their strategy crashed and burned months later. This is a time for brutal honesty. Can you live with the risk of a severe penalty, including being totally removed from the Google index?

High-risk SEO is like high-risk investing – yes, there can be high reward, if you know what you’re doing, but for every 1 winner at this game there are 99 companies that close their eyes, cover their ears, and whistle their way into disaster. If what you’re hearing from your SEO company sounds too good to be true, ask more questions.  As Paddy Moogan’s recent post pointed out, your risk is not someone else’s to take.

To make matters worse, I think that many so-called “black-hat” tactics, and even some gray-hat tactics, are much riskier than they used to be. There was a time when, if you played the game too hard, you got a slap on the wrist and had to start over. You’d be set back a few weeks, but you’d also have made a lot of money in the months leading up to that. I’m not saying it’s right, but let’s at least be honest about the past.

Fast-forward to 2013, and look at an update like Penguin – almost a year after the original Penguin, we’ve still heard very few public recovery stories. The ones I’ve heard in private have almost always involved a massive culling of links (the good with the bad, in many cases) and took months. That’s months with major revenue loss, and this is from big agencies who have resources and connections that many business don’t have access to.

Even semi-innocent tactics have been hit hard. Fairly recently, you could spin out a bunch of city/state pages with a few long-tail keywords and do pretty well. Was it a high-quality tactic? No, but it’s hardly the essence of evil. Worst case, Google would start ignoring those pages, and you’d be out a few days of work. Then, along came Panda, and now your entire site can suffer for quality issues. The price of mistakes is getting higher, and Google is getting more punitive.

I’m not here to tell you what to do, but this is not just a “white-hat” sermon. I’ve studied Google’s movements a lot in the past year, and I sincerely believe that the risk of manipulative tactics has increased dramatically. I also believe that it’s only the beginning. So, if you’re going to play the game, make sure you can afford to lose.

It’s almost important to understand that every tactic carries risk, especially if you fail to diversify. When I hear a company say “Our clients are never affected by updates, because we only use Google-approved methods!”, then I know that company has only been in business for six months. Sooner or later, white-hat or black-hat, the rules will change. You could be sparkling white and still get hit by things like paid inclusion, SERP layout changes, SERP feature changes, etc. The time for SEO hubris is over.

(3) Short-Term or Long-Term?

Finally, I think you have to consider whether you’re in this for the long haul or just trying to make a short-term play. For example, let’s say you’re building an affiliate site to sell accessories for the Samsung Galaxy S4 (which was just announced while I was writing this post). The smartphone market moves fast, and as an affiliate in this space, you’re facing a few realities:

  • You probably don’t have a lot of money to invest up-front
  • You need to get your traffic rolling quickly
  • Your peak opportunity may only last 6-12 months

Again, I’m not making a moral judgment, but this is a very different kind of business situation, practically speaking. You may not have time to build epic content or spend six months building up a social following, and the consequences of getting burned a year from now may be fairly small. So, if you know your business is short-term, you can take risks that other people can’t.

The problem, I think, is that too many long-term businesses think this way: “I can’t afford to spend money”, “I don’t have time to get moving”, “I need results now!” So, you dive into low-value tactics to get moving quickly and cheaply. Even if you never get smacked down by Google, the reality is that these tactics tend to be short-lived – they fade or burn out, and you’ve got to start again. So, you’re constantly in a cycle of chasing the next low-value trend.

This may be attractive at first, to get out the gate, but over time I think it’s a losing proposition. If you never build anything that lasts, you’re always stuck making repairs. If you invest early, those investments tend to pay out, and you can build on them. I’ve seen this so many times with content over the past few years – I invest in a piece that doesn’t quite live up to my immediate expectations (traffic-wise, social-wise, etc.), and I’m about to throw in the towel, when weeks or months later, it takes off and just keeps running. Once it’s running, you get to go along for the ride. Without that investment, you’re always pushing.

So, What’s Right for You?

I can’t tell you how to run your business. I just want you to ask yourself (and your vendors) the hard questions. Are “low-value” tactics actually saving you money? How much risk are you really willing to take? Is your #1 priority to get up and running quickly and cheaply, or are you trying to build a real, long-term business? If the best your provider can do is show you their hat, and they can’t help you answer these questions, then move on – it doesn’t matter what color that hat is.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Read More

Discover your International Online Potential

Posted by Aleyda Solis

One of the major advantages of having a web-based business presence is the opportunity to reach a global audience, eliminating many of the restrictions and costs that a “physical” international presence might have. Nonetheless, from my day to day experience I’ve found that there is still a lack of vision of opportunity to target international markets.

Ask yourself: when was the last time you checked how many visitors were coming to your site from other countries? Even if you have a small or mid-sized business, do you frequently check what’s the percentage of your current conversions coming from other countries and languages than yours?

Besides being an International SEO, I consider myself a cultural broker: I’m a Nicaraguan living in Madrid. I speak English and French in addition to my native language, which is Spanish. I love to travel and I’ve had the opportunity to do it because of work (and also for pleasure) to places like Argentina, Costa Rica, El Salvador, Turkey, Tunisia, Montenegro, and Russia (on top of other, more common destinations such as the UK, US, France, Italy, Ireland, The Netherlands, Switzerland, etc.). I’ve had Nicaraguan, Argentinian, Dutch, Spanish, and German bosses in the past, and now I have an American one.

I’ve also worked in the past as an SEO for:

  • A Dutch owned online marketing agency in Spain with clients from all over Europe
  • A Spanish owned Vertical Web portal targeting eight Latin American and European countries
  • An online marketing provider for Spanish small businesses owned by a French group
  • A Russian company targeting the European market

Currently, I work for an American online marketing agency targeting international clients. As you can see, the “international” component has been a common characteristic in my personal and professional life, and I cannot imagine how there’s still a lack of vision and openness towards international activities, which at the end means lost opportunities for businesses and a less rich and competitive market that will end up also hurting the audience.

Unfortunately, this frequently happens because of misconceptions about expanding internationally. I want to share and clarify here three of the most common misconceptions I find in my every day work. 

Misconception 1: I'm already in the most profitable market so I don't care about the rest

I’m not telling you to leave your current market (and lose your current profits), but to take others into consideration. At the beginning, it will be only to assess the opportunities there, so really, you don’t have anything to lose. I also know that we all tend to feel like we’re already in the “center of everything,” and a couple of World Maps from different countries are the best proof of it:

The World map according to our perception

According to a recent eMarketer study, B2C E-commerce sales will grow 18.3% to $1.298 trillion worldwide and Asia-Pacific will surpass North America to become the world’s No. 1 market:

B2C Ecommerce Sales Share Worldwide by Region

Additionally, in the same study we can see how Asia-Pacific and Western Europe as regions have both more digital buyers (Internet users who buy goods online) than North America:

Worldwide Digital Buyers

As you can see, nowadays no one is really in the “center.” There’s enough globally “distributed” potential out there, and the highest growing ones are in countries like China. Wake up! This means more exciting possibilities for your business internationally.

Misconception: Local Businesses don't need to have an International Online Presence

You don’t need to be a large international corporation, an E-commerce business, or a completely online based business to benefit from a website version in other languages, or targeting to other countries.

Although from a business perspective it can be more straight-forward for these type of sites to identify an international potential, there are also different types of local businesses that have an international audience, or that can additionally benefit from having an international online presence since their target market can be also abroad or from abroad. For example:

  • Language schools: such as Spanish language schools in Spain or Latin America targeting US, German, or UK students
  • Summer camps: like international summer camps in Switzerland targeting children from abroad
  • Centric hostels and apartments rentals: located in touristic or centric areas that can be attractive for tourists
  • Traditional restaurants and bars: that usually have tourists as clients 
  • Volunteering organizations: looking to attract volunteers from abroad
  • Gift and flower shops: which might also suitable to send from audience abroad
  • Traditional art and crafts shops: that look to sell typical local goods to foreigners 
  • Traditional food and drinks shops: like cured ham factories or wineries in Spain looking to sell their products abroad  

You need additional incentive? Check-out a mobile search engine result page for a local query in for “restaurantes en brooklyn” (restaurants in Brooklyn), that in English would be usually taken by Google maps results:

Local SERP for Spanish Query

There’s a huge opportunity, indeed. You can definitely achieve additional benefit targeting an International audience even if you are not a big company or based internationally!

Misconception 3: Expanding Internationally is Expensive

It’s true that expanding your site presence internationally might have higher costs than your local language version. From deploying the web platform in a new ccTLD (or subdirectory if it’s not a country but a language targeted version) to localizing (not only translating) the content, having native language support to expand your content and social media marketing strategies (that also need to take into consideration the local audience behavior, using the criteria I’ve previously shared in this post), as well as to support your outreach and community management efforts in this other language. 

Nonetheless, this doesn’t mean that expanding your site internationally should be non-beneficial for you. When you implement complete research to identify the potential organic traffic and conversion from each language and country and on the other that you validate from the start, this potential revenue will surpass the costs related to your international web presence:

International SEO: Revenue vs Costs

With this information, you will be able to calculate the expected international presence (as well as international SEO process) return on investment:

International SEO ROI

I’ve seen too many situations where this type of initial assessment hasn’t been done, and because of this, there are businesses that have ended up with many languages or country site versions that have been developed without any clear strategy. They don’t  answer to a business related goal and are simply the “literal translation” of the main site version. Of course they’re not profitable! But it’s because the international web project hasn’t been correctly developed.

Another common signal when an international site presence hasn’t been effectively planned or executed is when the site owner tells you that they have their UK site version with the exact same content than the US one but they cannot afford to update it to make it unique, specifically targeting the UK audience.

If they cannot afford it, this means that they’re at the moment not getting any or enough benefit from it; whether because they likely don’t have any strategy behind and this presence is potentially not optimized, or because there’s not enough potential in this market and they haven’t been able to identify this since they didn’t do any research previously. It’s also our work to advise our clients effectively from the start, validate the potential benefit from any international development or SEO project, and warn them if, for some reason, there’s no potential.

Additionally, we can run pilot projects to test the market, just with the most important product or services categories with targeted landing pages, so as you can see there’s no excuse for a non-successful international web presence that has been effectively planned, well developed, and optimized.

International SEO Potential

With a couple of very simple analysis steps that shouldn’t take much of your time you can have an overview of the potential your business might have internationally:

Google Analytics International Traffic

Check your International traffic status

Go to the Audience > Demographics > Location & Language reports in Google Analytics to check the percentage of your website visitors coming from other countries and using browsers in other languages.

Verify the volume and trends from the last couple of years for all of your traffic as well for only organic and compare them:

  • Is there a high or growing percentage of visitors coming from other countries? 
  • What’s the volume and trend of conversions and the conversion rate of visitors coming from other countries?
  • What’s the traffic source of visitors coming from other countries? Direct, organic, referrals?
  • Which are the keywords and pages attracting this international traffic?

You have a bit more of time? If so, go to Google Webmaster Tools to validate the visibility you’re getting already in Google search result pages from other countries, along with the queries and pages impressions and clicks.

International Search Queries

This is just your starting point that will help you to prioritize the international markets where you have already have activity and might be initially easier to start with.

Nonetheless, if numbers are not high it doesn’t mean you don’t have potential, but that maybe your efforts have been highly targeted to your current audience and haven’t had a high international impact until now, so you will likely need to work harder at the beginning.

International Keyword Research with Google AdWords Keyword Tool

Identify your International Organic potential

Prioritize the countries that you have already identified with higher traffic activity in your Website before and do a quick keyword research for each one of them by selecting the desired location and language from the Google’s Keyword Tool Advanced Options and Filters.

You can use the keywords that you have identified in the previous analysis that are already giving visibility and traffic from these countries and languages. If you didn’t identify any keyword information in the previous analysis and the country you need to research is non-English speaking (or in other language than yours), then the best option at this level is to take the keywords in your current language, use Google Translate to quickly translate them to the desired one and use them for this initial and quick validation and overview (It’s important to note that this is ok just for this initial, quick analysis, since these keywords will likely have errors and missing opportunities. You can do a complete international SEO research and process without speaking the language but with the right process and local language support, as I’ve described in this post).

Use the exact match type (to get more “realistic” data that you can expect for each specific keywords) and check:

  • What’s the local monthly search volume for the relevant keywords in each of the countries and languages?
  • Are there more suggested keyword ideas with a high level of search volume?

Refine and expand the research according to the suggestions you get for them.

You have a bit more of time? If so, go to SEMRush or Search Metrics Essentials (that support many countries) to identify more keywords opportunities:

Additional Keywords Ideas from SEMRush

Is there a high search volume potential for the verified countries and languages? If so, congratulations! This are great news.

It’s time then for you to develop a full International SEO research to understand, validate and plan your strategy, and verify your potential costs, revenue, and ROI, taking into consideration all of the necessary aspects, from a business abd language to technical capacity, restrictions, and requirements.

To do this, take a look and follow the step-by-step guide I published some weeks ago about it: 

How to start your international web presence

International SEO Doubts? Let me know in the comments!

Images under Creative Commons taken from Flickr.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Read More

Announcing the March Mozscape Index!

Posted by carinoverturf

It’s that time again – the latest Mozscape index is now live! Data is now refreshed across all the SEOmoz applications – Open Site Explorer, the MozbarPRO campaigns, and the Mozscape API.

This index finished up in just 13 days, thanks again to all the improvements our Big Data Processing team has been implementing to make our Mozscape processing pipeline more efficient. The team continues to dial out our virtual private cloud in Virginia as well as tweak, tune, and improve the time it takes to process 82 billion URLs.

We’ve been saying we’re close to releasing our first index created on our own hardware – and now we really are! Stay tuned for a deep dive blog post into why and how we built our own private cloud.

This index was kicked off the first week of March, so data in this index will span from late January through February, with a large percentage of crawl data from the last half of February.

Here are the metrics for this latest index:

  • 83,122,215,182 (83 billion) URLs
  • 12,140,091,376 (12.1 billion) Subdomains
  • 141,967,157 (142 million) Root Domains
  • 801,586,268,337 (802 billion) Links
  • Followed vs. Nofollowed

    • 2.21% of all links found were nofollowed
    • 55.23% of nofollowed links are internal
    • 44.77% are external
  • Rel Canonical – 15.70% of all pages now employ a rel=canonical tag
  • The average page has 74 links on it

    • 63.56 internal links on average
    • 10.65 external links on average

And the following correlations with Google’s US search results:

  • Page Authority – 0.35
  • Domain Authority – 0.19
  • MozRank – 0.24
  • Linking Root Domains – 0.30
  • Total Links – 0.25
  • External Links – 0.29

Crawl histogram for the March Mozscape index

We always love to hear your thoughts! And remember, if you’re ever curious about when Mozscape next updates, you can check the calendar here. We also maintain a list of previous index updates with metrics here.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Read More