Google’s Disavow Tool: Is This the Answer to the Penguin Update?

Finally, it has arrived…on October 17, Google released their long awaited disavow tool. If you have been affected by the Penguin updates you may have been waiting with anticipation for this to arrive. But is it the answer to our dreams – will it really get our sites back and ranking again? Let’s find out.

What is the Google Disavow Links Tool?

For those that don’t know, the Google Penguin updates targeted websites with poor quality links or those that overused the same keyword in their anchor text. Both of these, we suspect, were what happened with our sites. We had used blogging networks in the past and they only generate poor quality links and in the early days we did tend to overuse the same anchor text a little too much when backlinking.

Okay, so fair enough, we did a few no-no’s in the eyes of Google and we got slapped on the wrist for it. However, the problem with those Penguin updates is that:

1. It made it very difficult for anyone who was affected to fix their websites. Just try emailing hundreds or even thousands of website owners to ask them to remove your links from their website.

2. It made it too easy for a competitor to get someone elses sites penalized by simply generating hundreds of poor quality links to their site. This got everyone riled up because although Google denied it we all knew that it was possible, and it was happening. Many webmasters were even testing it on their own sites to see if they could get them penalized and yes they could.

So, because of the number of complaints they received from irate webmasters, Google listened and they have now provided a disavow tool that allows us to ignore specific backlinks to our websites.

The Official Google Word

Here is a video from Matt Cutts (Google Engineer) explaining how the disavow tool works.

[youtube width=”535″ height=”355″]http://youtu.be/393nmCYFRtA[/youtube]

So what Matt is saying in the video above is that you only need to load up a text file with a list of the websites you want removed and low and behold (after a few weeks) those sites will be ignored. Simple right?  Well if it is so simple, why does Matt Cutts reiterate a number of times that not many people should need to use this tool? It’s almost like he wants to scare us away from using it. Why?

Just take a look at what Google has to say about it on their Webmaster Tools blog:

“This is an advanced feature and should only be used with caution. If used incorrectly, this feature can potentially harm your site’s performance in Google’s search results. We recommend that you disavow backlinks only if you believe you have a considerable number of spammy, artificial, or low-quality links pointing to your site, and if you are confident that the links are causing issues for you. In most cases, Google can assess which links to trust without additional guidance, so most normal or typical sites will not need to use this tool.”

I love the last sentence, “In most cases, Google can assess which links to trust with additional guidance..”. If that’s the case, why do we even need such a tool?…more on that later.

How Does the Disavow Tool Work?

These are the steps for using the disavow tool:


Step 1: Assess the Links

The first thing you need to do is find the sites that are linking to you so you can determine which ones you want to disavow. You can do this using your own keyword tool or you can use Google Webmaster Tools (instructions below)

1. Login to Webmaster Tools.

2. Click on the site you want to assess.

3. Click on Traffic from the right menu.

4. Click on Links to Your Site.

5. Click the More link under section titled ‘Who links the most’

6. Click Download this table.


Step 2: Create a Text File

This step involves creating a text file that will list all of the domain urls that you want to disavow.

1. Open up Notepad or other text based editor and add one link per line in the following formats:

  • To ignore an entire domain use:
    • domain: thespammywebsite.com
  • To ignore a page from a domain use:
    • http://spammerswebsite.com/page-1.htm

So your text page might look like this:

domain: thespammywebsite.com
http://spammerswebsite.com/page-1.htm

2. Save the file to your computer.


Step 3: Upload List of Links

1. Go to the Disavow Links Tool page.

2. Login to Webmaster Tools if you haven’t already done so.

3. You will now see the Disavow Links box.

4. Select the website from the drop-down box and click the Disavow Links button.

5. You will see a big warning message from Google. Just click the Disavow Links button.Again you will see another warning message from Google…hmm, why do they need to keep warning us??

6. Click Choose File and find the text file you just created and click Submit. And your done!


 

Why Do We Need This Tool in the First Place?

Although I appreciate this tool being made available I am left wondering why we even need it. The tool lets us tell Google to ignore spammy or poor quality links but why can’t Google just ignore them to begin with? Google can pick a spammy site can’t it? I mean, if they don’t know the difference between a poor quality site and a high quality site by now, then they really need to reassess the system.

So what it essentially means is that you, the web owner, needs to determine what you consider to be poor quality links or not. But how do you do that? Do you look at the page rank of the page? Is that really an indication of the quality of the site.

Take a look at the pagerank of the site you are looking at right now. Do you see the pagerank – it’s a big fat zero. Why?…because in Google’s infinite wisdom they decided that the site doesn’t meet their quality guidelines. I have no idea what that means because we have never bothered to actively backlink to the site and we can’t see any other reason for the penalty.  (At one point, we started to backlink but we stopped pretty quickly and didn’t bother after that)

We have asked them what the exact problem is but we keep getting back the same old email in Google Webmaster Tools which tells us absolutely nothing:

We’ve reviewed your site and we believe that some or all of your pages still violate our quality guidelines.

In order to preserve the quality of our search engine, pages from http://www.affiliateblogonline.com/ may not appear or may not rank as highly in Google’s search results, or may otherwise be considered to be less trustworthy than sites which follow the quality guidelines.

If you wish to be reconsidered again, please correct or remove all pages that are outside our quality guidelines.

Funnily enough, the sites that we have backlinked to aggressively in the past, received no such message in Google Webmaster Tools.

We have given up trying and just continue to add content. It’s a bit hard to change something when you have no idea what the problem actually is.

So figuring out what site is considered  poor quality, a spammers site, or one that doesn’t meet Google’s guidelines is really a subjective thing. Who knows if a site we tell Google to disavow is actually okay or not?

Have We Used the Disavow Tool?

Not yet, but we will. Whenever something new is released by Google, we always like to let the dust settle before taking action. The other problem we have is finding the time but that’s another story.

We want to get our sites ranking like they used to so it will be interesting to see if the disavow tool actually works.

Think Pages Not Sites

pages vs sitesIn a recent email that we received, the person spoke about how they had received 1000 visitors to their review site but complained their conversions were terrible. They were wondering what they should do next.

I immediately suspected that this person was giving me stats for his whole site and not for any specific individual page. Those 1000 total site visitors are more than likely going to all sorts of different pages on the site. Some may be going to the home page, some to article pages, some to the About Us page, some to a review page and so on.

What I really wanted to know was how many people were actually landing on his review pages because those are the pages that are the real money making pages. Now, if only 100 people a month out of the 1000 are going to a review page then you can see how different this scenario becomes….100 visitors would equate to maybe 1 or 2 sales a month. Nowhere near enough sales to give up your day job.

There is a tendency for people with websites to look at them as a complete entity when doing any sort of analysis. In fact, most people focus on their site as whole in most respects and neglect their individual pages completely. They keep adding page after page of content but each page becomes lost and forgotten as more and more content is added.

Don’t get me wrong, we all need to add content to our sites on a regular basis because Google likes fresh content, but one of the most important things we need to do if we want to see some level of success, is to really focus in on just a few of those pages.

When you delve into your stats and look at them in terms of your site as a whole, you get a skewed view of how your site is actually performing. So instead of looking at the total number of visitors to your site, look at the total number of visitors to your most important pages. These pages are often your money making pages, like review pages, sales pages, product pages and so on. Important pages can also include the home page and portal pages which direct people through to your sales pages. In other words, you are looking to focus on pages that are likely to convert to a sale.

It was our very first mentor,  James Martell who told us to ‘think pages, not sites’ and  this is exactly what we do.  We only look at the important pages when checking our stats. This is why it is often so difficult for us to answer questions like, ‘how much traffic did you lose in the last Google update?’, because to answer this we would have to actually go in and check each page. Something we would rather not do as it is a time wasting exercise in our view. We know if we have been hit, when we see our income goes down.

Thinking about individual pages instead of thinking about your site as a whole forces you to focus on your goal….and focusing is where the magic happens!

So, the goal with all of this is to find those pages on your site that have the potential to make you  money.  Now, if you are anything like us, there may be hundreds of pages on your site that can potentially make you money and this is where many people come unstuck. The more pages you have and the more sites you have, the more difficult it becomes to focus on just one single website and even more difficult to only focus on a couple of pages.

We know that feeling well. We have over 25 sites and even though we have our plan in place we still find that every now and then something happens (someone leaves a comment or we are updating plugins) and we will suddenly be working on a site we shouldn’t be. The thing is, that all of our sites have the potential to do extremely well and this keeps us wanting to work on them. But we know that ultimately this doesn’t work because we are then spreading ourselves too thin. When this happens, we stop and have a quick discussion to remind ourselves that we need to stick to the plan and we get back on track again.

When we are doing things the right way, we only focus on one site. And when we focus on only one site, we pick a couple of pages from that site (usually two review pages) and concentrate on those plus the home page only. So that’s three pages in total. We also add regular content  to keep Google happy. Usually this is in the form of how-to articles and similar but it is generally only one article a week.

In a nutshell, those two pages, plus the home page, now become our focus for that site. Now you might be wondering what focusing on those pages actually entails. Here’s the basics:

1. Backlinking

This is where we spend the majority of the time when focusing on our pages. We backlink to the two review pages and the home page.  This usually means sending out  guest articles that include three links: two links to a review page and one link to the home page. However, because Google are getting pretty picky, we mix up the links and will randomly link to other pages on our site every now and then.

2. Reworking the Content

We will look at each of the pages that we are focusing on and determine whether they can be reworked. This usually involves doing more research on the product to create a more helpful review or it might include removing content that is unnecessary. We want the review to be the best it can possibly be because ultimately it is the quality of the review  that will affect our conversion rate. The better the review, the better the sales.

A lot of people often  fear making changes to a page because they are worried that as soon as they do their ranking might change in Google. Yes, this can happen but personally we haven’t really noticed any major ranking changes when we have changed the content on a page. However, just to be sure, we take a look at the page and if it is getting traffic and sales that we are happy with, then we don’t touch the page. If instead it is getting traffic but not many sales then we change it.  There is no point having a ton of traffic to a page if the sales aren’t there.

3. Add Helpful Content

Adding regular helpful, good quality content is useful for a number of reasons:

  • it increases traffic
  • it keeps Google happy
  • it keeps your readers happy
  • other sites may link to it

You can add as much content as you like to your site. The more you add the better but you don’t want to go crazy with it because firstly you can quickly burn yourself out and secondly you could also run out of ideas really quickly. Go for quality rather than quantity. Notice on this site for instance, that we only add content once or twice a month. The reason for that is we’d rather write something of value than write something just because we feel we have to.

That’s it!

That’s all we really do with our sites. Sure, we might play around with them every now and then and change the themes, create new headers, find new plugins to upload or fix broken links, but the majority of our time with our niche sites is spent on those three things.

It’s all about where you put your focus and we’ve said it time and time again, that working on too much at the one time just doesn’t work.

 

How to Clean Your Site after the Google Penguin Update

Confused-Penguin-Google-UpdateIf you read our previous blog post you will know that we were hit by the latest Google Penguin Update. We suspect it was because we had used blogging networks in the past although on further analysis it looks like it could also relate to our early link building strategies where we focused on backlinking using the same keywords over and over again. Or it could even be the number of keywords we have on those pages.

Honestly, it’s just too difficult to say what the exact reason might be – we’ve tried so many things over the years (before we knew better) and I guess it was going to catch up with us in the end. That’s not to say we did anything black hat, but these days even what was once accepted as white hat is becoming unethical in the eyes of Google.

You’ll see a lot of posts on the net about Penguin and how to get through it but really when it comes down to it nobody really has all the answers because nobody really knows exactly what Google does. I’ve heard so many conflicting theories – some say it’s is all about the backlinks, while others says it’s nothing to do with backlinks and it’s all about the on page SEO. If they had all the answers, they wouldn’t be online writing blog posts about it – they’d be off on their own private island somewhere logging in only to check their bank balance.

We of course don’t have all the answers either. That doesn’t sound very comforting does it but it’s all just  part of the wonderful world of internet marketing.  However, we can at the very least get our information straight from the horses mouth…Google themselves, instead of relying on theories that may or may not be correct.

Now this is easier said than done of course because Google isn’t particularly forthcoming about what they do. They tend to be very vague about things and we can only get snippets of information from them if at all. But sometimes those little snippets are enough. So below you will only find information based on actual Google data.

But First…Why This Update Didn’t Really Work

I’m all for Google cleaning up their search results and only getting the high quality sites ranking. I really want to see this happen but at this point, with all the Panda and Penguin updates nothing has really changed. Sure some sites have moved up and some down but we still see scraper sites ranked before legitimate sites, those with thin content ranked before quality sites and sites full of spammy ads and nothing much else beating out well developed content sites.

I really don’t think this update did the trick. And as always there are plenty of people who were innocently hit. Google’s goal with this update was to target over-optimization both on page and off. Too many spammy links to your site and you were penalized, too many keywords on the page and you were penalized. Unfortunately however, that is all they looked at. They still didn’t look at the content. Just because a page of content has a lot of keywords and has a lot of spammy links to it, doesn’t mean the content is bad. It might be fantastic content but Google doesn’t know that…their search engine still isn’t sophisticated enough to compare pages for quality and usefulness. So penalizing a page simply for spammy links just doesn’t work in my opinion.

But there’s no point whingeing about it. You could spend hours on forums and blogs commenting on how bad this all is depending on your situation.  The only way to get around it is either to comply with Google or to move on to some other form of traffic generation.

Where to Start

In order to clean up our sites to get them ranking again we need to start from the very beginning with Google’s Webmaster Guidelines.  The reason we need to start here is that Google explicitly stated that the Penguin change “decrease(s) rankings for sites that we believe are violating Google’s existing ‘quality guidelines”.

So when they refer to those ‘quality guidelines’ they are talking about their Google Webmaster Guidelines. According to Google, these guidelines will help them “find, index and rank your site”.  There are three sections to these guidelines and I’ll summarize the main points for each:

1. Design and Content Guidelines

  • ensure that each page on your site can be reached from at least one other page
  • use a site map to link to important pages
  • keep the links on a page to a ‘reasonable number’ – they don’t say what that is
  • create a useful, information-rich site
  • use words on your pages that people would use to find your site
  • use more text links than images for important links or content
  • ensure you use descriptive and accurate title tags and ALT attributes
  • check for broken links and correct HTML
  • for those using dynamic pages (ie. the URL contains a ? character) be aware that not all search engines can crawl these pages.

2. Technical Guidelines

  • Google recommend using the Lynx browser to view your site since it will display it the way most search engines see it. If some content is missing then search engines may have trouble viewing it. (I had a quick look at the Lynx site and it looks complicated so haven’t really tried it yet. I did instead find a Lynx viewer where all you need to do is type in your url and it does the same job.)
  • Allow search bots to crawl your sites without session IDs or arguments that track their path through the site. (I have no idea what this means but some of you might.)
  • Ensure your web server supports the If-Modified-Since HTTP header as this allows Google to tell whether your content has changed since they last crawled the site. (I simply typed in ‘If-Modified-Since HTTP header Hostgator’ as my search query in Google to find out if our hosting company supports this.  They do! You can do the same search with your hosting company or simply contact them directly and ask.)
  • Ensure that advertisements do not affect search engine rankings. (I’ m not quite sure what Google are getting at here but I think it has something to do with paid links. In other words, if you are selling paid ads on your site then add the rel=’nofollow’ attribute to the links.)
  • If you use a content management system ensure that pages can be read by Google.

3. Quality Guidelines

  • Make pages primarily for users, not for search engines.
  • Don’t deceive your users by using cloaking devices.
  • Avoid tricks intended to improve search engine rankings.
  • Don’t participate in link schemes designed to increase your search engine rankings.
  • Avoid linking to web spammers or ‘bad neighborhoods’ as your own ranking may be adversely affected. (Interesting that even linking to a poor quality site could affect your ranking).
  • Don’t use unauthorized computer pages to submit pages, check rankings etc. (Not sure how you know what is authorized or not authorized. Google doesn’t elaborate.)
  • Avoid hidden text or links.
  • Don’t use irrelevant keywords on your pages.
  • Don’t create multiple pages, domains or subdomains with duplicate content.
  • If you have an affiliate site, make sure your site adds value and provides unique and relevant content.

You can read the full version of the Google Webmaster Guidelines here.

As you can see, Google provides us with a basic overview of what they are after in their Google Webmaster Tools. It’s worth going through each of them to see whether you comply.

In this respect, we are on our own because unfortunately Google doesn’t elaborate too much on anything.  We might think we are complying with their guidelines but how do we really know? They say for instance to ‘keep your links on a page to a reasonable number’. What is ‘reasonable’…who knows? You have to really dig around to find the answer which I managed to do. It was in a  post by Matt Cutts (Google Engineer) and written in 2009. He mentioned that the page should preferably hold fewer than 100 links. Of course, the post is old so that could all be considered out of date by now and there could be a whole new number of links that we need to have on a page.

The Next Step

Google-Penguin-UpdateOnce you have gone through the Webmaster Tools and ensure that you comply to each…as best as you can, the next step is to figure out what this Penguin update was all about.  By doing this we can get a better insight into exactly what we need to do about cleaning up our sites. As we mentioned in the opening paragraphs, everyone has their own theory about what happened with this update but we want to hear what Google has to say. Here is what we found:

1. The Official Word

On April 24, 2012 Google published a blog post indicating that an update was imminent and this one was going to “reduce webspam and promote high quality content”. In the post they provided a couple of examples of keyword stuffing, spun content and outgoing links on a page that lead to unrelated content.

Apart from that, they didn’t impart too much information so at this point we were left in the dark. Considering we don’t keyword stuff, use spun content or link out to unrelated pages, this blog post didn’t help in the slightest.

Source:  Another step to reward high quality sites

2. Matt Cutts Interview with Search Engine Land

On May 10, 2012 Danny Sullivan from Search Engine Land interviewed Matt Cutts (Google Engineer). In the interview, Matt Cutts said that the Penguin update was designed to be quite precise and act against pages when there was an extremely high confidence of spam being involved.  Matt Cutts said, “The bottom line is, try to resolve what you can” and you will know if you have done the right thing the next time Penguin updates.

Again, we don’t really have much to go with here. Just a few snippets of information….just clean up the spam and you may be back ranking again when Penguin updates again….hmm, easy said than done when you don’t really know what the problem is to begin with.

Plus, Matt Cutts flippant statement “that you may need to start all over with a fresh site” if you can’t recover from the Penguin update just shows you how far removed he is from the rest of us. He obviously doesn’t know the amount of work that goes into it all. And what about small business owners who have created a branded website. You can’t tell me that they should start up a whole new website.

Source: Two Weeks In, Google Talks Penguin Update, Ways to Recover and Negative SEO

3. Google Updates Penguin Again

On May 26, 2012 Matt Cutts announces on his Twitter account that Google has pushed through a second Penguin data refresh. So if you weren’t hit the first time then you could have been hit the second time. Alternatively, if you made positive changes to your site since the first update, you might have seen an increase in traffic.

Does that mean we will see monthly Penguin updates? Hopefully, because it will mean that anyone affected by the Penguin update will be able to make changes to their site and not have to hang out for months waiting for the next update.

4. Matt Cutts Interviewed at SMX (Search Marketing Expo)

On June 5, 2012 Matt Cutts speaks at the SMX conference in Seattle. He informs the audience that Google’s definition of a Google penalty is something that is done manually. In other words, someone manually looks at the site and deems it to be bad. The Penguin update however was not a penalty but an algorithm change which is why you cannot submit your site for reconsideration.

Matt also spoke about negative SEO and that they are considering whether to create a system where you can disavow a link to your site. This would be fantastic but to me that says yes, negative SEO exists. Why would they bother creating a system otherwise? And if you’re wondering what negative SEO is, it is simply a way of killing a competitor by blasting their site with spammy backlinks.

One of the most interesting things to come out of this interview was when he was asked a question about wpmu.org, a reputable site that was hit badly by the Penguin update. The site went to the Sydney Morning Herald (an Australian newspaper) who in turn interviewed Matt Cutts about it.

In the SMX interview, Matt Cutts response was:

“They didn’t rank as high after Penguin, they made their case, and I thought it  was a good case. We were able to remove about 500,000 instances of links, and  that helped them.”

Now if you look at that response, Matt Cutts is effectively saying that it was their backlinks that caused the problem and by removing 500,000 links their ranking improved.  The site had created free WordPress themes and in the footer section had added a link back to their website. This was what resulted in their penalty as a lot of spammy sites used the theme.

Now this is all very nice for wpmu.org who were able to get to the press first to turn their site into a high profile case. But for the rest of us, we are left with spammy links to our site which in most cases are no fault of our own and we are left trying to figure out how to get them removed. Sure, we can email each and every site but do you think a spammy scraper website owner is going to give a toss about removing a link on their site? The whole reason they have a spammy site is that they couldn’t be bothered working on them to begin with. They just want to automate it all and sit back and not touch them ever again.

Another point that Matt Cutts made that would be of interest to us are affiliate links. He did say that they do handle affiliate links okay but if you are in any way worried about them, then add nofollow to them.

And just one more thing which is interesting to note, Matt does say that Google does not look at Google Analytics in its rankings.

Source: You & A with Matt Cutts – SMX Advanced 2012
and Matt Cutts on Penalities vs Algorithm Changes, A Disavow-This-Link Tool and More

Now What?

That’s about all I have been able to find so far on the Penguin update that comes from straight from a Google representative. If you know of anything else, please let us know in the comments below.

From what I can tell from all of this,  the Penguin update focused on two things:

1. Links
2. On page SEO

Of the two, I personally think that the links are the main factor.  In other words, if you have spammy sites linking to you in quantity then you are likely to be affected. If you have  used blogging networks or if you have paid someone to get hundreds of thousands of links to your site overnight then you would likely have felt the affects of this update.

I do also suspect that it could be the anchor text used in those links so try to avoid using the same anchor text over and over when backlinking.

Also if you link out from your site to totally unrelated sites then this could also affect you. So if you accept guest articles then ensure the links in those articles are to related sites. We often get sent articles for our sites and the article might be about dogs for instance which would be suitable for our dog site but the links in the article go to a credit card or insurance company. Don’t accept these articles. Keep them related to your site topic.

If you think that spammy backlinks are your problem then go to your Google Webmaster Tools and take the following steps to view your backlinks:

1. Click Traffic from the menu sidebar.

2. Click Links to Your Site from the drop-down that appears.

You can assess each link and if you deem it to be a spam site you can always email them to see if they will remove it.

As for the on page SEO, this has always been a problem for Google but perhaps they are cracking down a little harder. In this case, I would simply check your reviews and articles and if they sound unnatural to you when you read them out aloud then you are probably using too many keywords on the page. We’ve said for a long time now to just write naturally…throw in a few keywords to help Google find your pages but just don’t overdo it.

I think Google have a love-hate relationship when it comes to SEO. They need it because it helps them find relevant pages on the net for their search engine but at the same time it causes them all sorts of grief as webmasters attempt to use every SEO tip and trick in the book to attempt to manipulate their rankings.

What We Are Doing About It

We personally want to stick with Google but at the same time we want to focus on other forms of traffic. This was our goal before we went overseas and is something we are still looking into. But for the moment we want to get our sites back and ranking well in Google. Fortunately we weren’t hit too hard but it was enough to give us a jolt and get us focused again.

We started by doing nothing and you may think that a little odd but we have learned from years of experience that when it comes to Google updates that you don’t make any changes straight away if you have been hit. It’s always best to leave things alone for a while because oftentimes you will find yourself ranking again.

So after waiting a month or so we realized that our traffic wasn’t going to improve so we started to make some slight changes. Nothing major, just some changes to a few pages on our sites.  Fortunately we can take our time with this and not make too many drastic changes at once.

We tried those blogging networks at one point but gave up on them pretty quickly because we realized they didn’t work very well.  Plus we just didn’t feel comfortable about the sites the articles were going on. They just looked spammy to us and we wanted to be associated with quality wherever possible.

The other problem is spammy sites that link to us and that is out of our control. We noticed that some sites add hundreds of our links to their sites. I’m not sure why…we didn’t ask them to. One would be enough but they link to us from all sorts of pages in their ‘Further Information’ sections. This doesn’t help us if their site is just a scraper site of sorts.

However, we will try to contact the major offenders to see if they will remove the links and see how we go.

We are hoping that Google implements their ‘disavow’ option which will allow us to reject a site that is linking to us. But who knows when this will be – Matt Cutts says ‘maybe a month, or two, or three’. We will see.

Yikes!! We’ve Been Hit By the Latest Google Penguin Update…and Other Boring News!

First of all, thank you to everyone who emailed wondering where we have been for the past couple of months…we appreciate your thoughts. We have actually been travelling the world or at least some of it…Las Vegas, Orlando, Paris, Barcelona, Prague and Vienna. It’s been an amazing trip and we had all sorts of good intentions to write blog posts along the way but neither of us felt inspired, at least for this blog. We wrote a number of posts for our travel blog but just couldn’t generate any ideas for this one.

Anyway, it’s all good because we arrived home yesterday after a very long day of flying and are feeling good about getting back into it all again.

During the trip, we received a few emails from people talking about the latest Google update. We have been oblivious to all of this while we have been away so had no idea that an update had occurred. Mind you, we are usually oblivious to these sorts of things anyway as we stay clear of forums and blogs that discuss these sorts of things ad nauseam.

What is the Penguin Update?

For those that don’t know, the Penguin update occurred on April 24, 2012 and the consensus is that this one was all about the quality of back links. From the research I did on this, if you do any of the following to get backlinks for your sites, then the latest Google update may have affected you:

  • getting links from blog networks
  • leaving comments using a keyword instead of your name
  • paying for backlinks
  • getting links from article marketing sites
  • keyword stuffing
  • using link schemes

Basically it looks like Google is looking for any unnatural link building. This is what Google has to say about it:

We want people doing white hat search engine optimization (or even no search engine optimization at all) to be free to focus on creating amazing, compelling web sites.

I agree wholeheartedly. It’s something we have been pushing for a while now to our readers…focus less on SEO and more on building sites with quality content. Unfortunately however, Google can sometimes live in La La Land as they obviously have no idea what a small one person website owner has to go through to get a site ranked. Just building a site and adding great content isn’t enough when you are competing with millions of other website owners. We have some websites with some fantastic original content and they don’t generate any traffic because we haven’t actively backlinked to them. We have to generate backlinks somehow because that is what Google uses to rank a website… it’s a catch-22.

Is Guest Blogging Considered Spam?

Who knows what Google thinks of guest blogging but if they have any sense then I can only hope that they consider it to be completely above board. After all, when sending out a guest article you only want to target high quality sites and a high quality site will only want to take a quality article and link to a quality website. To me, that is natural linking so we are going to continue with it.

How far is Google going to go on this?

It seems we have had one update after another lately and it isn’t likely to stop in the near future. However, as we’ve said in the past, this is a risky business and you have to be prepared for the ups and the downs. It definitely keeps life interesting.

Have we been affected by the Penguin Update?

When these updates occur we normally steer clear of our stats and just wait to see how our sales are going. For us, this is the best indicator of how we are doing. We did notice a drop in income but nothing major for this time of year. We normally have a dip in sales anyway during these months and then it starts to increase again around August so it didn’t look like anything major.

Since arriving home however we have checked our Google Analytics account (not something we do very often) and noticed that yes, we had been hit by the Penguin update. On or around April 24 there was a drop in traffic for a number of our sites.

We had used blog networks in the past so I can only think that this is the reason. We never used them long term however for the reason that they never seemed to really work too well for us and we just weren’t happy with the quality of the sites that the links were going on. This may be the reason why we haven’t been hit as hard as we could have been.

We do have a decline in traffic but when we look more closely at the individual pages we didn’t do too badly. Sales are lower but when I compare them to last year, that isn’t too much of a difference although it’s a little difficult to tell at this point and we probably won’t know until the end of the month to get a better idea.

What are we going to do about it?

At this point, we aren’t going to do anything. We are no longer using blog networks and haven’t been for a long time so we are just going to continue guest blogging and hope it all works out. We are never going to know what Google is thinking; all we can do is continue to do the right thing and provide quality content and look for quality backlinks.

Although we are not happy with the drop in traffic, we still like this update. It is going to cause stress for a lot of website owners but I think over time these sorts of updates are going to make it easier for us to rank. All of the wannabe get-rich-quick website owners will eventually disappear because they don’t want to spend the time creating a quality website. This will reduce the competition and hopefully we won’t have to work so hard. Of course, that could all just be pie in the sky thinking but I am hoping that is where it is all heading.

Were your sites affected by the Penguin Update?

Let us know in the comments below….