HOME ORGANIC SEARCH SEO TIPS DIGITAL MARKETING SERVICES SUBSCRIBE

SEARCH
 
FIND US
Twitter
Facebook
LinkedIn
SUBSCRIBE
Subscribe to Marketing Jive
 Subscribe to our feed.
ADVERTISE
Marketing-Jive, formerly SEO-Space, was established in 2006 and since then we have noticed significant increases in both traffic and feed subscribers. If you want to promote your business to thousands of visitors who understand digital marketing, you’ve come to the right place. Sign up and start receiving qualified leads right now. Your ad will be visible on every unique page on our blog.
Glossary of SEO Terms
  • SEO Terms A-C
  • SEO Terms D-F
  • SEO Terms G-I
  • SEO Terms J-L
  • SEO Terms M-O
  • SEO Terms P-S
  • SEO Terms T-V
  • SEO Terms W-Z
  • Enquiro's Online Marketing Glossary
Search Engine Market Share (US)
Organic Search / SEO Tips
Google's List Snippets
Wednesday, August 31, 2011
Last week Google released what they refer to a list snippets. We had been seeing some of these snippets leading up to the official announcement from Google so we thought that we would mention them as we can expect to see more snippets with Google's search results in weeks and months to come.

Google List Snippets

Who:  Google
What:  Displaying list information in description area of search results.
Where:  Google Search Results Pages
When:  Launched August 2011 (Official announcement August 26,2011)
Why:  To better reflect the content of Google search results, making it easier for searchers to find the most relevant results

http://insidesearch.blogspot.com/2011/08/new-snippets-for-list-pages.html

Example of List Snippet
A search for “pizza restaurants in Montreal” returns this result for Yellowpages.ca:


Here is a list snippet for hotels.com that appears when searching for "hotels in Las Vegas":



SEO Implications

Of course webmasters and site owners will want to know how this impacts their SEO efforts.  Well that remains to be seen, but there are a couple of questions that come to mind:
  1. How will this impact CTRs?
  2. Does this devalue the use of the meta description tag?
Let’s examine these questions in a little more detail.
 

How will this Impact CTR?
 

Depending on what is being displayed in the list snippet, these snippets might either:
 

a) encourage the click through by displaying related information that the searcher is looking for thereby enticing the searcher to click through to the site
 
b) prevent the click if the user is finding information that they are seeking (such as a phone number or address) – a potential solution to this would be to control what is displayed in these list snippets. 
  • To prevent Google from displaying snippets (and Instant Preview for your page) you can place the following tag in the <HEAD> section of your page: <meta name="googlebot" content="nosnippet">
  • This tag will prevent previews from appearing in regular search results, but will not affect previews on ads. http://www.google.com/support/webmasters/bin/answer.py?answer=35304

Does this devalue the use of the meta description tag?
 

It is important to remember what meta data is and that is meta data is really data about data.  Currently search engines such as Google used meta descriptions to learn more about the topicality of a page.  Quite often, Google will display the meta description in their search results.  While this may change over time with rich snippets and list snippets showing up, it is still recommended to include unique meta description tags on your site pages.  Follow best practice and do not spam or abuse the meta description tag and you should be fine.
 
Google continues to attempt to improve their search results.  Some of the enhancements that they test and try may work and become widely accepted.  Others may not.  I am in favor of rich snippets, depending on what information is being displayed.  However at the end of the day if the snippet displayed helps me find what I am looking for then I'm happy.   

 
Share |

Labels: ,

posted by Jody @ Wednesday, August 31, 2011  
Why Social & Site Engagement Ranking Metrics are Important
Tuesday, August 30, 2011
Can you imagine if Google was more of a complacent company, one who did not look to regularly improve the search experience for users? What would Google look like? Would it be reminiscent of Yahoo or perhaps more like ASK? While many site owners and webmasters do not always agree with the results displayed by Google in their SERPs, you have to respect the mighty G for continuing to work on refining their results in hope of delivering the best results possible for a given search query.

 
Take some of their recent algorithm updates for example. Google Panda rocked what we saw in Google search results. For better or worse we saw a lot of sites that lost rankings replaced by at times other sites that were questionable based on the query performed. I have been critical of Google results for the past twelve to eighteen months (oddly enough around the time when Google began rolling out Caffeine) but I can understand what their end goal is... to provide the best search results possible. This is not an easy job people, and Google understands that. Perhaps that is why they are asking help to identify scraped content.

 
The fact is that if Google did not care about the user, they would not worry about trying to improve the search results as regularly as they do. Google has a series of search teams from engineers, to quality managers, they take their search results very seriously. If they didn’t why would they:
  • rollout a major infrastructure change (Google Caffeine)
  • place more emphasis on big brands (Google Vince Update)
  • improve the size of their Index (Google Caffeine)
  • increase focus on local search (Google Places)
  • assist users with search queries (Google Instant)
  • focus on webspam (Google Farmer Update)
  • focus on low quality content/duplicate content (Google Panda Update)
  • release a Google Chrome site blocking extension
  • launch Google +1
  • be looking to identify scraper sites and duplicated content
It might be a fair comment to say that post Caffeine Google is broken. However a broken Google is still better than some of the alternatives out there when it comes to finding information. (Think of all of the endless low quality directories or autoblogs out there).

Google’s ranking algorithms consist of hundreds of factors but what we are seeing is that the weight of some of these factors is shifting. While link popularity continues to be a key ranking factor, you have to think that with the ability to artificially inflate a site’s link inventory Google is modifying the weighting of this factor. It is no coincidence that Google has been communicating the importance of social and site engagement over the past couple of years. It is with these types of metrics where I see Google going and there is evidence that Google is going this route with their ranking algorithms.

Why Social & Site Engagement Ranking Metrics are Important

Google has often communicated the importance of site stickiness citing metrics such as bounce rate or time spent as being key metrics in monitoring the engagement success of a site. Factor in the social element and all of a sudden we can see just how important social and site metrics become. Google has been watching what has been going on in the online space with regards to social activity. It is no secret that Google has at one time or another tried to acquire or form partnerships with social sites such as Facebook and Twitter. It is no secret that engagement metrics are big on Google's list of ranking factors. That alone should be reason enough to pay attention to social engagement. Google is and will continue to use social engagement metrics as a ranking signal. Look for this to carry more weight moving forward. Social, unlike link popularity cannot be as easily gamed. This also makes it appealing to Google as a ranking factor.

 
How important is social to Google? CEO Larry Page reportedly sent out an internal memo to all employees informing them that 25 percent of their bonus will be tied to the success or failure of Google's social products. How people interact with a site either positively or negatively can impact what Google will display in their search results. Google’s launch of their +1 button suggests just how important social and site engagement has become for Google. http://www.google.com/+1/button/.

Derek Gordon over at MediaPost’s Search Insider shared a story where a recent meeting between Forbes reporters and Google sales representatives took place in which the Google resource “suggested a strong correlation between search results and publishers putting the Google "+1" button on their pages. The article goes on to suggest that:

Google is encouraging web publishers to start adding +1 buttons to their pages, and the message in this meeting was clear, 'Put a Plus One button on your pages or your search traffic will suffer.'

The fact that the Google +1 button impacts your site’s visibility in the search results is reason enough to place it on your site pages. Keep in mind that this is one ranking signal of many used by Google, but it is a social/site engagement metrics that is a new ranking factor that previously did not exist. If Google’s search results become richer based on the factoring in of social and site engagement metrics, you can expect the weightings of these metrics to increase. At the end of the day we are seeing what could be a monumental shift in how Google ranks a particular webpage. While there are still hundreds of factors used in Google ranking algorithms, expect Google to give more weight to engagement metrics and less to link popularity metrics.

Additional Readings on Social Engagement

Google’s Focus on Quality and the PostRank Acquisition
http://www.sitepoint.com/googles-focus-on-quality-and-the-postrank-acquisition/

Official: Google Analytics Gets Social Engagement Reporting
http://searchengineland.com/official-google-analytics-gets-social-engagement-reporting-83707

Google’s addiction to speed, and how engagement metrics will shape the web.
http://blog.webdistortion.com/2011/05/11/googles-addiction-to-speed-and-how-engagement-metrics-are-shaping-the-web/

Does Google use data from social sites in ranking? (Matt Cutts Video)
http://www.youtube.com/watch?v=ofhwPC-5Ub4&feature=player_embedded

What is driving engagement on Google+?
http://blog.webdistortion.com/2011/07/09/what-is-driving-engagement-on-google/


 
Share |

Labels: ,

posted by Jody @ Tuesday, August 30, 2011  
Ever Wonder How Google Comes Up with Ideas for Algorithm Updates?
Monday, August 29, 2011
Continuing with our theme on Google algorithm updates, we came across a nice post from the folks at Google as to how they come up with their algorithm updates. Ever wonder how Google decide upon an algorithm update? This post might shed some light into how Google works to ensure that their search engine continues to provide the best results possible.

It features a brief video with snippets from key Google engineers including Amit Singhal, Scott Huffman and other Google engineers and analysts.  Some key points that you may or may not have known:
  • Google made over 500 changes to their search algorithms last year, as Amit Signhal suggests that equates to nearly two per day. - how many other search engines do you know of that have this level of commitment to their search results?
  • Scott Huffman states that each potential change is analyzed very deeply to try and ensure that it is the right things for users
  • the first step with improving their algorithm always starts with an idea which may be based on a set "motivating searches" which are searches that are not performing as well as Google would like.
    • key ranking engineers then look at what signals or data could be integrated into their algorithm to address these issues
    • all ideas are tested through scientific testing
    • they have "raters" which are external people that have been trained to judge whether one ranking is more relevant or higher quality than another by looking at side-by-side sets of results
    • they use the "Google Sandbox" to test this out on live users as well. 
    • all of the data is rolled up and further looked at by a search analyst
  • in 2010, Google ran over 20,000 different experiments
  • Google conducts launch decision meetings where the leadership of the search team further evaluates the data and makes a decision
  • Google focuses on whether these changes will help users just in the US or based in other various GEOs
Amit Singhal:  "...when you align Google's interests with users interests as we have aligned, good things happen..."

Great  video on an interesting topic.  It is easy to see how Google remains the premier leader in the Search space.


Share |

Labels:

posted by Jody @ Monday, August 29, 2011  
The Top 10 Google Updates of All Time
Monday, August 22, 2011
For anyone involved in SEO and online marketing you will no doubt be familiar with terms like "Google Dance", "Florida", "Vince", "Big Daddy" and more recently Panda. These are all terms that describe the ever changing landscape of the Google SERP. Many of these terms are reflective of something that many site owners and webmasters have come to expect, the Google algorithm update.

Google make updates to their algorithms all of the time. Depending on who you talk to they make anywhere from 400-500 changes a year. Perhaps that is why they are still the dominant search engine in North America and throughout other markets across the globe.


Even though Google is the dominant player in Search, they are in danger of losing market share if their results continue to suffer. Over the past twelve to eighteen months we have seen a number of large algorithm changes that have dramatically altered the results that we see in Google. For better or worse Google continues to make regular updates to their algorithms.

We thought we would compile our list of the top ten Google algorithm updates of all time. This is our opinion so you can agree or disagree, but these updates are the ones that we feel have had the greatest impact on Google search results and on Search in general.  Algorithm updates can be great for some sites and not so great for others.  We have discussed tools to diagnose the impact of Google Algorithm updates, but sometimes you really need to ride out the storm when it comes to algorithmic changes.  For many impatient webmaster or site owners they simply cannot understand this.  Algorithm updates have to run their course.  Your tactical manipulation to try and rank number one for (insert key phrase here) will often fail miserably.  The fact is you shouldn't be trying to game Google and you shouldn't be entirely reliant on traffic from Google 65% market share or not.  However being smart about your online strategy means that being present in Google should be a definite part of your online initiatives.

Make no mistake Google will continue to make algorithm updates and changes.  They have a difficult task ahead of them in trying to sort the World's information.  If you are doing things correctly, your site make not be impacted by an algorithm update, but if you are employing questionable tactics you might be in for some hard times. 


Top 10 Google Algorithm Updates of All-Time

#10.  Google's Jagger Update - October 2005 - Google continued to make a series of algorithm updates surrounding link farms, low quality links, reciprocal links and the ever discussion-worthy paid links.  The Jagger Update also looked at some canonical issues and according to SEJ, the Jagger Update consisted of three elements: "The first was to deal with manipulative link-network schemes, sites generated with scraped content and other forms of SE-Spam. The second was to allow and account for the inclusion a greater number of spiderable documents and file types. The third was to allow and account for new methods of site acquisition beyond the use of the spider Googlebot."

#9.  Google Austin Update - January 2004 - in the aftermath of the Florida Update, Google's subsequent algorithm update "Austin" continued to focus on spam tactics such as link farms, white on white/invisible text, and over optimization of meta tags (keyword stuffing).  Again a number of sites were impacted, but in the end Google took a huge step to cleaning up its index.

#8.  Google Brandy Update - February 2004 - Ever notice how Google tends to have a lot of updates in the months of February and October?  This particular update focused on Latent Semantic Indexing (LSI), increased focus on anchor text relevance and link neighborhoods.  This update also placed less emphasis on tag optimization.

#7.  Google Vince Update - February 2009 - this update could easily rank higher as I think that this update had a profound impact on what we now see in Google's search results.   While Google downplayed this as a minor update, many others did not feel the same ways as big brands were seeing favorable results.  In October 2010 this became even more evident as brands continued to see further preferential treatment.  Where "pizza hut" previously had two rankings for a branded search, they now owned the SERP with as many as eight listings.  Again Google attempted to downplay the update suggesting that this was not a brand push.  This has had a dramatic impact on directory-type sites that previously enjoyed rankings for various branded terms.

#6.  Google's Big Daddy Update - December 2005/January 2006 - more of an infrastructure update, Google's Big Daddy may have been one of the first major algorithm updates to look at duplicate content issues as a large part of this update focused on redirects (301/302's) and URL canonicalization.  Google's Matt Cutts touched on Big Daddy here.

#5.  Google Universal Search Update - May 2007 - this was a Google update that I really liked as it meant that we would be experiencing a rich search results page with things such as news, images and video populating Google's search results.  Gone were the ten blue links (for now).  Other engines followed and blended search was born.  Danny Sullivan dubbed this Google 2.0, but the results page would begin a series of changes that would make it a so-called richer experience.

#4.  Google MayDay - April / May 2010 - this update focused on long-tail keyword traffic as many webmaster noticed a drop in their long-tail rankings and associated traffic.  This update would be the preview to what we would see with the Google Panda Update.  There were a lot of sites hit by MayDay and there was certainly a lot of discussion about the Google MayDay Update, but MayDay was only a sample of what was yet to come.  MayDay began Google's quest for higher quality sites to display in their search results.   Google told Vanessa Fox that "that it was a rankings change, not a crawling or indexing change, which seems to imply that sites getting less traffic still have their pages indexed, but some of those pages are no longer ranking as highly as before.."

#3.  Google Caffeine - June 2010 - what Google began testing in August 2009 became a live update in June 2010.  Caffeine was an infrastructure change that would allow Google to crawl the Web more efficiently resulting in their index growing from billions of pages to billions possibly even trillions more.  We are still experiencing the ramifications of Caffeine as Google continues to clean up their index.  As SEOmoz suggests, "Caffeine not only boosted Google's raw speed, but integrated crawling and indexation much more tightly, resulting in (according to Google) a 50% fresher index."  Google called Caffeine a more robust foundation that would allow them to provide a richer search index.  Hmm well perhaps richer and more spammed if you ask many in the industry.

#2.  Florida Update - November 2003 - this was a big one.  I remember being asked my thoughts on the Florida Update as I was starting my career at Enquiro.  I remember thinking that this was going to be huge, I just wasn't sure how much of an impact it would have on the Search industry and on how websites would be performing in the search results.  This was a good update as tactics such as keyword stuffing were no longer acceptable and as a result many sites dropped off of the face of the search landscape.  The Florida Update caused millions of pages and sites to be dropped from Google’s results.  Danny Sullivan had a good review of the Florida Update only weeks after it occurred.  Looking back it was an obvious effort to improve relevancy for the sites that were being listed.  Or was it?  A number of commercial sites suffered.  These are sites that you would expect to see when doing generic "head-type" keyword searches.  In the end however, the Florida Update was necessary for the progress of the Google Search Engine.

#1.  Google Panda Update - February 2011 - ongoing - We have written numerous times about the Google Panda Update.  This one made the top of our list based on the sheer volume of search results (and sites) that were impacted.  When initially released Google stated that this update would impact nearly 12% of Google's search results.  Twelve per cent?  Are you kidding me and that was just the first phase of Panda.  The Panda Update stems from a lot of the poor quality results that were showing up in Google really as a result of Google's infrastructure update Caffeine.  Google's index grew exponentially and as a result, I'm not sure if Google was prepared for all of the spam results that would now be part of their index.  With all of the low quality content, syndicated and duplicated content out there, it was only a matter of time before Google cracked down on the quality of their search results.

I was in Las Vegas attending PubCon in November 2010 listening to Matt Cutts touch on webspam and that Google will be tacking measures to deal with a renewed interest in spam.  Little over a month later the algorithm updates began... starting with the "Farmer Update" and then leading into Panda.  Google's search results will never be the same again.  The thing with Panda was the high number of "innocent" sites that were caught in the crossfire.  There were a lot of complaints, in fact people are still complaining of the devastation that their sites and businesses have suffered as a result of Panda.

So what is the best way to deal with an Google algorithm update?  Sometimes it simply means waiting and riding out the storm.

Related Resources

  • SEOmoz Google Algorithm Changes - http://www.seomoz.org/google-algorithm-change 
  • Barry Scwartz Major Google Updates Documented - http://searchengineland.com/major-google-updates-documented-89031


Share |

Labels:

posted by Jody @ Monday, August 22, 2011  
Google Sitelinks Update
Wednesday, August 17, 2011
Just a quick note to touch on the fact that Google has updated their sitelinks algorithm.  I think that this is a pretty cool feature providing that Google is displaying the sitelinks that are most important to your site.  At this time we still have no direct way of controlling which sitelinks appear but we can control this slightly by blocking sitelinks that we do not want to show up via Google Webmaster Tools. (Google refers to this as "demoting" a sitelink.

Sitelinks are links to other pages on your site that may appear under the search result for your site. Google generates these links automatically, and it should be noted that not all sites have sitelinks.
Sitelinks are often triggered by a branded search query.  Sitelinks are displayed for results when Google feels that they will be useful to the user.  However, there are a couple of reasons why Google may not display sitelinks for your site:
  • poor site architecture
  • if sitelinks for your site are not relevant for the user's query
Here is how the new sitelinks appear for Marketing Jive.



3 Tips for Controlling Sitelinks
  1. Ensure that you use anchor text  that's informative, compact, and avoids repetition.
  2. Do the same for alt text
  3. Demote irrelevant sitelinks via GWT.  Demoting a URL for a sitelink tells Google that you don't consider this URL a good sitelink candidate for a specific page on your site.  Google won't guarantee that these URLs will not appear as a sitelink, but in my experience they tend to be replaced with other URLs.  You can demote up to 100 URLs, and demotions are effective for 90 days from your most recent visit to the Sitelinks page in Webmaster Tools.
As a user and depending on my search query, I like the new and improved sitelinks.  Especially if they help me access the information that I am looking for faster.  According to Google:

"... we have made “a significant improvement to our algorithms by combining sitelink ranking with regular result ranking to yield a higher-quality list of links...this reduces link duplication and creates a better organized search results page.”

Sounds good to me.

Share |

Labels:

posted by Jody @ Wednesday, August 17, 2011  
More on Organic Click-Through Rates in Google
Thursday, August 11, 2011

Quite often we get asked about click-through rates in Google.  Specifically organic click-through rates.  Recently Slingshot SEO released their study on establishing Google click-through rates.  The study makes reference to a study from Optify as well as Marketing to a B2B Technical Buyer that we released at Enquiro back in 2007.  The results from the Slingshot study are significantly lower than other studies and numbers that we have seen.  Establishing click-through rates for Google is not an easy task as there are many factors that can impact the CTR of Google’s search results.  The single biggest factor that impacts establishing a somewhat accurate click-through rate report is the nature of the search query.  The type of queries being used for these CTR case studies will result in different findings.  For example when we conducted the study at Enquiro in 2007 we found that for navigational searches, searchers rarely look beyond the first couple of results; with informational and transactional searches, there is scanning much further down the page.  So suffice to say the nature of the search query has a direct impact on the organic CTR.

One thing is for sure, the same as it was in 2007 holds true in 2011 and that is the importance of being on the first page of search results.  Specifically the importance of being in the top three to four results is key to obtaining clicks and organic search traffic.  Regardless of these CTR studies, my experience over the years suggests that there is no single important position to be in than in the number one or number two spot on a SERP.  Being in the top spot is where you want to be no question.  One of the issues that I have with the reported numbers from the Slingshot SEO study is the low CTR of the top spot in which they report a CTR of 18.25%.  While the study acknowledges the fact that their findings are lower than previous studies, their top spot CTR just seems extremely low.  Again it is important to stress that the nature of the search query and the results that are returned will have a direct impact on what people will click on.

Fact:  When we did our original research studies we found that the percentage of organic search results that were clicked when compared to paid results was about a 70/30 split, realistically it was probably closer to 80/20.  Meaning that 80% of the clicks were from organic search results.  Today, even with the ever changing SERP, it is safe to suggest that the 70/30 split holds true.   Yes, the majority of searchers still prefer to click an organic search result.  Even with blended search results populating some search queries, organic is the place to be... well actually having a paid and organic presence for a key phrase can be quite beneficial (but that’s a post for another day).  The fact is more people still click organic than sponsored listings.

Looking at the CTR data from Slingshot, the top 10 organic results turned out roughly 53-55% of the clicks.  Where then did the other 45-47% of the clicks go?  With the study we did at Enquiro, our findings suggested that about 73% of  the clicks went to organic results, 19% went to sponsored results and about 8% went to what we called “missing” implying that people either clicked back in their browser or clicked to another SERP.  Have search behaviours changed that much in the past four or five years?

The Ever Changing Search Results Page
One thing that has definitely changed is the aesthetics of Google’s SERP.  At times it is reminiscent of an ASK.com home page (which I feel was ahead of their time) with blended results ranging from news listings to videos, blog results, local listings, multiple listings from the same domain additional search option links and more.  Add in the fact that since the roll out of Caffeine, Google`s search results have been shall we say questionable as some suggest that Google results are ``broken``, the landscape of a Google SERP is not what it once was.

The slingshot findings suggest that the impact of blended search results remains to be seen.  Well that is not entirely true.  Considering the fact that technically blended results could be classified as organic results, that is results that you do not have to pay for to be listed within the SERP itself.  Additional studies that we did at Enquiro suggest that blended results do have an impact on CTRs, or at least they can impact where the eyes focus as a video result for example “chunks out” a portion of the page drawing the eyes and potential clicks to listings above or below the rich media result.  Or do blended results act as barriers on a SERP similar to other barriers found on SERPs pages?  Again we need to be aware that this can change depending on the nature of the search query.  

I think that these CTR studies are great.  We need more of them and with greater frequency.  Organic click-through and Google click-through data is not as elusive as it once was.  However there are a lot of factors that come into play when establishing accurate CTR percentages.  The fact remains that being in the top three spots of a Google search results is preferred.  Anything else is simply residual clicks and traffic.  Obtaining the click is only part of the battle, engaging your site visitors after the click is a whole other issue that we could discuss at length.

Additional Resources on Organic Click-Through Rates

Organic Click Through Rate - Brand vs Intent vs Research keyphrases
Organic Click Through Rate Curve – Optify Study
Google Organic Click-Through Rates
Eye-Tracking Analysis of User Behavior in WWW Search
Redcardinal AOL data analysis
SEObook CTR post
SEO Scientist looks at Organic CTR and eyetracking studies
More on eyetracking studies


Share |

Labels: , ,

posted by Jody @ Thursday, August 11, 2011  
Google's Collateral Damage Infographic
Monday, August 08, 2011
Here's one of my favorite infographics on Google's ever evolving algorithm.

Google's SEO Cat & Mouse.

Infographic by SEO Book


Share |

Labels:

posted by Jody @ Monday, August 08, 2011  
3 Ways to Help Google Crawl Your Website
Sunday, August 07, 2011
You know SEO has been around long enough that most webmasters should have a good idea about how effective their site is being crawled by the search engines. Of course there are those site owners who continue to tinker with their sites in order to refresh their look and feel or to try and gain better placements within the search results. The fact is, is that it is all about relevancy (but that is a topic for a different day).

If you are trying to market your site on the vastness that is the Web, one of the first things you will want to put in place is the ability for the search engines to find and crawl your website's pages. If the search engines cannot find your content, how if the *bleep* do you expect to rank for a given keyword? You can optimize all you want, but if your site cannot be easily crawled you are not going to have much success from an SEO and search traffic perspective. So then what measures can you take to ensure that the search engines crawl your content?

Well first and foremost, you need to have an idea of the pages that make up your site. As I wrote last week, the importance of having an inventory of your site pages is extremely important.  How can you expect Google to crawl all of your site's important pages when you are not even sure how many important site pages that you currently have?  So take inventory of your site pages, even if you are a large e-commerce site, you should keep an inventory of your site URLs handy.  At this point I think that it is important to make reference to what I thought was another excellent post from ex-Google Vanessa Fox with her piece on the Fetch Googlebot functionality that can now be used to submit site URLs.  Vanessa reminds us how Google finds your site pages through their discovery process which mainly consists of the following:
  1. Through links
  2. RSS feeds
Google continues to have the best discovery of pages for their index... perhaps too good if you consider some of the issues that they are having with their updated roll out of Caffeine in which Google's index grew exponentially. Regardless, Google has a massive amount of pages in their index. The question is do they have all of your pages indexed (at least the ones you want indexed) and if not how can you ensure that your pages get indexed in Google?

Let's now discuss the purpose of this piece and that is to talk about ways of helping Google crawl your website.

3 Ways to Help Google Crawl your Website

Googlebot crawls the Web via links.  If by chance your content is not linked to and is not well interlinked itself, Google may be missing key pieces of your site in their index.  However there are ways that you can help Google find your content.  The following methods are three basic ways to help get Google to crawl your site.
  1. XML Sitemaps - this is probably a preferred method due to the control that you have, although Google may still not crawl all URLs submitted, XML sitemaps definitely point Googlebot in the right direction.  As Vanessa points out in here article, "The XML Sitemaps protocol enables you to submit a complete list of URLs to Google  and Bing. The search engines don’t guarantee that they’ll crawl every URL submitted, but they do feed this list into their crawl scheduling system."  More on Google and XML sitemaps here
  2. Submit your site to Google - have a new site? You might want to kick it old school and submit your URL to Google.  You will need a Google account and you can go to https://www.google.com/webmasters/tools/submit-url?pli=1 and follow the instructions for submitting your URL.
  3. Through GWT's Fetch as Googlebot -ah yes this is new as of early August.  The Fetch as Googlebot option allows you to request indexing consideration for that page after you fetch it by clicking Submit to index.  This new feature is ideal for situations when you launch a new set of pages on your site or have major updates.  Using this option, you can submit up to 50 URLs a week.  Again see Vanessa's post for additional details.
If you are looking for mass submission of URLs to Google, you will want to create an XML sitemap.  You will need multiple sitemaps if you have over 50,000 or more URLs.  Remember that the real purpose of XML sitemaps is not to submit URLs that Google already have in their index, it is designed for pages that Google may be having difficulty in finding.  Of course if Google or the other search engines are having trouble finding your content, you may have larger concerns:
  • have you inadvertently blocked the engines from finding certain pages?
  • have you properly linked to new site pages?
  • is your site being penalized?
    At the end of the day, anything that you can do to help the engines find your content is always a good thing.  If you are a regular user of Google Webmaster Tools, I recommend that you try the Fetch as Googlebot functionality to try and get new pages crawled and indexed by Google.  The three options above can be used by any siteowner or Webmaster, but there are certain times when one of the options is more purposeful over the others.


    Share |

    Labels: ,

    posted by Jody @ Sunday, August 07, 2011  
    Top B2B Blogs   
    Invesp landing page optimization
    About Me
    Name: Jody
    Home: Kelowna, BC, Canada
    About Me: SEO guy by day, family man 24/7.
    Previous Posts
    Marketing Jive Vault of Posts
    Online Marketing Resources
    • Content Marketing & Website Analysis
    • Digital Marketing Services
    • Optimizing for Blended Search
    • Search Engine Guide
    • WebProNews Canada
    • Official Google Blog
    • Yahoo Search Blog
    • Search Engine Watch
    • 100% Organic
    • Global Thoughtz
    • B2B Marketing Blogs
    • Silicon Valley Gateway
    • Guy Kawasaki
    • Church of the Customer Blog
    • Marketo's Big List of B2B Blogs
    Blogs We Like
    Hockey Fanatic
    JodyNimetz.com
    Ask.com Blog
    Comparison Engines
    Matt Cutts

    TechCrunch
    Techdirt
    VentureBeat
    Yahoo Search Blog

    Add to Technorati Favorites

    Marketing Jive Home

    |

    Subscribe | | Advertise | Site Map

    Add to GoogleAdd to My Yahoo!Add to BloglinesAdd to NetvibesAdd to Windows Live