HOME ORGANIC SEARCH SEO TIPS DIGITAL MARKETING SERVICES SUBSCRIBE

SEARCH
 
FIND US
Twitter
Facebook
LinkedIn
SUBSCRIBE
Subscribe to Marketing Jive
 Subscribe to our feed.
ADVERTISE
Marketing-Jive, formerly SEO-Space, was established in 2006 and since then we have noticed significant increases in both traffic and feed subscribers. If you want to promote your business to thousands of visitors who understand digital marketing, you’ve come to the right place. Sign up and start receiving qualified leads right now. Your ad will be visible on every unique page on our blog.
Glossary of SEO Terms
  • SEO Terms A-C
  • SEO Terms D-F
  • SEO Terms G-I
  • SEO Terms J-L
  • SEO Terms M-O
  • SEO Terms P-S
  • SEO Terms T-V
  • SEO Terms W-Z
  • Enquiro's Online Marketing Glossary
Search Engine Market Share (US)
Organic Search / SEO Tips
Google Algorithm Changes & Updates 2010: Part Two
Monday, June 28, 2010
Last week, we wrote about all of the Google updates that have happened thus far in 2010.  A number of these updates will, and already have, had a direct impact on SEO efforts of site owners.  SEO and organic search marketing is evolving and changing, it simply has to, in order to keep up with the innovations of the search engines.  The search results are as dynamic as they have ever been.  Being present for a given key phrase is not as simple as it once was.  There is no question that the Google algorithm updates will continue to have a profound impact on which sites appear in the SERPs and when.

 Google Algorithm Changes & Updates 2010:  Part Two:  Dealing with Algorithm Updates & Search Engine Changes

So what does this mean for you as a site owner?  Well a couple of things, or perhaps nothing.  We say nothing, because if you have created a site that delivers fresh, useful and informative content, you should not be impacted greatly by all of the changes from Google.  Also, many times the best way to handle a search engine algorithm update is to simply do nothing and see how everything plays out.  Let's remind folks of some of the best tips when dealing with a search engine update
  1. Don't Panic - unless you are leveraging some very questionable tactics, you may not need to panic just because an algorithm update is taking place.
  2. Keep your site well optimized - there are enough "crappy" sites out on the Web, does yours have to be one too?  Keep on top of on-page optimization and ensure that the topics of your site are clear and consistent.  Create and follow your SEO requirements.  Google actually does a pretty good job through Google Webmaster Tools at communicating issues with one's site.  Pay attention to this stuff and ensure that you site is well optimized.  If anything else ensure that the fundamentals are addressed.  What are the fundamentals you ask, well here are a couple:
    • Optimize Title Tags

    • Optimize meta descriptions

    • Optimize on-page headings/H1 tags

    • Optimize page copy

    • Leverage keyword rich anchor text when interlinking site pages

  3. Keep an eye on the SERPs within the search engines - regularly perform spot checks for some of your important key phrases in the search engines.  Pay attention to the SERP landscape.  Are you seeing the same sites?  Are you seeing more video results appear?  More local listings?  More news releases?  Have you dropped from the prime real estate?
  4. Be Patient - quite often the engines will test major changes for months prior to releasing the update outright.  This can provide you with time to make any necessary changes if required, however if you have a well optimized site that is effectively crawled by the engines you should be ok.  
  5. Keep Informed - if you have heard through the grapevine that there has been a major search engine update, check out some of the industry related blogs and sites out there.  Here's a couple that we'd recommend:
    • Official Google Webmaster Central Blog - http://googlewebmastercentral.blogspot.com/
    • Search Engine Land - http://searchengineland.com/
    • Search Engine Journal - http://www.searchenginejournal.com/
    • SEOmoz - http://www.seomoz.org/blog
    • Ask Enquiro - we post there as well - http://ask.enquiro.com
    • Marketing Jive -we've been known to share some great search engine marketing tips from time to time.  http://www.marketing-jive.com/
 What else can be done to deal with all of these Google updates and algo changes?  Here are some additional items to consider.
  1. Ensure that your site always has fresh content - Google is looking for the most relevant information to return to users based on a given search query.  Content is not king for nothing.  Content is and will always be key to gaining a greater presence in the online space.  Having said that, do not go out and create a bunch of garbage that can already be found on the Web.  Try to make your content unique and informative.
  2. Become educated about Search and share your knowledge - your online teams should be well versed in how search engines operate, but there is something to be said for keeping on top of major search engine developments and sharing this information throughout your organization.  Knowledge transfer is key to ensuring that everyone is on the same page, from copywriters to your technical and analytics teams.
  3. Keyword Research should never stop - think relevancy, what may have been relevant to your audience 10 years ago may not be as relevant today or may not be as relevant next month.  Establish a process for conducting on-going keyword research which in turn will assist you with ensuring that your site always features some fresh content.  Messaging is important as is the format it is delivered in.  (Read: video, press releases, blog posts. tweets, etc).  Apply relevant keywords to your copy when presented with the opportunity.  Factor in head, torso and long-tail as part of your keyword research strategy.
  4. Always ensure that your site is spiderable - always, always, always ensure that the search engines can find, crawl and index your content.  It never ceases to amaze me about all of the sites that restrict important content from being crawled by the search engines.  Learn how to properly use a robots.txt in conjunction with no index and/or no follow tags.  Do not make it harder for the crawlers than it needs to be.
Google makes upward of 500-600 changes to their algorithm every year.  While not everyone of these changes will have a direct impact on your site, algorithm updates and changes do happen.  Pay attention to what the engines are doing and do not necessarily overreact.

We will conclude our series next week, with Google Algorithm Changes & Updates part three.

    Labels: , ,

    posted by Jody @ Monday, June 28, 2010  
    Announcing Wordstream's Free Negative Keyword Tool
    Wednesday, June 23, 2010


    WordStream this week announced the release of a new addition to its growing suite of free keyword tools:
     
    The Free Negative Keyword Tool - This tool works in a similar fashion to a conventional keyword suggestion tool, but provides suggestions for negative keywords—in other words, terms that your ads might be matched against that aren't really relevant to your campaign. The tool is designed to help search marketers quickly create lists of highly effective negative keywords for their PPC advertising efforts.
    The Free Negative Keyword Tool gives PPC marketers the power to:
    • Elimiate wasteful ad spend before it starts by proactively setting negative keywords and match types.
    • Save time and increase productivity by identifying whole clusters of related negative keyword candidates rather than single instances of negative terms.
    • Efficiently manage negative keyword research by removing relevant "positive" terms from your list of negatives and exporting your negative keyword list in an AdWords-ready format.
    The tool makes this possible by identifying a list of high-traffic keyword niches related to the offering you market. Next, you can designate which of these keyword niches are relevant to your specific business and which ones aren't. The Free Negative Keyword Tool saves any terms you mark as negative and proactively suggests additional terms that have a high likelihood of getting matched against the ad groups within your paid search accounts.
    Let's take a closer look at how the Free Negative Keyword Tool works. The first step is to enter a word or phrase into the search box. Let's say you run a computer business and you're creating a campaign for monitors:
    Just enter your term and click "Find Negative Keywords." The tools returns a list of negative keyword suggestions for your review.

    Review each term and click "yes" if it's relevant and "no" if it's not. Clicking "no" will set the term as a negative keyword. In this case, "heart rate," "blood pressure" and "baby" aren't relevant to your campaign, but "lcd," "computer," "flat," "widescreen" and so on are. When you set a term as negative, it shows up in the list on the right. (Setting a term as positive will remove it from the list.)


     You have some additional options as you work through the list:
    • Sort the keywords by estimated search volume
    • Set the match type for each term in your negative keyword list
    • Email the results to yourself in a spreadsheet
    • Upload the list directly to AdWords
    • Exclude terms that have resulted in conversions on your site
    • Load negative keyword contenders from your AdWords search query reports
    While the tool is integrated with AdWords, the email export feature makes it easy to use the tool to find negatives for Bing and Yahoo campaigns as well.

    This is another great free offering from WordStream to help you conduct more targeted, relevant keyword research for PPC.

    Labels: , ,

    posted by Jody @ Wednesday, June 23, 2010  
    SEO and Website Hosting Webcast: What You Should Know
    Tuesday, June 22, 2010
    Here's a rundown of Search Marketing Now's latest webcast on SEO and Website Hosting: What You Should Know, featuring Vanessa Fox. Vanessa Fox is Contributing Editor, Search Engine Land.

     
    In this webcast, Vanessa discussed everything from what to expect from your hosting services, to understanding which databases to use, to diagnosing hosting issues and their impact on SEO. She also touched on items such as which server Operating Systems are used and how hosting services handle load balancing and redirects. Vanessa also shared tips on:
    • How speed affects SEO
    • How hosting services manage parked domains
    • How hosting services manage aggressive bot traffic
    • What to look for in a good SEO-focused hosting service
    Here is our live blogging coverage of the webcast. 

    Vanessa started out by talking about managing bot traffic.
    • she talked about using Google Webmaster Tools to help diagnose Googlebot issues (see crawl errors)
    • if you have access to server logs, insight can be gained by looking at bot activity there as well
    • ask you web host how they handle search engines and search engine traffic (do thet recomend blocking search engine bots?  yikes) - your hosts need to understand that Search is important
      • avoid answers such as using the robots.txt to crawl your site
      • be cautious of slowing down crawl rates - not as much of your site will get indexed
      • is shared hosting the best option for you
    • in GWT use the "fetch as Googlebot option" to see what Googlebot is seeing
    • use the translate tool in Google to see what Googlebot is seeing
    • some questions to ask include:
      • is your web host robust enough?
      • are you creating infinite crawls?
      • have you blocked login, registration nd other non-indexable pages?
    Vanessa then mentioned that bots don't read english citing a well known humourous example from hilton.com/robots.txt: 

    She then went on to discuss Page Speed.
    • she mentioned a couple of FireFox plugins that can be used to diagnose page speed issues
    • she discussed using crawl delays in robots.txt to slow rates for other search engine crawlers from Yahoo or Bing
    Location Matters
    • depends on if your company is International, but location does in fact play a roll.  (Note we've written a few posts on hosting sites with multiple GEO locales)
    • the first thing Google does is check the TLD (i.e. .com vs. ca, vs. .uk)
      • IP Location
      • you can override this with Google Webmaster Tools (obviously Google only)
    • ask your host where the Servers are located as Google may be associating your site based on server location - keep them in the country that you are focused in
    Server Requirements
    • older versions of IIS had some SEO issues - IIS 7 is pretty good.  Ask which version of IIS is being used  (do a search for SEO IIS workarounds)
    • on the Apache side, htaccess does lots of cool stuff

    Load Balancing
    • are they doing this behind a proxy?  (avoid ww1 vs. www)
    • she used Hilton as an example - with unexpected user issues
    Security
    • pay attention to malware details via GWT
    • visit the Google Webmaster blog for additional tips
    Vanessa also talked about server log access which can be used by technical resources to identify exactly what is being crawled on your site?  Is the site being crawled equally or are only certain pages or categories being crawled?  Are there Session IDs being created that you didn't know about?

    Vaness mentiond that if you have a thousand sites interlinked Google will look at this suspiciously



    Myths About Hosting

    • Class C Block of IPs - does not matter really
    Items to Consider
    • host load
    • server capacity - if the server is slow to respond, the search bots may slow their crawl
    • down time
    • status codes serve up a 503 not a 403
    • use 301 redirects as opposed to 302 redirects
    • ensure that your host does not block Googlebot as a capacity management strategy
    • ensure that your host maintains your site on a server in the country you serve
    • ensure that your host uses the latest version of IIS (if using IIS servers)
    • Apache and IIS are both fine for SEO (if used properly)
    • ensure that your host is robust enough to handle the load of your site
    • sharing IP addresses on C-Blocks does not impact SEO
    • use crawl effieicency best practices
    Vanessa shared a comment from Google's Matt Cutts:
    They were commenting on the misconception that having multiple sites hosted on the same IP address will in some way affect the PageRanks of those sites.  There is no PageRank difference whatsoever between these two cases (virtual hosting vs. a dedicated IP).  Liks to virtually hosted domains are treated the same as links to domains on dedicated IP addresses.
    There were some interesting and comical questions from the Q&A portion of the webcast.  Here are a couple of examples with condensed reponses from Vanessa:

    Q:  My developer is recommending that we route Googlebot to a dedicated server that exactly mirrors the site presented to visitors.  Would we be penalized for this?
    Vanessa:  Cloaking will get you penalized.

    Q:  If a person owns different TLDs and 301 redirects them to a .com site with servers in the US can we rank in Google.fr for our .com/fr content?
    Vanessa:  France in particular likes the .fr extension.

    Q:  If I have a .com with nglish and italian content, hosed in Italy, should I make an association?
    Vanessa: Be structured with the English content and the Italoian content.  Avoid having both on the same page.

    Q:  If we've moved a site, how long should a 301 (redirect) be in place on the server side and on Webmaster Tools?
    Vanessa:  Forever.

    Q: shopping.com routes Googlebot to a dedicated server, why are they not being penalized?
    Vanessa:  No response, joking that she is not the right person to ask about this as se no longer works at Google.

    All in all great webcast with some good reminders and tips for SEO and hosting.

    Labels: ,

    posted by Jody @ Tuesday, June 22, 2010  
    Google Algorithm Changes & Updates 2010: Part One
    Monday, June 21, 2010
    2010 has been a great year in the evolution of Google's algorithm.  Dating back to the fall of 2009, when the first tests of Google Caffeine were being mentioned, many felt that this was just the start of a new era for Google and how they would be returning search results.  In fact, we could argue that dating back to June of 2007, when Google rolled out their Universal Search (read: blended search), as well as personalization, the game of Search was starting to change.  Late last year, Google announced that they had begun rolling out personalized search to all.

    Google is not a company that does things by the book, nor are they a company that falls to complacency. I was at SES New York in the Spring of 2009, and I remember a comment from a Googler who stated that last year (implying 2008 at the time) Google made around 450 changes to their algorithm.  This to me was something that I, like many, had suspected, but when you break it down, this told us that Google makes 1.23 changes to their algorithms everyday...  everyday!  Talk about continuous improvement.  The reason for the high frequency of tweaks and changes to the algorithm?  Google truly does want to provide the best results possible.  Currently of the major search engines, Google does in fact return the best results out there, but as we all know, the results are not always perfect.  Having said that Google continues to dominate in terms of market share, at least in North America, where Google currently sits at around 64% according to comScore data.

    The point is, is that there has been a lot of change with Google in 2010 thus far.  Let's take inventory of some of the changes Google has rolled out in 2010.

    Inventory of Google Algorithm & Related Updates for 2010
    1. August 2009:  Initial Roll-out of Google Caffeine - stating it is more of an infrastructure update as opposed to an algorithm update, this was where we were seeing some of the testing of Google Caffeine in action.  Google Caffeine would officially roll out in June 2010.  See item #11 below.
    2. Fall 0f 2009:  Google Strives to make the Web Faster - page speed becomes a hot topic of discussion as real-time search begins to make it's way into the Google search results.  Matt Cutts stated that ".. a lot of people within Google think that the Web should be fast..".  Optimizing for page speed can not only help your site perform well from a technical stand point, but it can help improve bounce rates, and in the near future may actually help improve your rankings in Google.  Well it appears that the future is here as Google has officially factored page speed into their algorithms.
    3. December 2010: Real-Time Search Feature is Launched - Clicking on "Latest results" or select "Latest" from the search options menu now offers a view a full page of live tweets, blogs, news and other web content scrolling right on Google. 
      http://googleblog.blogspot.com/2009/12/relevance-meets-real-time-web.html
    4. January 2010: Answer Highlighting in Search Results - has Google turned into ASK Jeeves, well no, but a similar concept where answer highlighting helps you get to information more quickly by seeking out and bolding the likely answer to your question right in search results.
    5.  February 2010:  Refined Searches by Location - Google added the ability to refine your searches with the "Nearby" tool in the Search Options panel.  Another Google update to another piece of their Universal/blended search results.
    6. March 2010:  Google Search Feature launched to Provide Public Data - If you go to Google.com and type in [unemployment rate] or [population] followed by a U.S. state or county, you will see the most recent estimates.  In March of 2010, Google released this feature again in hopes of providing a richer Search experience for users. 
      http://googleblog.blogspot.com/2009/04/adding-search-power-to-public-data.html
    7. March 2010:  Google to Display Microdata in Descriptions - with the search results, Google announced that Microdata for use in rich snippets as specified in HTML 5 would now be included in the description of the results if they are defined in the page.  http://googlewebmastercentral.blogspot.com/2010/03/microdata-support-for-rich-snippets.html
    8. April 2010:  Google Announces that Page Speed / Site Speed is Officially a Ranking Factor - there is a chance that slower sites would be affected by a decline in rankings in Google SERPs.  http://googlewebmastercentral.blogspot.com/2010/04/using-site-speed-in-web-search-ranking.html
    9. April 2010:  Google Updates Real-Time Search Results - zoom to any point in time and “replay” what people were saying publicly about a topic on Twitter. To try it out, click “Show options” on the search results page, then select “Updates.” The first page will show you the familiar latest and greatest short-form updates from a comprehensive set of sources, but now there’s a new chart at the top. In that chart, you can select the year, month or day, or click any point to view the tweets from that specific time period.  http://googleblog.blogspot.com/2010/04/replay-it-google-search-across-twitter.html
    10. May 2010:  Google MayDay Update:  Long Tail Algorithm Change - what began on April 27, 2010, started with reports that some site owners were experiencing as much as a 50-90% decrease in traffic from long tail searches.  As Matt Cutts stated in late May, "... This is an algorithmic change in Google, looking for higher quality sites to surface for long tail queries. It went through vigorous testing and isn’t going to be rolled back."  Long tail traffic was impacted based on the Google MayDay Update.  This was confirned by Google as a rankings change and not a crawling or indexing change.
    11. June 2010:  Google's New Search Index:  Caffeine - on June 8th, Google announced that their new indexing engines was official.  According to Google, "Caffeine provides 50 percent fresher results for web searches than our last index, and it's the largest collection of web content we've offered. Whether it's a news story, a blog or a forum post, you can now find links to relevant content much sooner after it is published than was possible ever before."  Caffeine was built for the future in mind, factoring in things such as real-time search and the need for timely search results.  Was this a reaction by Google to micro-blogging and social environments such as Twitter?  Perhaps, but the fact of the matter is, Google understood that they needed to improve their search results as people search for accurate and timely information.  http://googleblog.blogspot.com/2010/06/our-new-search-index-caffeine.html
    There you have it, eleven distinct Google updates with many of them tweaks to their algorithms that have transpired over recent months.  If there is one thing that you can appreciate about Google, it's their desire to evolve and leverage innovation in hopes of providing the world's information in an easy to find and timely manner.

    In part two, we will look at the impact of these Google updates on site owners and on SEO.

    Labels: ,

    posted by Jody @ Monday, June 21, 2010  
    Best SEO Post of 2010: Bill Slawski on Google's Reasonable Surfer
    Sunday, June 20, 2010
    The first six months of 2010 have been an exciting time in the Search space, specifically with Google.  Google continues to move at "Internet Speed" and continues to be the dominant player in the Search space.  If you are a regular reader of Marketing Jive, you've probably noticed that this Spring we have discussed interlinking strategies on numerous occassions.  This is for goo reason culminating with regards to a patent that Google applied for in 2004, that was granted on May 11, 2010.  Dubbed the "Google Reasonable Surfer" patent by Bill Slawski, the patent basically states that "...Not every link from a page in a link-based ranking system is equal, and a search engine might look at a wide range of factors to determine how might weight each link on a page may pass along.".

    This has a direct impact on how site owners should and will link from within their own web pages as different features associated with links, and the pages they appear upon and point to, may determine how much value those links pass on to the recipient pages to which they link.  Bill Slawski is a pretty brilliant guy.  He has an uncanny ability to dissect Google patents and pull out the key items and share them in an unambiguous manner.  Probably the best SEO post of 2010 and possibly the past couple of years is Bill's thoughts on the Reasonable Surfer patent. 

    In the past, we have discussed what makes a good link a good link, but in Bill's piece, he takes this to another level identifying a multiude of factors that Google most likely uses when placing authority or value on a link from page A to page B.

    Examples of features associated with a link might include:


    1. Font size of anchor text associated with the link;

    2. The position of the link (measured, for example, in a HTML list, in running text, above or below the first screenful viewed on an 800 X 600 browser display, side (top, bottom, left, right) of document, in a footer, in a sidebar, etc.);

    3. If the link is in a list, the position of the link in the list;

    4. Font color and/or other attributes of the link (e.g., italics, gray, same color as background, etc.);

    5. Number of words in anchor text of a link;

    6. Actual words in the anchor text of a link;

    7. How commercial the anchor text associated with a link might be;

    8. Type of link (e.g., text link, image link);

    9. If the link is an image link, what the aspect ratio of the image might be;

    10. The context of a few words before and/or after the link;

    11. A topical cluster with which the anchor text of the link is associated;

    12. Whether the link leads somewhere on the same host or domain;

    13. If the link leads to somewhere on the same domain,
    • whether the link URL is shorter than the referring URL; and/or
    • whether the link URL embeds another URL (e.g., for server-side redirection)

    Examples of features associated with a source document might include:

    1. The URL of the source document (or a portion of the URL of the source document);

    2. A web site associated with the source document;

    3. A number of links in the source document;

    4. The presence of other words in the source document;

    5. The presence of other words in a heading of the source document;

    6. A topical cluster with which the source document is associated; and/or

    7. A degree to which a topical cluster associated with the source document matches a topical cluster associated with anchor text of a link.

    Examples of features associated with a target document might include:

    1. The URL of the target document (or a portion of the URL of the target document);

    2. A web site associated with the target document;

    3. Whether the URL of the target document is on the same host as the URL of the source document;

    4. Whether the URL of the target document is associated with the same domain as the URL of the source document;

    5. Words in the URL of the target document; and/or

    6. The length of the URL of the target document

    Bill covers it all.  Not only within the post, but within his reponses to the comments to his post.  Read the post in it's entirety here: Google's Reasonable Surfer: How the Value of a Link May Differ Based upon Link and Document Features and User Data

    I've thought about sharing some of my thoughts on Google's Reasonable Surfer patent, but since Bill's article is so indepth and the fact that there has been some great coverage of the Reasonable Surfer patent including an article by a team member of mine, Tina Kells on Ask Enquiro, I'll save my comments for another time.  I will say this however, Google is looking at signals other than links as well, which I feel isn’t necessarily a bad thing.  I feel that perhaps there is sometimes too much weight placed on link popularity and that items such as timiliness of information, personlization and user behavior are other factors that play a key role in which pages should show up in the search results.

    Google’s Reasonable Surfer Patent: Interlinking, Link Building and SEO Strategy Implications - Tina Kells - Ask Enquiro
    SEO Implications Of Google’s “Reasonable Surfer” Patent - Eric Enge via Search Engine Land

    Labels: ,

    posted by Jody @ Sunday, June 20, 2010  
    The Importance of Establishing SEO Requirements for Your Web Teams
    Friday, June 18, 2010
    As a search strategist, I wear many hats.  While I may specialize in organic strategies, I am more than an SEO.  In fact I'm not a big fan of the term SEO (Search Engine Optimization or Search Engine Optimizer).  I work very hard at developing strategies and leveraging white hat tactics for my clients.  Quite honestly, I've had great success.  However I do not just do SEO.  In addition to being a Search Strategist I also actively participate in the following roles and tasks:
    • Project Management
    • Drive continuous improvement
    • Product Development
    • Blogger
    • Manage/Coach SEO teams
    • Educate
    • Deliver customer service
    • Establish process and requirements
    It is this last item that I wanted to touch on today, the importance of developing processes and establishing SEO requirements.  We've dealt with a number of clients who communicate to us that they have their own SEO requirements, yet when we ask to review them, what they often come back with is not what we would consider requirements.

    If you are fortunate enough to have an SEO or online marketing team, you should be working on establishing the following:
    • process documentation - to address the how to's of completing a task (i.e. keyword research)
    • best practice compilation/creation - to ensure that the tactics that you are considering are the ones that you should be using.  Compilation is important because this can avoid a lot of rework from happening in the future.
    • SEO Requirements - this is more from an implementation perspective that can be used by your SEO team, your technical team and/or your content team.
    The importance of establishing SEO requirements cannot be understated.  Having requirements in place is a great way to:
    • educate your teams
    • improve the implementation process
    • promote the need for continuous improvement
    So what type of SEO requirements should you establish?  That is a great question.  The answer is, it depends.  It depends on the size of your team, how far along you are in the search marketing maturity model, and really what your online/seo needs are.  The fact is that without SEO requirements, your teams may be spinning their wheels in tying to progress from an online marketing perspective.  Furthermore your team may be missing opportunities to improve your online marketing initiatives.

    5 Areas That Need SEO Requirements
    1. CMS Requirements - you should define all of the SEO fields that you need as part of your CMS. Here's a good example of some of the requirements that you might want to consider including with your CMS requirements.  Typically with regards to SEO requirements for your CMS, you would include things like:
      • Ensuring that all pages have the ability to have a unique title tag that consists of 70 characters
      • Authors should have the ability to populate and customize the meta description tags
      • Authors should have the ability to edit the h1 or other heading tags of pages
      • URLs, including dynamic ones should have the ability to be rewritten to be more user and search engine friendly
      • Authors should have the ability to insert hyperlinks within the body copy of page content

    2. On-Page SEO - establish requirements for those who will be assisting in the on-page optimization efforts of your site.  This includes having requirements for:
      • tagging (i.e. title tags and meta descriptions)
      • H1 or heading tags
      • Page Copy

    3. Content Requirements - this goes without saying, but you should create SEO requirements for your copywriters and include any necessary instruction with regards to any legal compliance.  Remember content is no longer just html text, content can be in the form of images, video, blog posts, tweets, press releases etc.  Establishing clear requirements can ensure consistency and improve your content development process.
    4. Interlinking -  establishing SEO requirements for interlinking is something that a number of sites that we have seen simply just do not have in place.  Part of the reason for this is that many sites do not have formal interlinking strategies.  Establishing requirements for interlinking does not have to be complicated.  It can be as simple as outlining the process for identifying anchor text to be used, where to incorporate the links on a given page and which types of pages to either link from or link to.
    5. Technical Requirements - to address things such as page speed, image size, Flash, code clean up etc.  Most SEO or online marketing teams will have some form of technical requirements in place.  However, it is a good idea to revisit the technical requirements on a regular basis.  Technical requirements can cover off anything from:
      • Coding requirements
      • CSS standards
      • Tracking code implementation
      • XML sitemap specifications
      • Href link code specifications
      • and many more
    If anything else, SEO requirements are a great way to educate the various teams that you have who work on your web properties about SEO, and industry standards.  Clear requirements can make it easier for your technical teams to implement SEO recommendations so that your site is optimized to its fullest potential.  It might be time to take inventory of your requirements that you currently have in place.  If you do not currently have SEO requirements, it's never too late to start establishing them.

      Labels: , ,

      posted by Jody @ Friday, June 18, 2010  
      SEO Tips: SEOmoz Interview with Google's Matt Cutts
      Thursday, June 10, 2010
      There was a great post over at SEOmoz with a Whiteboard Interview between Rand Fishkin and Google's Matt Cutts.  There were some great SEO topics discussed including:
      • Should Webmasters Use the 'If Modified Since' Header? - a good standard practice, but Matt notes that it won't necessarily help you get crawled faster.

      • Should Webmasters Use 503 Status Codes for Downtime? - 03s can help avoid getting a page that's under construction or experiencing problems crawled and indexed

      • Does the Number of Outbound Links from a Page Affect PageRank? - There's no need to hoarde all of your link juice on your page and, in fact, there may be benefit to generously linking out...

      • Is a Trailing / Important in URL Structure? - Matt says he would slightly advocate for using a trailing slash simply because it clearly indicates that a URL is a folder and not a document.

      • Does Google Crawl from Multiple Geopgraphic Locations? - Google basically crawls from one IP address range worldwide because (they) have one index worldwide. (They) don't build different indices, one for each country.
      • Is It a Bad Idea to use Chain Redirects (e.g. 301-->301-->301)? - yes ideally no more than 2. 

      Check out the full video here:
      http://www.seomoz.org/blog/whiteboard-interview-googles-matt-cutts-on-redirects-trust-more

      Labels:

      posted by Jody @ Thursday, June 10, 2010  
      More Updates from Wordstream: Embedded Versions of WordStream’s Keyword Tools
      Tuesday, June 08, 2010
      WordStream Inc., a provider of search marketing software and keyword tools, has made its suite of advanced keyword research and organization tools available to embed on any website for free.

      Using WordStream’s Free Keyword Tool widget generator, website owners can generate custom versions of WordStream’s keyword tools (including The Free Keyword Tool, The Free Keyword Niche Finder and The Free Keyword Grouper) to embed on their websites. You can easily match the tools to the look and feel of your own site and branding.

      Embedded versions of WordStream’s keyword tools offer a number of potential benefits for webmasters, bloggers and online business owners:

      • These custom tools are a great way to turn visitors into leads and members of your mailing list.
      • The widgets provide free access to WordStream's powerful keyword suggestion tools that will help you better market your business.
      • The widgets provide a reason for visitors to continually return to and spend more time on your site.
      Visitors to your website now can perform the same high-level keyword research and keyword organization and discover the same profitable and traffic-driving keywords that the WordStream keyword tools provide, but they never have to leave your site to get them. By adding these custom tools to your website, you can leverage the work of WordStream’s team of engineers to make your own website more useful and compelling to your audience.

      You can preview WordStream’s keyword tools that you can embed on your own website for free, here: http://www.wordstream.com/free-tool-embed-generator/

      Anyone who owns a website now has the ability to quickly and easily match the tools to the look and feel of their site to make them their very own. By adding these custom tools to your website, you can leverage the work of WordStream’s team of engineers to make your own website more useful, more compelling and much more valuable to your audience.

      To create custom versions of the WordStream keyword tools that you can embed on your own website, go to: http://www.wordstream.com/free-tool-widget-generator/ and follow the easy four-step process to obtain your embed code.

      Labels: ,

      posted by Jody @ Tuesday, June 08, 2010  
      Local Search Optimization: Single Best Optimization Tip
      Friday, June 04, 2010
      We continue to run our series of posts about the importance of content development and establishing a content development strategy.  For a business with multiple physical  locations. as part of this process, you should also be considering the idea of optimizing for local search.

      The Single Best Optimization Tip for Local Search Optimization

      Are you ready for this?  Ok, well it's not rocket science, if you are trying to optimize your website for local search, the single best item that you need to ensure you implement is creating a page for each every location that you have.  It is content creation for local search to help target your local audiences.  It means having branch location pages for all of your stores.  If you do not think that there is any value to this, consider this post by Google's Matt Cutts, where basically Matt is indirectly telling us how to optimize for local search.

      In the post Matt suggests the following:
      • If you want your store pages to be found, it’s best to have a unique, easily crawlable url for each store.
      • each web page should have a unique url
      • create an HTML sitemap that points to the web pages for your stores
      • If you have a lot of stores, you could have a web page for each (say) state that links to all stores in that state.
      • Google does provide Google Places (formerly Google Local Business Center) where you can tell Google directly about your business, as do other search engines.
      For the record, we have made this recommendation time and time again to many of our clients.  Those that have taken the time to build out store or office location pages have had the greatest success and have improved traffic to their web sites for geo-appended phrases, have increased their presence within both blended, local search results and in the results of the main index of engines such as Google.

      Everything that Matt stated is helpful in improving your presence within local search results.  I would add that you should ensure that you leverage on-page optimization practices when creating your local store pages to help the engines find and  rank this content accordingly.  This includes leveraging brand and geo-appended keywords in:
      • title tags
      • meta descriptions
      • URLs
      • on-page headings
      • in the anchor text linking to your store pages
      • in the page copy of your store pages
      This will assist your pages in improving the relevancy and authoritativeness for these terms.

      The post from Matt is an excellent one, because it spells out something that, while is extremely obvious, is often overlooked by many larger site out there.  You need to have regional content if you want any chance to be found for local search queries or in the local search engines.

      The single best optimization technique for local search?  Content creation for your branch or store location pages.

      Labels: ,

      posted by Jody @ Friday, June 04, 2010  
      8 Cool SEO Posts from the First Week of June 2010
      Lots of great stuff being written about Search, social media and SEO this week. We would like to share some of our favorite SEO-related posts from the first couple of days of June.
      1.  B2B Search Marketing Success Is A 2-Part Equation - Search Engine Land

        The key to maximizing ROI is to have your pre- and post-click activities aligned; the right hand needs to know what the left hand is doing. If you are running ads that focus on one particular product, drive prospects to a page that specifically addresses that product  (not the home page and not an all-products page).

      2.  The Ultimate Beginners Guide to Personalized Search - Site Visibility
      3. How Do You Choose a Link Building Company? - Daily SEO Tip
      4. 52 Questions To Ask When Hiring A Social Media Company - Outspoken Media

        If you’re a business owner feeling like everyone around you is already fully immersed in social media and ‘getting it’, it’s intimidating to take those first steps or even ask for help. How do you hire a social media company when you’re not versed in it? What are you supposed to look for? What do you ask? How do you not embarrass yourself so they reject you before even knowing you?

      5. What signals are used in ranking other than PageRank? - YouTube


      6. Google & Site performance: The compilation answer album - Maile Ohye

        One of the ways we measure a page’s speed incorporates both download and render time — we pay attention to the time taken from the moment the user clicks on a link until just before that document’s body.onload() handler is called. This includes:

            * DNS resolution
            * network travel time
            * browser time to construct and render the DOM
            * time to parse and execute necessary javascript
            * and so on and so forth
      7. Link Building Tool Interview with Eric Ward - Vertical Measures
      8. comScore Releases April 2010 U.S. Online Video Rankings - comScore



      Labels: , ,

      posted by Jody @ Friday, June 04, 2010  
      Content Development: 11 Must Reads from Marketing Jive
      Thursday, June 03, 2010
      How is your content development strategy working for you?  Have you noticed an increase in traffic from Search?  Has the number of pages that you have indexed in the major search engines changed significantly?  Has Google "MayDay" affected your traffic?  Need help developing your content development strategy?  Marketing Jive is pleased to present a compilation of our must reads on content development and writing for the Web.


      Marketing Jive's Top 11 Posts on Content Development

      1. Planning a Content Development Strategy - "To be successful in the hyper-competitive arena that is the Internet, you need to provide useful, fresh and informative content.  Content development is challenging - you are asking people and businesses to act as publishers – which more than likely they are not. Content production is also time-consuming. Therefore, the need to plan a content development strategy is a key element of any online marketing campaign. "
      2. 37 Things To Keep in Mind When Writing Web Content - "It doesn’t matter what type of site that you have, it doesn’t matter if your site is B2C or B2B, writing effective web content is not something you just do and push out into the online universe. Content development should be well thought out with consideration of who the intended audience is."
      3.  Content Development Strategy: Guiding Users to a Better Experience - "Content helps tell a story, your content helps build trust with your audience and it helps educate your audience on whatever it is that you are trying to communicate to them. Your content can be short in nature or longer, but it must deliver value to the audience."
      4.  Content Development Strategy: 10 Items to Incorporate into Your Requirements - "These requirements will allow you to efficiently produce the content that you need to reach your desired target audience.  Content is king is the digital space."
      5. The Importance of On-Page Optimization - "You simply will not be found in the prime real estate of the Search results without on-page optimization. If no one (i.e. the Searcher) cannot find your page in the first place, the chance of a conversion happening disappears."
      6. B2B Content Development: Auditing Your Existing Content - "Content really does form the foundation of your website.  Failure to provide informative and useful content may mean that site visitors leave never to return again.  Even worse they leave to go to a competitor's site"
      7.  Web Content Development: Evaluation of All of the Touchpoints - "Each play a key role and while some of these roles can be combined or shared, you see the importance of ensuring that you have a central Web editor to ensure that the content you publish is accurate, reflective and relevant to the intended audience. "
      8. Duplicate Content: Thoughts from Google Webmaster Central - "Consolidating properties from duplicates into one representative URL often provides users with more accurate search results."
      9.  Content Development: The Importance of Brand Pages for e-Commerce Websites - "When it comes to e-commerce sites, informative content, or lack thereof can be the difference as to why a site visitor makes a purchase from you as opposed to your competitor. "
      10. Creating Unique Content: Do It Before Someone Else Does - "Blogging... video.... press releases.... articles, user reviews, the list goes on and on. All methods of creating fresh, unique content for your website. So why is it that so many site owners or site managers are hesitant to create new content for their websites?"
      11. User Generated Content: Google to Crack Down on Spammy UGC - "There is no question as to the value of user generated content.  However, it really should be monitored to ensure that spam commentary is not dominating the conversation."

      Labels:

      posted by Jody @ Thursday, June 03, 2010  
      Twitter Stats: From Timelines to Tweet Deconstruction
      Wednesday, June 02, 2010
      Here at Marketing Jive, we love lists.  So when we come accross a great list, we like to direct our visitors to it as we think that you will find value in the information that is shared within that list.  We came across one of these such lists that was identified, where else, but on Twitter.

      The list itself is entitled "The Ultimate List: 100+ Twitter Statistics" and comes from our friends at Hubspot.  Here is a sample of some of the statisics featured in the list.
      • The Journey of a Tweet
      • Most Retweetable Days and Times
      • Information Creation and Circulation Before and After Twitter
      • Facebook vs. Twitter
      • Twitter Facts and Figures
      • The Anatomy of a Tweet
      • Twitter on Paper
      The full list can be accessed here:
      http://blog.hubspot.com/blog/tabid/6307/bid/6050/The-Ultimate-List-100-Twitter-Statistics.aspx

      There are a lot of great tools out there to keep track of Twitter stats and such.  These tools can be used to help you measure just how you are doing on Twitter.

      Labels: ,

      posted by Jody @ Wednesday, June 02, 2010  
      Content Development Strategy: Guiding Users to a Better Experience
      Whether you are researching, gathering, writing, organizing, displaying or editing information for publication on web properties, content development and enhancement is the one thing that you should be focusing your efforts on.  In fact, what you produce in terms of content will have great influence on the popularity and profitability of your website.

      You Like Me, You Really Like Me

      Producing great content on your web properties can result in huge benefit to your business.  Just think about how people react to your content.  If people like your content, they may:
      • subscribe to it
      • syndicate it (good)
      • copy it (bad)
      • tweet about it
      • quote it
      • link to it
      • revisit it
      • bookmark it
      Conversely if your content is, shall we say, lackluster, people may ignore it or even worse never see it due to a lack of presence in the search results.  Definitely not the reaction or response that you want from potential clients or customers.  Content helps tell a story, your content helps build trust with your audience and it helps educate your audience on whatever it is that you are trying to communicate to them.  Your content can be short in nature or longer, but it must deliver value to the audience.  Your online content should tie into your offline messaging and vice versa.  In fact with regards to your messaging, it is not a matter of online vs. offline, it is all about integration and consistency.   It is about delivering the right content at the right time to the right person. 

      Your content development strategy should keep your target audience at the center.  You need to provide the right content to accomplish what I refer to as the ability to GUIDE your audience:
      1. Generate interest
      2. Uniquely communicate your subject matter
      3. Increase and acquire prolonged attention
      4. Demonstrate authority or expertise
      5. Earn respect of your audience
      Generate Interest

      A well established content strategy will allow you to plan out your content and map out the best ways to generate interest in your brand, your web property and your service offering.  The key to generating interest is just that, the content must provide value to someone.  In order to accomplish this it (the content) must be useful, easy to interpret and timely with regards to the user's intent. It also helps if your content is somewhat unique.

      Uniquely Communicate your Offering

      How many times have you visited an e-commerce site and checked out a product description only to visit another vendor site and see the exact same product description?  This does nothing to entice me to select one vendor over another.  Perhaps Maki, author of http://www.doshdosh.com/ blog said it best when he stated that "content differentiation helps you survive & succeed in a crowded space".  The Web indeed is a hyper-competitive environment.  Anytime that you have the opportunity to provide unique, informative content, do it.  Your audience will appreciate it.

      Increase Prolonged Attention

       I am referring to here is engagement.  By establishing an effective content development strategy for your web properties, you improve your chances of increasing site "stickiness" and the chance of a repeat visit.  We know that people's attentions spans are, for the most part, short.  As mentioned above, the Web is uber competitive and your content is only as good as your visitor interprets it to be.

      Demonstrating Authority & Expertise

      Not everyone is articulate, but when it comes to your Web content, your messaging should resonate with your visitors.  As a result your content should be clear and concise and be presented in the manner in which provides the best value to your audience.  Communicating your unique competitive advantage can go a long way in demonstrating your expertise.  Avoid using industry jargon.  The wise content developer will speak in the language of their audience and avoid marketing, sales and industry balderdash.

      Earning Respect of Your Audience

      From an SEO perspective, one of the greatest gifts that you can receive as a result of creating useful content is a link to your content from other websites or blogs.  Especially if they have found value in your content.  more quality links that you have pointing back to your site, the more authoritative search engines such as Google will treat your content.  Earning the respect of the search engines (through their complex ranking algorithms) is part of earning the respect of your audience.  In addition, as you GUIDE users through your content,  you can establish trust, which you can measure via your subscribers, repeat visitors, traffic from social networks, and community buzz or simply with your monthly visitor traffic.  If your content is of value, you will experience increases in all of these metrics and your audience will continue to engage with your web properties.

      Guiding your audience to the information that they are looking for via well established content will help build that relationship that you need to establish  in order to be successful in the online arena.  From an online perspective that may mean providing user generated content to help establish trust in your product or service offering.  It may mean establishing a "how to" video to communicate how to use features of your product.  From a B2B perspective, it may be as simple as providing reassurance to your audience to help reduce the amount of risk your prospect has when making their purchase decision.  Regardless of your situation, use your content to provide your users, your customers and your site visitors to a better online experience.

      Labels: , ,

      posted by Jody @ Wednesday, June 02, 2010  
      Enquiro B2B Webinar: Managing B2B Leads for Sales Success
      Tuesday, June 01, 2010
      Just a reminder that Enquiro is pleased to be presenting the latest webinar in our B2B Expert Series: Managing B2B Leads for Sales Success.  This time out, we’ve invited one of America’s leading experts on B2B lead generation, Mac McIntosh to speak to our audience.  Mac is considered to be one of America's leading experts on B2B lead generation. President and principal consultant of B2B lead generation consulting firm Mac McIntosh Incorporated, and founding partner of B2B marketing automation services firm AquireB2B, McIntosh specializes in helping companies generate more leads and close more sales using the latest marketing strategies, tactics, technology and media.

      Mac, Gord Hotchkiss and Bill Barnes will be leading us through a series of Questions and Answers which are top of mind for marketers.

      The webinar takes place:
      Thursday June 10

      11:00am – 12:00pm (PST)
      Length: 30 minutes
      Register here: http://pages.enquiro.com/webinar24-managing-b2b-sales-leads.html



      Managing B2B Leads for Sales Success
      Can B2B Marketers increase revenues with tighter integration between your sales and marketing team functions?
      In this special live webinar, two leading industry experts will address this question and those below that are top of mind for many B2B Marketing Executives today.

      1. What technologies exist that will facilitate more collaboration between marketing and sales?
      2. Should social media be part of your demand generation strategies?

      3. What role does search marketing play in improving both the quality and quantity of your leads?

      4. How will demand generation look different in 2015?

      Join us for a conversation with Mac McIntosh (from The Business-to-Business Sales Lead ExpertsTM) and Gord Hotchkiss (author of The BuyerSphere Project) to discuss these and other existing realities that face B2B Marketers.
      This webinar is recommended for VP’s, Directors and Managers in Sales and/or Marketing roles.

      Register here: http://pages.enquiro.com/webinar24-managing-b2b-sales-leads.html

      Labels:

      posted by Jody @ Tuesday, June 01, 2010  
      Top B2B Blogs   
      Invesp landing page optimization
      About Me
      Name: Jody
      Home: Kelowna, BC, Canada
      About Me: SEO guy by day, family man 24/7.
      Previous Posts
      Marketing Jive Vault of Posts
      Online Marketing Resources
      • Content Marketing & Website Analysis
      • Digital Marketing Services
      • Optimizing for Blended Search
      • Search Engine Guide
      • WebProNews Canada
      • Official Google Blog
      • Yahoo Search Blog
      • Search Engine Watch
      • 100% Organic
      • Global Thoughtz
      • B2B Marketing Blogs
      • Silicon Valley Gateway
      • Guy Kawasaki
      • Church of the Customer Blog
      • Marketo's Big List of B2B Blogs
      Blogs We Like
      Hockey Fanatic
      JodyNimetz.com
      Ask.com Blog
      Comparison Engines
      Matt Cutts

      TechCrunch
      Techdirt
      VentureBeat
      Yahoo Search Blog

      Add to Technorati Favorites

      Marketing Jive Home

      |

      Subscribe | | Advertise | Site Map

      Add to GoogleAdd to My Yahoo!Add to BloglinesAdd to NetvibesAdd to Windows Live