Setting Up A SEO Plan Of Attack For Your Blog or Website
Before you plan on absolutely dominating the search engines, it’s essential that you review how you’re currently managing your SEO techniques, by doing a quick checkup. Your website or blogs, regardless of their size or if they’re dynamic or static, will benefit a great deal by removing certain roadblocks that may impend your search engine rankings. You must also know exactly where your target keywords are ranked, as well as setting up a strategic plan of attack to get fully optimized.
Removing SEO Roadblocks
In a perfect online world, all of the pages on your site would be completely accessible and thus indexed by the search engine robots, and crawled as frequently as possible. The issue that most blogs and more in particular websites encounter, is that there can be significant roadblocks in the form of diagnostic issues, resulting in SE robot crawl errors. Removing or reducing these barriers will give the search engines easier access to crawl your content, and should also help you to refocus your SEO efforts as well.
Three Server Errors To Look For
1.) The first step is identifying the pages that are returning server errors. You should be able to find this at the bottom of your AWStats. Look for any errors in the 400’s or the 500 ranges of status codes. If there are any, then start correcting those problems first.
2.) Go find any pages that has broken links. You can do so by using a free W3C Link Checker found here: http://validator.w3.org/checklink/. If there are any broken links, then start building pages where the pages don’t exist, or use a 301 redirect and send the users to a different destination. You’ll be surprised how well this technique works over time on your search engine rankings.
3.) Remove as much duplicate content on your site as possible. The areas which are the most susceptible to dupe content are found in:
– The title tags
– Meta descriptions as well as product descriptions
– Dynamic URLs
One tool that is particularly useful in helping you eliminate duplicate content is Xenu Link Sleuth. Running this application will not only provide information on broken links, but the data can also be exported for possible duplication content issues as well.
Ensuring Website Relevancy In SEO
Websites can easily suffer from becoming SEO irrelevant, especially when the content relies on edgy material, ‘greyhat’ or freewheeling content development strategies. One of the best applications that you can use to make sure that there is site relevance, is by using Google’s Webmaster Tools.
Google will provide you with an ‘idea’ of what they consider is the most commonly used and most relevant keywords are on your site, just by entering your URL. This information can then be used to further optimize your content by using those exact keywords. Also ensure that your site is focused around a particular niche ‘topic’ using a central theme.
What may be more important by using Google’s Webmaster Tools, is it will reveal if your content is actually relevant (as according to Google,) to your site and its underlying objectives. Another quick and easy option is to develop and use a tag cloud on your site, even if it’s for individual pages. Tag clouds will help in determining how relevant your targeted ‘keywords’ are for that particular page. One of the best to use is TagCrowd.com, which is free.
Determining Your Existing Rankings
One of the biggest mistakes that is made is the failure to recognize that SEO is indeed a process. You must be continuously building upon incrementally improving your sites over time in order to receive those high SE rankings which everyone craves.
If you are unsure or wanting to know exactly where your site actually ranks for a particular keyword, then it may be time to set a formal plan. You need to set up regular routine based on a review of crawl errors, ranking reports and relevancy assessments.
This will not only help to reduce the required SEO that you need to do in the future, but it should also improve your rankings in the short term as well. The problem is, at any given time that you make adjustments to your site, it makes it extremely difficult to know if the changes are having any effect. Are the changes hurting or helping your overall efforts.
Google’s Webmaster Tools will provide insight on how a site that’s ‘verified’ is currently ranking for a variety of queries through the Search Queries report. The report will also reveal the number of impressions that you’re receiving, rather than the number of clicks. Which is a good indicator if you need to improve the quality of your titles and descriptions on your site.
There are obviously alternatives to these Google’s tools, and most if not all of these search marketing solutions will offer ways to track rankings, or cutting down the time to do so.
Hopefully there is now a better understanding of crawl errors, as well as a renewed focus on improving relevance to the central theme on your site, as well as a commitment to determining your current rankings. So now it’s time to set up a plan of attack.