Latest News: (loading..)


  • Content count

  • Joined

  • Last visited

  • Days Won


Hotclutch last won the day on May 6

Hotclutch had the most liked content!


About Hotclutch

Profile Information

  • Real Name
  • Gender
  • Location
    Cape Town, SA

Recent Profile Visitors

29,206 profile views
  1. Yes. It's normal to see duplicates for a while after installing SEO URLs. Even a small drop in performance can be experienced while old and new URLs occur in the index. It's takes a while for search engines to consolidate the change.
  2. No, there's no need to do that. If you have installed the addon correctly, then there's nothing further needed to do. There will be a period where both old and new URLs are in the index. It can take many months for Google to clean up it's index.
  3. Funny what this forum has degraded into. Post of least value gets liked on this forum.
  4. That article does not contradict what I am saying at all. "...Actually, I shouldn’t be framing these pages as “lucky,” because the reason they got to the Top10 in less than a year is most likely hard work and great knowledge of SEO, not luck..." In any case readers can believe what they want.
  5. That might be normal for a site where no optimization has been done, but as an experienced webmaster I could never accept that kind of performance. From experience I could now build a 2000 URL site on a brand new domain, and have all submitted URLs shown as indexed within one week of publishing the site. That's not to say that you will absolutely rank number one for every keyword phrase, that depends on the competitiveness of your market and the quality of your content.
  6. Roughly double the number in both cases. So approx 500 and 100 URLs when doing a site: search. In one of my previous posts on another thread I said that as a loose guide the minimum number of URLs to expect is the total of the homepage, categories and products (plus manufacturers if you use them). Now, firstly you want to make sure that all your submitted URLs are in the index and part of that number returned by the site operator search, which in the OPs case it's not. He only has 150 / 2160. And secondly you need to bear in mind that certain other URLs ought also to be included in the site search number. These URLs include paginated links, which contrary to what some people might believe, are not duplicate pages (content) of the first page in the series. Product review pages also get indexed, but you need to be careful with these because if there is no actual review(s) they generate soft 404 errors in webmaster tools. So @@Jack_mcs I think a point you're making is that the site: search number will always be greater than the submitted number of URLs, which is correct, but if it's way off then that's an indication that something is wrong. Every site is different and the only way to work things out is for the webmaster to do his own analysis of what's included in the index.
  7. Two of my smaller sites:
  8. It might be normal for a new store that has not been configured OR incorrectly configured. You have 150 out of your 2160 submitted URLs actually in the index. And I am willing to go further and say that those 150 URLs are not performing at their optimum. A 2000 URL store is a small to medium size shop, and in my experience if i were to start that size shop and did not see at least 90% of my submitted URLs indexed, I would be thinking there's something wrong with my setup. In short what I am saying is this shop is not optimized.
  9. This is totally wrong...what you are describing is the exact definition of duplicate content. A page should be uniquely identified by its URL. By inefficiency I simply mean URLs in the index that ought not to be there.
  10. I am going to suggest that sites like these are incorrectly configured or are not configured at all. You can make a loose estimate of how many pages should be in the index. Add your homepage, number of category pages and total products together and that's the basic number of pages you need in the index. If you have a lot more URLs than that basic number in the index, then you could have inefficiency and possible pagerank spread.
  11. Most likely a strong external link to that particular page is causing it to perform well in the SERPs. Instead of Sitemap SEO I would recommend using the related products module, its another way of providing link juice to products.
  12. Some people believe that paginated pages are duplicate content of the first page in the series, and then apply the <meta rel="canonical"> tag on them, which is not correct.
  13. There are various ways you can go about improving your internal link profile. A rule of thumb is to try to have every page within 2-3 clicks away from the homepage. i.e from the homepage, you must be able to navigate to any page within 3 clicks. So you want to avoid putting down many categories and burying your products deep. As stated before, in a new osCommerce the products are starved of link juice. One add-on that addresses this is Jack MCS' All Products SEO. However if you understand how linking works you will be able to achieve the same effect without the need for creating additional pages. Actually the new products module works in a similar fashion to All Products SEO. Where All Products SEO is better is with the use of filters, whereas the new products module only has pagination links to spread link juice around. My suggestion to improve the new products module would be to add the option of filtering eg. by category. I think there might be an add-on for that, not sure. Another mistake I see people make is to use rel="canonical" incorrectly, specifically wrt to pagination, and this starves products even more of link juice.
  14. "...I don't know about anyone else but from experience, the links in the header and footer receive the most internal links. Which leaves the body of the product page and category pages lacking in authority. I know you're thinking, "Isn't that the job of the breadcrumb?". My opinion is the breadcrumb helps but the only effect it has is displaying the trail in the serps. I'm unsure if it does anything otherwise..." Links in the header and footer are typically contact us / about us type pages. These pages should carry a meta noindex tag, as they are pages that don't really have content. Note - noindex and not nofollow. There's no real need to use nofollow, because if a link is not trustworthy it might be best to just remove that link from the webpage. Also linking out to relevant webpages is seen as good practice, and can increase your performance in the SERPS. Breadcrumbs do a great job of spreading link juice around the site, and should not be touched. The subject of internal linking and on-page optimization carries huge value in the webmaster's effort to position their site on the web and should not be underestimated. Another tip I can give is to use the information in webmaster tools to help shape your site. There is a section there under Search Traffic called internal links. Think of it this way: If your home page has 1000 internal links to it, and one of your product pages only has 2 internal links to it, then relative to the home page that product page is pretty unimportant. And in fact that page will most likely drop out of the index and/or perform poorly in the index. Its not only the number of links that is important. For the product page mentioned above, if one of the 2 pages happens to be an important page like the homepage, then the product can still do well in the SERPs. This is typical of osCommerce in the beginning stages of building a site - the product pages are starved of link juice. Especially a site with many products. But this is not a negative for osCommerce that should be attempted to be solved with code. Rather the webmaster must understand how linking works and then he can improve his SEO by himself. With time products are sold and modules kick in - bestsellers module, also purchased module, featured products module etc. These modules all help to spread link juice around the site.
  15. No. here's a tip to get the script to load locally.