Latest News: (loading..)


  • Content count

  • Joined

  • Last visited


About peterr

Profile Information

  1. Hi, You will have to look in your web server logs, and find out what agent name the spider simulator uses, then place _that_ agent name in .htaccess. The .htaccess file needs to be in the 'web root' path, which is simply the path that is: So, to answer your question, yes, you would need to 'consolidate' the old .htaccess contents, plus the 'new' (this contrib) .htaccess contents, to create a new .htaccess. You will need to be careful about the order of the commands though. :-" Peter
  2. Hi, 1. Do you have 'prevent spider sesions' set, in admin ? 2. Are searches on Google, Yahoo,etc, for your website, including the session ID ? 3. You ask if the .htaccess file supplied can be used as is, well it is setup to look for either: * msnbot * slurp * googlebot so you would have to modify it, according to the spider/s that are showing session id's in web search engine results for your site. 4. Placement, .... it goes in the web root path of course. 5. Good article at 6. The best advice I can give, for you to be sure that this will work for you, is for you to setup a 'test' path, which would be a complete copy of your osCommerce (/catalog/ path ) files, so that you have: (you will have to mod the 2 configure.php files to point to the test path). then add the .htaccess file in the ../test path, the one from the contribution, and modify this line: RewriteCond %{HTTP_USER_AGENT} !(msnbot|slurp|googlebot) [NC] to .................... RewriteCond %{HTTP_USER_AGENT} !(firefox) [NC] From memory, the "NC" assures the command is not case-sensitive. Now, I'm assuming you have Firefox, because anyone who wants a secure browser shouldn't be using IE. :lol: Anyway, you no doubt get the idea, you modify the .htaccess in the .../test path, to reflect the agent name of the browser that you use. Now, you should be ready to test the mod_rewrite, use this URL: and if the /test path, and .htaccess have been modified correctly, the URL _should_ be re-written as: Setting up a "test" path might seem like a bit of work, however you should have one anyway, so that modifications are never done on a live website, they are done in 'test', and only moved to the live site, when appropriate. HTH Peter
  3. Hi, The web server log entries for spiders do not have a session ID, therefore, if all of your spider log entries don't have a session id (sid), then you have no need for this contribution, and you will only be complicating matters by using it. The old saying, if it aint broke, .... don't fix it. It is not essential that you have a robots.txt, however it is advisable, it will cut down on the 404 messages, and _most_ spiders/bots look for it. They do not have to 'obey' the rules you place in the file robots.txt, however _most_ do. This is what we usually put on osCommerce sites. User-Agent: * Disallow: /login.php Disallow: /create_account.php Peter
  4. Hi, Usually, the name of the web server log file, will contain the domain name as a suffix. Therefore, your website mus be a ".COM", and it _looks_ like a MS-DOS app. Just open it in any text editor, Notepad, Crimson Editor is free and very good. Peter
  5. Bob, Why would the thread need to be monitored ? If people have no questions, then there _is_ no activity, right. :) I see other people have answered your questions about what logs to look at, .... the web server logs, or raw access logs. The only thing/s relevant to look for in the logs, for this contribution, are: Spider entries with the session included. Anything else is irrelevant to this contribution, and should be posted elsewhere. :D Between the 'readme', the 'instal'l file, and a sample .htaccess file, that all come with the contribution, there is suficient information for you to install/use the contribution. However, you need to read the 'readme', etc, and also read the early posts in this thread, to find out if you _really_ do need this contribution. Peter
  6. Hi, This is from a site where STS is used and the popup works. The code is between the "<body>" tags of course, not the "<head>" tags ........... <td align="center" class="smallText"> <script language="javascript"><!-- document.write('<a href="javascript:popupWindow(\'\')"><img src="images/imagecache/imagefilename.jpg" border="0" alt="Alt text" title=" Title text " width="100" height="72" hspace="5" vspace="5"><br>Click to enlarge</a>'); //--></script> <noscript> <a href="" target="_blank"><img src="images/imagecache/imagefilename.jpg" border="0" alt="Alt text" title=" Title text " width="100" height="72" hspace="5" vspace="5"><br>Click to enlarge</a> </noscript> </td> HTH Peter
  7. Hi, I can see what you mean. If I try "artist" Duo Feterman by using , it doesn't work. If I try , I can see the product for the artist Duo Feterman. Why does one work and not the other ? Simple, ...... have you seen the source code generated ? The one that works uses this template: The one that doesn't work uses this template: Why doesn't the second one work ? Because you have no "$content" defined. :D Peter
  8. Hi, This is the official topic/thread for the Cart Conversion Suite,3136 Thanks, Peter
  9. Hi, The file /admin/gv_mail.php has a minor bug, the following code block: if (SEARCH_ENGINE_FRIENDLY_URLS == 'true') { $message .= HTTP_SERVER . DIR_WS_CATALOG . 'gv_redeem.php' . '/gv_no,'.$id1 . "\n\n"; } else { $message .= HTTP_SERVER . DIR_WS_CATALOG . 'gv_redeem.php' . '?gv_no='.$id1 . "\n\n"; } if a site is using SE friendly url set to true/enabled, the code above is of course writing out the URL as follows:,c8s37c and if a customer tries to use it, this cause osCommerce to 'go beserk', and the following message appears: Cookies are not blocked, the code correction is shown in blue ================= if (SEARCH_ENGINE_FRIENDLY_URLS == 'true') { $message .= HTTP_SERVER . DIR_WS_CATALOG . 'gv_redeem.php' . '/gv_no/'.$id1 . "\n\n"; } else { $message .= HTTP_SERVER . DIR_WS_CATALOG . 'gv_redeem.php' . '?gv_no='.$id1 . "\n\n"; } ================= Peter
  10. Hi Don, None of us can say what search engines will do. :D We can only relay to you from our experiences, and from my experience, whilst it did take a number of months, the mod_rewrite from this contribution did 'inform/tell" search engines, and the session ID was removed. Three of the big search engines had the 'oscsid' in the link (what you call the top). Btw, your post has the session ID in a link to your site, this is actually a "bad" thing to do, as when Googlebot,etc do the next crawl of this forum, they will pick it up, and potentially then you have another one to contend with. Peter
  11. Hi, Steve's reply is very informative, should help. How long will it take, good question. It is now over 4 months since we put this contrib on a site, and MSN were very quick to remove it, Yahoo were next to remove it, and Google still have it there, although not showing in search results links from Google, or the content that is displayed, but they haven't updated their cache since 28 Oct 2004. It's only 4 results, and people would follow the links, not look at the cache (much ?? ). I have found this is true with other sites, one domain was taken off the air a few months ago, MSN and Yahoo (slurp) were reasonably fast to remove the links, but Google still has over 800 links to the site. How long, the first 2 won't take long, but don't rely on Google to be quick about removing those sessions id's from search results. :D Regards, Peter
  12. Hi, The link to Gubed is Peter
  13. Hi, If you still get the error message , see post #8 here: Peter
  14. Hi, If you were able to cmod the configure.php file to 444 beforehand and cannot now, delete the file, then upload it, then cmod to 444. Peter
  15. Hi, Are you sure each product has a manufacturer assigned to it ? Peter