Jump to content
  • Checkout
  • Login
  • Get in touch

osCommerce

The e-commerce.

[Contribution] Ultimate SEO URLs - by Chemo


Guest

Recommended Posts

I have some trouble with this contribution, My shop run with shared ssl server,an

when I am logged on the shop with a clients' account,and that I select a category in category box, my login box asks me has new o authenticatification and when I clic on being identified I am still connected without authetification, I can't close the session user, log off does not function any more

somebody have the same problem and know how to solve it ???

I have autologon contrib installed, but i have disable it, and i still have same symptom.

Link to comment
Share on other sites

  • Replies 1.9k
  • Created
  • Last Reply

Top Posters In This Topic

HI chemo!!

 

just to let you know that I meet the same problem with "ultimate seo urls" (Vs Cname Pname) when adding new products. I don't use easypopulate, I enter them one by one from the admin panel. Thus, when I add a new product, the address is linked to the products id and not to cname pname.

 

To prevent this, now I use a product in the catalog that I duplicate and it works!!

 

Thanks for this great contribution!!

OSC2.2

Link to comment
Share on other sites

Internal Server Error

The server encountered an internal error or misconfiguration and was unable to complete your request.

Please contact the server administrator, [email protected] and inform them of the time the error occurred, and anything you might have done that may have caused the error.

 

More information about this error may be available in the server error log.

Link to comment
Share on other sites

Chemo,

Quick question for you.  I asked something like this before but after running a few spider tools on my system, it is caching all links with the long if not longer original OSC strings.  It isn't until someone actually clicks on the first link that they get the proper string with html endings.  The total first page is still using the old strings while it's being crawled...am I correct?  If so, then that negates all the positive effects I thunk it was doing.  I also noticed my rank on search engines has changes slightly to a lesser side and IS caching the super long OSC strings.

 

Suggestions?

What you describe is exactly how a properly configured installation is suppose to function. Upon entry to the store all URLs will have the SID appended. After the first click it will then pull the osCsid from the cookie.

 

As for the spider tools and Google: make sure you have an updated spiders.txt file. There is a contribution where they keep up with the latest and greatest spiders and upload revised spiders.txt files. Also, in you admin control panel under sessions make sure you have the setting to prevent spiders sessions enabled. If you are using a spider tool to check the site make sure its's also listed in your spiders.txt file or you may get inaccurate data (session id's).

I have some trouble with this contribution, My shop run with shared ssl server,an

when I am logged on the shop with a clients' account,and that I select a category in  category box, my login box asks me has new o authenticatification and when I clic on being identified I am still connected without authetification, I can't  close the session user, log off does not function any more

somebody have the same problem and know how to solve it ???

I have autologon contrib installed, but i have disable it, and i still have same symptom.

This contribution should not affect the login / logoff functionality. Verify that it is not another contribution causing the issues.

HI chemo!!

 

just to let you know that I meet the same problem with "ultimate seo urls" (Vs Cname Pname) when adding new products. I don't use easypopulate, I enter them one by one from the admin panel. Thus, when I add a new product, the address is linked to the products id and not to cname pname.

 

To prevent this, now I use a product in the catalog that I duplicate and it works!!

 

Thanks for this great contribution!!

I would suggest upgrading to the latest Ultimate SEO URLs v2.0 or revise the code in admin/categories.php

 

Find this code:

  if ( eregi('(update)', $action) ) include('includes/reset_seo_cache.php');

replace with this:

  // Ultimate SEO URLs - by Chemo
 // If the action will affect the cache entries
 if ( eregi("(insert|update|setflag)", $action) ) include_once('includes/reset_seo_cache.php');

 

Bobby

 

:thumbsup:

Link to comment
Share on other sites

Internal Server Error

The server encountered an internal error or misconfiguration and was unable to complete your request.

Please contact the server administrator, [email protected] and inform them of the time the error occurred, and anything you might have done that may have caused the error.

 

More information about this error may be available in the server error log.

Please ask your webhost provider to install the .htaccess entries properly for you. This is a server issue and not a contribution bug. If the general code that I provide in the contribution does not work on your setup there is no need cluttering the thread with support requests that we will not be able to solve.

 

Each server setup is different and it looks like the only one that will be able to solve this for you is your host.

 

Bobby

Link to comment
Share on other sites

The htaccess was in my catalog but I checked the root and there is also an htaccess file, so I deleted the one in the root because my store is in my catalog folder but the htaccess file in the root keeps reapearing!

 

So I have deleted the htaccess in the catalog and am left with one in the root, but I still get the server error message :(

Link to comment
Share on other sites

Please ask your webhost provider to install the .htaccess entries properly for you. This is a server issue and not a contribution bug. If the general code that I provide in the contribution does not work on your setup there is no need cluttering the thread with support requests that we will not be able to solve.

 

Each server setup is different and it looks like the only one that will be able to solve this for you is your host.

 

Bobby

 

OK , wasnt saying it was a contribution bug though was just looking for some help to solve the problem cheers

Link to comment
Share on other sites

  // Ultimate SEO URLs - by Chemo
 // If the action will affect the cache entries
 if ( eregi("(insert|update|setflag)", $action) ) include_once('includes/reset_seo_cache.php');

 

Allright Bobby!!! It solved my problem!

 

Thanks a lot!!!! :D

OSC2.2

Link to comment
Share on other sites

Quick question Bobby.

 

Installed your Ultimate SEO URLs v1.4 last week, works great!. I also have the configuration cache contribution installed since last Oct (faster page loads less DB queries). No problems with this too.

 

Should I be considering upgrading to your Ultimate SEO URLs v2.0? If yes, do I need to remove the configuration cache contribution.

 

Please advise.

 

Thanks

Link to comment
Share on other sites

I would upgrade...and you don't have to uninstall the config cache contribution.  They will both co-exist nicely.

 

Bobby

 

 

Hi Bobby

 

 

Firstly thank for this great contribution.

 

i have one small problem, i cant get the rewrite to work - i get 404's - ive read the thread about .htaccess and used both senarios for / and /catalog/ but still no joy.

 

Any ideas would be greatly appreciated

 

here is my .htaccess for reference

 

# $Id: .htaccess,v 1.4 2001/04/22 20:30:03 dwatkins Exp $

RewriteEngine On

# Change "folder" to your catalog directory name

RewriteBase /

RewriteRule ^(.*)-p-(.*).html$ product_info.php?products_id=$2&%{QUERY_STRING}

RewriteRule ^(.*)-c-(.*).html$ index.php?cPath=$2&%{QUERY_STRING}

RewriteRule ^(.*)-m-(.*).html$ index.php?manufacturers_id=$2&%{QUERY_STRING}

#

# This is used with Apache WebServers

# The following blocks direct HTTP requests in this directory recursively

#

# For this to work, you must include the parameter 'Limit' to the AllowOverride configuration

#

# Example:

#

#<Directory "/usr/local/apache/htdocs">

# AllowOverride Limit

#

# 'All' with also work. (This configuration is in your apache/conf/httpd.conf file)

#

# This does not affect PHP include/require functions

#

# Example: http://server/catalog/includes/application_top.php will not work

 

<Files *.php>

Order Deny,Allow

Deny from all

</Files>

Link to comment
Share on other sites

Chemo,

 

I'am trying to upgrade from 1.4 to 2.0 but have follwing error if i overwrite the seo_cache.php file in includes. i just overwrite the old php files with the new one, What do i have to change?

 

Fatal error: Call to a member function on a non-object in /home/pc/public_html/catalog/includes/seo_cache.php on line 47

 

bye

Chayenne

Link to comment
Share on other sites

Hi

 

Just to say thanks for a brilliantly put together contribution - now I just need to sit back and wait for the search engines to do their thing :rolleyes:

 

One thing that may be of use to some of you, I was having problems with the rewrite method giving me page not found 404 errors, so I stuck in the following command in the .htaccess file and it worked a treat:

 

Options +FollowSymLinks

 

 

Also just to let anyone thinking of using this that it also works fine on shared SSL set up, I know there is a rewrite contrib out there which doesn't.

 

Cheers

 

Karl

Handy Candy

The Internet Agency

Link to comment
Share on other sites

What you describe is exactly how a properly configured installation is suppose to function.  Upon entry to the store all URLs will have the SID appended.  After the first click it will then pull the osCsid from the cookie.

 

As for the spider tools and Google: make sure you have an updated spiders.txt file.  There is a contribution where they keep up with the latest and greatest spiders and upload revised spiders.txt files.  Also, in you admin control panel under sessions make sure you have the setting to prevent spiders sessions enabled.  If you are using a spider tool to check the site make sure its's also listed in your spiders.txt file or you may get inaccurate data (session id's).

 

Chemo,

Ok - What exactly are the Spider.txt files for? This scares me as it seems only a handful of people understand SEO on here, Knowing you are one of them, is this properly configured in OSC and what's it there for?

As for the sessions > spiders denied, I have that set. Does the spiders text see a spider and disable sessions to spiders? Ima guessing that's what it is there for. Do you think no-legit spiders are going to continue to identify themselves the same each crawl or is it useless to update this file as most of the 'big boys' who are reputible are already listed.

ALSO - SO this is ok?

First time entry and in ALL the code are URLS like this:

http://www.examplesite.com/Category-One-c-...bcfda42af36cc82

 

Will not the spider cache/capture the long urls as they are not in essence 'clicking' on a link? They take in all the first links they see then follow those links from those captured links. That is how it was explained to me, and if you read some of the google / yahoo SEO pages on their actual sites, they seem to hint toward that also.

 

S

Link to comment
Share on other sites

Me too:

 

Fatal error: Call to a member function on a non-object in /www/s/susanrm1/htdocs/catalog/includes/seo_cache.php on line 47

 

Fatal error: Call to a member function on a non-object in /www/s/susanrm1/htdocs/catalog/includes/seo_cache.php on line 47

 

And Bobby - I don't see the link to the 2.0 support forum in the Contributions area. :-(

 

Susan

Link to comment
Share on other sites

Make sure you do the upgrade in the order he has it.

 

Upgrade

 

1. Overwrite current files with new. Upload the cache.class.php file

2. Call the install-seo.php file in the web browser and select UNINSTALL. Turn right around and select INSTALL. This will clear the current settings and get the new ones in there. Call the instal-cache.php file in your browser and select INSTALL

3. Update the code in step 3(1). Also, depending on which version you are upgrading from you may also need to do steps 3(2) and 3(3).

4. Just to make sure you have the latest code replace your tep_href_link() function with the one provided.

5. If you are not upgrading from the latest version you may need to peform step #5

6. Update your code with step #6

7. Perform step #7

8. Step #8 should already be completed from a past install

9. Reconfigure the settings in the admin control panel

 

Making sure to run the install-seo.php and the install-cache.php scripts.

 

Should help.

 

Paul

Link to comment
Share on other sites

Paul,

 

Make sure you do the upgrade in the order he has it.

 

Upgrade

 

        1. Overwrite current files with new. Upload the cache.class.php file

        2. Call the install-seo.php file in the web browser and select UNINSTALL.

 

Should help.

 

Paul

 

Wish that did help. It's right after uploading the files that the error occurs. It's not even possible to call the install-seo.php file due to the error.

 

One thought, Bobby: I did not install the version for caches before. Could that be the issue? Is the non-cache version trying to call something that doesn't exist?

 

Also, I too am seeing the spider test results with session IDs. Here's an example of a spider tool. Spider Tool If you can find a way to get rid of the extra characters, please let me know! Otherwise I love this contrib.

 

Susan

Aspiring Arts

Link to comment
Share on other sites

...

I'am trying to upgrade from 1.4 to 2.0 but have follwing error if i overwrite the seo_cache.php file in includes. i just overwrite the old php files with the new one, What do i have to change?

 

Fatal error: Call to a member function on a non-object in /home/pc/public_html/catalog/includes/seo_cache.php on line 47

 

bye

Chayenne

Chayene,

 

Whenever you get that type of fatal error (call to a member function on a non-object) it means that you are trying to use a class method from one that has NOT been initialized. In simple terms, you did not add perform step #3(1). This change should REMOVE the 1.4 code and replace it with the 2.0 code (which initializes the class).

 

Please go back and redo step #3(1) and be sure to remove the old 1.4 code.

Hi

 

Just to say thanks for a brilliantly put together contribution - now I just need to sit back and wait for the search engines to do their thing  :rolleyes:

 

One thing that may be of use to some of you, I was having problems with the rewrite method giving me page not found 404 errors, so I stuck in the following command in the .htaccess file and it worked a treat:

 

Options +FollowSymLinks

Also just to let anyone thinking of using this that it also works fine on shared SSL set up, I know there is a rewrite contrib out there which doesn't.

 

Cheers

 

Karl

Handy Candy

The Internet Agency

Thank you for your kind words...and nice suggeston about setting FollowSymlinks

 

...you get 10 Chemo bonus points :)

 

I'm upgrading too and I get exactly the same message!

 

Please advise.

 

Thanks

Once again...it appears as though you did not replace the code in application_top.php with the new code.

 

First, remove the old 1.4 code and replace with the code provided in step #3(1)

 

Chemo,

Ok - What exactly are the Spider.txt files for?  This scares me as it seems only a handful of people understand SEO on here, Knowing you are one of them, is this properly configured in OSC and what's it there for? 

As for the sessions > spiders denied, I have that set.  Does the spiders text see a spider and disable sessions to spiders?  Ima guessing that's what it is there for.  Do you think no-legit spiders are going to continue to identify themselves the same each crawl or is it useless to update this file as most of the 'big boys' who are reputible are already listed.

ALSO - SO this is ok?

First time entry and in ALL the code are URLS like this:

http://www.examplesite.com/Category-One-c-...bcfda42af36cc82

 

Will not the spider cache/capture the long urls as they are not in essence 'clicking' on a link?  They take in all the first links they see then follow those links from those captured links.  That is how it was explained to me, and if you read some of the google / yahoo SEO pages on their actual sites, they seem to hint toward that also.

 

S

The behavior that you describe is how a correctly configured store should function.

 

The spiders.txt file works like this: when a visitor comes to the store whether it be a spider or warm, live body it checks the browser user agent against the spiders.txt file. If the user agent that is declared matches a known spider user agent it will suppress the osCsid. The problem is that the default installation spiders.txt file was created circa 2001-2002 and there have been many new spiders pop-up. I recommend you set the "Prevent Spiders Sessions" as TRUE and keep the spiders.txt file updated on a regular basis (every 6 months or sooner).

 

Me too:

 

Fatal error: Call to a member function on a non-object in /www/s/susanrm1/htdocs/catalog/includes/seo_cache.php on line 47

And Bobby - I don't see the link to the 2.0 support forum in the Contributions area. :-(

 

Susan

Once again...it appears as though you did not replace the code in application_top.php with the new code.

 

First, remove the old 1.4 code and replace with the code provided in step #3(1)

 

Paul,

Wish that did help. It's right after uploading the files that the error occurs. It's not even possible to call the install-seo.php file due to the error.

 

One thought, Bobby: I did not install the version for caches before. Could that be the issue? Is the non-cache version trying to call something that doesn't exist?

 

Also, I too am seeing the spider test results with session IDs. Here's an example of a spider tool.  Spider Tool If you can find a way to get rid of the extra characters, please let me know! Otherwise I love this contrib.

 

Susan

Aspiring Arts

OK...this is why in the directions it says to first upload the new files THEN execute the install scripts in the browser BEFORE you modify files :)

 

To get around this simply comment the new code from application_top.php => don't delete it but simply comment it out. Then run the install scripts. After you are done with the install scripts go back to application_top.php and uncomment the code.

 

Bobby

Link to comment
Share on other sites

The spiders.txt file works like this: when a visitor comes to the store whether it be a spider or warm, live body it checks the browser user agent against the spiders.txt file.  If the user agent that is declared matches a known spider user agent it will suppress the osCsid.  The problem is that the default installation spiders.txt file was created circa 2001-2002 and there have been many new spiders pop-up.  I recommend you set the "Prevent Spiders Sessions" as TRUE and keep the spiders.txt file updated on a regular basis (every 6 months or sooner).

 

I have an updated spiders.txt file from the latest contrib., and Prevent Spider Sessions is TRUE. I also added the name of the testing spider. Still had the session id come through. This is why I provided a link for you to possibly help us troubleshoot, once this latest bug spree is over.

 

Once again...it appears as though you did not replace the code in application_top.php with the new code.

 

First, remove the old 1.4 code and replace with the code provided in step #3(1)

OK...this is why in the directions it says to first upload the new files THEN execute the install scripts in the browser BEFORE you modify files :)

 

To get around this simply comment the new code from application_top.php => don't delete it but simply comment it out.  Then run the install scripts.  After you are done with the install scripts go back to application_top.php and uncomment the code.

 

Bobby

 

This is helpful. What would also be helpful is if in your install instructions, you mention that 1.4 needs to be uninstalled first. This is a critical first step that wasn't mentioned before.

 

Will try when I'm less sleepy. Thanks for helping.

 

Thanks,

Susan

Link to comment
Share on other sites

I have an updated spiders.txt file from the latest contrib., and Prevent Spider Sessions is TRUE.  I also added  the name of the testing spider. Still had the session id come through. This is why I provided a link for you to possibly help us troubleshoot, once this latest bug spree is over.

This is helpful. What would also be helpful is if in your install instructions, you mention that 1.4 needs to be uninstalled first. This is a critical first step that wasn't mentioned before.

 

Will try when I'm less sleepy. Thanks for helping.

 

Thanks,

Susan

Susan,

 

I had another person on the CRE Loaded forum pose the possibility that the contribution is causing osCsid's to be cached in the Google index. Luckily, I have been creating my own search engine spider and search engine (to be released as osC Seach Engine Spider) and are far enough along to actually modify it slightly to check whether this was indeed a valid concern.

 

What I have found is that a properly configured system will NOT send the osCsid's to Google spiders. It turns out that person was using a spider simulator that was sending the Mozilla user agent and not the Googlebot user agent.

 

If you want to test your setup I have the beta tool available online here: Google, Chemo's Spider, and Firefox user agent simulator You can input your URL and choose the user agent of your choice...then hit submit. It presents lots of data but the important info is whether or not the osCsid's are appended to the URL's. You can choose the Firefox (normal customer) user agent option and it should append the osCsid's like normal. Then, choose the Google user agent option and it SHOULD NOT append the osCsid's.

 

I have tested all of my client setups and found zero problems. If you encounter an error it is due to your setup and not my contribution.

 

Maybe we both need some sleep because the tone in your post seems a bit unappreciative of the free contribution. Remember, I have spent well over 100 hours of my personal time developing this for you free of charge and with less than $40 in donations. Further, I spend countless hours on this forum supporting my contributions (and generally anyone with osC support issues). Go look at 90% of the other contributions and see if the authors give my level of support. I have answered every post in all my support threads usually within minutes or hours...1 day at the latest.

 

If you don't want to "suffer" through this bug spree uninstall my contribution and have a Coke and a smile.

 

For the rest of the store owners that know I support my work and will get all the issues resolved (which at this point is mainly upgrade errors from unexperienced webmasters) sit tight and let me create a separate upgrade installation file to make it a step by step process. I agree that it could be more spelled out and would make upgrade easier...but creating free contributions in between trying to feed my family is a bit more difficult than it sounds.

 

Cordially,

 

Bobby

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Unfortunately, your content contains terms that we do not allow. Please edit your content to remove the highlighted words below.
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...