Jump to content
spidometrs

[CONTRIBUTION] Ultimate SEO URLs v2.1 - by Chemo

Recommended Posts

Not sure why you are confused Paul .. I only support 2.1d ORIGINAL updated so download the latest one that is 2.1d ORIGINAL updated.

 

You can of course download any you like I just won't support the versions that I didn't upload/maintain.

 

There were two 2.1d ORIGINAL so I used the 23 Nov 2008 version.

 

I have it up and running on my sandbox site and it looks and works great straight out of the box. I think that is a first as nearly every contribution I have use so far had some sort of bug.

 

I have found one related issue that I would like to ask for advice about. I am using the Google Base - Froogle Data Feeder v1.20 to send product info to Google once a month. The output is a text file that Google uploads automatically. I just looked at the content after installing SEO and the URL format is the old one.

 

Does anyone here also use this contribution and have a solution or should I post in the thread for that contribution?

 

There is a version of the feeder system for use with SEO-G (whatever that is), does anyone know if that would work with this contribution?

 

Cheers, Paul.

Share this post


Link to post
Share on other sites
There were two 2.1d ORIGINAL so I used the 23 Nov 2008 version.

 

I have it up and running on my sandbox site and it looks and works great straight out of the box. I think that is a first as nearly every contribution I have use so far had some sort of bug.

 

I have found one related issue that I would like to ask for advice about. I am using the Google Base - Froogle Data Feeder v1.20 to send product info to Google once a month. The output is a text file that Google uploads automatically. I just looked at the content after installing SEO and the URL format is the old one.

 

Does anyone here also use this contribution and have a solution or should I post in the thread for that contribution?

 

There is a version of the feeder system for use with SEO-G (whatever that is), does anyone know if that would work with this contribution?

 

Cheers, Paul.

 

Find a different feeder, all moderately well coded contributions will use the proper tep_href_link wrapper function and would therefore output seo urls.

Edited by FWR Media

Share this post


Link to post
Share on other sites

I'm having som trouble with this. And I can't figure out where i went wrong!

 

Problem #1

Everytime I reload my page the SEO URLs Add-on installs itself all over again. Getting a new gID.

Anyone knows why?

 

Problem #2

1364 - Field 'categories_seo_url' doesn't have a default value

 

insert into categories_description (categories_name, categories_id, language_id) values ('', '27', '4')

 

I'm using version 2.1d.

Thanks in advance

Share this post


Link to post
Share on other sites
I'm having som trouble with this. And I can't figure out where i went wrong!

 

Problem #1

Everytime I reload my page the SEO URLs Add-on installs itself all over again. Getting a new gID.

Anyone knows why?

 

Problem #2

1364 - Field 'categories_seo_url' doesn't have a default value

 

insert into categories_description (categories_name, categories_id, language_id) values ('', '27', '4')

 

I'm using version 2.1d.

Thanks in advance

 

Sounds like a very old version of seo urls 2.1d .. try 2.1d ORIGINAL updated .. but you'll lose the ability to set category/product urls.

Share this post


Link to post
Share on other sites
Sounds like a very old version of seo urls 2.1d .. try 2.1d ORIGINAL updated .. but you'll lose the ability to set category/product urls.

 

The simple answer is to remove the fields from the database? Since i don't need to alter the urls manually it doesn't matter.

Share this post


Link to post
Share on other sites

SEO HTTP Error 404 -THE PAGE CANNOT BE FOUND

 

adresses appear right like this http://www.mysite.it/visti-in-tv-c-52.html...gt59bqlsnu6sdm6 but it return error 404

my htacces is based on root directory see the follow

# $Id: .htaccess,v 1.3 2003/06/12 10:53:20 hpdl Exp $
#
# This is used with Apache WebServers
#
# For this to work, you must include the parameter 'Options' to
# the AllowOverride configuration
#
# Example:
#
# <Directory "/usr/local/apache/htdocs">
#   AllowOverride Options
# </Directory>
#
# 'All' with also work. (This configuration is in the
# apache/conf/httpd.conf file)

# The following makes adjustments to the SSL protocol for Internet
# Explorer browsers

<IfModule mod_setenvif.c>
 <IfDefine SSL>
SetEnvIf User-Agent ".*MSIE.*" \
		 nokeepalive ssl-unclean-shutdown \
		 downgrade-1.0 force-response-1.0
 </IfDefine>
</IfModule>

# If Search Engine Friendly URLs do not work, try enabling the
# following Apache configuration parameter
#
# AcceptPathInfo On

# Fix certain PHP values
# (commented out by default to prevent errors occuring on certain
# servers)
#
#<IfModule mod_php4.c>
#  php_value session.use_trans_sid 0
#  php_value register_globals 1
#</IfModule>

# Ultimate SEO URLs BEGIN
Options +FollowSymLinks
RewriteEngine On
RewriteBase /

RewriteCond %{QUERY_STRING} ^options\=(.*)$
RewriteRule ^(.*)-p-(.*).html$ product_info.php?products_id=$2%1
RewriteRule ^(.*)-p-(.*).html$ product_info.php?products_id=$2&%{QUERY_STRING}
RewriteRule ^(.*)-c-(.*).html$ index.php?cPath=$2&%{QUERY_STRING}
RewriteRule ^(.*)-m-(.*).html$ index.php?manufacturers_id=$2&%{QUERY_STRING}
RewriteRule ^(.*)-pi-(.*).html$ popup_image.php?pID=$2&%{QUERY_STRING}
RewriteRule ^(.*)-t-(.*).html$ articles.php?tPath=$2&%{QUERY_STRING}
RewriteRule ^(.*)-a-(.*).html$ article_info.php?articles_id=$2&%{QUERY_STRING}
RewriteRule ^(.*)-pr-(.*).html$ product_reviews.php?products_id=$2&%{QUERY_STRING}
RewriteRule ^(.*)-pri-(.*).html$ product_reviews_info.php?products_id=$2&%{QUERY_STRING}
RewriteRule ^(.*)-pm-([0-9]+).html$ info_pages.php?pages_id=$2&%{QUERY_STRING}
RewriteRule ^(.*)-i-(.*).html$ information.php?info_id=$2&%{QUERY_STRING}
RewriteRule ^(.*)-links-(.*).html$ links.php?lPath=$2&%{QUERY_STRING}
# Added polls and newsdesk
#RewriteRule ^(.*)-po-([0-9]+).html$ pollbooth.php?pollid=$2&%{QUERY_STRING}
 RewriteRule ^(.*)-n-(.*).html$ newsdesk_info.php?newsdesk_id=$2&%{QUERY_STRING}
 RewriteRule ^(.*)-nc-(.*).html$ newsdesk_index.php?newsPath=$2&%{QUERY_STRING}
 RewriteRule ^(.*)-nri-(.*).html$ newsdesk_reviews_info.php?newsdesk_id=$2&%{QUERY_STRING}
 RewriteRule ^(.*)-nra-(.*).html$ newsdesk_reviews_article.php?newsdesk_id=$2&%{QUERY_STRING}
# BOF: Faqdesk support added by faaliyet
 RewriteRule ^(.*)-f-(.*).html$ faqdesk_info.php?faqdesk_id=$2&%{QUERY_STRING}
 RewriteRule ^(.*)-fc-(.*).html$ faqdesk_index.php?faqPath=$2&%{QUERY_STRING}
 RewriteRule ^(.*)-fri-(.*).html$ faqdesk_reviews_info.php?faqdesk_id=$2&%{QUERY_STRING}
 RewriteRule ^(.*)-fra-(.*).html$ faqdesk_reviews_article.php?faqdesk_id=$2&%{QUERY_STRING}
# EOF: Faqdesk support added by faaliyet
# Ultimate SEO URLs END

# Block Bad Bots
RewriteCond %{HTTP_USER_AGENT} ^BlackWidow [OR]
RewriteCond %{HTTP_USER_AGENT} ^Bot\ mailto:craftbot@yahoo.com [OR]
RewriteCond %{HTTP_USER_AGENT} ^ChinaClaw [OR]
RewriteCond %{HTTP_USER_AGENT} ^Custo [OR]
RewriteCond %{HTTP_USER_AGENT} ^DISCo [OR]
RewriteCond %{HTTP_USER_AGENT} ^Download\ Demon [OR]
RewriteCond %{HTTP_USER_AGENT} ^eCatch [OR]
RewriteCond %{HTTP_USER_AGENT} ^EirGrabber [OR]
RewriteCond %{HTTP_USER_AGENT} ^EmailSiphon [OR]
RewriteCond %{HTTP_USER_AGENT} ^EmailWolf [OR]
RewriteCond %{HTTP_USER_AGENT} ^Express\ WebPictures [OR]
RewriteCond %{HTTP_USER_AGENT} ^ExtractorPro [OR]
RewriteCond %{HTTP_USER_AGENT} ^EyeNetIE [OR]
RewriteCond %{HTTP_USER_AGENT} ^FlashGet [OR]
RewriteCond %{HTTP_USER_AGENT} ^GetRight [OR]
RewriteCond %{HTTP_USER_AGENT} ^GetWeb! [OR]
RewriteCond %{HTTP_USER_AGENT} ^Go!Zilla [OR]
RewriteCond %{HTTP_USER_AGENT} ^Go-Ahead-Got-It [OR]
RewriteCond %{HTTP_USER_AGENT} ^GrabNet [OR]
RewriteCond %{HTTP_USER_AGENT} ^Grafula [OR]
RewriteCond %{HTTP_USER_AGENT} ^HMView [OR]
RewriteCond %{HTTP_USER_AGENT} HTTrack [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^Image\ Stripper [OR]
RewriteCond %{HTTP_USER_AGENT} ^Image\ Sucker [OR]
RewriteCond %{HTTP_USER_AGENT} Indy\ Library [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^InterGET [OR]
RewriteCond %{HTTP_USER_AGENT} ^Internet\ Ninja [OR]
RewriteCond %{HTTP_USER_AGENT} ^JetCar [OR]
RewriteCond %{HTTP_USER_AGENT} ^JOC\ Web\ Spider [OR]
RewriteCond %{HTTP_USER_AGENT} ^larbin [OR]
RewriteCond %{HTTP_USER_AGENT} ^LeechFTP [OR]
RewriteCond %{HTTP_USER_AGENT} ^Mass\ Downloader [OR]
RewriteCond %{HTTP_USER_AGENT} ^MIDown\ tool [OR]
RewriteCond %{HTTP_USER_AGENT} ^Mister\ PiX [OR]
RewriteCond %{HTTP_USER_AGENT} ^Navroad [OR]
RewriteCond %{HTTP_USER_AGENT} ^NearSite [OR]
RewriteCond %{HTTP_USER_AGENT} ^NetAnts [OR]
RewriteCond %{HTTP_USER_AGENT} ^NetSpider [OR]
RewriteCond %{HTTP_USER_AGENT} ^Net\ Vampire [OR]
RewriteCond %{HTTP_USER_AGENT} ^NetZIP [OR]
RewriteCond %{HTTP_USER_AGENT} ^Octopus [OR]
RewriteCond %{HTTP_USER_AGENT} ^Offline\ Explorer [OR]
RewriteCond %{HTTP_USER_AGENT} ^Offline\ Navigator [OR]
RewriteCond %{HTTP_USER_AGENT} ^PageGrabber [OR]
RewriteCond %{HTTP_USER_AGENT} ^Papa\ Foto [OR]
RewriteCond %{HTTP_USER_AGENT} ^pavuk [OR]
RewriteCond %{HTTP_USER_AGENT} ^pcBrowser [OR]
RewriteCond %{HTTP_USER_AGENT} ^RealDownload [OR]
RewriteCond %{HTTP_USER_AGENT} ^ReGet [OR]
RewriteCond %{HTTP_USER_AGENT} ^SiteSnagger [OR]
RewriteCond %{HTTP_USER_AGENT} ^SmartDownload [OR]
RewriteCond %{HTTP_USER_AGENT} ^SuperBot [OR]
RewriteCond %{HTTP_USER_AGENT} ^SuperHTTP [OR]
RewriteCond %{HTTP_USER_AGENT} ^Surfbot [OR]
RewriteCond %{HTTP_USER_AGENT} ^tAkeOut [OR]
RewriteCond %{HTTP_USER_AGENT} ^Teleport\ Pro [OR]
RewriteCond %{HTTP_USER_AGENT} ^VoidEYE [OR]
RewriteCond %{HTTP_USER_AGENT} ^Web\ Image\ Collector [OR]
RewriteCond %{HTTP_USER_AGENT} ^Web\ Sucker [OR]
RewriteCond %{HTTP_USER_AGENT} ^WebAuto [OR]
RewriteCond %{HTTP_USER_AGENT} ^WebCopier [OR]
RewriteCond %{HTTP_USER_AGENT} ^WebFetch [OR]
RewriteCond %{HTTP_USER_AGENT} ^WebGo\ IS [OR]
RewriteCond %{HTTP_USER_AGENT} ^WebLeacher [OR]
RewriteCond %{HTTP_USER_AGENT} ^WebReaper [OR]
RewriteCond %{HTTP_USER_AGENT} ^WebSauger [OR]
RewriteCond %{HTTP_USER_AGENT} ^Website\ eXtractor [OR]
RewriteCond %{HTTP_USER_AGENT} ^Website\ Quester [OR]
RewriteCond %{HTTP_USER_AGENT} ^WebStripper [OR]
RewriteCond %{HTTP_USER_AGENT} ^WebWhacker [OR]
RewriteCond %{HTTP_USER_AGENT} ^WebZIP [OR]
RewriteCond %{HTTP_USER_AGENT} ^Wget [OR]
RewriteCond %{HTTP_USER_AGENT} ^Widow [OR]
RewriteCond %{HTTP_USER_AGENT} ^WWWOFFLE [OR]
RewriteCond %{HTTP_USER_AGENT} ^Xaldon\ WebSpider [OR]
RewriteCond %{HTTP_USER_AGENT} ^Zeus
RewriteRule .* - [F]

qtm
membro Junior


Messaggi: 13
Iscritto il: gio set 04, 2008 8:26 am
Top
/code]

[color="#0000FF"]actually i don't know if depend of my html_output[/color]



[code]////
// Ultimate SEO URLs v2.1
// The HTML href link wrapper function
if (SEO_ENABLED == 'true') { //run chemo's code
 function tep_href_link($page = '', $parameters = '', $connection = 'NONSSL', $add_session_id = true, $search_engine_safe = true) {
	global $seo_urls;
			if ( !is_object($seo_urls) ){
					if ( !class_exists('SEO_URL') ){
							include_once(DIR_WS_CLASSES . 'seo.class.php');
					}
					global $languages_id;
					$seo_urls = new SEO_URL($languages_id);
			}
	return preg_replace('/&/','&',$seo_urls->href_link($page, $parameters, $connection, $add_session_id));
 }
} else { //run original code
// The HTML href link wrapper function
 function tep_href_link($page = '', $parameters = '', $connection = 'NONSSL', $add_session_id = true, $search_engine_safe = true) {
global $request_type, $session_started, $SID;

if (!tep_not_null($page)) {
  die('<br><br><font color="#ff0000"><b>Error!</b></font><br><br><b>Unable to determine the page link!<br><br>');
}

if ($connection == 'NONSSL') {
  $link = HTTP_SERVER . DIR_WS_HTTP_CATALOG;
} elseif ($connection == 'SSL') {
  if (ENABLE_SSL == true) {
	$link = HTTPS_SERVER . DIR_WS_HTTPS_CATALOG;
  } else {
	$link = HTTP_SERVER . DIR_WS_HTTP_CATALOG;
  }
} else {
  die('<br><br><font color="#ff0000"><b>Error!</b></font><br><br><b>Unable to determine connection method on a link!<br><br>Known methods: NONSSL SSL</b><br><br>');
}

if (tep_not_null($parameters)) {
  $link .= $page . '?' . tep_output_string($parameters);
  $separator = '&';
} else {
  $link .= $page;
  $separator = '?';
}

while ( (substr($link, -1) == '&') || (substr($link, -1) == '?') ) $link = substr($link, 0, -1);

// Add the session ID when moving from different HTTP and HTTPS servers, or when SID is defined
if ( ($add_session_id == true) && ($session_started == true) && (SESSION_FORCE_COOKIE_USE == 'False') ) {
  if (tep_not_null($SID)) {
	$_sid = $SID;
  } elseif ( ( ($request_type == 'NONSSL') && ($connection == 'SSL') && (ENABLE_SSL == true) ) || ( ($request_type == 'SSL') && ($connection == 'NONSSL') ) ) {
	if (HTTP_COOKIE_DOMAIN != HTTPS_COOKIE_DOMAIN) {
	  $_sid = tep_session_name() . '=' . tep_session_id();
	}
  }
}

if ( (SEARCH_ENGINE_FRIENDLY_URLS == 'true') && ($search_engine_safe == true) ) {
  while (strstr($link, '&&')) $link = str_replace('&&', '&', $link);

  $link = str_replace('?', '/', $link);
  $link = str_replace('&', '/', $link);
  $link = str_replace('=', '/', $link);

  $separator = '?';
}

if (isset($_sid)) {
  $link .= $separator . $_sid;
}

return $link;
 }
}

 

ANY HELP WOULD BE MUCH APPRECIATED TKS

Share this post


Link to post
Share on other sites

What is ..

RewriteCond %{QUERY_STRING} ^options\=(.*)$

 

Try without it.

Share this post


Link to post
Share on other sites
What is ..

RewriteCond %{QUERY_STRING} ^options\=(.*)$

 

Try without it.

 

 

done ......but same as before

 

HTTP Error 404 - File or directory not found.

Internet Information Services (IIS)

 

any other suggestion? :-)

tks in advance

Edited by qtm

Share this post


Link to post
Share on other sites
done ......but same as before

 

HTTP Error 404 - File or directory not found.

Internet Information Services (IIS)

 

any other suggestion? :-)

tks in advance

 

Perhaps the server is not set up correctly to allow rewrites. As long as it's linux and the server settings are correct then it'll work.

Share this post


Link to post
Share on other sites
Perhaps the server is not set up correctly to allow rewrites. As long as it's linux and the server settings are correct then it'll work.

 

 

then my problem is set the server and i don't know how to do it many tks anyway

Share this post


Link to post
Share on other sites

Can anyone recommend a contribution for generating sitemaps for Google, so that it works with this contribution. There are a number of Sitemap contributions, just not sure which one to chose. Thank you.

Share this post


Link to post
Share on other sites
Can anyone recommend a contribution for generating sitemaps for Google, so that it works with this contribution. There are a number of Sitemap contributions, just not sure which one to chose. Thank you.

http://www.oscommerce.com/community/contributions,3233


The Coopco Underwear Shop

 

If you live to be 100 years of age, that means you have lived for 36,525 days. Don't waste another, there aren't many left.

Share this post


Link to post
Share on other sites

I have Ultimate_SEO_URLSv21d_UPDATED-07-NOV-2008.zip installed and it works.

 

I have a problem with special character conversions however. Almost all of my product descriptions begin with the "#" symbol. I would like it to stay in the URL but of course it is remove also "-"). I have tried #=># and -=>- which of course does not work. I have changed "Filter Short Words" to a blank.

 

Anyone have a suggestion as to what I must do? I realize I am probably pointed down the wrong track in what I have been trying.

Share this post


Link to post
Share on other sites

I have installed 2.1d ORIGINAL updated

on a Zeus server and i get 404 errors

I have installed this on Apache servers before without any problems

Now I am installing it on a Zeus Web Server I get 404 errors

 

I think it is a Rewrite issue

 

This is info i got from netregistry support

 

If you've previously used Apache to implement some rewrite rules on your website using mod rewrite, but are hosted on our Zeus server cluster - then all you need to do is convert your Apache rewrite rules to Zeus, and paste them into the Rewrite section of your Webhosting settings.

Converting your existing Apache rewrite to Zeus syntax by hand:

 

Apache:

RewriteCond %{REQUEST_FILENAME} !-fRewriteRule ^[^/]*\.html$ index.php

Zeus Web Server:

match URL into $ with ^[^\/]*\.html$if matched then set URL = index.phpendif

The match expression itself can remain intact with no changes, whereas the arcane RewriteCond directive is rewritten in a form much closer to natural language. The REQUEST_FILENAME variable of Apache is the URL variable in Zeus, upon which a pattern match is attempted in the first line of the Zeus rule. If this match succeeds, then the URL field used for the remainder of the request is set to the desired page.

 

So, con anyone help with converting the Apache rewrite rules to Zeus

is it just the rewrite rules in .htaccess that will need to be converted,

or is there some code in the seo.class.php that will need to be changed

 

would really appreciate any help i can get as i would really

like to get this working on my site on the Zeus Web Server

hope someone can help

 

Kevin

Share this post


Link to post
Share on other sites

I have just installed Ultimate SEO URLs v2.6 (FullPackage) - faaliyet following the instructions but get the following error when calling my website

 

Parse error: syntax error, unexpected '}' in /home/woodleysnauk/public_html/catalog/includes/functions/html_output.php on line 13

 

Has anyone experienced this? I'm happy to share my code with anyone who wants to take a look.

 

Site:

http://www.woodleysnapshot.co.uk/catalog

Share this post


Link to post
Share on other sites
I have just installed Ultimate SEO URLs v2.6 (FullPackage) - faaliyet following the instructions but get the following error when calling my website

 

Parse error: syntax error, unexpected '}' in /home/woodleysnauk/public_html/catalog/includes/functions/html_output.php on line 13

 

Has anyone experienced this? I'm happy to share my code with anyone who wants to take a look.

 

Site:

http://www.woodleysnapshot.co.uk/catalog

 

You have probably made a mistake when making the changes to that file (catalog/includes/functions/html_output.php) either check the file for messy code at or near line 13 or just make the changes to that file again from a backup.

 

SJC

Share this post


Link to post
Share on other sites

I'm running this mod and notice it broke the Extra pages-info box w-admin 4.6.1 probably because of the new page names it creates...is there a workaround for this?

Share this post


Link to post
Share on other sites
thank you for Ultimate SEO URLs

 

i install v2.6 its work good but i used arabic language the URL is appear different

help me

 

this is my website http://www.hediah.com

i still to answer how to correct URL Arabic language

Share this post


Link to post
Share on other sites

Hello,

 

I just installed Ultimate SEO URLs 2.1d (updated) and I am having a problem with the URLs. Even though I set up the contribution to not add cPath to the product URLs in the admin panel, the URLs are showing the cPath, like:

[url="http://www.sacred-jewelry.com/yogajewelry/chakra-jewelry-c-21.html?osCsid=teid7g9gc06oudhgeb8ohkggq6"[/url]

 

This is the second site I install this contribution on, and never had this happen with the first one. I am also using URL Validation, STS and More Pics contributions here.

 

I am sure I must have overlooked something and there is a simple fix for this... :blush:

 

Thanks in advance for any help!

Edited by jailaxmi

I repeat myself when under stress, I repeat myself when under stress, I repeat myself...

 

--King Crimson (“Discipline”)

Share this post


Link to post
Share on other sites

I am getting this problem at the end of my category urls

 

-c-103_93.html?page=2&sort=5a

 

is there a way to chang it to something better like below?

 

-c-103_93-2.html

 

-c-103_93-page2.html

 

this error is only coming up when going to the second page of the category to see more products

 

also google seems to be trying to list old site urls like the below

 

http://www.mysite.co.uk/index.php?cPath=103_60&page=1

 

which redirects to

 

http://www.mysite.co.uk/polo-shirts-childr..._60.html?page=1

 

but should be and works but google is still listing theabove ones even though there is no link or reference to th above on my site?

 

http://www.mysite.co.uk/polo-shirts-childr...s-c-103_60.html

 

I have been running ultimate seo urls for months now but still google lists the above :-(

Share this post


Link to post
Share on other sites
I am getting this problem at the end of my category urls

 

-c-103_93.html?page=2&sort=5a

 

is there a way to chang it to something better like below?

 

-c-103_93-2.html

 

-c-103_93-page2.html

 

this error is only coming up when going to the second page of the category to see more products

 

also google seems to be trying to list old site urls like the below

 

http://www.mysite.co.uk/index.php?cPath=103_60&page=1

 

which redirects to

 

http://www.mysite.co.uk/polo-shirts-childr..._60.html?page=1

 

but should be and works but google is still listing theabove ones even though there is no link or reference to th above on my site?

 

http://www.mysite.co.uk/polo-shirts-childr...s-c-103_60.html

 

I have been running ultimate seo urls for months now but still google lists the above :-(

 

Have you tried the Validation contribution that comes with the v2.1.d updated?

 

http://addons.oscommerce.com/info/2823

 

I believe it could help.

Edited by jailaxmi

I repeat myself when under stress, I repeat myself when under stress, I repeat myself...

 

--King Crimson (“Discipline”)

Share this post


Link to post
Share on other sites
Hello,

 

I just installed Ultimate SEO URLs 2.1d (updated) and I am having a problem with the URLs. Even though I set up the contribution to not add cPath to the product URLs in the admin panel, the URLs are showing the cPath, like:

[url="http://www.sacred-jewelry.com/yogajewelry/chakra-jewelry-c-21.html?osCsid=teid7g9gc06oudhgeb8ohkggq6"[/url]

 

This is the second site I install this contribution on, and never had this happen with the first one. I am also using URL Validation, STS and More Pics contributions here.

 

I am sure I must have overlooked something and there is a simple fix for this... :blush:

 

Thanks in advance for any help!

 

If you are meaning the -c-21 that HAS to be there.

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×