Jump to content
  • Checkout
  • Login
  • Get in touch

osCommerce

The e-commerce.

Automatically send data feed to Froogle


gottaloveit

Recommended Posts

I downloaded the 1.2 version that was posted today. I created a schedule on google base, and it ran, but it failed. It says:

 

Outcome: Data feed upload failed. No items are live. help

Number of items processed: 0

Number of inserted items: 0

Uploaded at: Feb 12, 2009 9:00 pm EST

Processed at: Feb 12, 2009 9:09 pm EST

 

Errors:

Error Bad data Line #

Your data feed wasn't found at the specified location.

 

I have the googlebase.php file in www.server.com/shop/catalog and I have my .htaccess file in the same directory, and it looks like this:

 

# $Id: .htaccess,v 1.3 2003/06/12 10:53:20 hpdl Exp $

#

# This is used with Apache WebServers

#

# For this to work, you must include the parameter 'Options' to

# the AllowOverride configuration

#

# Example:

#

# <Directory "/usr/local/apache/htdocs">

# AllowOverride Options

# </Directory>

#

# 'All' with also work. (This configuration is in the

# apache/conf/httpd.conf file)

 

# The following makes adjustments to the SSL protocol for Internet

# Explorer browsers

 

<IfModule mod_setenvif.c>

<IfDefine SSL>

SetEnvIf User-Agent ".*MSIE.*" \

nokeepalive ssl-unclean-shutdown \

downgrade-1.0 force-response-1.0

</IfDefine>

</IfModule>

 

# If Search Engine Friendly URLs do not work, try enabling the

# following Apache configuration parameter

#

# AcceptPathInfo On

 

# Fix certain PHP values

# (commented out by default to prevent errors occuring on certain

# servers)

#

#<IfModule mod_php4.c>

# php_value session.use_trans_sid 0

# php_value register_globals 1

#</IfModule>

#############################

# Begin Google Base File Rewrite Code

RewriteRule ypg_outfile.txt googlebase.php

# End Google Base File Rewrite Code

#############################

 

What am I missing?

 

Thanks

Link to comment
Share on other sites

Now I get:

 

Errors:

Error Bad data Line #

We did not understand the header (first) row of your file. Please make sure that you are using valid attribute names. help

 

We did not understand the header (first) row of your file. Please make sure that you are using valid attribute names. help <body>

2

 

 

Has anyone got the fix for this?

 

Thanks

 

TMM

Whats the point of a signature?

Link to comment
Share on other sites

Has anyone got the fix for this?

 

Thanks

 

TMM

 

Hi

 

I have upgraded to 1.2 but switched off all the additional setting and the feed works fine. Can someone confirm if the script work with $optional_sec = 1; and options set?

 

Thanks

 

TMM

Edited by themilkman

Whats the point of a signature?

Link to comment
Share on other sites

I downloaded the 1.2 version that was posted today. I created a schedule on google base, and it ran, but it failed. It says:

 

Outcome: Data feed upload failed. No items are live. help

Number of items processed: 0

Number of inserted items: 0

Uploaded at: Feb 12, 2009 9:00 pm EST

Processed at: Feb 12, 2009 9:09 pm EST

 

Errors:

Error Bad data Line #

Your data feed wasn't found at the specified location.

 

I have the googlebase.php file in www.server.com/shop/catalog and I have my .htaccess file in the same directory, and it looks like this:

 

# $Id: .htaccess,v 1.3 2003/06/12 10:53:20 hpdl Exp $

#

# This is used with Apache WebServers

#

# For this to work, you must include the parameter 'Options' to

# the AllowOverride configuration

#

# Example:

#

# <Directory "/usr/local/apache/htdocs">

# AllowOverride Options

# </Directory>

#

# 'All' with also work. (This configuration is in the

# apache/conf/httpd.conf file)

 

# The following makes adjustments to the SSL protocol for Internet

# Explorer browsers

 

<IfModule mod_setenvif.c>

<IfDefine SSL>

SetEnvIf User-Agent ".*MSIE.*" \

nokeepalive ssl-unclean-shutdown \

downgrade-1.0 force-response-1.0

</IfDefine>

</IfModule>

 

# If Search Engine Friendly URLs do not work, try enabling the

# following Apache configuration parameter

#

# AcceptPathInfo On

 

# Fix certain PHP values

# (commented out by default to prevent errors occuring on certain

# servers)

#

#<IfModule mod_php4.c>

# php_value session.use_trans_sid 0

# php_value register_globals 1

#</IfModule>

#############################

# Begin Google Base File Rewrite Code

RewriteRule ypg_outfile.txt googlebase.php

# End Google Base File Rewrite Code

#############################

 

What am I missing?

 

Thanks

Can someone tell me what I'm doing wrong?

 

Thanks!!

Link to comment
Share on other sites

Hi there,

 

some of the items in my catalogue are tax exempt, but there doesn't seem to be a way to deal with this, so they are listed with tax added in google. is there anyway to handle this with this contribution?

 

Thanks

 

Dave

Edited by stubbsy
Link to comment
Share on other sites

I downloaded the 1.2 version that was posted today. I created a schedule on google base, and it ran, but it failed. It says:

 

Outcome: Data feed upload failed. No items are live. help

Number of items processed: 0

Number of inserted items: 0

Uploaded at: Feb 12, 2009 9:00 pm EST

Processed at: Feb 12, 2009 9:09 pm EST

 

Errors:

Error Bad data Line #

Your data feed wasn't found at the specified location.

 

I have the googlebase.php file in www.server.com/shop/catalog and I have my .htaccess file in the same directory, and it looks like this:

 

# $Id: .htaccess,v 1.3 2003/06/12 10:53:20 hpdl Exp $

#

# This is used with Apache WebServers

#

# For this to work, you must include the parameter 'Options' to

# the AllowOverride configuration

#

# Example:

#

# <Directory "/usr/local/apache/htdocs">

# AllowOverride Options

# </Directory>

#

# 'All' with also work. (This configuration is in the

# apache/conf/httpd.conf file)

 

# The following makes adjustments to the SSL protocol for Internet

# Explorer browsers

 

<IfModule mod_setenvif.c>

<IfDefine SSL>

SetEnvIf User-Agent ".*MSIE.*" \

nokeepalive ssl-unclean-shutdown \

downgrade-1.0 force-response-1.0

</IfDefine>

</IfModule>

 

# If Search Engine Friendly URLs do not work, try enabling the

# following Apache configuration parameter

#

# AcceptPathInfo On

 

# Fix certain PHP values

# (commented out by default to prevent errors occuring on certain

# servers)

#

#<IfModule mod_php4.c>

# php_value session.use_trans_sid 0

# php_value register_globals 1

#</IfModule>

#############################

# Begin Google Base File Rewrite Code

RewriteRule ypg_outfile.txt googlebase.php

# End Google Base File Rewrite Code

#############################

 

What am I missing?

 

Thanks

Can anyone tell me what is wrong with this .htaccess file? If I try to go to www.site.com/shop/catalog/ypg_outfile.txt, I get page not found.

 

Thanks.

Link to comment
Share on other sites

Can someone tell me what I'm doing wrong?

 

Thanks!!

Add the following to your .htaccess, just above the Google Base code that you added.

# Turn on the rewrite engine
Options +FollowSymLinks
RewriteEngine on

Regards

Jim

See my profile for a list of my addons and ways to get support.

Link to comment
Share on other sites

Has anyone got the fix for this?

 

Thanks

 

TMM

 

I'm having the same issue, but using the latest version and turning off the options didn't work for me. Can you share all your settings that made it work? The autoupload wasn't working so I copied from firefox window into notpad. Could it be an issue with doing this and encoding?

 

It's using these attributes:

 

link title description expiration_date price image_link genre id weight

Link to comment
Share on other sites

Don't copy the text from your browser window -- it messes up the formatting. Go to View > Page Source and copy the file from there. Don't try the automatic upload until the manual file is accepted without errors.

 

Once you have the manual file working, check that the URL to the text file is working correctly before submitting to Google.

 

Regards

Jim

Edited by kymation

See my profile for a list of my addons and ways to get support.

Link to comment
Share on other sites

Don't copy the text from your browser window -- it messes up the formatting. Go to View > Page Source and copy the file from there. Don't try the automatic upload until the manual file is accepted without errors.

 

Once you have the manual file working, check that the URL to the text file is working correctly before submitting to Google.

 

Regards

Jim

 

 

THanks! Fixes that problem. Now my problem is that my store is in a sub-folder and the product links don't have the subfolder in them so they go nowhere. Where do I set the location of the catalog so that the link build correctly?

Link to comment
Share on other sites

Try something like this: In googlebase.php, change Line 101 to:

  $imageURL = HTTP_SERVER.'/subfolder'.DIR_WS_IMAGES;

and Line 117:

   $productURL = HTTP_SERVER.'/subfolder/product_info.php/products_id/';

Substitute the name of your subfolder where I put subfolder. There may be another line that you need that I've forgotten, depending on your settings.

 

Regards

Jim

See my profile for a list of my addons and ways to get support.

Link to comment
Share on other sites

Try something like this: In googlebase.php, change Line 101 to:

  $imageURL = HTTP_SERVER.'/subfolder'.DIR_WS_IMAGES;

and Line 117:

   $productURL = HTTP_SERVER.'/subfolder/product_info.php/products_id/';

Substitute the name of your subfolder where I put subfolder. There may be another line that you need that I've forgotten, depending on your settings.

 

Regards

Jim

 

I tried that second one already and it didn't change anything. Maybe I messed something up...will give it another go...thanks

Link to comment
Share on other sites

Try something like this: In googlebase.php, change Line 101 to:

  $imageURL = HTTP_SERVER.'/subfolder'.DIR_WS_IMAGES;

and Line 117:

   $productURL = HTTP_SERVER.'/subfolder/product_info.php/products_id/';

Substitute the name of your subfolder where I put subfolder. There may be another line that you need that I've forgotten, depending on your settings.

 

Regards

Jim

 

The images one works...if you put another / after subfolder....now to figure out why the other one doesn't seem to make any difference at all

Link to comment
Share on other sites

  • 3 weeks later...
Hi there,

 

some of the items in my catalogue are tax exempt, but there doesn't seem to be a way to deal with this, so they are listed with tax added in google. is there anyway to handle this with this contribution?

 

Thanks

 

Dave

 

I am facing the same problem as dave, dave did you get an answer for this?

Link to comment
Share on other sites

I am facing the same problem as dave, dave did you get an answer for this?

 

 

Not sure if the contribution can support this, but here's how you can control tax/shipping for Google Product Search on a sku level.

 

http://www.loveyourfeed.com/how-to-submit-...ata-feed-92.htm

 

You can obviously control these settings on a global account level from within your Google Base account.

Ryan Douglas

SingleFeed.com

Link to comment
Share on other sites

I have a question... I use this contrib or one like it... I am sure it's this one... I submit and I am getting errors due to some of the item descriptions containing quotation marks...

 

example...

 

Jim Beam "Traditional" mirror

 

Google calls it a bad line because of the ""

 

how can I strip the quotation marks from the generated feed?

 

Thanks in advance,

 

Lonny

Yeah Yeah I am learning as I go... lol

Link to comment
Share on other sites

Add the following to your .htaccess, just above the Google Base code that you added.

# Turn on the rewrite engine
Options +FollowSymLinks
RewriteEngine on

Regards

Jim

 

Hi Jim,

 

I appear to have the same issue as Jason as I am getting the same error message from Google and cannot navigate to the file via my browser, however, I do already have the rewrite code in my htaccess file as below. Any ideas where I've gone wrong? Cheers.

# $Id: .htaccess 1739 2007-12-20 00:52:16Z hpdl $
#
# This is used with Apache WebServers
#
# For this to work, you must include the parameter 'Options' to
# the AllowOverride configuration
#
# Example:
#
# <Directory "/usr/local/apache/htdocs">
#   AllowOverride Options
# </Directory>
#
# 'All' with also work. (This configuration is in the
# apache/conf/httpd.conf file)

# The following makes adjustments to the SSL protocol for Internet
# Explorer browsers

#<IfModule mod_setenvif.c>
#  <IfDefine SSL>
#	SetEnvIf User-Agent ".*MSIE.*" \
#			 nokeepalive ssl-unclean-shutdown \
#			 downgrade-1.0 force-response-1.0
#  </IfDefine>
#</IfModule>

# If Search Engine Friendly URLs do not work, try enabling the
# following Apache configuration parameter

# AcceptPathInfo On

# Fix certain PHP values
# (commented out by default to prevent errors occuring on certain
# servers)


# php_value session.use_trans_sid 0
# php_value register_globals 1
Options +FollowSymLinks
RewriteEngine On 
RewriteBase /
#############################
# Begin Google Base File Rewrite Code
RewriteRule your-outfile.txt googlebase.php
# End Google Base File Rewrite Code
#############################

RewriteRule ^(.*)-p-(.*).html$ product_info.php?products_id=$2&%{QUERY_STRING}
RewriteRule ^(.*)-c-(.*).html$ index.php?cPath=$2&%{QUERY_STRING}
RewriteRule ^(.*)-m-(.*).html$ index.php?manufacturers_id=$2&%{QUERY_STRING}

RewriteCond %{HTTP_HOST} !^www.togsuk.com$
RewriteRule ^(.*)$ http://www.togsuk.com/catalog/$1 [R=301]

SetEnvIfNoCase Request_URI IP_Trapped\.txt ban
<Files ~ "^.*$">
order allow,deny
allow from all
deny from env=ban
</Files>

# Deny domain access to spammers and other scumbags
RewriteEngine on

php_flag register_globals off

SetEnvIfNoCase User-Agent "^libwww-perl*" block_bad_bots 

Deny from env=block_bad_bots

# ban spam bots 
RewriteCond %{HTTP_USER_AGENT} almaden [OR]

RewriteCond %{HTTP_USER_AGENT} ^Anarchie [OR]

RewriteCond %{HTTP_USER_AGENT} ^ASPSeek [OR]

RewriteCond %{HTTP_USER_AGENT} ^attach [OR]

RewriteCond %{HTTP_USER_AGENT} ^autoemailspider [OR]

RewriteCond %{HTTP_USER_AGENT} ^BackWeb [OR]

RewriteCond %{HTTP_USER_AGENT} ^Bandit [OR]

RewriteCond %{HTTP_USER_AGENT} ^BatchFTP [OR]

RewriteCond %{HTTP_USER_AGENT} ^BlackWidow [OR]

RewriteCond %{HTTP_USER_AGENT} ^Bot\ mailto:[email protected] [OR]

RewriteCond %{HTTP_USER_AGENT} ^Buddy [OR]

RewriteCond %{HTTP_USER_AGENT} ^bumblebee [OR]

RewriteCond %{HTTP_USER_AGENT} ^CherryPicker [OR]

RewriteCond %{HTTP_USER_AGENT} ^ChinaClaw [OR]

RewriteCond %{HTTP_USER_AGENT} ^CICC [OR]

RewriteCond %{HTTP_USER_AGENT} ^Collector [OR]

RewriteCond %{HTTP_USER_AGENT} ^Copier [OR]

RewriteCond %{HTTP_USER_AGENT} ^Crescent [OR]

RewriteCond %{HTTP_USER_AGENT} ^Custo [OR]

RewriteCond %{HTTP_USER_AGENT} ^DA [OR]

RewriteCond %{HTTP_USER_AGENT} ^DIIbot [OR]

RewriteCond %{HTTP_USER_AGENT} ^DISCo [OR]

RewriteCond %{HTTP_USER_AGENT} ^DISCo\ Pump [OR]

RewriteCond %{HTTP_USER_AGENT} ^Download\ Demon [OR]

RewriteCond %{HTTP_USER_AGENT} ^Download\ Wonder [OR]

RewriteCond %{HTTP_USER_AGENT} ^Downloader [OR]

RewriteCond %{HTTP_USER_AGENT} ^Drip [OR]

RewriteCond %{HTTP_USER_AGENT} ^DSurf15a [OR]

RewriteCond %{HTTP_USER_AGENT} ^eCatch [OR]

RewriteCond %{HTTP_USER_AGENT} ^EasyDL/2.99 [OR]

RewriteCond %{HTTP_USER_AGENT} ^EirGrabber [OR]

RewriteCond %{HTTP_USER_AGENT} email [NC,OR]

RewriteCond %{HTTP_USER_AGENT} ^EmailCollector [OR]

RewriteCond %{HTTP_USER_AGENT} ^EmailSiphon [OR]

RewriteCond %{HTTP_USER_AGENT} ^EmailWolf [OR]

RewriteCond %{HTTP_USER_AGENT} ^Express\ WebPictures [OR]

RewriteCond %{HTTP_USER_AGENT} ^ExtractorPro [OR]

RewriteCond %{HTTP_USER_AGENT} ^EyeNetIE [OR]

RewriteCond %{HTTP_USER_AGENT} ^FileHound [OR]

RewriteCond %{HTTP_USER_AGENT} ^FlashGet [OR]

RewriteCond %{HTTP_USER_AGENT} FrontPage [NC,OR]

RewriteCond %{HTTP_USER_AGENT} ^GetRight [OR]

RewriteCond %{HTTP_USER_AGENT} ^GetSmart [OR]

RewriteCond %{HTTP_USER_AGENT} ^GetWeb! [OR]

RewriteCond %{HTTP_USER_AGENT} ^gigabaz [OR]

RewriteCond %{HTTP_USER_AGENT} ^Go\!Zilla [OR]

RewriteCond %{HTTP_USER_AGENT} ^Go!Zilla [OR]

RewriteCond %{HTTP_USER_AGENT} ^Go-Ahead-Got-It [OR]

RewriteCond %{HTTP_USER_AGENT} ^gotit [OR]

RewriteCond %{HTTP_USER_AGENT} ^Grabber [OR]

RewriteCond %{HTTP_USER_AGENT} ^GrabNet [OR]

RewriteCond %{HTTP_USER_AGENT} ^Grafula [OR]

RewriteCond %{HTTP_USER_AGENT} ^grub-client [OR]

RewriteCond %{HTTP_USER_AGENT} ^HMView [OR]

RewriteCond %{HTTP_USER_AGENT} ^HTTrack [OR]

RewriteCond %{HTTP_USER_AGENT} ^httpdown [OR]

RewriteCond %{HTTP_USER_AGENT} .*httrack.* [NC,OR]

RewriteCond %{HTTP_USER_AGENT} ^ia_archiver [OR]

RewriteCond %{HTTP_USER_AGENT} ^Image\ Stripper [OR]

RewriteCond %{HTTP_USER_AGENT} ^Image\ Sucker [OR]

RewriteCond %{HTTP_USER_AGENT} ^Indy*Library [OR]

RewriteCond %{HTTP_USER_AGENT} Indy\ Library [NC,OR]

RewriteCond %{HTTP_USER_AGENT} ^InterGET [OR]

RewriteCond %{HTTP_USER_AGENT} ^InternetLinkagent [OR]

RewriteCond %{HTTP_USER_AGENT} ^Internet\ Ninja [OR]

RewriteCond %{HTTP_USER_AGENT} ^InternetSeer.com [OR]

RewriteCond %{HTTP_USER_AGENT} ^Iria [OR]

RewriteCond %{HTTP_USER_AGENT} ^JBH*agent [OR]

RewriteCond %{HTTP_USER_AGENT} ^JetCar [OR]

RewriteCond %{HTTP_USER_AGENT} ^JOC\ Web\ Spider [OR]

RewriteCond %{HTTP_USER_AGENT} ^JustView [OR]

RewriteCond %{HTTP_USER_AGENT} ^larbin [OR]

RewriteCond %{HTTP_USER_AGENT} ^LeechFTP [OR]

RewriteCond %{HTTP_USER_AGENT} ^LexiBot [OR]

RewriteCond %{HTTP_USER_AGENT} ^lftp [OR]

RewriteCond %{HTTP_USER_AGENT} ^Link*Sleuth [OR]

RewriteCond %{HTTP_USER_AGENT} ^likse [OR]

RewriteCond %{HTTP_USER_AGENT} ^Link [OR]

RewriteCond %{HTTP_USER_AGENT} ^LinkWalker [OR]

RewriteCond %{HTTP_USER_AGENT} ^Mag-Net [OR]

RewriteCond %{HTTP_USER_AGENT} ^Magnet [OR]

RewriteCond %{HTTP_USER_AGENT} ^Mass\ Downloader [OR]

RewriteCond %{HTTP_USER_AGENT} ^Memo [OR]

RewriteCond %{HTTP_USER_AGENT} ^Microsoft.URL [OR]

RewriteCond %{HTTP_USER_AGENT} ^MIDown\ tool [OR]

RewriteCond %{HTTP_USER_AGENT} ^Mirror [OR]

RewriteCond %{HTTP_USER_AGENT} ^Mister\ PiX [OR]

RewriteCond %{HTTP_USER_AGENT} ^Mozilla.*Indy [OR]

RewriteCond %{HTTP_USER_AGENT} ^Mozilla.*NEWT [OR]

RewriteCond %{HTTP_USER_AGENT} ^Mozilla*MSIECrawler [OR]

RewriteCond %{HTTP_USER_AGENT} ^MS\ FrontPage* [OR]

RewriteCond %{HTTP_USER_AGENT} ^MSFrontPage [OR]

RewriteCond %{HTTP_USER_AGENT} ^MSIECrawler [OR]

RewriteCond %{HTTP_USER_AGENT} ^MSProxy [OR]

RewriteCond %{HTTP_USER_AGENT} ^Navroad [OR]

RewriteCond %{HTTP_USER_AGENT} ^NearSite [OR]

RewriteCond %{HTTP_USER_AGENT} ^NetAnts [OR]

RewriteCond %{HTTP_USER_AGENT} ^NetMechanic [OR]

RewriteCond %{HTTP_USER_AGENT} ^NetSpider [OR]

RewriteCond %{HTTP_USER_AGENT} ^Net\ Vampire [OR]

RewriteCond %{HTTP_USER_AGENT} ^NetZIP [OR]

RewriteCond %{HTTP_USER_AGENT} ^NICErsPRO [OR]

RewriteCond %{HTTP_USER_AGENT} ^Ninja [OR]

RewriteCond %{HTTP_USER_AGENT} ^Octopus [OR]

RewriteCond %{HTTP_USER_AGENT} ^Offline\ Explorer [OR]

RewriteCond %{HTTP_USER_AGENT} ^Offline\ Navigator [OR]

RewriteCond %{HTTP_USER_AGENT} ^Openfind [OR]

RewriteCond %{HTTP_USER_AGENT} ^PageGrabber [OR]

RewriteCond %{HTTP_USER_AGENT} ^Papa\ Foto [OR]

RewriteCond %{HTTP_USER_AGENT} ^pavuk [OR]

RewriteCond %{HTTP_USER_AGENT} ^pcBrowser [OR]

RewriteCond %{HTTP_USER_AGENT} ^Ping [OR]

RewriteCond %{HTTP_USER_AGENT} ^PingALink [OR]

RewriteCond %{HTTP_USER_AGENT} ^Pockey [OR]

RewriteCond %{HTTP_USER_AGENT} ^psbot [OR]

RewriteCond %{HTTP_USER_AGENT} ^Pump [OR]

RewriteCond %{HTTP_USER_AGENT} ^QRVA [OR]

RewriteCond %{HTTP_USER_AGENT} ^RealDownload [OR]

RewriteCond %{HTTP_USER_AGENT} ^Reaper [OR]

RewriteCond %{HTTP_USER_AGENT} ^Recorder [OR]

RewriteCond %{HTTP_USER_AGENT} ^ReGet [OR]

RewriteCond %{HTTP_USER_AGENT} ^Scooter [OR]

RewriteCond %{HTTP_USER_AGENT} ^Seeker [OR]

RewriteCond %{HTTP_USER_AGENT} ^Siphon [OR]

RewriteCond %{HTTP_USER_AGENT} ^sitecheck.internetseer.com [OR]

RewriteCond %{HTTP_USER_AGENT} ^SiteSnagger [OR]

RewriteCond %{HTTP_USER_AGENT} ^SlySearch [OR]

RewriteCond %{HTTP_USER_AGENT} ^SmartDownload [OR]

RewriteCond %{HTTP_USER_AGENT} ^Snake [OR]

RewriteCond %{HTTP_USER_AGENT} ^SpaceBison [OR]

RewriteCond %{HTTP_USER_AGENT} ^sproose [OR]

RewriteCond %{HTTP_USER_AGENT} ^Stripper [OR]

RewriteCond %{HTTP_USER_AGENT} ^Sucker [OR]

RewriteCond %{HTTP_USER_AGENT} ^SuperBot [OR]

RewriteCond %{HTTP_USER_AGENT} ^SuperHTTP [OR]

RewriteCond %{HTTP_USER_AGENT} ^Surfbot [OR]

RewriteCond %{HTTP_USER_AGENT} ^Szukacz [OR]

RewriteCond %{HTTP_USER_AGENT} ^tAkeOut [OR]

RewriteCond %{HTTP_USER_AGENT} ^Teleport\ Pro [OR]

RewriteCond %{HTTP_USER_AGENT} ^URLSpiderPro [OR]

RewriteCond %{HTTP_USER_AGENT} ^Vacuum [OR]

RewriteCond %{HTTP_USER_AGENT} ^VoidEYE [OR]

RewriteCond %{HTTP_USER_AGENT} ^Web\ Image\ Collector [OR]

RewriteCond %{HTTP_USER_AGENT} ^Web\ Sucker [OR]

RewriteCond %{HTTP_USER_AGENT} ^WebAuto [OR]

RewriteCond %{HTTP_USER_AGENT} ^[Ww]eb[Bb]andit [OR]

RewriteCond %{HTTP_USER_AGENT} ^webcollage [OR]

RewriteCond %{HTTP_USER_AGENT} ^WebCopier [OR]

RewriteCond %{HTTP_USER_AGENT} ^Web\ Downloader [OR]

RewriteCond %{HTTP_USER_AGENT} ^WebEMailExtrac.* [OR]

RewriteCond %{HTTP_USER_AGENT} ^WebFetch [OR]

RewriteCond %{HTTP_USER_AGENT} ^WebGo\ IS [OR]

RewriteCond %{HTTP_USER_AGENT} ^WebHook [OR]

RewriteCond %{HTTP_USER_AGENT} ^WebLeacher [OR]

RewriteCond %{HTTP_USER_AGENT} ^WebMiner [OR]

RewriteCond %{HTTP_USER_AGENT} ^WebMirror [OR]

RewriteCond %{HTTP_USER_AGENT} ^WebReaper [OR]

RewriteCond %{HTTP_USER_AGENT} ^WebSauger [OR]

RewriteCond %{HTTP_USER_AGENT} ^Website [OR]

RewriteCond %{HTTP_USER_AGENT} ^Website\ eXtractor [OR]

RewriteCond %{HTTP_USER_AGENT} ^Website\ Quester [OR]

RewriteCond %{HTTP_USER_AGENT} ^Webster [OR]

RewriteCond %{HTTP_USER_AGENT} ^WebStripper [OR]

RewriteCond %{HTTP_USER_AGENT} WebWhacker [OR]

RewriteCond %{HTTP_USER_AGENT} ^WebZIP [OR]

RewriteCond %{HTTP_USER_AGENT} ^Wget [OR]

RewriteCond %{HTTP_USER_AGENT} ^Whacker [OR]

RewriteCond %{HTTP_USER_AGENT} ^Widow [OR]

RewriteCond %{HTTP_USER_AGENT} ^WWWOFFLE [OR]

RewriteCond %{HTTP_USER_AGENT} ^x-Tractor [OR]

RewriteCond %{HTTP_USER_AGENT} ^Xaldon\ WebSpider [OR]

RewriteCond %{HTTP_USER_AGENT} ^Xenu [OR]

RewriteCond %{HTTP_USER_AGENT} ^Zeus.*Webster [OR]

RewriteCond %{HTTP_USER_AGENT} ^Zeus

RewriteRule ^.* - [F,L]

RewriteCond %{HTTP_REFERER} ^http://www.togsuk.com$

RewriteRule !^http://[^/.]\.togsuk.com.* - [F,L]

# anti xss script 1 - pci compliance - by pixclinic
Options +FollowSymLinks
RewriteEngine On 
RewriteCond %{QUERY_STRING} base64_encode.*\(.*\) [OR]
RewriteCond %{QUERY_STRING} (\<|%3C).*script.*(\>|%3E) [NC,OR]
RewriteCond %{QUERY_STRING} (\<|%3C).*iframe.*(\>|%3E) [NC,OR]
RewriteCond %{QUERY_STRING} GLOBALS(=|\[|\%[0-9A-Z]{0,2}) [OR]
RewriteCond %{QUERY_STRING} _REQUEST(=|\[|\%[0-9A-Z]{0,2})
RewriteRule ^(.*)$ index_error.php [F,L]
RewriteCond %{REQUEST_METHOD} ^(TRACE|TRACK)
RewriteRule .* - [F]

# extra anti uri and xss attack script 2 - sql injection prevention
Options +FollowSymLinks
RewriteEngine On
RewriteCond %{QUERY_STRING} ("|%22).*(>|%3E|<|%3C).* [NC]
RewriteRule ^(.*)$ log.php [NC]
RewriteCond %{QUERY_STRING} (<|%3C).*script.*(>|%3E) [NC]
RewriteRule ^(.*)$ log.php [NC]
RewriteCond %{QUERY_STRING} (java script:).*(;).* [NC]
RewriteRule ^(.*)$ log.php [NC]
RewriteCond %{QUERY_STRING} (;|'|"|%22).*(union|select|insert|drop|update|md5|benchmark|or|and|if).* [NC]
RewriteRule ^(.*)$ log.php [NC]
RewriteRule (,|;|<|>|'|`) /log.php [NC]

Link to comment
Share on other sites

Dear Jim,

 

I also had the same problem as Jason, so I used the code you provided, and now I have a new problem.

I get the following:

 

Error Bad data Line #

We were unable to connect to the host holding your data feed.

 

What am I doing wrong?

 

Thanks for your help.

Link to comment
Share on other sites

Dan:

You need to change your-outfile.txt to the name of the file that you want to give to Google.

 

Larry:

Google cannot find your web site. Check for spelling errors in your URL in your Google Feed admin.

 

Regards

Jim

See my profile for a list of my addons and ways to get support.

Link to comment
Share on other sites

Dan:

You need to change your-outfile.txt to the name of the file that you want to give to Google.

 

Larry:

Google cannot find your web site. Check for spelling errors in your URL in your Google Feed admin.

 

Regards

Jim

 

Jim,

Thanks, but it still does not work.... I checked and triple checked, and it still gives me the same error... can it be something with file permissions? I am so frustrated by now, and have no idea on what to do next...

Your help is greatly appreciated.

Link to comment
Share on other sites

Jim,

Thanks, but it still does not work.... I checked and triple checked, and it still gives me the same error... can it be something with file permissions? I am so frustrated by now, and have no idea on what to do next...

Your help is greatly appreciated.

 

I just commented out the file name in my .htaccess file and i am getting the

 

Error Bad data Line #

Your data feed wasn't found at the specified location.

 

I did not change anything in my google admin, so i am very confused now, because it seems that it can connect to my host, but not when i provide the file.

 

Thanks for your help.

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Unfortunately, your content contains terms that we do not allow. Please edit your content to remove the highlighted words below.
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...