Jump to content
  • Checkout
  • Login
  • Get in touch

osCommerce

The e-commerce.

Ban IP Address & Security Issue


sunshynecraftsbeads

Recommended Posts

Hi,

 

I was wondering if anyone could tell me how to do the following;

 

1. Ban a person from my store. ( There is 1 person who is at my store every day for hours on end and I am not sure what they are doing) ?

 

2. Is there a program or way to find out if your store is being hacked or has been hacked. I had a person email me from my store telling me that I had issues with my site but would not share any further information.

 

 

Any help, comments or suggestions would be greatly appreciated.

 

Thank you in advance,

Tracie

Link to comment
Share on other sites

If you have protected your sites admin directory then this person could well be bluffing you. Have you tried osC_Sec ?

- Stop Oscommerce hacks dead in their tracks with osC_Sec (see discussion here)
- Another discussion about infected files ::here::
- A discussion on file permissions ::here::
- Site hacked? Should you upgrade or not, some thoughts ::here::
- Fix the admin login bypass exploit here
- Pareto Security: New security addon I am developing, a remake of osC_Sec in PHP 5 with a number of fixes
- BTC:1LHiMXedmtyq4wcYLedk9i9gkk8A8Hk7qX

Link to comment
Share on other sites

Tracie,

 

You can add the following to the store root .htaccess file:

 

<Limit GET PUT POST>

 

order allow,deny

deny from 00.00.00.0 // the ip address you want to ban goes here

allow from all

</Limit>

 

 

 

Chris

Link to comment
Share on other sites

Jody,

 

 

The root .htaccess file is located in the stores root. So, if you have your store installed in the /catalog directory, you would find it there.

 

 

NOTE: .htaccess files are only functional on LINUX servers, WINDOWS IIs do not allow the use of the .htaccess file. If you have a LINUX server and do not see the .htaccess file when viewing the server with an FTP client, try using your hosting providers file manager to see it. Sometimes hosting accounts have the .htaccess file 'hidden' from outside viewers.

 

 

 

Chris

Link to comment
Share on other sites

LIke I told you in your other post if you can't use the stupid file_manager you have to use FTP or cPanel

huh.png

If I suggest you edit any file(s) make a backup first - I'm not perfect and neither are you.

 

"Given enough impetus a parallelogramatically shaped projectile can egress a circular orifice."

- Me -

 

"Headers already sent" - The definitive help

 

"Cannot redeclare ..." - How to find/fix it

 

SSL Implementation Help

 

Like this post? "Like" it again over there >

Link to comment
Share on other sites

  • 1 month later...

Thank you for the excellent topic/advice.

I have a guest who is on my website for days on end without logging out.

 

I don't know what he is doing, it seems that he is just browsing around though, but for security reasons I have tried to block his IP.

 

When I add the code to my .htaccess file, I get an eternal 500 error screen on the frontend and loging in the admin area.

 

If I remove the code my site is back to normal.

 

Any advice please will be HIGHLY apreciated!

 

Here is my .htaccess file for your info, the code is at the bottom:

 

# Ultimate SEO URLs BEGIN

Options +FollowSymLinks

RewriteEngine On

RewriteBase /catalog/

 

RewriteCond %{QUERY_STRING} ^options\=(.*)$

RewriteRule ^(.*)-p-(.*).html$ product_info.php?products_id=$2%1

RewriteRule ^(.*)-p-(.*).html$ product_info.php?products_id=$2&%{QUERY_STRING}

RewriteRule ^(.*)-c-(.*).html$ index.php?cPath=$2&%{QUERY_STRING}

RewriteRule ^(.*)-m-(.*).html$ index.php?manufacturers_id=$2&%{QUERY_STRING}

RewriteRule ^(.*)-pi-(.*).html$ popup_image.php?pID=$2&%{QUERY_STRING}

RewriteRule ^(.*)-t-(.*).html$ articles.php?tPath=$2&%{QUERY_STRING}

RewriteRule ^(.*)-au-(.*).html$ articles.php?authors_id=$2&%{QUERY_STRING}

RewriteRule ^(.*)-a-(.*).html$ article_info.php?articles_id=$2&%{QUERY_STRING}

RewriteRule ^(.*)-pr-(.*).html$ product_reviews.php?products_id=$2&%{QUERY_STRING}

RewriteRule ^(.*)-pri-(.*).html$ product_reviews_info.php?products_id=$2&%{QUERY_STRING}

RewriteRule ^(.*)-i-(.*).html$ information.php?info_id=$2&%{QUERY_STRING}

# BOF: "Extra pages-info box w/ admin" support added by faaliyet

RewriteRule ^(.*)-pm-([0-9]+).html$ info_pages.php?pages_id=$2&%{QUERY_STRING}

# EOF: "Extra pages-info box w/ admin" support added by faaliyet

RewriteRule ^(.*)-links-(.*).html$ links.php?lPath=$2&%{QUERY_STRING}

# Added polls and newsdesk

#RewriteRule ^(.*)-po-([0-9]+).html$ pollbooth.php?pollid=$2&%{QUERY_STRING}

RewriteRule ^(.*)-n-(.*).html$ newsdesk_info.php?newsdesk_id=$2&%{QUERY_STRING}

RewriteRule ^(.*)-nc-(.*).html$ newsdesk_index.php?newsPath=$2&%{QUERY_STRING}

RewriteRule ^(.*)-nri-(.*).html$ newsdesk_reviews_info.php?newsdesk_id=$2&%{QUERY_STRING}

RewriteRule ^(.*)-nra-(.*).html$ newsdesk_reviews_article.php?newsdesk_id=$2&%{QUERY_STRING}

# BOF: Faqdesk support added by faaliyet

RewriteRule ^(.*)-f-(.*).html$ faqdesk_info.php?faqdesk_id=$2&%{QUERY_STRING}

RewriteRule ^(.*)-fc-(.*).html$ faqdesk_index.php?faqPath=$2&%{QUERY_STRING}

RewriteRule ^(.*)-fri-(.*).html$ faqdesk_reviews_info.php?faqdesk_id=$2&%{QUERY_STRING}

RewriteRule ^(.*)-fra-(.*).html$ faqdesk_reviews_article.php?faqdesk_id=$2&%{QUERY_STRING}

# EOF: Faqdesk support added by faaliyet

# Ultimate SEO URLs END

 

# Block Bad Bots

RewriteCond %{HTTP_USER_AGENT} ^BlackWidow [OR]

RewriteCond %{HTTP_USER_AGENT} ^Bot\ mailto:[email protected] [OR]

RewriteCond %{HTTP_USER_AGENT} ^ChinaClaw [OR]

RewriteCond %{HTTP_USER_AGENT} ^Custo [OR]

RewriteCond %{HTTP_USER_AGENT} ^DISCo [OR]

RewriteCond %{HTTP_USER_AGENT} ^Download\ Demon [OR]

RewriteCond %{HTTP_USER_AGENT} ^eCatch [OR]

RewriteCond %{HTTP_USER_AGENT} ^EirGrabber [OR]

RewriteCond %{HTTP_USER_AGENT} ^EmailSiphon [OR]

RewriteCond %{HTTP_USER_AGENT} ^EmailWolf [OR]

RewriteCond %{HTTP_USER_AGENT} ^Express\ WebPictures [OR]

RewriteCond %{HTTP_USER_AGENT} ^ExtractorPro [OR]

RewriteCond %{HTTP_USER_AGENT} ^EyeNetIE [OR]

RewriteCond %{HTTP_USER_AGENT} ^FlashGet [OR]

RewriteCond %{HTTP_USER_AGENT} ^GetRight [OR]

RewriteCond %{HTTP_USER_AGENT} ^GetWeb! [OR]

RewriteCond %{HTTP_USER_AGENT} ^Go!Zilla [OR]

RewriteCond %{HTTP_USER_AGENT} ^Go-Ahead-Got-It [OR]

RewriteCond %{HTTP_USER_AGENT} ^GrabNet [OR]

RewriteCond %{HTTP_USER_AGENT} ^Grafula [OR]

RewriteCond %{HTTP_USER_AGENT} ^HMView [OR]

RewriteCond %{HTTP_USER_AGENT} HTTrack [NC,OR]

RewriteCond %{HTTP_USER_AGENT} ^Image\ Stripper [OR]

RewriteCond %{HTTP_USER_AGENT} ^Image\ Sucker [OR]

RewriteCond %{HTTP_USER_AGENT} Indy\ Library [NC,OR]

RewriteCond %{HTTP_USER_AGENT} ^InterGET [OR]

RewriteCond %{HTTP_USER_AGENT} ^Internet\ Ninja [OR]

RewriteCond %{HTTP_USER_AGENT} ^JetCar [OR]

RewriteCond %{HTTP_USER_AGENT} ^JOC\ Web\ Spider [OR]

RewriteCond %{HTTP_USER_AGENT} ^larbin [OR]

RewriteCond %{HTTP_USER_AGENT} ^LeechFTP [OR]

RewriteCond %{HTTP_USER_AGENT} ^Mass\ Downloader [OR]

RewriteCond %{HTTP_USER_AGENT} ^MIDown\ tool [OR]

RewriteCond %{HTTP_USER_AGENT} ^Mister\ PiX [OR]

RewriteCond %{HTTP_USER_AGENT} ^Navroad [OR]

RewriteCond %{HTTP_USER_AGENT} ^NearSite [OR]

RewriteCond %{HTTP_USER_AGENT} ^NetAnts [OR]

RewriteCond %{HTTP_USER_AGENT} ^NetSpider [OR]

RewriteCond %{HTTP_USER_AGENT} ^Net\ Vampire [OR]

RewriteCond %{HTTP_USER_AGENT} ^NetZIP [OR]

RewriteCond %{HTTP_USER_AGENT} ^Octopus [OR]

RewriteCond %{HTTP_USER_AGENT} ^Offline\ Explorer [OR]

RewriteCond %{HTTP_USER_AGENT} ^Offline\ Navigator [OR]

RewriteCond %{HTTP_USER_AGENT} ^PageGrabber [OR]

RewriteCond %{HTTP_USER_AGENT} ^Papa\ Foto [OR]

RewriteCond %{HTTP_USER_AGENT} ^pavuk [OR]

RewriteCond %{HTTP_USER_AGENT} ^pcBrowser [OR]

RewriteCond %{HTTP_USER_AGENT} ^RealDownload [OR]

RewriteCond %{HTTP_USER_AGENT} ^ReGet [OR]

RewriteCond %{HTTP_USER_AGENT} ^SiteSnagger [OR]

RewriteCond %{HTTP_USER_AGENT} ^SmartDownload [OR]

RewriteCond %{HTTP_USER_AGENT} ^SuperBot [OR]

RewriteCond %{HTTP_USER_AGENT} ^SuperHTTP [OR]

RewriteCond %{HTTP_USER_AGENT} ^Surfbot [OR]

RewriteCond %{HTTP_USER_AGENT} ^tAkeOut [OR]

RewriteCond %{HTTP_USER_AGENT} ^Teleport\ Pro [OR]

RewriteCond %{HTTP_USER_AGENT} ^VoidEYE [OR]

RewriteCond %{HTTP_USER_AGENT} ^Web\ Image\ Collector [OR]

RewriteCond %{HTTP_USER_AGENT} ^Web\ Sucker [OR]

RewriteCond %{HTTP_USER_AGENT} ^WebAuto [OR]

RewriteCond %{HTTP_USER_AGENT} ^WebCopier [OR]

RewriteCond %{HTTP_USER_AGENT} ^WebFetch [OR]

RewriteCond %{HTTP_USER_AGENT} ^WebGo\ IS [OR]

RewriteCond %{HTTP_USER_AGENT} ^WebLeacher [OR]

RewriteCond %{HTTP_USER_AGENT} ^WebReaper [OR]

RewriteCond %{HTTP_USER_AGENT} ^WebSauger [OR]

RewriteCond %{HTTP_USER_AGENT} ^Website\ eXtractor [OR]

RewriteCond %{HTTP_USER_AGENT} ^Website\ Quester [OR]

RewriteCond %{HTTP_USER_AGENT} ^WebStripper [OR]

RewriteCond %{HTTP_USER_AGENT} ^WebWhacker [OR]

RewriteCond %{HTTP_USER_AGENT} ^WebZIP [OR]

RewriteCond %{HTTP_USER_AGENT} ^Wget [OR]

RewriteCond %{HTTP_USER_AGENT} ^Widow [OR]

RewriteCond %{HTTP_USER_AGENT} ^WWWOFFLE [OR]

RewriteCond %{HTTP_USER_AGENT} ^Xaldon\ WebSpider [OR]

RewriteCond %{HTTP_USER_AGENT} ^Zeus

RewriteRule .* - [F]

 

<Limit GET PUT POST>

 

order allow,deny

deny from 180.76.5.194 // the ip address you want to ban goes here

allow from all

</Limit>

Link to comment
Share on other sites

sorry about the above post I've got it sorted

 

It should only be:

 

<Limit GET PUT POST>

 

order allow,deny

deny from 180.76.5.194

allow from all

</Limit>

 

added and it works now without errors

 

Thanks alot for this topic again!!

Link to comment
Share on other sites

You can ban entire ranges of IP addresses but you risk denying legitimate visitors to your site as well.

 

A couple of things about manually banning IP addresses. Firstly I see you found out just how fruitless that pursuit is, because there are just plainly endless IP addresses malicious visitors can use, and at some point you have to go out or sleep and during that time an attacker could 'do bad things'....

 

Secondly it is possible on some webservers to bypass that code above.

 

<Limit GET PUT POST>

order allow,deny
deny from 180.76.5.194
allow from all
</Limit>

 

Some versions of apache do silly things when an attack sends a malformed request that is not a recognized request type, to commonly known ones being GET, POST, PUT, HEAD etc

 

So if you are going to use that method of banning an IP address, firstly try it without the LIMIT directive

 

order allow,deny
deny from 180.76.5.194
allow from all

Next, the IP you are trying to block is the yea old Baidu spider, and there are a number of addons and scripts that have been posted into the security section that deal with this particular spider.

 

Lastly, again, have you tried osC_Sec ? It will not ban an IP that you want banned, but it will catch hack attempts and ban them to your .htaccess file automatically. That is what you want to happen, when a prohibited request is made to your site, this addon will at the very basic setting, ban the action, and if you set it to ban to IPs, it will. osC_Sec also deals with Baidu spider as well by blocking all requests from it.

- Stop Oscommerce hacks dead in their tracks with osC_Sec (see discussion here)
- Another discussion about infected files ::here::
- A discussion on file permissions ::here::
- Site hacked? Should you upgrade or not, some thoughts ::here::
- Fix the admin login bypass exploit here
- Pareto Security: New security addon I am developing, a remake of osC_Sec in PHP 5 with a number of fixes
- BTC:1LHiMXedmtyq4wcYLedk9i9gkk8A8Hk7qX

Link to comment
Share on other sites

2. Is there a program or way to find out if your store is being hacked or has been hacked. I had a person email me from my store telling me that I had issues with my site but would not share any further information.

 

You could install osCommerce Virus & Threat Scanner and Site Monitor. They spot known bad code.

 

HTH

 

G

Need help installing add ons/contributions, cleaning a hacked site or a bespoke development, check my profile

 

Virus Threat Scanner

My Contributions

Basic install answers.

Click here for Contributions / Add Ons.

UK your site.

Site Move.

Basic design info.

 

For links mentioned in old answers that are no longer here follow this link Useful Threads.

 

If this post was useful, click the Like This button over there ======>>>>>.

Link to comment
Share on other sites

No need to go through all this. The Baidu Spiders will obey your robots.txt file. See this:

http://www.baidu.com...er_english.html

 

 

Someone forgot to inform the Baidu Spider that it is suppose to follow the rules. I have banned the spider from all of my clients sites as well because it was indexing admin files. BAD Spider !

 

 

 

Chris

Link to comment
Share on other sites

I have never had a problem denying Baidu on any of my sites. You may have something incorrect in robots.txt. Try reading: http://www.baidu.com/search/spider_english.html for correct information. If you still have a problem try contacting them? They are very easy and cooperative to work with.

Also, after you have tried to ban the IP the spider is not going to be capable of checking the robots.txt file so you will continue to see attempts in your logs. Remember: there is usually more than one way to skin a cat.

Link to comment
Share on other sites

Most of the complaints I have seen about Baidu are associated with server load often causing webservers to lag excessively while a single site on that server is being trawled by these Baidu servers and the excessive way the Baidu crawlers load almost every object on the site including images in a repeated fashion thus using up large amounts of data.

 

There are literally many many 100s of thousands of people complaining about this even after they have added the code to their robot.txt. Some reports indicate that the crawler causes the site to become unreachable creating lag times sometimes up to 30 seconds while being crawled. Close enough to almost be a denial of service attack had it been intentional.

 

This data use and connections rate issue and the ignoring of the robots.txt are things that no doubt has been addressed with Baidu as you can see it is the first topic discussed in that link in the FAQs (FAQ3) and some of the other FAQs obviously in response to people bringing these issues up with them, so I doubt any of their misconfigured settings are intentional, more likely just bug ridden.

 

If Baidu or any other crawler is ignoring robots.txt and are making too many connections at too high a frequency then the best way to get your websites speed up again is to ban the crawler based on its User-Agent which will always contain (in this case about the Baidu spider) the word Baidu.

 

If your shared hosting, or home hosted server, has limited bandwidth allocation and a crawling server is using excessive amounts of bandwidth in comparrison to other crawlers then that also may be grounds to use htaccess to ban the connection.

 

HTACCESS banning will reduce the data usage to its very minimum.

 

If you are using a dedicated hosting server then it should be able to handle heavy crawling and you may not notice any difference at all in your sites performance.

- Stop Oscommerce hacks dead in their tracks with osC_Sec (see discussion here)
- Another discussion about infected files ::here::
- A discussion on file permissions ::here::
- Site hacked? Should you upgrade or not, some thoughts ::here::
- Fix the admin login bypass exploit here
- Pareto Security: New security addon I am developing, a remake of osC_Sec in PHP 5 with a number of fixes
- BTC:1LHiMXedmtyq4wcYLedk9i9gkk8A8Hk7qX

Link to comment
Share on other sites

It might be wise to check the IP of the baidu bot. If it is ignoring the robots.txt file then the bot is probably an imposter.

 

"The hostname of Baiduspider is *.baidu.com or *.baidu.jp. Others are fake hostnames."

 

Just a thought.

Link to comment
Share on other sites

That could well be the case since determining the harvester is based on the http user-agent which can be spoofed with ease.

 

See: http://www.useragentstring.com/Baiduspider_id_248.php

 

I doubt many in the 'west' would bother resolving IPs though when it comes to harvester issues unless it was a harvester that had a real impact on their site visitors like Googlebot for example.

 

Even if someone constructed a script to resolve the IP address where the word Baidu was found in the user-agent, that could result in a denial of service under heavy traffic because of the resources needed to resolve IP addresses. Things get even more resource intensive if the get_browser() function was envoked in conjuntion with IP resolving to determine the crawler name and resolve the IP address.

- Stop Oscommerce hacks dead in their tracks with osC_Sec (see discussion here)
- Another discussion about infected files ::here::
- A discussion on file permissions ::here::
- Site hacked? Should you upgrade or not, some thoughts ::here::
- Fix the admin login bypass exploit here
- Pareto Security: New security addon I am developing, a remake of osC_Sec in PHP 5 with a number of fixes
- BTC:1LHiMXedmtyq4wcYLedk9i9gkk8A8Hk7qX

Link to comment
Share on other sites

right guys i have the same bloody bot on mine fo 4 days now.

 

ive just tried to put the

 

<Limit GET PUT POST>

 

order allow,deny

deny from 180.76.5.0/999

allow from all

</Limit>

code into my .htaccess and each time i upload it i get a 500 internal error on the site, admin and normal. remove it and its fine again.

 

now ive tried it without the limit statement and as just one ip on its own and both do the same.

 

any ideas?

Link to comment
Share on other sites

  • 1 month later...

Archived

This topic is now archived and is closed to further replies.

×
×
  • Create New...