Jump to content
  • Checkout
  • Login
  • Get in touch

osCommerce

The e-commerce.

Phishing attack


nttcar

Recommended Posts

Hi,

I got ban by host server and I really can not figure out why I am still getting this, I have ensure my folder has 755 and all the file has 644 but configgure file has 444 for those two. Does any can help how to resolve this issue, I think i am on oscommerce 2.2 year 2003.

 

thx for your help

 

 

My logs show

 

66.249.71.165 - - [01/Sep/2011:04:14:12 -0500] "GET /product_info.php?currency=TWD&products_id=310&oscsid=25pntj1s177bqa9umo5ne716k4&language=cn&osCsid=6rkbb00v4fofgar0s5vc5tr5r0 HTTP/1.1" 200 32357 "-" "Mozilla/5.0 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)"

220.181.94.233 - - [01/Sep/2011:04:14:13 -0500] "GET /create_account.php?language=tw&osCsid=i859td1rbt3ihkrub5q3t196n0 HTTP/1.1" 200 62265 "-" "Sogou web spider/4.0(+http://www.sogou.com/docs/help/webmasters.htm#07)"

220.181.94.233 - - [01/Sep/2011:04:14:24 -0500] "GET /conditions.php?osCsid=taf7i30m3n2mmlfej9fp0m2qj3 HTTP/1.1" 200 33043 "-" "Sogou web spider/4.0(+http://www.sogou.com/docs/help/webmasters.htm#07)"

124.115.0.25 - - [01/Sep/2011:04:14:52 -0500] "GET /product_reviews.php?action=buy_now&language=cn&osCsid=7cdac7049335dc04372a557386f24aa5&products_id=148 HTTP/1.1" 302 425 "-" "Sosospider+(+http://help.soso.com/webspider.htm)"

124.115.0.25 - - [01/Sep/2011:04:14:53 -0500] "GET /shopping_cart.php?language=cn&osCsid=7cdac7049335dc04372a557386f24aa5 HTTP/1.1" 200 36732 "-" "Sosospider+(+http://help.soso.com/webspider.htm)"

220.181.94.233 - - [01/Sep/2011:04:14:56 -0500] "GET /question_answer.php?language=cn&osCsid=4cu238fj6afpuajvn0piauoac1 HTTP/1.1" 200 39786 "-" "Sogou web spider/4.0(+http://www.sogou.com/docs/help/webmasters.htm#07)"

Link to comment
Share on other sites

Call me blind and stupid but someone please point out the "phish" in the log.

 

I'm not seeing it.

If I suggest you edit any file(s) make a backup first - I'm not perfect and neither are you.

 

"Given enough impetus a parallelogramatically shaped projectile can egress a circular orifice."

- Me -

 

"Headers already sent" - The definitive help

 

"Cannot redeclare ..." - How to find/fix it

 

SSL Implementation Help

 

Like this post? "Like" it again over there >

Link to comment
Share on other sites

Here is the email from rsa.com

It appears that your website http://usaasiatrade.com has been hacked by a fraudster. It is now hosting a phishing attack against Rabobank

Please remove the fraudulent folders/files as soon as possible and secure your website as it has been compromised.

Please note that it is possible that the fraudulent content is embedded in your website ' s legitimate files.

http://www.usaasiatrade.com/ext/rab.php

 

I did delete the /ext folder but how can I stop those crawling ? and Now I move to another host how can I stop them to create the file again

thanks a lot !

Link to comment
Share on other sites

If I suggest you edit any file(s) make a backup first - I'm not perfect and neither are you.

 

"Given enough impetus a parallelogramatically shaped projectile can egress a circular orifice."

- Me -

 

"Headers already sent" - The definitive help

 

"Cannot redeclare ..." - How to find/fix it

 

SSL Implementation Help

 

Like this post? "Like" it again over there >

Link to comment
Share on other sites

Follow these steps to clean and secure your website:

 

1) Lock down your site by using an .htaccess password so your customers are not attacked by the hackers code.

 

2) FTP all of the files to your local machine and use a program like WinGrep to identify and remove all malicious and anomalous files containing hacker code. Look for keywords such as 'base64','eval','decode'.

 

3) Delete the files on your hosting account before uploading the clean files.

 

4) FTP the clean files back to your hosting account and read and implement the security patches and contributions found in these two threads. Admin Security and Website Security.

 

5) Change all of your passwords: FTP, CPANEL, STORE ADMIN and DATABASE

 

6) Make sure File and Directory Permissions are set correctly. Directories no higher than 755, Files no higher than 644 and the TWO configure.php files no higher than 444

 

7) If your site has been 'black listed' as an attack site by Google, then log into Google Webmaster Tools and submit the site to be re-indexed and verified to be removed from the 'black list'

 

8) Remove the .htaccess password protection so your customers can resume making purchases from your website.

 

9) Monitor your website using the newly installed contributions to prevent future hacker attacks.

 

10) If you feel you can not perform any of the above steps, you should seek professional help to ensure all malware is removed.

 

 

Chris

Link to comment
Share on other sites

Thank you so much, I have follow your suggestion, I also ass htaccess and robot.txt as below but I am stilling seeing it constant hitting my website, seems either the way I defind htaccess or robot does not work. Can someone give some idea

 

robot

User-agent: Baiduspider

User-agent: Baiduspider-video

User-agent: Baiduspider-image

User-agent: sogou spider

User-agent:Sogou web spider

Disallow: /

 

htaccess

RewriteEngine on

Options +FollowSymlinks

RewriteBase /

RewriteCond %{HTTP_USER_AGENT} ^Baiduspider [NC,OR]

RewriteCond %{HTTP_USER_AGENT} ^Sogou

RewriteRule ^.*$ - [F]

 

RewriteEngine on

Options +FollowSymlinks

RewriteBase /

RewriteCond %{HTTP_USER_AGENT} ^Baiduspider [NC,OR]

RewriteCond %{HTTP_USER_AGENT} ^Sogou

RewriteRule ^.*$ - [F]

 

 

compatible; Baiduspider/2.0; +http://www.baidu.com/search/spider.html)"

220.181.94.233 - - [18/Sep/2011:09:35:18 -0500] "GET /checkout_shipping.php?osCsid=pg90rlql1ipag4dd1ahru1djn6 HTTP/1.1" 403 489 "-" "Sogou web spider/4.0(+http://www.sogou.com/docs/help/webmasters.htm#07)"

220.181.108.177 - - [18/Sep/2011:09:35:18 -0500] "GET /index.php?cPath=42&osCsid=h0cm3egiq1p2trt6v8r2imbvr5 HTTP/1.1" 200 39221 "-" "Mozilla/5.0 (compatible; Baiduspider/2.0; +http://www.baidu.com/search/spider.html)"

220.181.108.165 - - [18/Sep/2011:09:35:24 -0500] "GET /index.php?cPath=42&osCsid=gd9e4mrhne4ogg018p15i6nno7 HTTP/1.1" 200 39330 "-" "Mozilla/5.0 (compatible; Baiduspider/2.0; +http://www.baidu.com/search/spider.html)"

202.160.178.59 - - [18/Sep/2011:09:35:26 -0500] "GET /index.php?cPath=86 HTTP/1.0" 200 37583 "-" "Mozilla/5.0 (compatible; Yahoo! Slurp China; http://misc.yahoo.com.cn/help.html)"

220.181.94.233 - - [18/Sep/2011:09:35:29 -0500] "GET /privacy.php?language=tw&osCsid=3f3732sdd7mvods8joctehhfo4 HTTP/1.1" 403 479 "-" "Sogou web spider/4.0(+http://www.sogou.com/docs/help/webmasters.htm#07)"

220.181.108.156 - - [18/Sep/2011:09:35:29 -0500] "GET /index.php?cPath=42&osCsid=fohn1rqdesdednutl315s67n90 HTTP/1.1" 200 39375 "-" "Mozilla/5.0 (compatible; Baiduspider/2.0; +http://www.baidu.com/search/spider.html)"

220.181.108.153 - - [18/Sep/2011:09:35:35 -0500] "GET /index.php?cPath=42&osCsid=487tjq4jdk7uvteqf6o3une3p2 HTTP/1.1" 200 39205 "-" "Mozilla/5.0 (compatible; Baiduspider/2.0; +http://www.baidu.com/search/spider.html)"

220.181.94.233 - - [18/Sep/2011:09:35:39 -0500] "GET /shipping.php?osCsid=pv28n7h8q1pm6amfs3rhqn3236 HTTP/1.1" 403 480 "-" "Sogou web spider/4.0(+http://www.sogou.com/docs/help/webmasters.htm#07)"

220.181.108.111 - - [18/Sep/2011:09:35:41 -0500] "GET /index.php?cPath=42&osCsid=3q1mvg10ese22l3phfbulmeoa4 HTTP/1.1" 200 39199 "-" "Mozilla/5.0 (compatible; Baiduspider/2.0; +http://www.baidu.com/search/spider.html)"

220.181.108.168 - - [18/Sep/2011:09:35:46 -0500] "GET /index.php?cPath=42&osCsid=31nrqu7c2qp7fi56oj11bjb1c1 HTTP/1.1" 200 39195 "-" "Mozilla/5.0 (compatible; Baiduspider/2.0; +http://www.baidu.com/search/spider.html)"

Link to comment
Share on other sites

That bot is a known rogue that ignores the robots.txt file.

 

You have to block it with .htaccess

 

Read this

If I suggest you edit any file(s) make a backup first - I'm not perfect and neither are you.

 

"Given enough impetus a parallelogramatically shaped projectile can egress a circular orifice."

- Me -

 

"Headers already sent" - The definitive help

 

"Cannot redeclare ..." - How to find/fix it

 

SSL Implementation Help

 

Like this post? "Like" it again over there >

Link to comment
Share on other sites

My suggestion: Get rid of the rewite stuff.

 

Also, you have some old information. The Baidu spider does obey the robots.txt file now.

 

http://www.baidu.com...er_english.html

 

 

Try this in robots.txt and give it 48 to 72 hours to work.

 

 

User-agent: Baiduspider

Disallow: /

 

User-agent: Baiduspider-image

Disallow: /

 

User-agent: Baiduspider-video

Disallow: /

 

User-agent: Baiduspider-news

Disallow: /

 

User-agent: Baiduspider-favo

Disallow: /

 

User-agent: Baiduspider-cpro

Disallow: /

 

User-agent: Baiduspider-ads

Disallow: /

 

User-agent: Baidu

Disallow: /

 

You have to allow Baidu for it to read robots.txt.

Link to comment
Share on other sites

"Old information:" aside, .htaccess works right away and you don't have to worry if they follow the robots.txt rules or not.

If I suggest you edit any file(s) make a backup first - I'm not perfect and neither are you.

 

"Given enough impetus a parallelogramatically shaped projectile can egress a circular orifice."

- Me -

 

"Headers already sent" - The definitive help

 

"Cannot redeclare ..." - How to find/fix it

 

SSL Implementation Help

 

Like this post? "Like" it again over there >

Link to comment
Share on other sites

Many administrators don't allow for a .htaccess file as in "AllowOverride none". There is good reason for using "none". When "AllowOverride" is anything other than "none" every single server http request must check the directory tree for an .htaccess file thus increasing the server load.

 

In addition when a .htaccess file is used every Baidu denied request will cause two server log entries one to the access log and another to the error log which bloats the log files excessively. When the Baidu spider starts hitting the server multiple times a second the server load can become intolerable. The log file problem is what I experienced when dening the Baidu spider in the servers virtual host configuration file.

 

Thus my suggestion to at least try the robots.txt file because even if you deny Baidu with .htaccess it doesn't solve the problem it just covers it up. In fact, as far as server load is concerned you are probably better of allowing it to traverse the site.

 

There are always multiple ways to accomplish a goal. Some methods are practical others are not. Some may work for you and some may not be possible. There are advantages and disadvantages to most alternatives. It's nice to have all the information and actually understand the various choices but in the end do what you want to based on your capabilities, knowledge and requirements.

 

From Apache.org

http://httpd.apache....o/htaccess.html

"You should avoid using .htaccess files completely if you have access to httpd main server config file. Using .htaccess files slows down your Apache server. Any directive that you can include in a .htaccess file is better set in a Directory block, as it will have the same effect with better performance."

 

More from Apache.org:

 

"When (not) to use .htaccess files

 

In general, you should never use .htaccess files unless you don't have access to the main server configuration file. There is, for example, a prevailing misconception that user authentication should always be done in .htaccess files. This is simply not the case. You can put user authentication configurations in the main server configuration, and this is, in fact, the preferred way to do things.

 

.htaccess files should be used in a case where the content providers need to make configuration changes to the server on a per-directory basis, but do not have root access on the server system. In the event that the server administrator is not willing to make frequent configuration changes, it might be desirable to permit individual users to make these changes in .htaccess files for themselves. This is particularly true, for example, in cases where ISPs are hosting multiple user sites on a single machine, and want their users to be able to alter their configuration.

 

However, in general, use of .htaccess files should be avoided when possible. Any configuration that you would consider putting in a .htaccess file, can just as effectively be made in a <Directory> section in your main server configuration file.

 

There are two main reasons to avoid the use of .htaccess files.

 

The first of these is performance. When AllowOverride is set to allow the use of .htaccess files, Apache will look in every directory for .htaccess files. Thus, permitting .htaccess files causes a performance hit, whether or not you actually even use them! Also, the .htaccess file is loaded every time a document is requested.

 

Further note that Apache must look for .htaccess files in all higher-level directories, in order to have a full complement of directives that it must apply. (See section on how directives are applied.) Thus, if a file is requested out of a directory /www/htdocs/example, Apache must look for the following files:

 

/.htaccess

/www/.htaccess

/www/htdocs/.htaccess

/www/htdocs/example/.htaccess

 

And so, for each file access out of that directory, there are 4 additional file-system accesses, even if none of those files are present. (Note that this would only be the case if .htaccess files were enabled for /, which is not usually the case.)

 

The second consideration is one of security. You are permitting users to modify server configuration, which may result in changes over which you have no control. Carefully consider whether you want to give your users this privilege. Note also that giving users less privileges than they need will lead to additional technical support requests. Make sure you clearly tell your users what level of privileges you have given them. Specifying exactly what you have set AllowOverride to, and pointing them to the relevant documentation, will save yourself a lot of confusion later."

Link to comment
Share on other sites

Now that's what makes this a great forum.

 

People sharing what they've learned and "dug up" is what makes us all better.

 

If it was me I'd probably add a code snippet to applicaion_top to detect the bot (since they have a fixed set of IP addresses they use) and redirect them to the asteroid belt between Mars and Jupiter...

If I suggest you edit any file(s) make a backup first - I'm not perfect and neither are you.

 

"Given enough impetus a parallelogramatically shaped projectile can egress a circular orifice."

- Me -

 

"Headers already sent" - The definitive help

 

"Cannot redeclare ..." - How to find/fix it

 

SSL Implementation Help

 

Like this post? "Like" it again over there >

Link to comment
Share on other sites

In general, you should never use .htaccess files unless you don't have access to the main server configuration file

 

Most people would not be running their own webservers so therefore if htaccess is available that would be their only option. If you compare the cpu usage of an HTTP request that was prevented from completing by a directive in htaccess, to that of the usage of actually completing the http request, completing uses a hell of a lot more resources hands down. They are called spiders because thats what they do, they troll through every layer of your site that can be found following links.

 

Back in the day when webservers were run on pentium 200mmx servers with a single core behind a 1mps connection, sure checking the htaccess with every HTTP request was costly, but these days its insignificant when compared to the CPU and bandwidth cost that is incurred when countless, often pointless and sometimes even, malicious, spiders troll through your site.

 

If you are on a shared host and they have given you access to the main server config file then you should think about backing your site up and changing hosts. Most users of osCommerce are on shared webhosting services, its not the optimal, but its cheap. Therefore they will not have access to the apache conf file.

 

Bandwidth and cpu use aside, what nttcar needs to do if havent done so already is patch the faulty code in the outdated version of osCommerce, clean out the files that are hosting the rogue code, protect the admin directory and I wouldn't worry too much about all the other spiders that come trolling through your site.

 

nttcar you could also try out the addon osC_Sec if you want as well, it will prevent attackers exploiting your website. See the link in my signature.

- Stop Oscommerce hacks dead in their tracks with osC_Sec (see discussion here)
- Another discussion about infected files ::here::
- A discussion on file permissions ::here::
- Site hacked? Should you upgrade or not, some thoughts ::here::
- Fix the admin login bypass exploit here
- Pareto Security: New security addon I am developing, a remake of osC_Sec in PHP 5 with a number of fixes
- BTC:1LHiMXedmtyq4wcYLedk9i9gkk8A8Hk7qX

Link to comment
Share on other sites

My difference of opinion is based around the realities of hosting in the modern environment which is almost always a shared hosting situation now. Very few people host their own webservers anymore and the greater majority of users of the osCommerce system use shared hosting, so having access to the apache config file is just out of the question for most users.

 

So yeah if you have access to the apache conf file then by all means add your directives directly in there because as the documentation states its more economical to impliment your server wide settings in one location. But that is just not the reality for the greater majority of people who run osCommerce on a leased webspace.

 

The other practical reality is that, conceding that most users will have no choice but to either use .htaccess or not to impliment directives, the difference in cpu and bandwidth consumption between blocking spider activity at the gate versus allowing the spider to effectively load pages is considerable. Sure if you added the code into the apache conf file directly it would be the more economical way of dealing with it, but you do not have access, or should not have access I should say, to the apache conf file in a shared hosting environment.

 

The other point is that htaccess gives you the advantage in some instances of applying rules to one directory tree and not all directory trees. Basic HTTP Authentication is just one such method that should only be applied on the directory (and sub-directories) it is intended to protect.

 

So it is if/else situation. If you have access to the apache conf file i.e. you are on a virtual server or you are hosting your own webserver, then by all means add your blocking code in there. If not then use the .htaccess file in the root directory of your website to do so.

 

If you want to block this particular spider and do not have a working htaccess file then you will have to use php code to do so.

 

In the application_top.php files (both of them)

 

Find:

  Released under the GNU General Public License
*/

Below that add the following

 
 /**
  * Baiduspider Block
  */
 if ( ( isset( $_SERVER[ "HTTP_USER_AGENT" ] ) )
	 && ( false !== strpos( $_SERVER[ "HTTP_USER_AGENT" ], "Baiduspider" ) ) ) {
	  $header = array( "HTTP/1.1 404 Not Found", "HTTP/1.1 404 Not Found", "Content-Length: 0" );
	  foreach ( $header as $sent ) {
			  header( $sent );
	  }
	  die();
 }

 

This will return a 404 not found message to the harvesting server

- Stop Oscommerce hacks dead in their tracks with osC_Sec (see discussion here)
- Another discussion about infected files ::here::
- A discussion on file permissions ::here::
- Site hacked? Should you upgrade or not, some thoughts ::here::
- Fix the admin login bypass exploit here
- Pareto Security: New security addon I am developing, a remake of osC_Sec in PHP 5 with a number of fixes
- BTC:1LHiMXedmtyq4wcYLedk9i9gkk8A8Hk7qX

Link to comment
Share on other sites

You have made some very good and accurate points.

As I mentioned in my original post and this has been my recent experience:

 

The Baidu spider does obey the robots.txt file now.

 

http://www.baidu.com...er_english.html

 

 

Try this in robots.txt and give it 48 to 72 hours to work.

 

 

User-agent: Baiduspider

Disallow: /

 

User-agent: Baiduspider-image

Disallow: /

 

User-agent: Baiduspider-video

Disallow: /

 

User-agent: Baiduspider-news

Disallow: /

 

User-agent: Baiduspider-favo

Disallow: /

 

User-agent: Baiduspider-cpro

Disallow: /

 

User-agent: Baiduspider-ads

Disallow: /

 

User-agent: Baidu

Disallow: /

 

You have to allow Baidu for it to read robots.txt. In other words if you block the spider it will not be able to read the robots file.

 

Good luck.

Link to comment
Share on other sites

Archived

This topic is now archived and is closed to further replies.

×
×
  • Create New...