Jump to content
  • Checkout
  • Login
  • Get in touch

osCommerce

The e-commerce.

How to increase performance of my oscommerce app?


mick

Recommended Posts

Hi!

I've installed, customized and populated a music sheet store.

 

With 150.000 products the performance is very very low. I take about a minute to execute a seach in the catalog! :cry:

 

I've already:

1) implemented indexes on mysql table

2) removed dropdown manufacturers menu

3) enabled cache

4) turned off products count in the categories list

 

The site url is http://www.demusica.com/catalog/default.php , it's an italian site, but everyone can see how low are the performances.

 

Someone can help me? I'd like to improve the search query, the paging query and the product listing query. Is there any suggestion?

 

Thanks :wink:

Mick

Link to comment
Share on other sites

Either the Best Sellers or the What's New box seems to be slowing you down a lot.

 

For fun ... turn off one and see what your load time is, then turn that on and turn off the other and see the time ... then turn off both.

 

I think it's the Best Sellers box.

Link to comment
Share on other sites

Something on your main page is taking an awful long time to load....

 

Perhaps you could make the best sellers a static list? (rather than pulling the info from the database) That would help a little.

 

There's not alot you can do with the categories and have a heap of products though..

 

ps: Love the header, those colors are very nice : )

Link to comment
Share on other sites

I've analyzed the slow query log (query >2sec) of mysql database and here is the results:

 

 

*********************

HOME PAGE retrieving:

*********************

# Query_time: 5 Lock_time: 0 Rows_sent: 1 Rows_examined: 293786

use demusica_com;

select count(*) as total from products p, products_to_categories p2c where p.products_id = p2c.products_id and p.products_status = '1' and p2c.categories_id = '32';

 

# Query_time: 6 Lock_time: 0 Rows_sent: 3 Rows_examined: 677431

select p.products_id, p.products_image, p.products_tax_class_id, if(s.status, s.specials_new_products_price, p.products_price) as products_price from products p left join specials s on p.products_id = s.products_id where products_status = '1' order by p.products_date_added desc limit 3;

 

# Query_time: 9 Lock_time: 0 Rows_sent: 10 Rows_examined: 343482

SET timestamp=1049191104;

select p.products_id, pd.products_name, products_date_available as date_expected from products p, products_description pd where to_days(products_date_available) >= to_days(now()) and p.products_id = pd.products_id and pd.language_id = '4' order by date_expected desc limit 10;

 

# Query_time: 8 Lock_time: 0 Rows_sent: 0 Rows_examined: 333474

select distinct p.products_id, pd.products_name from products p, products_description pd where p.products_status = '1' and p.products_ordered > 0 and p.products_id = pd.products_id and pd.language_id = '4' order by p.products_ordered desc, pd.products_name limit 10;

 

*********************************

SCHOTT Catalog RETRIEVING

*********************************

# Query_time: 6 Lock_time: 0 Rows_sent: 1 Rows_examined: 293786

select count(*) as total from products p, products_to_categories p2c where p.products_id = p2c.products_id and p.products_status = '1' and p2c.categories_id = '32';

 

# Query_time: 19 Lock_time: 0 Rows_sent: 1 Rows_examined: 791477

select count(*) as total from products_description pd, products p left join manufacturers m on p.manufacturers_id = m.manufacturers_id, products_to_categories p2c left join specials s on p.products_id = s.products_id where p.products_status = '1' and p.products_id = p2c.products_id and pd.products_id = p2c.products_id and pd.language_id = '4' and p2c.categories_id = '32';

 

# Query_time: 30 Lock_time: 0 Rows_sent: 124421 Rows_examined: 1040319

select p.products_model, pd.products_name, m.manufacturers_name, p.products_id, p.manufacturers_id, p.products_price, p.products_tax_class_id, IF(s.status, s.specials_new_products_price, NULL) as specials_new_products_price, IF(s.status, s.specials_new_products_price, p.products_price) as final_price from products_description pd, products p left join manufacturers m on p.manufacturers_id = m.manufacturers_id, products_to_categories p2c left join specials s on p.products_id = s.products_id where p.products_status = '1' and p.products_id = p2c.products_id and pd.products_id = p2c.products_id and pd.language_id = '4' and p2c.categories_id = '32' order by pd.products_name;

 

# Query_time: 26 Lock_time: 0 Rows_sent: 20 Rows_examined: 915918

select p.products_model, pd.products_name, m.manufacturers_name, p.products_id, p.manufacturers_id, p.products_price, p.products_tax_class_id, IF(s.status, s.specials_new_products_price, NULL) as specials_new_products_price, IF(s.status, s.specials_new_products_price, p.products_price) as final_price from products_description pd, products p left join manufacturers m on p.manufacturers_id = m.manufacturers_id, products_to_categories p2c left join specials s on p.products_id = s.products_id where p.products_status = '1' and p.products_id = p2c.products_id and pd.products_id = p2c.products_id and pd.language_id = '4' and p2c.categories_id = '32' order by pd.products_name limit 0, 20;

 

# Query_time: 7 Lock_time: 0 Rows_sent: 0 Rows_examined: 338716

select distinct p.products_id, pd.products_name from products p, products_description pd, products_to_categories p2c, categories c where p.products_status = '1' and p.products_ordered > 0 and p.products_id = pd.products_id and pd.language_id = '4' and p.products_id = p2c.products_id and p2c.categories_id = c.categories_id and '32' in (c.categories_id, c.parent_id) order by p.products_ordered desc, pd.products_name limit 10;

 

************************************************************

What do you think? Slow, eh?

The problem is the database. It hasn't a good performance. Is there a way to improve the queries?

 

Thanks a lot! :wink:

Mick

Link to comment
Share on other sites

Add some indexes to your database.

Reduce image sixe.

Add gzip compression contrib.

Eliminate count of hits.

-------------------------------------------------------------------------------------------------------------------------

NOTE: As of Oct 2006, I'm not as active in this forum as I used to be, but I still work with osC quite a bit.

If you have a question about any of my posts here, your best bet is to contact me though either Email or PM in my profile, and I'll be happy to help.

Link to comment
Share on other sites

ah... the timing above are taken on my old test server.

The effective timing on the production server are better. :wink:

 

Thanks for the hints, but, in any case, a 2GB quad xeon could have some problem and would cost much more than the ecommerce app!!! :lol:

 

If I add GZIP contrib, reduce image size, eliminate count of hits, it would improve global performance, but it doesn't improve the performance of the above queries. :cry:

 

:idea: I think the queries are very complex. There is too much join. Perhaps the database indexes may help. Is it right?

 

Thanks,

Mick

Link to comment
Share on other sites

Are you able to make a database dump available for project developers? This can help us optimize the database queries during development.

 

Generally for the database server (is it dedicated?) more memory is always good. How much memory do you have on the server?

:heart:, osCommerce

Link to comment
Share on other sites

Indexes would indeed help. Especially if you have alot of products.

 

However, there are a few things in MySQL that causes inconsistancies. MySQL version 3.24, which nearly everyone has, currently doesn't support sub queries, or multi table updates. This means that we have to use php to hold arrays and essentially replicate a 'cursor', which is similar to a table scan of each record in the table everytime we have to an preform an update using more than one tables data.

 

Anyway, if you upgrade th MySql 4.0, it does allow multi table updates, and you cna rewrite several of the queries to be more efficient and run faster.

 

That being said, it is my opinion that this is not your problem.

-------------------------------------------------------------------------------------------------------------------------

NOTE: As of Oct 2006, I'm not as active in this forum as I used to be, but I still work with osC quite a bit.

If you have a question about any of my posts here, your best bet is to contact me though either Email or PM in my profile, and I'll be happy to help.

Link to comment
Share on other sites

to hpdl:

The site is hosted on a shared server. I don't think it's a very high performance server (512MB of ram).

 

I'm happy to give u the database dump. Can I send u via email (4/5MB)? Obviously I'll alter the reserved data.

 

------------------------------------------------------------------------------

to wizardsandwars:

The database time IS my problem. If u see at the queries above, u can observe 3 queries very long: 19sec, 30sec, 26sec. All these queries are about the "schott" category (categories_id = '32'), the biggest one(124,000 products) of my catalog.

The longest query process 1,000,000 of rows because of the joins.

 

Thanks,

Mick

Link to comment
Share on other sites

And I take it that you have no indexes?

 

A catagory ofthat size would certainly cause problems without indexes.

-------------------------------------------------------------------------------------------------------------------------

NOTE: As of Oct 2006, I'm not as active in this forum as I used to be, but I still work with osC quite a bit.

If you have a question about any of my posts here, your best bet is to contact me though either Email or PM in my profile, and I'll be happy to help.

Link to comment
Share on other sites

I see now.

 

Yeah those queries are written that way because MySQL doesn't support coorelated subqueries. Indexes would definatly help thouhg, especially on the tables on the right side of the join. They won't help on the talbes to the left of the join because with the outer joing, you have to process every row in the table anyway.

-------------------------------------------------------------------------------------------------------------------------

NOTE: As of Oct 2006, I'm not as active in this forum as I used to be, but I still work with osC quite a bit.

If you have a question about any of my posts here, your best bet is to contact me though either Email or PM in my profile, and I'll be happy to help.

Link to comment
Share on other sites

With a product list of 150K items, would you consider helping with an Oracle or Sybase port of the database? -- I too would REALLY encourage you to donate this database (perhaps sanitizing the data), to the development team. It might help them in testing future releases!

 

Perhaps a future abstraction of a robust database layer would be in order. An much lower featured endeavor (phpShop) looks to be a technical approach but not nearly as ambitious in terms of features! -- Kudos to the OSCommerce development team!

 

Please post your results of the additional indexes -- we'd all be interested in the results.

Link to comment
Share on other sites

Sybase would be by far the lease expensive method, should he decide to upgrade the DBMS. Unfortunatly, I don't know of any client seveices that they offer for the web, so it would have to use an ODBC connection, which would still be a little slow.

-------------------------------------------------------------------------------------------------------------------------

NOTE: As of Oct 2006, I'm not as active in this forum as I used to be, but I still work with osC quite a bit.

If you have a question about any of my posts here, your best bet is to contact me though either Email or PM in my profile, and I'll be happy to help.

Link to comment
Share on other sites

Mick ... sounds like you may also have your server throttling your site as you are on a shared server.

 

I would be interested in testing this database on one of mine to see what the results are. Drop me an email and let me know and we can make arrangements for you to see this.

Link to comment
Share on other sites

WoW nice design.

 

hmmm, your top page is definitely making my 12 mbps ADSL look like a modem.

 

I wonder if you take out the "Prodotti in arrivo" section as a temp solution, while the dev team comes up with an answer.

 

Would postgres be any quicker than mysql in this situation? I've always imagined that sybase and Oracle being pretty expensive.

NewsDesk(934) / FAQDesk(1106) / OrderCheck(1168) :::

Link to comment
Share on other sites

Remember to tweak your my.cnf file, also, if you have access to it. For example, this is one of the config files that comes with the mysql source code. I used it to tweak the settings on my server a few days ago, and sql-bench is showing about a 30% improvement in performance from the default settings.

 

# Example mysql config file for large systems.

#

# This is for large system with memory = 512M where the system runs mainly

# MySQL.

#

# You can copy this file to

# /etc/my.cnf to set global options,

# mysql-data-dir/my.cnf to set server-specific options (in this

# installation this directory is /var/db/mysql) or

# ~/.my.cnf to set user-specific options.

#

# One can in this file use all long options that the program supports.

# If you want to know which options a program support, run the program

# with --help option.



# The following options will be passed to all MySQL clients

[client]

#password       = your_password

port            = 3306

socket          = /tmp/mysql.sock



# Here follows entries for some specific programs



# The MySQL server

[mysqld]

port            = 3306

socket          = /tmp/mysql.sock

skip-locking

set-variable    = key_buffer=256M

set-variable    = max_allowed_packet=1M

set-variable    = table_cache=256

set-variable    = sort_buffer=1M

set-variable    = record_buffer=1M

set-variable    = myisam_sort_buffer_size=64M

set-variable    = thread_cache=8

# Try number of CPU's*2 for thread_concurrency

set-variable    = thread_concurrency=8

server-id       = 1



# Uncomment the following if you are using BDB tables

#set-variable   = bdb_cache_size=64M

#set-variable   = bdb_max_lock=100000



# Uncomment the following if you are using InnoDB tables

#innodb_data_home_dir = /var/db/mysql/

#innodb_data_file_path = ibdata1:10M:autoextend

#innodb_log_group_home_dir = /var/db/mysql/

#innodb_log_arch_dir = /var/db/mysql/

# You can set .._buffer_pool_size up to 50 - 80 %

# of RAM but beware of setting memory usage too high

#set-variable = innodb_buffer_pool_size=256M

#set-variable = innodb_additional_mem_pool_size=20M

# Set .._log_file_size to 25 % of buffer pool size

#set-variable = innodb_log_file_size=64M

#set-variable = innodb_log_buffer_size=8M

#innodb_flush_log_at_trx_commit=1

#set-variable = innodb_lock_wait_timeout=50



# Point the following paths to different dedicated disks

#tmpdir         = /tmp/         

#log-update     = /path-to-dedicated-directory/hostname



[mysqldump]

quick

set-variable    = max_allowed_packet=16M



[mysql]

no-auto-rehash

# Remove the next comment character if you are not familiar with SQL

#safe-updates



[isamchk]

set-variable    = key_buffer=128M

set-variable    = sort_buffer=128M

set-variable    = read_buffer=2M

set-variable    = write_buffer=2M



[myisamchk]

set-variable    = key_buffer=128M

set-variable    = sort_buffer=128M

set-variable    = read_buffer=2M

set-variable    = write_buffer=2M



[mysqlhotcopy]

interactive-timeout

Link to comment
Share on other sites

I'm ready to donate my DATABASE. Linda, Herald, can I have your email address? (about 5 MB)

 

A question. Isn't mysql one of the faster database? I think that changing database I don't improve my performance...

 

I'm very grateful to all the friends that help me. Now I'm going to apply your suggestions, and then I'll publish the results.

 

Have a good day,

Mick

Link to comment
Share on other sites

Sybase would be by far the lease expensive method, should he decide to upgrade the DBMS. Unfortunatly, I don't know of any client seveices that they offer for the web, so it would have to use an ODBC connection, which would still be a little slow.

 

There is a phpSybaseAdmin (port of phpMyAdmin)

Sybase ASE for Linux is free AFAIK.

 

And Mick, you can e-mail me the SQL file and I will make it available to the team on our internal server.

 

Thanks,

Mattice

"Politics is the art of preventing people from taking part in affairs which properly concern them"

Link to comment
Share on other sites

Hi,

I found some bottle neck of my oscommerce app. I will explain what I do:

 

1) Queries with JOIN to "specials" slow down very very very much the apps!!! So I remove all the joins and I speed up to 90%.

 

2) There is a double count of rows for paging (one in split_page_results constructor and one in the product_listing.php). I remove the last one counting.

 

3) I customized the row counting in split_page_results constructor to obtain an efficient query to count the rows.

 

Now my little big app go normally, with 1/2 sec of latency.

If you want more details, ask me.

 

Thanks all guys,

:D Mick :D

Link to comment
Share on other sites

There is a phpSybaseAdmin (port of phpMyAdmin) 

Sybase ASE for Linux is free AFAIK.

 

Now *that* is some very usefull information. I had no idea, but Sybase is very easy to use and administer. Probably wouldn't be realistic to re-write all of the OSC MySql queries using TSQL, but if someone ever wanted to give this a go, I could probalby knock them all out in pretty short order.

 

I write SQL queries for Sybase 8 hours a day. :lol:

-------------------------------------------------------------------------------------------------------------------------

NOTE: As of Oct 2006, I'm not as active in this forum as I used to be, but I still work with osC quite a bit.

If you have a question about any of my posts here, your best bet is to contact me though either Email or PM in my profile, and I'll be happy to help.

Link to comment
Share on other sites

Michele, are you interested in benchmarking your store/database against MySQL 3.x and 4.x?

 

Probably wouldn't be realistic to re-write all of the OSC MySql queries using TSQL, but if someone ever wanted to give this a go, I could probalby knock them all out in pretty short order.

 

I think Sybase will be one of the first databases we'd be adding support for once work on database independency is started. The Sybase database is available for download for educational purposes which is great :D

:heart:, osCommerce

Link to comment
Share on other sites

Archived

This topic is now archived and is closed to further replies.

×
×
  • Create New...