Jump to content

Archived

This topic is now archived and is closed to further replies.

Chemo

Speed / Performance Optimizations

Recommended Posts

Now that you have your store up and running with products uploaded it's time to start looking at your page performance. Generally speaking customers love fast loading pages. If they have to wait on a page to load you decrease your conversion rate. So let's get down and discuss a few basic but essential performance optimizations.

 

IMAGES

By far the worse offender for slow loading pages! The simple answer is to install an automatic thumbnail generator with cache features. It is important to use an autothumbnailer that will cache the image instead of doing it "on the fly" as it decreases server load tremendously. The effects may not be noticeable on a store with 1 or 2 visitors per hour but if you do any kind of serious volume you will defitely want to cache the images. After all...if you've spent the server resources to create the thumbnail image why not store it and use it later instead of creating it again?

 

CONTRIBUTIONS

Install ONLY the ones that you need to operate and ONLY the ones that will either save time or add value to the shopping experience. What contributions will fit these requirements will depend on the type of store you have and products for sale. Each contribution you install will add overhead to process the logic whether it be pure PHP code or MySQL queries (or both). As a general rule KEEP THE NUMBER OF CONTRIBUTIONS INSTALLED TO A MINIMUM. Too many times a store owner will go contribution happy and install every toy possible. This adds tremendous bloat to the code base and performance suffers...the intent was to make a better shopping experience but the net effect is to DECREASE the experience due to slow loading pages.

 

REMOVE FEATURES NOT USED

There are several features of stock osCommerce that stores may or may not use that would help performance if removed. For example: banners, "requests since" footer display, who's online, etc.

 

If you don't use the features remove (comment out) the code.

 

Another benefit of eliminating those features is that it prevents a store from using the MySQL query caching features. On each insert the cache is flushed. Of course, this also means storing the sessions on the filesystem!

 

STS - Simple Template System

I wanted to give special attention and space to this contribution since it is one of the worse offenders for queries and is also one of the more popular template systems available for osCommerce. If you have a significant number of categories on your store this contribution will kill your page performance. It performs a full table scan on each page request. If you have more than a few hundred categories it will increase your page load time by 200-300% based on server specifications.

 

In short, if you want speed don't use a template system...learn to modify the monolithic application that is osCommerce.

 

SESSIONS

If you store the sessions in the database consider adding a (primary) multi-column index on sesskey and expiry columns. This will take up more physical space but will be much faster and will be a const query type.

 

CACHING - Stock Code and Page Cache

The ability to cache data is very important to performance...especially if you have a large number of categories, products, and/or orders. I recommend creating a directory ABOVE the publicly accessible document root and giving it proper permissions for the server read/write. Creating it ABOVE the document root ensures that would-be hackers cannot access it with their web browser. Set your cache directory settings to this filepath to keep it from being stored in the "/tmp" folder which will cause issues.

 

Once you have the cache folder created and settings configured TURN ON THE CACHE FEATURES. This is especially effective with the category box!

 

The Page Cache contribution was created for those stores that have so many contributions installed it would be nearly impossible to optimize the code. The correct answer is to not bloat the store with contributions that are not needed...but if you find yourself in that position then Page Cache may be for you. I use the Page Cache contribution as a last resort...not as a first line choice.

 

COMPRESS YOUR PAGE OUTPUT

Most are aware of page compression via GZIP. The optimal setting is compression level 1 for speed as higher levels will not result in signifant reduction of page size. However, a quick trick is to phyically compress the data being GZIP'd by removing extra whitespace and line breaks.

 

Here is the function that I use:

function compress_buffer ($buffer) {
# Return the compressed buffer
return str_replace("\n", ' ', preg_replace('/\>\s+\</', '> <', $buffer));
}

Of course, this is offered as a starting point for you and you'll have to figure out how to use it in combination with GZIP.

 

The physical page compression with GZIP compression will reduce the output HTML code by some 60-75% and will dramatically increase page performance especially on dial connections.

 

CODE / QUERY OPTIMIZATION

I wanted to approach this topic last as it has the most variety. This is due to the fact that every store is different. They all start with the same code base but each store installs different contributions which results in great variation as to which code needs to be optimized.

 

A good starting point would be to install a per page query output contribution[/b]. This will allow you to see which queries are being executed on a per page basis and easily identify those that are redundant or taking excessive time to execute. Once you pinoint what needs to be done it's easier to actually do it...never perform ANY optimization blindly. ALWAYS modify code for a purpose. Sounds like common sense? You would be surpised...

 

Once again, I won't approach specific contribution optimizations but will only discuss those that are common to every osC store. One such contribution is an abstraction of the MS3 tax class for MS2. This replaces the stock MS2 tax code with the new tax class for MS3 which is much more efficient and uses less queries per page. The tax query is EXECUTED ON EACH PRICE DISPLAY EVEN IF THE SETTING TO DISPLAY TAX IS DISABLED. It's important to know that fact since most think if they turn off the option in the admin control panel it will not query for the tax...this is false and even if the setting is turned off it will STILL RUN THE TAX QUERY. Why is this important? Because the tax query is one of the most server intensive queries for all of osCommerce! For a 2 minute install it will increase the page performance / server load tremendously.

 

The also_purchased module (product info pages) is a powerful upsale tool however it is the absolutely most server intensive query for the osCommerce application. This is especially true for those stores that have a significant number of products and orders. I've seen this bring a dual 2.8 Ghz CPU, 1Gb RAM server to its knees...by just running that 1 query. As you accumulate more orders you will get a feel for which products would be ideal cross sell items and thus it would be advisable to install a X-Sell contribution. It may be a bit more work to maintain but the increase in product page performance is well worth the effort.

 

On each page request the contents of the configuration table are loaded as define statements. The implication of this is that on each page request the server must perform a table scan to generate the data...which depending on how many contributions you have installed will add overhead to each page load. An excellent contribution that addresses this issue is Faster Page Loads, Less DB Queries. This contribution caches the data and eliminates the database query...thereby saving the table scan for when the cache is not present. This contribution is HIGHLY recommended.

 

This post is not meant to be a difinitive guide to optimizing an osCommerce store but will hopefully give you a few pointers as to which areas should be addressed as a priority to your optimizination efforts. I hope this post will spark the creativity and discussion that makes the community the valuable resource that it is...

 

Cordially,

 

Bobby Easland

Share this post


Link to post
Share on other sites

One little item I like to install and reccomend highly is ZEND optimizer for increased PHP performance. Overall it will enhance the performance of ALL php scripts on your server.

 

Using ZEND encoder in conjunction will dramatically reduce the actual file size of your PHP pages and create a compound performance increase.

 

http://www.zend.com

 

Just my 2 cents. :blink:


I had a blind date last night. Her name was :. .:: :.: .:. .::.

 

My contributions to the cause.

Share this post


Link to post
Share on other sites

I prefer Turck MMCache. It's open source, free, feature rich, and very robust. They have performance benchmark data on the website and it beats or matches Zend (eval) Performance Suite in most metrics.

 

Bobby

Share this post


Link to post
Share on other sites

Kevin,

 

Turck is much faster than using Zend Optimizer as a standalone application...they are not even in the same league. Turck blows Zend Optimizer out of the water. Why? Because all Zend Optimizer does is go over the produced intermediate code and optmizes it for faster execution. However, Turck not only optimizes the intermediate code BUT ALSO CACHES IT thereby saving the overhead of compiling it again.

 

However, there is an "industry standard" suite produced by Zend that compares to Turck: Zend Performance Suite which is now only offered as part of the Zend Platform and STARTS at $995 USD per year.

 

Let's see...$1,000 per year or a free, open source, mature, robust, and proven PHP accelerator? The choice is clear for me...

 

Bobby

Share this post


Link to post
Share on other sites

Kevin,

Id agree 100% with what bobby has written above, industry standard or no industry standard ive seen with my own eyes the differences between the 2 aforementioned sw and the Zend stuff is nowhere near as good. Theres a MASSIVE difference.

Share this post


Link to post
Share on other sites
COMPRESS YOUR PAGE OUTPUT

Most are aware of page compression via GZIP.  The optimal setting is compression level 1 for speed as higher levels will not result in signifant reduction of page size.  However, a quick trick is to phyically compress the data being GZIP'd by removing extra whitespace and line breaks.

 

Here is the function that I use:

function compress_buffer ($buffer) {
# Return the compressed buffer
return str_replace("\n", ' ', preg_replace('/\>\s+\</', '> <', $buffer));
}

Of course, this is offered as a starting point for you and you'll have to figure out how to use it in combination with GZIP.

 

The physical page compression with GZIP compression will reduce the output HTML code by some 60-75% and will dramatically increase page performance especially on dial connections.

 

 

how can i do this? can you provide any links or something that may assist me in doing this

 

thanks

Share this post


Link to post
Share on other sites
how can i do this? can you provide any links or something that may assist me in doing this

 

thanks

 

 

not totally sure but I guess that in this function :

 

function tep_gzip_output($level = 5) {

 

you replace :

 

$contents = ob_get_contents();

 

with :

 

$contents = compress_buffer(ob_get_contents());

 

after you added the function to gzip_compression.php


Treasurer MFC

Share this post


Link to post
Share on other sites
not totally sure but I guess that in this function :

 

  function tep_gzip_output($level = 5) {

 

you replace :

 

      $contents = ob_get_contents();

 

with :

 

      $contents = compress_buffer(ob_get_contents());

 

after you added the function to gzip_compression.php

 

 

would it be advisable? to use this statement :

 

return str_replace("\n", ' ', preg_replace('/\>\s+\</', '> <', $buffer));

 

in the page cache buffer compress function instead of the current :

 

return preg_replace('/\>\s+\</', '> <', $buffer); ?


Treasurer MFC

Share this post


Link to post
Share on other sites
how can i do this? can you provide any links or something that may assist me in doing this

 

thanks

As usual Amanda is correct.

 

Add the function to the includes/functions/gzip_compression.php file and use the code in tep_gzip_output(). It should look something like this:

	function compress_buffer ($buffer) {
 # Return the compressed buffer
 return str_replace("\n", ' ', preg_replace('/\>\s+\</', '> <', $buffer));
}

/* $level = compression level 0-9, 0=none, 9=max */
 function tep_gzip_output($level = 1) {
   if ($encoding = tep_check_gzip()) {
     $content = ob_get_contents();
  $contents = compress_buffer( $content );
  unset( $content );
     ob_end_clean();
  
     header('Content-Encoding: ' . $encoding);

     $size = strlen($contents);
     $crc = crc32($contents);

     $contents = gzcompress($contents, $level);
     $contents = substr($contents, 0, strlen($contents) - 4);

     echo "\x1f\x8b\x08\x00\x00\x00\x00\x00";
     echo $contents;
     echo pack('V', $crc);
     echo pack('V', $size);
   } else {
     $contents = ob_get_contents();
     ob_end_clean();
  echo compress_buffer( $contents );
   }
 }

 

of course, turn on the GZIP feature through the admin control panel and set the level to 1.

 

would it be advisable? to use this statement :

 

return str_replace("\n", ' ', preg_replace('/\>\s+\</', '> <', $buffer));

 

in the page cache buffer compress function instead of the current :

 

return preg_replace('/\>\s+\</', '> <', $buffer); ?

Yes it would :) Depending on the size of the page and number of line breaks it could save an extra few percent...but everything counts!

 

Bobby

Share this post


Link to post
Share on other sites

So are some questions I would have for server gurus....

 

What is the advantage/disadvantage to using tep_gzip_output?

 

Pages are smaller adn faster? but wht server increas load is there?

 

Also if you have Zend on the server should you run w/ Turk or remove zend optimizer?

Chris


osC Contributions I have published.

 

Note: Some I only provided minor changes, updates or additions!

Share this post


Link to post
Share on other sites

Chris,

 

The advantage to using GZIP output is that the server physically compresses the contents and send it to the browser. When it is received by the browser it unzips it and renders the code. Basically, it presents a smaller amount of data to transfer.

 

The pages are smaller and faster...and the load on the server is insignificant. It is HIGHLY recommended!

 

Zend Optimizer can co-exist with Turck but is not recommended (per the website). The only reason to have both is if scripts are encoded with Zend...otherwise just install Turck.

 

Bobby

Share this post


Link to post
Share on other sites
Chris,

 

The advantage to using GZIP output is that the server physically compresses the contents and send it to the browser.  When it is received by the browser it unzips it and renders the code.  Basically, it presents a smaller amount of data to transfer.

 

The pages are smaller and faster...and the load on the server is insignificant.  It is HIGHLY recommended!

 

Zend Optimizer can co-exist with Turck but is not recommended (per the website).  The only reason to have both is if scripts are encoded with Zend...otherwise just install Turck.

 

Bobby

 

 

Bobby is there a serve rsettign for this?

 

I ask because after atttempting to use GZIP and see ing no change I started to tlook through the code and it appears that in app_top it is not even getting to the include(DIR_WS_FUNCTIONS . 'gzip_compression.php');

 

If you look at the code:

 

 if ( (GZIP_COMPRESSION == 'true') && ($ext_zlib_loaded = extension_loaded('zlib')) && (PHP_VERSION >= '4') ) {
   if (($ini_zlib_output_compression = (int)ini_get('zlib.output_compression')) < 1) {
     if (PHP_VERSION >= '4.0.4') {
       ob_start('ob_gzhandler');
     } else {
       include(DIR_WS_FUNCTIONS . 'gzip_compression.php');
       ob_start();
       ob_implicit_flush();
     }
   } else {
     ini_set('zlib.output_compression_level', GZIP_LEVEL);
   }
 }

 

I am not gettign the else I run: ob_start('ob_gzhandler');

 

?? am I missing somethign obvious????

 

tested both on 4.3.9 and 4.3.11

 

Chris


osC Contributions I have published.

 

Note: Some I only provided minor changes, updates or additions!

Share this post


Link to post
Share on other sites

Hi,

 

I would like to use this code to optimise my html output but it shos no effect.

why?

 

tom

 

 function tep_check_gzip() {
   global $HTTP_ACCEPT_ENCODING;

   if (headers_sent() || connection_aborted()) {
     return false;
   }

   if (strpos($HTTP_ACCEPT_ENCODING, 'x-gzip') !== false) return 'x-gzip';

   if (strpos($HTTP_ACCEPT_ENCODING,'gzip') !== false) return 'gzip';

   return false;
 }

function compress_buffer ($buffer) {
# Return the compressed buffer
return str_replace("\n", ' ', preg_replace('/\>\s+\</', '> <', $buffer));
}

/* $level = compression level 0-9, 0=none, 9=max */
function tep_gzip_output($level = 1) {
  if ($encoding = tep_check_gzip()) {
    $content = ob_get_contents();
 $contents = compress_buffer( $content );
 unset( $content );
    ob_end_clean();

    header('Content-Encoding: ' . $encoding);

    $size = strlen($contents);
    $crc = crc32($contents);

    $contents = gzcompress($contents, $level);
    $contents = substr($contents, 0, strlen($contents) - 4);

    echo "\x1f\x8b\x08\x00\x00\x00\x00\x00";


    echo $contents;
    echo pack('V', $crc);
    echo pack('V', $size);
  } else {
    $contents = ob_get_contents();
    ob_end_clean();
 echo compress_buffer( $contents );
  }
}

Share this post


Link to post
Share on other sites

I just implemented this change and noticed a slight improvement in speed, all the few %`s add up.

Thanks again Bobby!

 

..... now about that index page optimisation or catalogue box optimisation contrib ;)

Share this post


Link to post
Share on other sites

if your PHP install has the zlib module installed and your host allows you to change php configuration values through .htaccess, you can simply add the following to your top level .htaccess to gzip your entire site to the browser (you'll have to turn gzip Off in your admin):

 

php_value zlib.output_compression 1
php_value zlib.output_compression_level 1

 

I personally prefer this method over using php to buffer, compress, and output (if the host allows these .htaccess modifications).


The only thing necessary for evil to flourish is for good men to do nothing

- Edmund Burke

Share this post


Link to post
Share on other sites
if your PHP install has the zlib module installed and your host allows you to change php configuration values through .htaccess, you can simply add the following to your top level .htaccess to gzip your entire site to the browser (you'll have to turn gzip Off in your admin):

 

php_value zlib.output_compression 1
php_value zlib.output_compression_level 1

 

I personally prefer this method over using php to buffer, compress, and output (if the host allows these .htaccess modifications).

 

 

This method will not remove the whitespace will it?

 

Chris


osC Contributions I have published.

 

Note: Some I only provided minor changes, updates or additions!

Share this post


Link to post
Share on other sites
if your PHP install has the zlib module installed and your host allows you to change php configuration values through .htaccess, you can simply add the following to your top level .htaccess to gzip your entire site to the browser (you'll have to turn gzip Off in your admin):

 

php_value zlib.output_compression 1
php_value zlib.output_compression_level 1

 

I personally prefer this method over using php to buffer, compress, and output (if the host allows these .htaccess modifications).

True...it is easier and faster to add those settings as an ini_set in application_top, php.ini configuration setting, or even htaccess setting. However, you will miss out on the benefit of compressing the buffer by removing whitespace and linebreaks.

 

As a test check the page size without removing whitespace and linebreaks. Then, implement the code above (in whatever fashion you want). Notice how the size of the page is reduced by 20-25%.

 

I would rather reduce the page size 25% THEN gzip the contents as it makes for an even smaller data transfer. With GZIP output AND physical page compression it is typical to reduce a 50-60Kb page down to about 6-7 Kb...which will absolutely fly even over dial connections.

 

Bobby

Share this post


Link to post
Share on other sites

my server says "HTTP_ACCEPT_ENCODING gzip,deflate" and my compression_level = 1.

 

Could there somethig else, maybe an contrib which unables the gzip funktion? I habe no idea.

 

tom

Share this post


Link to post
Share on other sites

Hi Bobby,

 

thanks for the interesting tips!

 

I've got one question though, at your demo site (mesoimpact.com) I see this at the bottom:

Original Parse Time: .428 s with 92 original queries
Is this an avarage parse time on an avarage server for a stock osC install? Or?

 

I am asking this because the number seems rather high to me. My site isn't really optimized for speed yet, and I'm on a cheap (shared) host, but the avarage parse time is much lower than .428 sec. One live site mostley avarages below .15 sec (using the optimized tax class you ported to ms2), and another stock (+BTS) test site allways avarages below 0.2 sec.

 

No cache as far as I know (osC cache function is disabled), gzip compression enabled.

 

Paul


Please do not PM me for support, I will not respond anyway.

Share this post


Link to post
Share on other sites

That was the parse time for the page on my server as an average of 100 page refreshes. The time will vary and your mileage may vary.

 

The starting and end point is not significant...the percent performance increase is what counts.

 

Bobby

Share this post


Link to post
Share on other sites

Hey Bobby,

 

Kind of a related question for you. I kinda sensed things slowing a bit for me now that I have about 6,000 prods up, specifically retireving prod images. In speaking with my hosting company (trying to rule out hardware issues), they suggested that since all prod images are stored in teh same dir, this was contributing to the delays. It also created problems for the FTP daemon being used since there are so many files in that one directory.

 

Since I will probably be hitting 10,000 + prods within a few months, I went through the trouble this weekend of re-writting several of the osc scripts to actually store prod images in manufacturer specific sub-folders. basically when a new manufacturer is created (I currently have 118 manufacturers and will probably end up near 300-400) a sub-folder is created using the manufacturers_id. I modified the categories.php script to upload all pictures to their respective sub-folders based on manufacturers_id. I also re-wrote the 5-6 scripts throughout the store including your category thumbnail browser and the shopping cart class to use this new structure.

 

All of this to ask if you think I just wasted my time and that if I basically trying to solve a performance problem that was never going to exist? The main driver is that I know I will be over 10,000 prods soon and figured I might as well tweak the cart now then when things hit the skids.....

 

Thoughts?

Share this post


Link to post
Share on other sites
That was the parse time for the page on my server as an average of 100 page refreshes.  The time will vary and your mileage may vary.

Thanks for the explanation, I probably have to thank my hosting company for the great server speed then. :)

Please do not PM me for support, I will not respond anyway.

Share this post


Link to post
Share on other sites
As usual Amanda is correct.

 

Add the function to the includes/functions/gzip_compression.php file and use the code in tep_gzip_output().  It should look something like this:

This is my full gzip_compression.php file, Is it correct?

 

<?php
/*
 $Id: gzip_compression.php,v 1.3 2003/02/11 01:31:02 hpdl Exp $

 osCommerce, Open Source E-Commerce Solutions
 http://www.oscommerce.com

 Copyright (c) 2003 osCommerce

 Released under the GNU General Public License
*/

 function tep_check_gzip() {
   global $HTTP_ACCEPT_ENCODING;

   if (headers_sent() || connection_aborted()) {
     return false;
   }

   if (strpos($HTTP_ACCEPT_ENCODING, 'x-gzip') !== false) return 'x-gzip';

   if (strpos($HTTP_ACCEPT_ENCODING,'gzip') !== false) return 'gzip';

   return false;
 }

function compress_buffer ($buffer) {
# Return the compressed buffer
return str_replace("\n", ' ', preg_replace('/\>\s+\</', '> <', $buffer));
}

/* $level = compression level 0-9, 0=none, 9=max */
function tep_gzip_output($level = 1) {
  if ($encoding = tep_check_gzip()) {
    $content = ob_get_contents();
 $contents = compress_buffer( $content );
 unset( $content );
    ob_end_clean();

    header('Content-Encoding: ' . $encoding);

    $size = strlen($contents);
    $crc = crc32($contents);

    $contents = gzcompress($contents, $level);
    $contents = substr($contents, 0, strlen($contents) - 4);

    echo "\x1f\x8b\x08\x00\x00\x00\x00\x00";
    echo $contents;
    echo pack('V', $crc);
    echo pack('V', $size);
  } else {
    $contents = ob_get_contents();
    ob_end_clean();
 echo compress_buffer( $contents );
  }
}
?>

Share this post


Link to post
Share on other sites
Hey Bobby,

 

Kind of a related question for you. I kinda sensed things slowing a bit for me now that I have about 6,000 prods up, specifically retireving prod images. In speaking with my hosting company (trying to rule out hardware issues), they suggested that since all prod images are stored in teh same dir, this was contributing to the delays. It also created problems for the FTP daemon being used since there are so many files in that one directory.

...

All of this to ask if you think I just wasted my time and that if I basically trying to solve a performance problem that was never going to exist? The main driver is that I know I will be over 10,000 prods soon and figured I might as well tweak the cart now then when things hit the skids.....

 

Thoughts?

It is true that having a very large number of files in a directory will affect performance. Creating an effective directory structure helps solve this problem as each can be viewed as an index is to a database table. The idea of creating this elaborate directory structure of yours will be effective but the same result could have been achieved by giving each image a unique name. As an example let's say you have a manufacturer with an ID of 7, JPG file format, and eventual 100x100 size. You could have modified your thumbnail script to save the image like this:

$id_$mName_$width_$height.$extension

This would translate to 7_manname_100_100.JPG which is unique enough to offer the same performance increase as the directory structure.

Thanks for the explanation, I probably have to thank my hosting company for the great server speed then. :)

heheheh...sure. Keep in mind though that my development server is just that...it's not my production box. The dev server is my very first dedicated server that I have ever owned. I got it a few years back and each time I get another server just add it to my cluster.

 

So, as a frame of reference => my dev server is an 800Mhz CPU, 128Mb RAM, CentOS RHEL server running latest PHP and MySQL. If I put the meso site on my dual 2.4Ghz, 1.5Gb RAM, 100Mbs burstable, CentOS RHEL, load balanced dedicated server I'm sure the performance would be much better :)

 

Bobby

Share this post


Link to post
Share on other sites

×