A few weeks ago I mentioned the wesley.pl script from GitHub to optimize images, and how I had modified it to keep (or discard) the EXIF / XMP information. Making sure images are as small as possible is important to save bandwidth and improve page load times (and google rank), so I think it’s worth discussing my image optimization process in more detail.
To improve page load times (and Google ranking), you should make sure all jpeg, png, and gif files are properly optimized. Instead of writing my own script for jpegtran, pngcrush, and gifsicle, I used Mike Brittain’s Wesley.pl script on GitHub. It works great, though I did have to modify it to change the “jpegtran -copy” parameter it uses — I need to keep the EXIF on larger files, and strip it from thumbnails. I posted the diff on the GitHub Issues page.
Update 2012-12-31 : In case Mike doesn’t merge my diff, with the addition of the
--copy=[all|comments|none]command-line argument (see my comment bellow for more info), you can download the patched wesley.pl script here instead.
I recently updated a script that checks Apache httpd process sizes and saves the information to an SQLite database file. As part of some new functionality in the script, I needed to modify the SQLite database to add an additional column and some indexes. When creating new tables, you can use
create table if not exists $table (); for example, but the same “if not exists” condition is not available when adding columns or creating indexes.
There are several solutions available to create a similar “if not exists” test for columns and indexes, but all of them (or at least the ones I found) are based on long SQL statements and/or stored procedures. I wanted something more flexible and perl-based, so wrote the following to set hash elements (%dbcol_exists and %dbidx_exists) using column and index names as references. The section that retrieves table / index names from the database and sets the hash elements, has been highlighted in the following snippet of code.
check_httpd_limits.pl compares the size of running Apache httpd processes, the configured prefork / worker / event MPM limits, and the server’s available memory. The script exits with a warning (or error message) if the configured limits exceed the server’s available memory.
A little while ago I had to reboot a client’s VM because the web server forked too many processes. They were making use of PHP, but the web server had not been configured for the resulting larger process size. I searched for a tool that would analyze the size of running httpd processes, and project the impact of starting the maximum number of processes allowed by MaxClients or ServerLimit, but didn’t find anything, so ended-up writing my own.
The following check_httpd_limits.pl script compares the size of running Apache httpd processes, the configured prefork/worker/event MPM limits, and the server’s available memory. The script exits with a warning or error message if the configured limits exceed the server’s available memory.
check_httpd_limits.pl does not use any 3rd-party perl modules, unless the
--save/days/max command-line options are used, in which case you will need to have the DBD::SQLite module installed. It should work on any UNIX server that provides /proc/meminfo, /proc//exe, /proc//stat, and /proc//statm files. You will probably have to run the script as root for it to read the /proc//exe symbolic links.