Shocktober 2013: C.H.U.D.

Last night Robin and I watched the 1985 movie C.H.U.D. It was definitely one of those movies I remember seeing in the video rental stores as a kid, when I would be ogling the covers of the VHS rentals in the horror section—much too afraid to watch any of them—my brain making up content based on the artwork that I have and still am  discovering was much worse than what was really contained on the magnetic tape inside the plastic cassettes.

C.H.U.D. (VHS Box Art)

C.H.U.D. is definitely one of those movies for me. Most of the movie can best be described as “awesomely awful”. From the music and the effects to the story and pacing, it’s just kind of all over the place. The, seemingly, unintentional comedy throughout the thing is pretty great. At times you have to wonder whether the cast and crew realised they were making something truly terrible and was just having fun with it or if everyone was truly committed to making a quality cinematic experience. The government conspiracy theory “us vs. The Man”.

The cameos of, most likely then very unknown actors, got me really excited, too. Sam McMurray (a.k.a. Dr. Vic Schweiber from Freaks and Geeks) and a super young John Goodman both make an appearance here.

I think that overall this was better than I expected.

I’ll just leave this here.

Shocktober 2013: Creature from the Black Lagoon

Two years ago Robin and I tried to pull off Shocktober. The goal: 31 horror–or, at least, somewhat related to the genre–films for the 31 days in October. We only made it through 29 last time but we’re planning on trying to do it again this year. This is specifically a throw-back to the Shocktober that used to run on Channel 50 out of Detroit when we were both kids, growing on the Ontario / Michigan border.

We have started off Shocktober 2013 this year with the 1954 film Creature from the Black Lagoon.

Check out below the videos for the list of films that we watched in 2011.

Continue reading

Moodle performance testing — 2.4.6 vs. 2.5.2 vs. 2.6dev

Yesterday I posted some updated performance information about currently released versions of Moodle and Tim Hunt from The Open University asked me if I could run the same tests on the latest in-development Moodle 2.6 code.

So I went ahead and re-ran my automated tests using the same setup as I had previously. In this case, as it was meant to be a straight comparison of the various releases of Moodle, I kept the database backend consistent across the tests. This time I also changed the methodology slightly by restarting the web server before each initial page load to clear out the PHP APC opcode cache to get the first page to be indicative of a worst case scenario where someone is loading a page without anything being cached neither in Moodle nor within your opcode cache. In previous tests the web server was not restarted between tests and those opcode caches were attempted to be pre-warmed before the initial testing started.

The Moodle versions compared here are:

This is meant to be a preview of the performance for Moodle 2.6 as it is is currently in a pre-alpha state. The code freeze is scheduled for Monday Oct. 7 2013 after which the QA testing phase will begin. For more details refer to the Moodle Roadmap.

For details on the testing environment, please refer to my original post — Testing environment. The only difference between that original listing and the setup I used for these tests is that I was exclusively using MariaDB 5.5.33a_0 and PHP 5.4.20_0 as installed from MacPorts.

You can use the code I created for these tests as it’s publicly available on my Github account – moodle_perf_scripts.

Continue reading

Moodle performance analysis revisted (now with MariaDB)

This is a follow-up to my previous post on this subject. Pleases refer to that post for the testing environment setup and also the scenarios that were tested. Below I will outline what has changed since those results were gathered and the problems I was addressing with these changes.

I also ran the original tests with MariaDB 5.5.31 and the graphs on this post are comparing 2.4.5 and 2.5.1 with Percona MySQL and MariaDB on each version of Moodle using only the default filesystem MUC caching implementation.

Reducing manual work

Previously I was running each test manually in my browser and then copying the web server error log file into a named file (for later running an analysis script against). This had two problems:

  1. The time between each request in my web browser was not consistent. Sometimes I was refreshing pages immediately after the previous page had loaded and sometimes I was waiting up to over five minutes between each subsequent page request.
  2. I had to manage a lot of separate log files which meant a lot of possibilities for mistakes and also running the same script over and over again on each file.

I solved these problems by creating some automated Selenium test scripts (using the Firefox IDE extension) and adding a URL parameter to each individual page request so that I could gather the statistics for the requests generated for each page within an error log file that contains the requests for all of the page requests.

This gave me the following improvements:

  • allowed me to put an exactly five second delay between each subsequent page load for the individual tests
  • meant that I would never skip a test or forget one of the steps (i.e. clearing Moodle caches before the first page load in a test)
  • meant that my performance analysis script could examine a log file with all of the requests and output a single file containing all of the processed results
  • the single CSV file meant that generating graphs no longer required manually copying / entering the results into a spreadsheet
  • less (zero?) chance of an error during the entire process
  • the login test is only including results

You can use the code I created for these tests as it’s publicly available on my Github account — moodle_perf_scripts.

Continue reading

Moodle 2.4.5 vs. 2.5.1 performance and MUC APC cache store

NOTE: The graphs within this post have been updated based on the automated testing I created. The new work is explained in my new post, Moodle performance analysis revisted (now with MariaDB).

As of this writing, the team at Moodle HQ (along with lots of volunteers from around the world) is currently hard at work on Moodle 2.6 and 2.5 has already seen it’s first maintenance release (2.5.1). At Remote-Learner we are just about to roll out Moodle 2.5.1 for our clients and we wanted to do some performance testing. Given how much more of Moodle is using the Moodle Universal Cache (MUC) we also wanted to test what kind of performance benefits using Sam Hemelryk’s moodle-cachestore_apc plugin might afford us.

In this case, performance is specifically limited to the server / hosting environment load. We were less interested in the effects of Javascript and end user browser page load speeds at this time.

Testing environment


Moodle was being run via a LAMP stack on a mid-2010 MacBook Pro with 8GB of RAM and a 512GB Seagate Momentus XT hybrid HDD. The web browser was being run from a separate machine on the local network.


Web stack (all versions were installed via MacPorts):

  • Apache 2.2.25_0
  • PHP 5.4.18_0 w/ APC 3.1.13_0
  • Percona MySQL 5.5.32-31.0_0
  • Mozilla Firefox 23.0 (with browser caching disabled – network.http.use-cache set to false)


The Moodle site was setup using a copy of a large production database (2000 courses, 90,000+ course module instances and nearly 28,000 users) running Moodle 2.4.4. For the purposes of this test that database was upgraded to Moodle 2.4.5 and, for the second and third round of testing, upgraded to 2.5.1. DB Sessions were disabled (so that session data was being written to the filesystem) and the MUC configuration was all default settings (except when switching to the APC store for the third round of testing).

PHP APC was using default settings aside from allowing the use of 256MB of memory for the cache store.

Moodle had the MDL_PERFTOLOG constant set to true in the config.php file to that performance data was written out to the web server error log for each PHP script that was executed.

Testing methodology

The idea was to identify some common pages should help give a good indication of performance across the three setups we are testing. I identified six specific pages / actions in Moodle to use as comparison points:

  1. Viewing the site index page as an unathenticated user
  2. Logging into the system as a site administrator (from submitting the login form data to the site index page finishin rendering)
  3. Viewing the site index page as a site administrator
  4. Viewing the site index page as a student that is enrolled in 32 courses
  5. Viewing a very large course page (2696 course module instances) as a site administrator
  6. Viewing the first page of the Moodle gradebook in a course with a very large number of enrolled users (over 131,000 enrolled users)

Each page was loaded four times for for each site configuration. Those site configurations:

  1. Moodle 2.4.5 with the default MUC configuration
  2. Moodle 2.5.1 with the default MUC configuration
  3. Moodle 2.5.1 with the default applicaiton store set to use apc as the plugin in the /moodledata/muc/config.php file

Before the first page load all of the Moodle caches were purged by running the Moodle CLI script /admin/cli/purge_caches.php. The three subsequent page loads were performed with the caches warmed up. The idea here was to simulate a worst-case performance scenario and then more of a real world scenario where the caches are warmed up.

NOTE: the web server was never restarted so the APC opcode caches were never purged during testing.

After each test the Apache error_log file was copied into a named file and them emptied for the next page request to be made. This allowed us to capture all of the requests that were generated by a single page load within Moodle (i.e. images, CSS, JS, etc.) and then total the performance numbers across all of the PHP scripts executed on the web server for a given page load. The numbers below reflect those totals for each configuration.

In each case there were four key metrics I was interested in:

  1. Execution time
  2. Memory usage (peak memory)
  3. DB reads
  4. DB writes

Continue reading

Gears of My Childhood – Lego

I am currently enrolled in the Learning Creative Learning course offered by the MIT Media Lab and P2PU. The assignment for the second week was to read the foreword to Seymour Papert’s 1980 book Mindstorms: Children, Computers, and Powerful Ideas. The foreword is titled The Gears of My Childhood and is about how, as a very young child, the author learned how gears worked and used it as a framework for learning new and ever more complex ideas by relating them back to gears. The main point of this is the fact that his passion and love for gears meant he was able to make this leap of connecting them to other concepts such as linear mathematical functions.

Our task for this assignment is to write about what was our own gears from when we were a child. In my case that would be Lego.

I believe I was four when I first received a rather large collection of Lego. A friend of my parents had been buying me sets as presents since I was born with the idea that when I was old enough to actually play with them I would have a very good set to use. I believe that my initial collection was just sets of blocks and not the more specific kits that were intended to build a few projects. Though I did enjoy getting those kits I would always end up mixing the pieces from those kits back into the larger pool to build my own creations.

I would come up with ideas for building things like space ships, moon bases, a monorail system or things from movies and TV shows I liked. I remember building a fairly large version of the helicopter from the TV show Airwolf that I could git my G.I. Joe action figures inside of. The fact that I had pieces which did not include highly specialised parts–like  a lot of the sets made in recent years–meant that I had to work within the constraints of the blocks I had available to me in order to make something look like what I envisioned.

I also learned about symmetry and consistency. I always wanted the colours of blocks to match up. If I was building a house with four walls and I did not have enough of a certain colour block to make all the walls the same colour, I would try to match each level, or at least use the same pattern, all the way around the walls so that things were symmetrical and equal.

Later in school when I started to get projects to work on, if the project involved making something, I would always try and incorporate Lego into what I was building.

In fact, Lego is somewhat relevant to why I choose my career path. Douglas Coupland’s 1995 novel Microserfs features a group of characters who leave their jobs at Microsoft to start up their own company creating a PC game that is essentially a virtual Lego set. At one point in the story the characters are playing with the gigantic pile of Lego in their office and talking about it’s influence on their childhood. Things like the binary nature of how the blocks fit together and what kind of computer code would be produced by someone who selected random block colours when building structures as opposed to the more logical and programming-centric approach of symmetry. This novel is one of the primary motivators in my wanting to study Computer Science at university.

While I don’t believe I can use Lego in a similar manner to Papert in the sense of Lego being a framework to make it easier to relate to new concepts, it still applies as something which was important to my learning as a child and which I had great passion for. In fact, if anyone is ever looking for a gift for me, I would love one of those $500 3800 piece Lego Death Star kits. =)

Moodle Add-on Evaluations

During my second talk at the 2013 Canadian Moodle Moot in Vancouver, someone in the audience asked me about how my team at Remote-Learner evaluates Moodle plug-ins. We have a checklist that a developer must go through when trying to make a determination as to whether an add-on is suitable for installation on a Moodle site for one of our clients.

In general we will only disapprove a Moodle add-on if it fails for security reasons or it completely breaks some part of core Moodle functionality.


  • Search forums and tracker for issues with the module.
  • Check for recent updates or other indications of continuing maintenance.


  • Ensure that scripts call require_login() or require_course_login() somewhere near the top:Ensure that capabilities are checked against the proper contexts using require_capability() or has_capability().
  • Look in all scripts that have directly executable code – that is not just functions or class definitions
    • Technique – use an IDE (Eclipse, Netbeans, Sublime Text, etc.) and open all files with “folding” enabled and “collapse all”.
  • Ensure that all html parameters are retrieved using required_param() or optional_param() and never from globals.Ensure that the proper Moodle database calls (get_records(), get_records_sql()) are being used and that hard coded table prefixes aren’t
    • Search all code for occurrences of $_ and $HTTP for potential abuses
    • Ensure that the data type given to required_param() and optional_param() is appropriate for the data expected. If PARAM_RAW is used, ensure this does not get used in a db call.


  • Ensure that proper plug-in structure is provided (i.e. /db/ and /lang/ directories are used).

Moodle activity modules

  • Ensure that the module includes the required files, API hooks, and database table as defined in the official Moodle documentation
    • All of the above that are not met must be documented as some requirements are more important than others
  • Does the code require modification of any core files?
    • If so does it just contain the modified core files or does it contain patches or unmodified copies of the source files?
  • Test the module on a local development site:
      1. Install the module on the site (record any warnings or errors)
      2. Add an instance of the activity module to a sample course
      3. Backup and restore the test course to make sure that the module does not cause any errors during either part of the process and that the restored course contains the data from the module instance

Moodle blocks

  • Ensure that if the block is installing database tables that it is using the correct naming format for the database tables:
    • All of the block database tables must be prefixed with the block class name. So if the block class is called block_my_block then all of the database table names must start with block_my_block eg.:
      • block_my_block
      • block_my_block_table1
      • block_my_block_table2
      • etc.

Special considerations

  • If something modifies core code we need to evaluate the impact this will have, especially for ELIS clients
  • If there is cron code included, perform a test cron run to make sure it doesn’t break anything

Using Git to fix a typo in every commit of a branch

Full disclosure: how I got into this situation is bad practice on my part but how I got out of it was interesting and I thought it would be worth sharing and potentially helpful to others.


First, some background. I was creating a Moodle plugin to contain both a minified and uncompressed version of jQuery. There already exists a plugin in the community that fills this role–moodle-local_jquery–but it contains more than just jQuery and is meant to work as part of a larger component. It wasn’t exactly what I needed.

For my needs, I wanted a Moodle plugin which only contained jQuery and had the version property set to the specific release date of that version of jQuery. This allows another plugin to require a minimum version of my plugin in order to specify the minimum version of jQuery that it is compatible with. I needed this plugin to work with Moodle versions 2.2 through 2.4. So I went ahead and created a commit in a MOODLE_22_STABLE, MOODLE_23_STABLE and MOODLE_24_STABLE branch for each release of jQuery since 1.6.3 (everything listed on the jQuery download page).

After I had finished going through and creating all the branches I wanted to test my plugin on a local Moodle 2.2 install. I cloned the MOODLE_22_STABLE branch of the repository into /local/rljquery/ and ran through the installation process The process stopped before it had finished installing all the plugins with no error output. I looked through my web server error log file and saw this message:

2013-02-08 11:51:48: (mod_fastcgi.c.2676) FastCGI-stderr: PHP Parse error: syntax error, unexpected end of file in /Users/jfilip/code/moodle22/local/rljquery/version.php on line 33
PHP Stack trace:
PHP 1. {main}() /Users/jfilip/code/moodle22/admin/index.php:0
PHP 2. moodle_needs_upgrading() /Users/jfilip/code/moodle22/admin/index.php:248


I opened up the version.php file in my plugin and noticed the following:

$plugin->version   = 201302040; // Release date of this version of jQuery.
$plugin_requires   = 2011120500.00;
$plugin->component = 'local_rljquery';
$plugin->maturity  = MATURITY_BETA;
$plugin->release   = '1.9.1 (2.2)' // The jQuery release version with the Moodle release in parentheses.

So, I made two very syntax simple errors that I should have caught when first writing the code. The second line has $plugin_requires instead of $plugin->requires and the last line is missing a semi-colon at the end of the string and before the comment starts. In most cases this is a simple fix but in my case I had already spread this problem across 30 individual commits.

At that moment you would normally have to just throw everything away and re-do every single commit. But I was able to recall one time when I had done some digging into a way to perform changes to a Git repository across the entire history. So I did some Google digging.

The Nuclear Option: filter-branch

There is another history-rewriting option that you can use if you need to rewrite a larger number of commits in some scriptable way — for instance, changing your e-mail address globally or removing a file from every commit. The command is filter-branch, and it can rewrite huge swaths of your history, so you probably shouldn’t use it unless your project isn’t yet public and other people haven’t based work off the commits you’re about to rewrite. However, it can be very useful.

The filter-branch command in Git was what I had remembered. The documentation for the command has an example of using it to execute a shell script to re-write information about the user who performed a commit. I thought I could try this same approach to edit the contents of a file in every commit within a branch.

Eventually I came up with the following two commands which would fix each problem. It uses a perl command to do a search and replace within the version.php file for each commit within the current branch. Because git filter-branch creates a backup so you can revert the changes it has made if the results are not what you expected, I had to use the -f option in order to force the second command to overwrite the backup.

git filter-branch --tree-filter "perl -pi -e 's/plugin\_requires =/plugin\->requires =/g' version.php" HEAD

git filter-branch -f --tree-filter "perl -pi -e 's/ \/\/ The jQuery/; \/\/ The jQuery/g' version.php" HEAD

Next I had to check to make sure that the changes actually did get applied through the commits in the current branch. So I wanted to check the first commit in this branch to make sure it was modified correctly. I found the first commit using the following command:

git log --oneline | tail -n 1

bcbd130 Initial commit of jQuery 1.6.3 for Moodle 2.2

And then I wanted to look at just the relevant lines of that file to make sure the changes were made:

git blame bcbd130 -- version.php | tail -n 5

^bcbd130 (Justin Filip 2013-01-23 11:25:26 -0500 28) $plugin->version   = 2011090100; // Release date of this version of jQuery.
^bcbd130 (Justin Filip 2013-01-23 11:25:26 -0500 29) $plugin->requires  = 2011120500.00;
^bcbd130 (Justin Filip 2013-01-23 11:25:26 -0500 30) $plugin->component = 'local_rljquery';
^bcbd130 (Justin Filip 2013-01-23 11:25:26 -0500 31) $plugin->maturity  = MATURITY_BETA;
^bcbd130 (Justin Filip 2013-01-23 11:25:26 -0500 32) $plugin->release   = '1.6.3 (2.2)'; // The jQuery release version with the Moodle release in parentheses

After verifying that this worked correctly, I then applied the same commands to the MOODLE_23_STABLE and MOODLE_24_STABLE branches of my plugin repository, repeating the same check to make sure the earliest commit had been updated as well.

One thing I thought was interesting is that even though I had just modified this commit, it didn’t update the commit timestamp with the current date. For more information about how the filter-branch command works, refer to the documentation page for it.


In the end I was able to fix the problems I had introduced, learn something new about Git and get my code into a state where it installed correctly and could be peer reviewed and tested within our internal process at Remote-Learner. I also re-learned that I should validate even very simple code before assuming it works and spreading it around to lots of places.



I’m starting up this site which I hope to use to share information about things that interest me. Those things will probably be related to my day job and personal projects that I am working on in my spare time.

Thanks for stopping by!