Boosting performance of a complex Drupal 7 project with Blackfire.io

By Christophe Dujarric, on Aug 23, 2015

Editor’s note: This is a guest blog post by Lee Rowlands

PreviousNext is one of Australia’s most recognizable Drupal agencies. We’ve built some pretty complex sites and were thrilled when we were approached by a household Australian name to help them build a next-generation more-performant version of their energy-comparison site.

We took a fairly bold approach to the project, focussing on some of the key pain points of the previous build such as:

  • Slow to load search results page
  • Difficult for admin users to add and edit energy offers.

We settled on a site for admin users to maintain the energy offers with file upload support and a separate consumer site to power the offer search.

In order for them to both access shared data, the admin user site stores and updates data using AWS DynamoDb as the canonical data-source.

Published energy offers (available for consumers to search) are stored in an ElasticSearch index with access to both DynamoDb and Elastic abstracted behind a generic PHP library that contains both infrastructure concerns as well as the domain model.

This means the bulk of the code is decoupled from Drupal, and the two sites interact with the storage and retrieval via the domain model interfaces. This decoupling meant we could test domain logic without needing the full Drupal stack.

The key piece of the site functionality is the ability to enter your current energy usage and have the system show you an indicative bill for each of the offers available in your area.

Obviously this means you can’t use reverse-proxy caching technologies like Varnish as each search relies on user-posted data to feed into the complex calculations performed by the algorithm.

Throughout the build, we identified a number of places to optimize the algorithm but in the interest of avoiding premature optimisation, waited until the bulk of the site was built before profiling.

Enter blackfire.io

Having had success using Blackfire.io to profile the Drupal 8 installation process, we opted to use it to help us squeeze the most out of our application.

Setting up was a breeze, we simply added the required repositories, installed the packages and completed the configuration.

Because we were interested in profiling the POST requests to the search form, where the most intensive calculation algorithm was, we used the browser to submit the form and then grabbed it as a cURL request from the console.

img1

From here it was straight forward to use the CLI Blackfire binary with the cURL string to profile a search submission and the algorithm.

First pass and eliminating HTTP requests

We ran our first pass to generate a baseline for comparison. The test was run using a local VM against a staging AWS DynamoDB instance with realistic data and ElasticSearch running on an EC2 instance. We were expecting some network latency, but found that HTTP requests accounted for 94% of the page-load time. Even in the same data-centre instead of on a VM, this was surely low hanging fruit.

The bulk of our search data was stored in ElasticSearch, but each record was associated with the retailer that offered it. Thanks to profiling we were able to see that the bulk of the load time was loading each retailer entity in turn from the DynamoDb store. Given changes to retailer details (address, logo etc) would be very infrequent, we added a cache-layer to this, which was easy to do as access to this data was behind a domain-model interface. Thanks to the trusty service container we simply created a new cached decorator for our retailer storage service.

img2

Second pass and hashing

After implementing this cache layer, we had reduced the page load time by 90% in our first pass. Digging into the second pass we found a lot of time being spent hashing data returned from the ElasticSearch index. In order to track if any changes needed saving on the admin user site, we implemented a hash calculation in the Offer object’s constructor, which could be recalculated at anytime to determine if changes existed that needed to be persisted. Because the domain model allowed storage of offers in both DynamoDb and ElasticSearch, we found that offers loaded back from ElasticSearch were also calculating this hash, but because the consumer-site was read-only this was redundant. So we slightly modified the constructor to only calculate the hash if it was missing and saved 56% of the CPU-time of the already improved page.

img3

Third pass and calculate on write

The bulk of the algorithm for calculating the energy offer estimate requires determining how many days a given tariff overlaps with a given season. On our third pass we saw a lot of time being spent calculating these intersections. But – these are static, and only change when the offer changes, so could be calculated and persisted with the offers when they were saved. After changing these to be calculated at write time we saw another 10% reduction in CPU-time. Now we were down to an order of magnitude faster than the baseline.

img4

Fourth pass and optimising DateTime creation

On our fourth pass we noticed the a lot of CPU-time being spent constructing DateTime objects. Many of the offer properties are timestamps which are stored as strings and converted back intoDateTime objects in the denormalizer. We were using the generic DateTime constructor method in a utility factory, but in all cases we knew the stored format of the string, so switching it to the more-performant DateTime::createFromFormat() and statically caching a single timezone object instead of creating a new one each time saved us some more time.

img5

Wrapping up

There are a number of profiling options on the market for PHP, but Blackfire is by far the easiest we have found to install, utilize and interpret. The ability to switch between CPU cycles, load time, memory use and I/O time and quickly pinpoint pain-points means you can yield real performance improvements in just a few passes – as seen by this case-study.

I look forward to using Blackfire on my next project and in my open-source contributions.

About the author

Lee Rowlands is a Senior Drupal Developer with one of Australia’s most respected Drupal agencies – PreviousNext. He maintains the Block Content, Forum, Contact and Comment modules in Drupal core, is a member of the Drupal Security Team and a major contributor to Drupal 8.

Editor’s note: We are always looking for great stories about how Blackfire helped our users find bottlenecks and improve the performance of their apps. Feel free to contact us if you are interested in sharing your experience with Blackfire.

Christophe Dujarric

Christophe is the Chief Product Officer at Blackfire. He's an engineer, but probably one of the least "tech" people in the company. He's wearing many hats, from product management to marketing and sales. He loves the beauty of simple solutions that solve actual problems.