The thoughts, opinions and sometimes the rants of Mark Evans

MySQL Query Analysis - Explain Extended

| Comments

I have recently discovered a feature in MySQL which is useful in debugging performance issues with queries.

We all know about the EXPLAIN statement which looks something like


This is used to display information from the optimizer about the query execution plan for the SELECT statement including information about how tables are joined and in which order. There is also a second statement called EXPLAIN EXTENDED which will provide even more information about what is happening under the hood.

To show you how this works lets take a common query from the current development version of osCommerce 3.x

EXPLAIN select distinct p.products_id from osc_products p left join osc_product_attributes pa on (p.products_id = pa.products_id) left join osc_templates_boxes tb on ( = and tb.code = “Manufacturers”), osc_products_description pd, osc_categories c, osc_products_to_categories p2c where p.products_status = 1 and p.products_id = pd.products_id and pd.language_id = 1 and p.products_id = p2c.products_id and p2c.categories_id = c.categories_id

You should get some output similar to the following

MySQL Explain Example

As you can see there is lots of useful information there to help understand just what is happening. But it seems the MySQL engine has some other things it would like to tell us. And we get to this extra information by adding the keyword EXTENDED in combination with SHOW WARNINGS to get information about how query looks after transformation as well as any other notes the optimizer may wish to tell us.

Here is the same query again but with some extra information.

MySQL Explain Extended

You can see there is a new column here called “filtered” which was added in MySQL 5.1.12. This column indicates an estimated percentage of table rows that will be filtered by the table condition.

Now if we send the statement


We get back exactly how the optimizer has changed the query internally before executing it

select distinct `osc3`.`p`.`products_id` AS `products_id` from `osc3`.`osc_products` `p` left join `osc3`.`osc_product_attributes` `pa` on(((`osc3`.`p`.`products_id` = `osc3`.`pd`.`products_id`) and (`osc3`.`p2c`.`products_id` = `osc3`.`pd`.`products_id`) and (`osc3`.`pa`.`products_id` = `osc3`.`pd`.`products_id`))) left join `osc3`.`osc_templates_boxes` `tb` on(((`osc3`.`tb`.`code` = 'Manufacturers') and (`osc3`.`tb`.`id` = `osc3`.`pa`.`id`))) join `osc3`.`osc_products_description` `pd` join `osc3`.`osc_categories` `c` join `osc3`.`osc_products_to_categories` `p2c` where ((`osc3`.`c`.`categories_id` = `osc3`.`p2c`.`categories_id`) and (`osc3`.`pd`.`language_id` = 1) and (`osc3`.`p`.`products_id` = `osc3`.`pd`.`products_id`) and (`osc3`.`p2c`.`products_id` = `osc3`.`pd`.`products_id`) and (`osc3`.`p`.`products_status` = 1))

As you can see to get better performance the engine has changed the order of the query. Using this knowledge it is possible for the developer to understand exactly what the optimizer is doing and possibly find a better way to achieve the same results.

Hopefully I am not the only person in the world who didn’t know about this EXTENDED option ;–)

Jump in Camp – SQL Server Edition – Day 3

| Comments

Day 3 of Jump in Camp started roughly the same as Day 2 except that I managed to get a few extra hours of sleep before having breakfast at 7.30.

The first subject of the day was about porting SQL queries to T-SQL. Transact-SQL (T-SQL) is Microsoft’s proprietary extension to SQL and varies from the MySQL version of SQL which most open-source developers are used too. For information on the differences and how to convert MySQL to T-SQL Microsoft provide a useful guide to the most common issues developers will have.

The guide can be downloaded from Guide to Migrating from MySQL to SQL Server 2008 and as you can see there are a number of differences which need to be dealt with by the developer. Most issues can be dealt with and both MySQL and SQL Server can supported simultaneously as long as you are prepared to think about how you write SQL statements to take into account the nuances.

The other path open to developers is to use something like Doctrine which will abstract these differences away from the developer. SQL Server support has also added to Kohana and Lithium so if you use either of those frameworks, adding support for SQL Server should be much less painful.

Just as I started to have panic attacks on just how much work it would be to get osCommerce to support SQL Server we were shown a rather cool tool which is called SSMA (SQL Server Migration Assistant) this will take a MySQL database and will convert its data structure and also the data itself and convert it into a format which SQL Server can understand. I tested this tool against the current 3.x database structure which is being used in osCommerce and because we hadn’t really ever supported any databases apart from MySQL I expected a number of problems to be flagged up.

After running through the tool however I was pleased to see that the entire database structure and all of the sample data we provide was migrated to SQL-Server without a hitch. Kudos to the SSMA team for their hard work in making it light work for me! If you are interested in trying SSMA out you can download it from Microsoft SQL Server Migration Assistant 2008 for MySQL

Much discussion was had about how we could best support SQL Server in osCommerce and we decided that a new database layer was needed in order to provide support for all of the database platforms osCommerce will officially support and also allow extensibility so that other platforms could be added without needing to re-write large swathes of core code. This work is currently in progress and hopefully the fruits of our labour can be commited into the Github repository soon.

After lunch came a series of interesting talks about how to optimize the T-SQL once you had it working. Most developers these days should know about the EXPLAIN statement in MySQL which will tell you how MySQL has executed a SELECT statement.

SQL Server has a pretty awesome graphical tool which is built into SQL Server Management Studio and provides more than enough information to help developers decipher what is going on under the hood to get the best performance possible out of SQL Server. I won’t go into details on how to use the tools I have mentioned here but I will be blogging on the specifically very soon with a walkthrough on converting the current osCommerce 3.x database structure to SQL Server and also how to use the SQL Server Management Studio to discover how your T-SQL is performing.

After an afternoon of discussions and talks about optimisation of T-SQL and SQL Server in general it was time for us to prepare for a small road-trip to the Space Needle where we would be spending the evening having dinner and relaxing.

I think this had to be one of the highlights of the whole Jump in Camp as the view from the top was amazing. The food and drink in the Restaurant was also top notch and I couldn’t find fault with anything apart from the Wind which was absolutely freezing when out on the observation deck.

The evening was rounded off with some good debates in the hotel bar over a few beers about what interoperability really meant and of course some really good banter about getting a Linux version of the SQL Server driver ;–)

Was then time for sleep in readiness for the fun of Day 4

Jump in Camp – SQL Server Edition – Day 2

| Comments

Despite my initial reservations it seems getting up in time for breakfast at 7.30am was easier than I thought. Due to the timezone difference I actually awoke at about 4am and then had trouble getting back to sleep. Luckily there was free WIFI at the hotel so I spent a few hours checking up what was happening in the rest of the world.

The first job for the day after breakfast was to head over to the Microsoft Redmond Campus. A short coach ride and we were there. I knew the campus was big but didn’t quite realise how big until I was really there.

Our first visit stop was Building 35 which is the home of SQL Server. After getting our security passes organised we where escorted upstairs to a meeting room where we were introduced to Ted Kummert and Nigel Ellis where an overview of the history of SQL Server was given and also some discussion on SQL Azure.

The expected question soon arose of which I am sure Microsoft cringe every time someone asks it… that being “When we there be a *nix version of the SQL Server driver”. Much discussion ensued about the many reasons the open-source community and some commercial companies wanted it and the many reasons it wasn’t yet available from Microsoft. I sure hope that one day it will be available but at this moment in time I am not expecting anything to be done about it in the near future.

Next up we had chance to visit the Microsoft store and Visitor centre where we had chance to see some of the many different products that Microsoft had worked on over the years. After some shopping and playing with the Xbox Kinect it was time to head back to the hotel for some lunch before the afternoons activities.

The first thing up in the afternoon was a surprise session of which we knew nothing about and involved us drumming and playing various musical instruments (and there was even some bad dancing at one-point). The idea being to try and get us to understand how different people working together can contribute in harmony even if the tools they are using are different.

Once we were all loosened up it was time to crack out the laptops and get down to the real reason we were all there… coding!

The first challenge was to work on porting our various applications to either the sqlsvr (native) driver or the PDO_SQLSVR driver. This was obviously easier for some projects that had abstraction layers in place already, for osCommerce however it was a little bit more work so progress was initially slow.

Towards the end of the afternoon my brain was getting pretty fried, the timezone difference and lack of sleep was certainly taking its toll. Luckily the evening didn’t involve going too far with dinner in the hotel and then some xbox gaming.

Along with the rest of the hotel the food they provided was always excellent, I think I ate so much I thought I was going to burst! It certainly hindered me for the next part of the evening. Harald’s flight had also landed by this time so I was looking forward to catching up with him again.

So onto the gamimg… In one of the various rooms in the hotel had been setup 2 Xbox 360’s with the new Kinect controller after spending some time getting them calibrated the game was on.. the winner was to get to take a Kinect controller home with them. As you can imagine there was some frantic competition and the game settled on was “Ralleyball” which is part of the Kinect adventures game.

This short video should give you an idea of the game

After numerous rounds and also numerous beers the eventual winner was none other than Paul Reinheimer by the end I was totally exhausted from all the cheering (and drinking) so it was time to hit the hay and try and do a better job and sleeping through the timezone differences, ready for what was install for Day 3!

Jump in Camp - SQL Server Edition - Day 1

| Comments

Its taken me a bit longer than I really wanted to get around to blogging about the awesome Jump in Camp which I attended recently but here is the first of a few blog posts!

The Jump in Camp was organised by Microsoft and ran from November 15th-19th 2010. It was to be held in Redmond (Seattle) which as most people know is where the head office of Microsoft is situated.

Day 1 started off at a very early time for me at 6am when my alarm clock went off and I stumbled downstairs to make myself some coffee. Once alertness had been restored it was time to check I had all the important things that I would need to take with me to Seattle. Passport, Money and Macbook Pro were all present and accounted for, so off I headed on my trek across London to get to London Heathrow Airport where I was due to meet up with Harald before our 9hrs 50 min flight.

After going through the usual security and checkin procedures it was time to relax and wait for Harald to arrive on his flight from Germany. It all seemed so simple until mother nature intervened and Harald’s flight was delayed due to fog at Heathrow earlier in the day. For a while it looked like he might just make it but it wasn’t meant to be so flight BA49 left LHR on its 7,726 km without him.

The flight itself was long and fairly uneventful apart from the person sitting on the other side of the Aisle asking me how to fix various problems with his own Macbook, it seems even faking being asleep didn’t slow down the barrage of question like “What is a Megabyte” and “How to I make the screen brighter”. Eventually he ran out of questions so I was left alone to peruse the online entertainment and pray for the flight to land already.

Finally it was all over and the plane set itself down a Sea-Tac airport and we were allowed to dis-embark and head towards passport control which I was expecting to be a stressful experience. In the end it went fairly smoothly and after giving up my finger prints and a photo I was granted access to the US of A.

The next challenge was to find the Shuttle to the hotel, after spending some time going round in circles I bumped into Sam de Freyssinet and it seems with our combined brain power and some asking for directions we finally found the shuttle along with the rest of the group (Kevin Marvin, Jason Coward and Nate Abele) who would be travelling to the hotel with us.

Upon arrival at Willows Lodge I was pretty impressed (The real thing was better than the photo’s) I would have to say its possibly the nicest hotel I have ever stayed in! Once I’d dropped off my bags it was time to catchup with some friends I’d made at the previous Jump in Camp Juozas, Josh Holmes and Ashay Chaudhary

By now I had been awake for more hours than I could manage to count at the time so there was only one thing on my mind… BEER! and it seems it was all organized for us to visit the Redhook Woodinville Brewery so off we went!

After some good food and a tour of the brewery (I didn’t spray everyone with beer this time round) it was time for sleep and to try and adjust to the timezone difference which wasn’t working in my favor!

It seems Microsoft didn’t listen to my feedback on the previous camp and had booked breakfast from 7.30 – 8.30am, I had no idea how I was going to make it up that early.

Details of Day 2 coming soon.

MAMP and Dynamic Virtual Hosts

| Comments

I recently came across a blog post by David Coallier which detailed a cool way to deal with local development with Apache. You can read his blog post at Wildcard /etc/hosts, an alternative, this is something I have been after for a very long time so decided to try it for myself.

I don’t use homebrew but instead I use macports so here is how I managed to achieve the same thing.

First things first make sure you have Macports installed and working.

Then lets install dnsmasq using Macports so open up a terminal shell and type

sudo port install dnsmasq

This will download and install dnsmasq to /opt/local/sbin

Next up you need to edit the dnsmasq.conf file to setup your local DNS entries. To do this type

sudo vim /opt/local/etc/dnsmasq.conf

I then added the following 2 lines to the configuration file

address=/com.local/ listen-address=

After saving the file I then needed to make sure that dnsmasq was started automatically when my macbook was restarted. I did this by typing

sudo port load dnsmasq

Next up I needed to add the new DNS server to my network configuration so that it would be used when trying to resolve domain names. I did this by going to

System Preferences -> Network -> Selected My Network Interface -> Advanced -> DNS

By clicking the + symbol and adding to the list of DNS servers I was now all set.

I tested this was working by going back to the terminal window and first off tried an external domain to make sure I hadn’t screwed my DNS up

ping PING ( 56 data bytes

And then trying my new local addresses

ping PING ( 56 data bytes

I was now all set. Next up to configure MAMP so that I didn’t need to setup a virtual host for every site I was working on.

I did this by editing /Applications/MAMP/conf/apache/httpd.conf and added the following lines to the end of the file

NameVirtualHost *:80 UseCanonicalName Off

LogFormat "%V %h %l %u %t \"%r\" %s %b" vcommon CustomLog logs/access_log vcommon

VirtualDocumentRoot /Applications/MAMP/htdocs/%1 VirtualScriptAlias /Applications/MAMP/cgi-bin

After a quick restart of MAMP I was now able to use any .com.local domain name and it would map correctly to a folder inside /Applications/MAMP/htdocs/

To test this I created a new folder inside /Applications/MAMP/htdocs/ called testsite

And added a simple index.php file which just did an echo “Hello World”

Then booting up a browser I went to and voila I was treated to the simple words “Hello World”. My life was now complete, no more faffing around with vhosts and /etc/hosts!

Thanks to David for helping me realise just how easy it could be to set all this stuff up.

How One Line of Javascript Killed the Internet

| Comments

Warning: The post below is a rant based on my own personal views and should be read as such.

There is a constant battle which rages across the Internet almost everyday. Users want up-to-date content and they don’t want to pay for it, content producers want to create more and more content but can’t do so without a way to monetize that content.

There have been some recent pushes towards content which is no longer free and sits behind a paywall, this in some instances works out great when users cannot get the content you are offering from any other source but for general news and views a quick search in Google or Bing and people can generally find an alternative source of the information they want.

This pushes content providers to follow a more “Ad-Supported” model where content producers try harder and harder to make increased revenue from an ever diminishing market which is seeing lower and lower rate of CPM.

This causes quite a conundrum as getting a larger share of a reducing revenue model doesn’t do the content providers much good. Therefore they start to look at other ways of monetization such as behavioural targeting and ad-retargetting in order to get better value from the “Ad-Supported” model. This is where the “one line of javascript” comes in.

One thing I hate as an end user is when I am waiting for content to be shown to me and it seems to take forever (or at least over 6 seconds in real terms). Seeing that blank white page and the waiting for to respond in the bottom of the browser window can quite quickly see my blood pressure rise to the point where I just click the close button and try and find somewhere else to get the information I was looking for.

This can cause big problems for content producers, they want to try and get a larger share of the ad revenue but by doing that they reduce the number of users who want to visit their website due to it taking forever to get to the real content, therefore driving down their ad-revenues.

Enter the all too familiar Best Practices for Speeding Up Your Web Site which gives advice on how to make your website faster, some of the things to do is reduce DNS lookups, use a CDN and defer javascript calls (or move to the bottom of the page) quite often however the code used to serve ads and conduct behavioural targeting requires to be at the top of the page due to document.write or other reasons which I won’t go into here which means there is not much the developer can do to optimize this, meaning the end user suffers.

This problem can only get worse as content providers try more and more ways of targeting ads to the end user and implement multiple different providers on the same pages which increases the slow performance of the site.

There needs to be a sense check somewhere by the content providers to decide if a targeting / retargeting platform is really delivering value, if it isn’t and they decide to implement an alternative then please please please for the sake of the whole Internet take off the platform that isn’t working.

2011 in my opinion will be the year of performance where sites that perform badly will be penalised and move further down the search engine rankings, and whilst this will be great for end users it will be a disaster for content providers.

Jump in Camp - SQL Server Edition

| Comments

Something very exciting is happening next week, myself and a bunch of other people will be descending on Seattle for the latest Microsoft Jump in Camp.

Following on from the camp I attended in Zurich (Here is what happened) the next camp focuses primarily on MSSQL Server and how it can be leveraged by applications such as osCommerce to give our end users a wider choice of platforms on which they can run osCommerce.

Historically osCommerce has only supported MySQL and Apache based around *nix operating systems, this has served the project well for a number of years, but I think the time has now come to expand the range of platforms we officially support.

In the upcoming 2.3.0 release we have been working on improving support for Windows/IIS and very soon it will be even easier to install osCommerce in that environment, the next logical step is to expand support for other database back-end with a logical choice being MSSQL.

I am very much looking forward to spending time with a number of very clever people from both Microsoft and other projects from the open-source community for what I expect will be an awesome event.

For more information on what is planned for the camp take a look at the Jump in Camp Website

Love Thy Project

| Comments

I’ve been planning to write again for a while but things such as “life” have gotten in the way, today however I decided to stop finding reasons to avoid writing and actually sit down and write about something which has been bothering me for a while.

Being part of an open-source project can invoke a wide range of emotions, from feeling proud you are creating something cool for others to benefit from, to the deep despair of trying to get a release ready whilst your family/friends compete for your attention.

A majority of people in the community are pretty understanding about the dedication you bring to a project and understand when you don’t provide firm commitments on when something will be ready (a.k.a “it will be ready when its ready”), there are however a minority of people who think that you “suck” and have no idea how an open-source project should be run.

See Exhibit A:

I’ve lost count the number of times I’ve read comments similar to the following

  • Version X is dead, no-one cares anymore
  • You suck because you don’t know when version X will be ready
  • You should stop wasting your time, project X is doing it better

From the posters perspective some or all of the above maybe true, however from the projects perspective some or all of these statements maybe false, there are most likely people who do care about version X and who prefer that a release is done right rather than done quickly.

Enter the conundrum of what vocal users want vs what the project/silent users want.

Finding the right balance is never easy for a project, there is no magic formula to follow so that they get it right every time, all a project can can do is follow what it believes is the right course of action and hope the community agrees and follows.

This can sometimes be misinterpreted as the project not listening or ignoring feedback, but rest assured that for most projects all constructive feedback is noticed and used to help formulate decisions even if a project doesn’t reply to every single post you make.

Posting comments like

  • Don’t bother your comment will likely be ignored
  • The project team don’t care so why even bother
  • I wonder if the team will even bother to read this

Is a real simple way of making all of your points however valid irrelevant and easily _forgettable. _

So when giving feedback to a project, remember they are run by people, powered by motivation, so make sure to keep it on topic, detached from emotion and constructive, and the rest assured it will be heard.

So, to sum up the theme of this post in as few words as possible.

You have your way. I have my way. As for the right way, the correct way, and the only way, it does not exist. – Friedrich Nietzsche

Jump in Camp - Final Day and Final Thoughts

| Comments

The final day at Jump in Camp started with a very hungover breakfast due to the festivities of the brewery and card games the night before, surprisingly quite a few people made it in time for the talk by Jason Stowe on Spatial Data this is a topic I have been interested for quite some time and Jason’s talk and demonstrations only made what I thought was already a really cool thing into an awesomely cool thing!

One thing that is surely lacking in a UK standpoint is access to Free Geo-spatial data for the UK which would allow some pretty awesome tools to be built, I am going to spend some more time looking into this to see if I can find a free source, if anyone knows of any please feel free to get in touch,

The rest of the morning and the afternoon was used for people to get back to the coding and start implementing the things we had been looking into over the last few days, I spent the afternoon working on Web Platform Installer to look into how we can get osCommerce to install smoothly on a Windows machine, some more details of this work will be available soon.

Soon it was time for some people (including myself) to head back to Zurich Airport to catch our respective flights home, for myself it was a close call and I made it with just 5 minutes to spare before the plane left without me!

Final Thoughts

Looking back on the few days spent in Switzerland I am amazed by what was achieved in just a short space of time, a large part of this is down to the brilliant minds that were provided by Microsoft and the brilliant organisational skills of Nicole Zahnd, in my mind this camp was a resounding success and I hope they have many more of these in the future.

Allowing the open-source projects direct access to the people in Microsoft who can help them make a difference just made the whole process simple and it was a refreshing look into the heart of Microsoft and surprisingly not everyone there is a “borg” style drone, they have some brilliant people working who have a real passion for open-source software and interoperability.

I would like to say thanks for all of the people from Microsoft, Futurecom and of course all of the others who attended, together you made it an awesome few days for myself, I hope to have chance to meet everyone again at some point soon.