Coding With Pen and Paper

Producing a rough sketch on a piece of paper may seem more like the realms of a designer desperately trying to get an idea out of their head into a physical form to share with others but I find it can also help me develop backend code.


When starting a new bit of work I normally have the rough idea in my head of how I’m going to modularise the code and what functionality each piece will need. However, if I plunge straight into coding I can at times duplicate functionality that maybe better condensed into one function of a class that other classes can extend.

By listing all the functionality I require first I can then begin to separate these concerns into classes and functions. This helps me to spot where functionality is needed in several places and any variations that maybe required, e.g. giving the option to return in differing formats.


My head can be a great place to start work and it all seems very clear, but putting it down on paper really helps me visualise how code can work together.

Is there a certain flow of data that must be followed? Lining up each step with a simple visual and showing how data can flow through a process and the possible outcomes at each step will highlight any areas that need special attention.


There are so many times when I’m speaking to clients or colleagues and I wish they could see the picture that was in my head. Things that seem very logical to me can sometimes not be vocalised as simply.

By sketching out the concept it can help others to understand what the plan is and how all the pieces will work together.


As I move forward with the work it can be easy to start focusing on the small details of each individual piece of functionality. By having a good visual reference for the overall work it helps me to keep considering the overall functionality and how it will all fit together.

In much the same way that testing can help you organise your thoughts on how your code should be structured, I find that using pen and paper is a great way to get my thoughts in order and keep the work on track.

Curse of the Redesign: Moving to the Live Environment

When we develop sites we start off locally and in a development environment. During the development process we continue this process as we evolve the design and functionality.

But there comes a point where the site we’re working moves to the stage where it will become the live site. At this point it’s preferable to move it to the live environment.


We try and match development and live environments as much as possible but it’s not always possible to get exact matches due to various constraints.

By working on a live environment as early as possible any differences in configuration such as file permissions can be worked out early on.

With PHP and Drupal moving to Composer based workflows there constraints on PHP versions for different modules. Even Drupal will be implementing a minimum PHP version. Discrepancies like this need to be spotted and resolved as early as possible to ensure the site functions as expected when launched.


As I talked about in my previous post on Handling Existing Content, we often import content and do some manipulation of it. Further to this both the client and team at Headscape will add additional content.

We don’t want to risk this being incorrectly updated or lost. Having a copy of the site on the live server gives a clear distinction as to where the true source of content lies.

File uploads is another source of content that should not be ignored. By working on the live environment you ensure that file uploads and access work correctly, as described above with regards to permissions.

It also saves a time consuming task of moving any uploads from a development environment to the live environment before launch.

SSL Certificates

It’s rare to work on a site that does not include an SSL certificate. Even simple sites often have a contact form so data can be securely sent and combined with search engines ranking https sites better it makes a lot of sense.

I won’t go into detail here but in a previous post I detailed how to test an SSL certificate without the domain name.


Overall working on the actual live environment provides a nice level of reassurance before you make that all important DNS switch.

A simple thing like ensuring the server is correctly sending out emails can make launch day an enjoyable experience rather than a mad scramble to debug why your killer feature isn’t working as expected.

Curse of the Redesign: Handling Existing Content

As I spoke about in the previous post about redesigns my work generally focuses on clients who have existing sites that need to be moved to a new Content Management System and have a new design applied to them.

Whilst this can have it’s advantages for design work by allowing the front end developers far more freedom with markup it has some challenges for the backend.


Every database I have received have been from a different CMS, some have been MySQL and some have been Microsoft SQL database but all have required some discovery.

We do a lot of work in Drupal so I always import the old database to MySQL to make working between the two databases as smooth as possible.

Once I have the old database imported I then set about piecing together the structure. Most database are actually fairly simple when you start to look into them and follow the same patterns.

A base table with master ID
Associated tables with extra data tied to the masterID
A table joining master IDs together e.g. blog posts and authors

Once you have found the patterns it is usually the same for all content types that you will be importing.

Tidy Up

Running an import of data gives us a great chance to tidy up both content and data.

Tidying up the content can be simple clauses such as only import items that have been created in the last 5 years.

Or it can be more complicated. We worked on a project where the tagging had got out of hand so we set a limit of a maximum of 5 tags in the new site. This meant a decision had to be made on how to handle the import of a subset of tags.

We made a list of priority tags, if these were linked to from the item then they were included. If they were not then we had a manual review process. The manual job actually ended up being a lot smaller than anticipated and could be worked through very quickly.

Old data from a CMS is often littered with poor HTML that needs to be cleaned and during a pragmatic import of data is a perfect time to do it. A tool such as HTML Purifier can help you clean up that old mess.

Store IDs

Probably my top tip for importing data is to keep a mapping between the old IDs and the new IDs. This allows repeated updating of the data and also a useful way to compare the two data sets.

It also allows the import of join tables as mentioned above. If you import authors and store their old IDs when you import the blog posts you have a way of joining the two together. The basic logic would be

Find old author ID for this blog post
Look up mapping between old author id and new author ID
Set the blog post to be connected to the new author ID

Choose Candidates

If the dataset is large enough to warrant migration scripts then it is also too large to check every single imported item. Work with the client in selecting some candidate items for data comparison. This will mean you have useful reference points for both you and the client to cross check the imported data.

Use Your CMS Tools

Most Content Management Systems offer an API for data creation and Drupal is no exception. You should always use these APIs for content creation rather than inserting the data straight into the database.

Run It and Run It Again

The great thing about computers is you can get them to do things over and over again very quickly in exactly the same way.

We always have to run scripts many times. This can be due to new fields being added or data needing to be formatted in a different way.

It is also highly likely that lots of new content will be added between the time you got a copy of the old site database and the new site going live. All these items can also be imported saving the client manually adding them.

Get Stuck In

The first time I was faced with an unknown database it was quite overwhelming but many years on and many different databases in I’m no longer phased by it. My advice is to logically work through the data and always think about repeatedly running the import scripts.

Curse of the Redesign: Existing Content vs New Ideas

I can’t remember the last time I was involved in making a brand new website rather than an update or redesign. Maybe it’s just the sort of work we are suited to at Headscape and you may find differently.

A redesign often brings with it different challenges to making a site from scratch, one of these is challenging the client to provide new types of content. By sharing a vision of how their site could be it encourages clients to improve their site rather than continue with the same old processes.

Linked Content

One of the best ways to help users find relevant content on your site is to link to other pages that contain similar content or related subjects. You’ll probably know this best from “tags” which often appear on blogs like this one.

Over the years I have encountered both ends of the spectrum for related content.

No Relationships

It maybe that the current CMS does not allow it or it maybe that it was never thought of, but I have come across sites where there is no data linking any of the content.

Clients always see the benefit of doing this going forward but it is often hard to convince them to invest time in going over old pages and adding tags or links to other content. This doesn’t have to be an arduous task that is done in a single sitting, encourage people to tackle a small section at a time and share the work load around.

Too Many Relationships

Over time lists of tags can become over populated and some tags have been added twice with minor spelling differences. A complete audit of tags or other ways of linking content should be carried out to streamline the process and make it easier for both content editors and site users to navigate between related items.

Limiting the number of tags editors are allowed to choose is also a good way to focus them on the important ones rather than lots being selected as they don’t want to miss anything out.

Likewise on the front end do not output a huge long list of related content. Optimally 10 at maximum and then offer a way to “see all related content” through a pre filtered listing page.

Social Media

Now social media has been around long enough that the majority of people are using it and using it to consume content and communicate with other people and companies it is an area that should not be missed.

There are many good tools for automating Tweets for example, such as Buffer, and the major CMS provider, and I’m sure smaller ones, offer ways to automatically post new and updated content to various social media channels.

If the client is not used to using social media as a marketing channel it may seem like a lot of effort and another area of work. But introducing them to some automated tools will hopefully show them it needn’t be as much extra work as anticipated.


I would say the one type of content that clients find the most challenging to add to a website is imagery. This is probably due to good quality images being hard to find and also tricky to convey exactly what a page is about in a single image.

Most people can at least write some words to populate a page but taking good quality images is a lot harder. There are the stock photography sites but these are often cliched and over used. At the design we often help the client source images to get them started.

It may not always be possible to find an appropriate image or the page may not need one. The two options here are to either ensure the design works without an image. We often have banner images as an optional field. The other is to ensure that any placeholder images fit in well with the overall design, at all screen resolutions, and will will not be too distracting if used multiple times.

Trying It Out

By adding new elements to a design it can really liven things up and improve the overall look and feel of a site. My only word of caution is to make sure that the client can keep producing content with the new elements. For example if your design falls over when there are too many placeholder images and the client can’t produce new imagery you may have to reconsider that aspect.

Drupal 6 num_rows

I’ve been working on a site powered by Drupal and needed to find the number of rows a database query returned. Simple enough and quite common procedure. But, the Drupal database API no longer offers the db_num_rows() function.

I had a quick Google around and the answered seemed to be to run the query twice but the second time using a count query. Two calls to the database when only one is necessary? Not on my watch. The piece of code is to appear on the home page so it will be called upon frequently.

My solution was too loop through the results set and add the formatted HTML to an array. Then check if there are any items in the array, if so loop through it an output them if not display the message that there are currently no results.

I know it’s still doubling up but I think it is a slightly more elegant solution than the double database call.

Destroying CodeIgniter Sessions when closing the browser

I’m really enjoying using CodeIgniter, it’s an excellent framework and has a side benefit of helping to structure your files neatly.

I’ve been working on my first login form using it, and whilst there are lots of tutorials going through this common process I encountered a “problem” with the CodeIgniter sessions persisting when the user closed their browser.

A bit of searching around and I came up with a few recommendations.

The first is to set the $config[‘sess_expiration’] to 0

This however has the effect of creating a cookie which actually lasts 2 years due to code in the system/libraries/Session.php file

The next was to set the $config[‘sess_expiration’] to -1

This just didn’t work for me full stop. When I logged in it saved the session variable then when I went to the next page it had disappeared.

The solution I have used came from this page.

It’s pretty quick to implement the code and you can have control over the expiration time as well as choosing whether your cookie should be persistent.

And now it’s implemented I intend to copy the Session.php file into every project I create that requires sessions. I can handle having to paste in one line of code to the config file.

Google Maps

A lot of the sites we do in work involve using a Google map, some are simple and just allow the user to choose a spot which populates input fields for latitude and longitude. Others contain large data sets and information about the point, which is normally displayed in one of the Google map bubbles.

Using our standard library code for implementing a map and allowing the user to pick a latitude / longitude point IE6 was only displaying a grey background with the Google copyright.

My first line of attack was to go through the Javascript, but even displaying a basic map on the page was giving the same results.

So I started from a blank page and managed to get a basic map up and working. Then stepping through my real page I discovered that something in the style sheet was causing the error. I know that IE can throw a wobbly if you don’t declare the width and height of the map div, in particular the width. But both these values were declared in pixels.

I eventually got to the bottom of it and it was a png fix I have been using for IE6. I know it’s not practical to make sites look exactly the same across browsers but when you have a large portion of IE6 users it’s hard to tell them that they will receive a version of the site that doesn’t match up to what they were expecting.