Coding With Pen and Paper

Producing a rough sketch on a piece of paper may seem more like the realms of a designer desperately trying to get an idea out of their head into a physical form to share with others but I find it can also help me develop backend code.

Planning

When starting a new bit of work I normally have the rough idea in my head of how I’m going to modularise the code and what functionality each piece will need. However, if I plunge straight into coding I can at times duplicate functionality that maybe better condensed into one function of a class that other classes can extend.

By listing all the functionality I require first I can then begin to separate these concerns into classes and functions. This helps me to spot where functionality is needed in several places and any variations that maybe required, e.g. giving the option to return in differing formats.

Visualisation

My head can be a great place to start work and it all seems very clear, but putting it down on paper really helps me visualise how code can work together.

Is there a certain flow of data that must be followed? Lining up each step with a simple visual and showing how data can flow through a process and the possible outcomes at each step will highlight any areas that need special attention.

Collaboration

There are so many times when I’m speaking to clients or colleagues and I wish they could see the picture that was in my head. Things that seem very logical to me can sometimes not be vocalised as simply.

By sketching out the concept it can help others to understand what the plan is and how all the pieces will work together.

Reference

As I move forward with the work it can be easy to start focusing on the small details of each individual piece of functionality. By having a good visual reference for the overall work it helps me to keep considering the overall functionality and how it will all fit together.

In much the same way that testing can help you organise your thoughts on how your code should be structured, I find that using pen and paper is a great way to get my thoughts in order and keep the work on track.

Curse of the Redesign: Moving to the Live Environment

When we develop sites we start off locally and in a development environment. During the development process we continue this process as we evolve the design and functionality.

But there comes a point where the site we’re working moves to the stage where it will become the live site. At this point it’s preferable to move it to the live environment.

Configuration

We try and match development and live environments as much as possible but it’s not always possible to get exact matches due to various constraints.

By working on a live environment as early as possible any differences in configuration such as file permissions can be worked out early on.

With PHP and Drupal moving to Composer based workflows there constraints on PHP versions for different modules. Even Drupal will be implementing a minimum PHP version. Discrepancies like this need to be spotted and resolved as early as possible to ensure the site functions as expected when launched.

Content

As I talked about in my previous post on Handling Existing Content, we often import content and do some manipulation of it. Further to this both the client and team at Headscape will add additional content.

We don’t want to risk this being incorrectly updated or lost. Having a copy of the site on the live server gives a clear distinction as to where the true source of content lies.

File uploads is another source of content that should not be ignored. By working on the live environment you ensure that file uploads and access work correctly, as described above with regards to permissions.

It also saves a time consuming task of moving any uploads from a development environment to the live environment before launch.

SSL Certificates

It’s rare to work on a site that does not include an SSL certificate. Even simple sites often have a contact form so data can be securely sent and combined with search engines ranking https sites better it makes a lot of sense.

I won’t go into detail here but in a previous post I detailed how to test an SSL certificate without the domain name.

Reassurance

Overall working on the actual live environment provides a nice level of reassurance before you make that all important DNS switch.

A simple thing like ensuring the server is correctly sending out emails can make launch day an enjoyable experience rather than a mad scramble to debug why your killer feature isn’t working as expected.

Curse of the Redesign: Handling Existing Content

As I spoke about in the previous post about redesigns my work generally focuses on clients who have existing sites that need to be moved to a new Content Management System and have a new design applied to them.

Whilst this can have it’s advantages for design work by allowing the front end developers far more freedom with markup it has some challenges for the backend.

Discovery

Every database I have received have been from a different CMS, some have been MySQL and some have been Microsoft SQL database but all have required some discovery.

We do a lot of work in Drupal so I always import the old database to MySQL to make working between the two databases as smooth as possible.

Once I have the old database imported I then set about piecing together the structure. Most database are actually fairly simple when you start to look into them and follow the same patterns.

A base table with master ID
Associated tables with extra data tied to the masterID
A table joining master IDs together e.g. blog posts and authors

Once you have found the patterns it is usually the same for all content types that you will be importing.

Tidy Up

Running an import of data gives us a great chance to tidy up both content and data.

Tidying up the content can be simple clauses such as only import items that have been created in the last 5 years.

Or it can be more complicated. We worked on a project where the tagging had got out of hand so we set a limit of a maximum of 5 tags in the new site. This meant a decision had to be made on how to handle the import of a subset of tags.

We made a list of priority tags, if these were linked to from the item then they were included. If they were not then we had a manual review process. The manual job actually ended up being a lot smaller than anticipated and could be worked through very quickly.

Old data from a CMS is often littered with poor HTML that needs to be cleaned and during a pragmatic import of data is a perfect time to do it. A tool such as HTML Purifier can help you clean up that old mess.

Store IDs

Probably my top tip for importing data is to keep a mapping between the old IDs and the new IDs. This allows repeated updating of the data and also a useful way to compare the two data sets.

It also allows the import of join tables as mentioned above. If you import authors and store their old IDs when you import the blog posts you have a way of joining the two together. The basic logic would be

Find old author ID for this blog post
Look up mapping between old author id and new author ID
Set the blog post to be connected to the new author ID

Choose Candidates

If the dataset is large enough to warrant migration scripts then it is also too large to check every single imported item. Work with the client in selecting some candidate items for data comparison. This will mean you have useful reference points for both you and the client to cross check the imported data.

Use Your CMS Tools

Most Content Management Systems offer an API for data creation and Drupal is no exception. You should always use these APIs for content creation rather than inserting the data straight into the database.

Run It and Run It Again

The great thing about computers is you can get them to do things over and over again very quickly in exactly the same way.

We always have to run scripts many times. This can be due to new fields being added or data needing to be formatted in a different way.

It is also highly likely that lots of new content will be added between the time you got a copy of the old site database and the new site going live. All these items can also be imported saving the client manually adding them.

Get Stuck In

The first time I was faced with an unknown database it was quite overwhelming but many years on and many different databases in I’m no longer phased by it. My advice is to logically work through the data and always think about repeatedly running the import scripts.

MARINEXUS website approaching launch

The website for the MARINEXUS project is approaching a launch date which should be before the Christmas break.

In anticipation I created a new holding page to makes things that little more interesting.

The website will be developed in the new year to run from WordPress to allow users to adjust the content and add news and events when the project hits full swing.

MARINEXUS holding page

MarLIN Recording Blog re-launched

The new version of the MarLIN Recording Blog has gone live today.

MarLIN, The Marine Life Information Network, has an activity community of recorders who submit sightings of marine life through the MarLIN website.

The blog, built on WordPress, will be updated with interesting and unusual sightings as well as information from MarLIN events.

The blog also updates the MarLIN Twitter account when a new post is added or edited.

Lilly & Day launched

Based in London Lilly & Day provide garden and landscape design.

Working with designer Random Badger the site is built on WordPress allowing the client to make updates to text or upload new portfolio images.

The portfolio gallery uses a jQuery plugin Galleria to improve the user experience progressively for those that are able to use javascript.

JC Surf launched

John Copley has been part of the surf scene since the sixties and now works as an agent for a large number of worldwide surf brands.

Working with Random Badger I created a site to showcase the companies and products that JC can provide for you. The site is aimed at getting the user interested and encouraging them to get in touch.

The site also contains a small profile about the man himself.

URBANE Project launched

The URBANE project is a study of urban coastal defences. The website provides information about the project and those involved. The partners page has a Google map linked to an unordered list that acts as dual functionality to view more details on the partners.

The overlay effect that I wrote for the partners list is utilised again in the image galleries.

The site is built on HTML5 with some subtle use of CSS3.