Adapt Authoring Tool Hello World

I’ve created a small series of blog posts showing how you can create a simple Adapt elearning course using the Adapt Authoring Tool. These posts assume you have already installed the Authoring Tool. If you have not then please follow the very comprehensive installation instructions.

To start off we will create a storyboard that will guide our development of the course.

Further posts will cover

Arduino Indoor Outdoor Temperatures

There has been many times that I’ve left the house and noticed the difference in temperature between inside and outside my home. I thought a fun project would be to combine an Arduino reading the inside temperature and the outside temperature.

I had a couple ideas on how to do this. The first involved two Arduinos, one inside and one outside, both using TMP36 sensors to monitor the temperatures. The second was a single Arduino inside that recorded the temperature using the sensor and then using the Open Weather Map API to gain the outside temperature.

Both ideas had some pros and cons but I went for option two. It may not be as accurate as having an outside sensor but I think the data reported by the API is good enough that I could see the difference between inside and outside.

The code for this is available in a GitHub repo.

Parts used

Arduino Uno WiFi

TMP36 Sensor

Jumper wires

You’ll also need

A web and database server running PHP and MySQL. In the code used this is an internal server that does not accept requests from outside of the network. If you are going to use a server open to traffic outside of your network you will need to add extra security measures to ensure that unauthorised requests cannot add data.

Setup

Database

Create a database and then run the SQL that is in the data.sql file. This will create a table that stores the indoor and outdoor temperatures as well as a unique id and a created at date and time.

Data receive page

The first part of the script includes the Composer autoload file, the settings file and then uses Guzzle’s HTTP client that will be used to send a get request to Open Weather Map API.

require __DIR__ . '/vendor/autoload.php';

require './settings.inc';

use GuzzleHttp\Client;

There is then the basic security check for the key parameter in the query string.

if (isset($_GET['key']) && $_GET['key'] === BASIC_KEY) {

We create a connection to the database or output the error message and exit if it was not successful.

  try {
    $db = new PDO("mysql:host=" . DB_HOST . ";dbname=" . DB_NAME, DB_USER, DB_PASS);
  } catch (PDOException $e) {
    echo $e->getMessage();
    exit;
  }

The next section of code creates a new Guzzle HTTP client with the base uri set to the Open Weather Map API.

We then set the query string parameters to send and perform a GET request.

The response body is first converted to a string and then JSON decoded for easier use with PHP. This decoded response is a PHP object.

We now access the temperature returned by the API call and use this as our outdoor temperature reading.

$client = new GuzzleHttp\Client(['base_uri' => 'api.openweathermap.org/']);

  $params = [
    'query' => [
      'id' => CITY_ID,
      'appid' => API_KEY,
      'units' => 'metric',
    ],
  ];

  $request = $client->request('GET', 'data/2.5/weather', $params);

  $response = json_decode((string)$request->getBody());

  $outdoorTemperature = $response->main->temp;

The last part of the logic inserts a row into the database. To ensure that the temperature recordings are in the correct format we cast them as floats and user the number_format function to round to two decimal places.

  $data = [
    'indoor_temperature' => number_format((float)$_GET['temp'], 2),
    'outdoor_temperature' => $outdoorTemperature,
  ];

  $sql = "INSERT INTO data (indoor_temperature, outdoor_temperature) VALUES (:indoor_temperature, :outdoor_temperature)";

  $stmt= $db->prepare($sql);

  $stmt->execute($data);

After inserting the row into the database we return a 201 success response code to show that the row has been created.

  // Return a 201 created response.
  header('HTTP/1.1 201 Unauthorized');
  exit;

The last part in the else statement returns a not authorised response header if the query string does not contain the key parameter or if the key parameter is incorrect.

  header('HTTP/1.1 401 Unauthorized');
  exit;

Arduino sketch

The first portion of code includes the library for the Arduino Uno WiFi, this allows us to use the Ciao library for sending HTTP requests. We then declare some constants that will be used when sending the request. You will need to replace the SERVER_ADDRESS and KEY to match those used on your web server for receiving the data. Finally we declare our TMP36 sensor input pin to zero.

#include <UnoWiFiDevEd.h>

#define CONNECTOR "rest"
#define SERVER_ADDRESS "YOUR_SERVER_ADDRESS"
#define METHOD "GET"
#define KEY "YOUR_KEY"

int temperaturePin = 0;

Inside the setup function we initialise the Ciao library ready for use.

void setup() {
  Ciao.begin();
}

Inside our loop function we first calculate a five minute interval as this is how often we want to be taking a recording. We then set the last sampled time to zero so we get a recording on the first time in the loop.

  const unsigned long fiveMinutes = 5 * 60 * 1000UL;
  static unsigned long lastSampleTime = 0 - fiveMinutes;

Next we get the number of milliseconds that have passed and check if five minutes has passed since the last recording. If it has then we add five minutes to our last sampled time ready for the next recording.

unsigned long now = millis();
  if (now - lastSampleTime >= fiveMinutes)
  {
    lastSampleTime += fiveMinutes;

Our next few lines of code take the recording from the sensor and convert it to degrees celsius.

    // Getting the voltage temperature reading from the sensor.
    int temperatureReading = analogRead(temperaturePin);
  
    // Converting that temperature reading to voltage, for 3.3v arduino use 3.3.
    float voltage = temperatureReading * 5.0;
    voltage /= 1024.0;
  
    // Temperature in degrees celsius.
    float temperatureC = (voltage - 0.5) * 100;  // Converting from 10 mv per degree wit 500 mV offset to degrees ((voltage - 500mV) times 100).

The last part of code builds up a URI that we send to the server using the Ciao library.

    // Build the GET request URI
    String uri = "/data.php?key=";
    uri += String(KEY);
    uri += "&temp=";
    uri += String(temperatureC);

    // Send the data to the webserver.
    CiaoData data = Ciao.write(CONNECTOR, SERVER_ADDRESS, uri);

Index page and chart

The last piece of code to look at is the index.php file for the web server. This draws a line chart using the indoor and outdoor temperatures.

We first include the settings file and then make a database connection.

require './settings.inc';

// Try to connect to the database.
try {
  $db = new PDO("mysql:host=" . DB_HOST . ";dbname=" . DB_NAME, DB_USER, DB_PASS);
} catch (PDOException $e) {
  echo $e->getMessage();
  exit;
}

The SQL query gets all of the recordings in the past 24 hours and orders them oldest to newest. We then add all of these results to an array that we will use later to output as JSON.

$query = $db->query("SELECT * FROM data WHERE created_at >= now() - INTERVAL 1 DAY ORDER BY created_at ASC;");

while($row = $query->fetch( PDO::FETCH_ASSOC )){
  $recordings[] = $row;
}

In the head section we set some basic styling for the chart.

  <style>

    body {
      font-family: sans-serif;
      color: #444;
    }

    .line {
      fill: none;
      stroke-width: 3;
    }

    .line__indoor {
      stroke: #ffab00;
    }

    .line__outdoor {
      stroke: #34e823;
    }

    .axis path,
    .axis line {
      fill: none;
      stroke: #000;
      shape-rendering: crispEdges;
    }

    .axis text {
      font-size: 10px;
    }

  </style>

The body has no content, just two script tags. The first tag includes the D3 library.

<script src="https://d3js.org/d3.v5.min.js"></script>

The second script tag has the logic for creating the chart.

We first create a variable to hold the JSON data that we created earlier.

    var recordings = <?php echo json_encode($recordings); ?>;

Then we set some variables for margins, width and height and append an SVG element to the body that will contain the chart.

    // Set variables for margins, width and height.
    var margin = {top: 50, right: 50, bottom: 50, left: 50},
        width = window.innerWidth - margin.left - margin.right,
        height = window.innerHeight - margin.top - margin.bottom;

    // Create an svg element and append it to the body element.
    var svg = d3.select('body').append("svg")
        .attr("width",  width + margin.left + margin.right)
        .attr("height", height + margin.top + margin.bottom)
        .append("g")
        .attr("transform", "translate(" + margin.left + "," + margin.top + ")");

The next section does some setup of a time parse in the format returned from our database and then an array of all of the temperatures so we can get the min and max values.

    var timeConv = d3.timeParse("%Y-%m-%d %H:%M:%S");

    var temperatureRange = [];

    // Create an array of all the temperatures so we can get the min and max values.
    recordings.map(function(recording) {
        temperatureRange.push(recording.indoor_temperature);
        temperatureRange.push(recording.outdoor_temperature);
    });

We then create scales for our X and Y axes using time for the X axis and a linear scale for the temperatures on the Y axis.

For the yScale domain I’ve subtracted 4 from the minimum and added 4 to the maximum so the chart had a bit of breathing room above and below the lines. We then append the axis to the chart.

    var xScale = d3.scaleTime().range([0,width]);
    var yScale = d3.scaleLinear().rangeRound([height, 0]);

    xScale.domain(d3.extent(recordings, function(d){
        return timeConv(d.created_at);
    }));

    yScale.domain([parseFloat(d3.min(temperatureRange)) - 4.00, parseFloat(d3.max(temperatureRange)) + 4]);

    // Create the axis.
    var yaxis = d3.axisLeft().scale(yScale);
    var xaxis = d3.axisBottom().scale(xScale);

    svg.append("g")
        .attr("class", "axis")
        .attr("transform", "translate(0," + height + ")")
        .call(xaxis);

    svg.append("g")
        .attr("class", "axis")
        .call(yaxis);

The last part of the code adds the two lines to the chart and gives them slightly different classes so we can colour the lines differently.

    // Create the indoor line.
    var indoorLine = d3.line()
        .x(function(d) {
            return xScale(timeConv(d.created_at));
        })
        .y(function(d) {
            return yScale(d.indoor_temperature);
        })
        .curve(d3.curveMonotoneX);

    svg.append("path")
        .data([recordings])
        .attr("class", "line line__indoor")
        .attr("d", indoorLine);

    // Create the outdoor line.
    var outdoorLine = d3.line()
        .x(function(d) {
            return xScale(timeConv(d.created_at));
        })
        .y(function(d) {
            return yScale(d.outdoor_temperature);
        })
        .curve(d3.curveMonotoneX);

    svg.append("path")
        .data([recordings])
        .attr("class", "line line__outdoor")
        .attr("d", outdoorLine);

Improving

There are a couple of things I’d like to do to improve this.

  1. Security. To allow the data to be sent to a public facing server some more security steps than a basic key should be used.
  2. Allow different timeframes on the chart, e.g. past X hours, days or weeks. For this I’d need to group the data as showing data for every 5 minutes of an hour over 7 days would clutter a chart.

When to use a custom search solution

This post also appears on the Headscape site.

Website search can often be a thorny subject. Expectations of search capabilities have been set very high by search engines. The default search offered by content management systems (CMSs) is often a bit basic by comparison. For example, the results are often not the most relevant to the search term used because the CMS is only offering a simple “does this page contain this word” method.

Our discovery work for the Competition Appeal Tribunal site revealed a specialised set of search requirements. When working with a CMS, I’m a firm believer that you go with the flow of its strong and weak points, and try not to fight it. With that philosophy in mind, my instinct was not to struggle to make native CMS search fit the requirements but rather to develop a custom solution for this site.

We opted for Elasticsearch, which is described as a “RESTful search and analytics engine”, combined with the very well maintained and documented Elasticsearch PHP library.

As soon as you put some data into Elasticsearch it does a good job of returning relevant results. But where it really excels is when you begin to fine tune it to your needs.

As an example, there is a field on the site that we needed to be able to perform full text searches and sorts. This can be accomplished by duplicating the field into two different “types” allowing one to have all of the full text search capabilities whilst the other is used for sorting.

On another project for US law firm Buckley Sandler (that is built on Drupal), the different content types and fields did not all have the same relevance in search results. By applying a “boost” to specific content types and fields we were able to deliver the most relevant results to users.  Whilst Drupal’s native search API does allow a boost to be applied to certain fields, using Elasticsearch we were able to boost specific content types, individual pages and taxonomy terms.

This level of customisation may not be required on your project, but if it is I highly recommend trying Elasticsearch as I’m confident it can meet your needs.

Chatbots, always available customer support or know nothings?

This post also appears on the Headscape site.

After reading Build Chatbots with PHP by Christoph Rumpel I started making some simple chatbots and quite quickly moved on to more complex interactions. It didn’t take long to get something basic up and running, which in turn, got me really excited about the possibilities.

But – there’s always a ‘but’ isn’t there – it also didn’t take long for the bots to fail to have answers, especially when I showed them to colleagues who didn’t know the precise phrases the bots were listening for.

To try and get over this I used Dialogflow for the backend as it has natural language parsing capabilities. It also has a very nice UI that will allow even non-technical users to build up conversations. After a short tutorial I found Dialogflow easy to use and very powerful for the little effort needed to start using it.

Research

Highlighted by others asking my bot questions it did not understand, it is essential that you do in-depth user research before you launch your bot. The last thing you want is to spend a lot of time and money only for people to never use the chatbot as they had a bad first experience.

Just like any web project, speaking to the people who will be interacting with your site / bot will show you interactions you can never have imagined by yourself.

Monitor

It’s highly likely that you run some analytics or monitoring code on your website to ensure users are moving through the site as expected. You should also be monitoring the chatbot’s interactions to see where users take conversations and if any of these lead to dead ends.

This could be in the form of capturing the sessions with a tool such as Full Story, or logging the conversations to a database that you can then analyse at a later date.

Learn

Machine learning is great but machines also need help as well. Whilst Dialogflow has a machine learning aspect to it there will more than likely be cases where a user asks a question that it cannot learn from. By spotting new forms of interaction early you can return to the research phase to see how others would phrase the same question and then feed these alternatives into the backend.

You may also find people are looking for specific areas of your site and your overall information architecture could do with some readjustment. Or better still, maybe people are coming to you looking for a service or product you do not yet offer but could do, leading to new lines of business.

Chatbots are an exciting development and give the potential to have a form of customer support that’s always on hand. But they are not something that can be set up and never looked at again, otherwise their lack of knowledge can leave users feeling frustrated and abandoning your site.

Blogging, Completed It Mate

Oddly enough I’d had this post in mind over the summer and then some gentle encouragement from Paul Boag on an episode of The Boagworld Podcast convinced me to finish writing it.

Towards the end of last year I had several ideas for blog posts rattling round my head. I made a conscious decision I should actually finish the blog posts instead of leaving them as “Draft” and thinking “I’ll finish that soon”.

Off To A Flyer

When you have lots of ideas the posts just flow, you have momentum and enthusiasm and you’re a blogger master.

I even started setting myself reminders and began blogging quite regularly. Then it stopped. Like so many people stop and I’m not sure why. I missed a week, promised I’d finish the post next week and then all of sudden I was out of the habit.

Why Blog

I had a think about what made me start blogging in the first place

Had some ideas
Self documentation
Sure up my own thoughts and processes
Help others

It was good to finally finish off some of the blog posts I had stored as drafts and that made me look at other aspects of my job and think “I could post about that”.

I’m sure most developers document parts of their jobs that they don’t do so frequently. My own website seems a good place to keep those documents.

A bit like rubber ducking the act of writing out a process really helps me to straighten out my thoughts and ensure I’ve covered all aspects of a job.

It’s been really rewarding to see links coming in from drupal.org and seeing some search terms people have used and ended up on my blog such as Testing SSL certificates without a domain name.

When to Blog

Maybe I aimed a little too high with once a week at the start and things started to feel forced.

My aim going forward is to commit to once a month but if I have a series of posts I’ll possibly look at releasing those a little closer together.

Ultimately I want it to be a thing I enjoy rather than forcing myself to do it.

Thanks To Anyone Who Blogs

I’ve learnt so much from blog posts over the years. People just sharing a little bit of knowledge they’ve learnt to people consciously documenting the process they’ve been through.

It’s also given me a huge appreciation of people like Paul Boag who have consistently produced lots of high quality content over a long period of time.

Maybe, just maybe, this blog post will make you write a blog post, and you never know maybe I’ll find that blog post and it will help me. Even if that never happens there is a good chance it will help someone, even if it’s just your getting your thoughts in order.

Today I am Not Working From Home

I’m a full time remote worker and have a room at home set up as an office I can shut myself away in. As I’ve talked about before I love working from home but this week for the first time I decided to try remote working from a public space. I choose a local Wetherspoons as it seemed to tick all the boxes and was a short walk from home.

Setting Off

The night before I actually made a list to ensure I’d have everything I needed. Small things like earphones that I don’t normally use should come in very handy in a public space. One last mental check before leaving home and I was off. Luckily it was a dry morning and having a walk first thing was very nice.

Essentials, Free WiFi and Power

A lot of places have free WiFi these days so I knew that wouldn’t be a problem but it was only on my way that I realised I would need a power supply to last the whole day, rookie error. I opted for the honest approach and just asked when I arrived. Handily there was a nice table tucked away at the back with a power socket.

What do I do When I Need the toilet?

Everything was going well and then I hit my first “What do I do now?” moment. When I need to pop to the toilet what do I do with all my stuff. By this time there was a table of retired gentlemen sat on the table beside me. I had protectors of my belongings.

Not as Many Screen Breaks

When I’m at home I find myself getting up to make a drink a bit more often and normally have few paces around the kitchen while I wait for the kettle to boil. Being somewhere where food and drink are brought to you I found myself sitting in the same place for a lot longer than I usually would.

Planning Calls

As I mentioned in my “Make working from home work for you and your team” post I like to make myself available for quick chats with colleagues. Today I had no client calls planned and didn’t end up needing to chat with anyone else. I had my earphones and I suppose I would just have had to try and talk at a polite volume had I needed to.

How Long Can I Stay

Then came the second “What do I do now?” moment. When I arrived I ordered food and drink, but now I feel like I’ve outstayed my welcome, and that was nothing to do with the staff just my internal thoughts. So I ordered another drink and a small bite to eat.

Overall Productivity

When lunchtime came I made sure to leave and have some time away from the “desk” just as I do at home. I got plenty done away from home and don’t feel it impacted my productivity at all. It was nice to be somewhere different and it adds to my enjoyment that I’m lucky enough to have a job where I can do such a thing.

Would I do it Again?

Yes, but certainly not everyday. Ignoring the cost of buying food and drink so I feel like I can stay, I do like being in the home office. Maybe I shall save it as a once a month treat.

Fractal – A Backend Developers Perspective

One of the great thing about attending conferences is attending a talk and learning about something you didn’t even realise you wanted to know about.

During DrupalCon Vienna I saw a talk by Anton Staroverov and Tassilo Groeper from Wondrous titled Decouple your Twig from PHP and make Frontenders happy! In the talk they showed a tool called Fractal which has been developed by Clearleft. After seeing the presentation I felt this would be a very useful tool so set about looking into it further.

Easy to Use

That evening I started to try out Fractal and within 10 minutes I had a build up and running and had added a component to it.

Since then at Headscape we have switched the default templating language to Twig to tie it into our Drupal 8 development. We’ve also used it’s static output to produce pattern libraries for clients.

Reference Point for All

Not only is Fractal a useful tool for frontend developers as mentioned above we also use it to show clients both individual components and components working together.

As a predominantly backend developer I also find it useful when I need to add a button for example. I can easily find all the button options in the nice frontend UI and see what classes I need to add.

Helps Building Templates and Reusable Components

Having one place where we can list all of the components and then using this as the actual templates for the production build is very useful for me.

During development, wether that is early stages or updates in the future, I find having a reference point of components speeds up the process.

For example if we have a new content type I can look at the various view modes we already have to see how I should be outputting the content.

With how quick it is to get something up and running with Fractal I’d encourage you to try it out. It might take a little bit of settings adjustment and maybe even splitting you’re templates and CSS out into more componentised methods but I think the benefits you’ll gain will be worth it.

The Importance of Real Data

At the start of a project there are many different objectives and everyone involved has their own sense of priorities or tasks they wish to a tackle first. For me that is generally sorting the backend into a nicely organised structure that will allow flexibility in the output both now and in the future.

A big part of this is getting real data in so the front end output is as close to final content as possible. This allows us to realistically analysis how the content and design will work together.

Does it Fit

As flexible as the web is with it’s layouts there are still times when a design will only allow a certain amount of text or number of items to appear before it looks squeezed. For example text in a sidebar can be excellent if it’s short and snappy. However if the actual content is a long paragraph it will be very hard to read.

I’ve also experienced this with titles where they can be in a large font size. Taking into consideration how that works over multiple lines is a useful exercise.

Lining Up

Adding columns or boxes to a layout is a useful way of defining content areas. Without real data these areas normally contain filler text, occasionally the same filler text, which can mean boxes are the same height as each other.

On many occasions I’ve seen a design change as soon as real content is output as one boxes contains many more lines than the other causing them to no longer vertical match in height.

There are ways to handle this with JavaScript and more and more CSS but it is still a consideration at this point in time.

Missing Anything

This is normally the realm of imagery as it is the hardest part of content to populate. Instead of just using test images take the time to ensure the design works with other no images or a placeholder repeated multiple times.

CMS Tools

Content editors often need to add some classes or styles to text entered in wysiwyg editors to help front end presentation. By their nature these editors don’t or can’t always output perfect markup. By using them to populate the site at design and development time you can save a lot of headaches by utilising the markup they do produce and not battling against it.

Overall

On every project I find the use of real data and content smooths out the development process as all involved parties can see a “real world” example of how the site will function with it’s content.

Drupal 8 Workflow Notifications with Rules Part 3

In previous posts I’ve covered enabling and editing a Workflow and then sending an email when a new content moderation state has been saved.

In this post we will complete the loop by sending emails when a content moderation state has been updated.

In this example an editor will publish the node and the rule will email the node creator to tell them it has been published.

Head to the Rules admin page /admin/config/workflow/rules and click Add reaction rule

This time we want to choose After updating content moderation state from the React on event select.

Adding a rules event for updating content moderation state

This time round we will need two conditions.

  1. Check the unchanged moderation state
  2. Check the new moderation state

Click the Add condition button and then choose Data comparison from the select.

For the data to compare value you can either use the data selection or enter

content_moderation_state_unchanged.moderation_state.value

Then for the data value enter review

This is the moderation state the node was in before it was saved.

Now click save.

Moderation state unchanged data comparison condition

Add another data comparison condition then this time enter the following

Data to compare value

content_moderation_state.moderation_state.value

Data value

published

New moderation state data comparison condition

Next we need to fetch the entity so we can use it’s values in our email.

Back on the edit page for this rule click Add action and choose Fetch entity by id from the select.

For Entity type enter node and for the Identifier value enter

content_moderation_state.content_entity_id.value

Then click save.

Now we can send an email to the node creator.
Click Add action and choose Send email.

In the Send to field use direct input and enter

{{entity_fetched.uid.entity.mail.value}}

This is a token that gets replaced with the users email address.

For subject we can just enter something appropriate such as Published Notification

For the message we will use the direct input mode an a token that provides the entity title

Your content {{entity_fetched.title.value}} has been published.

Email action

Save that and then save the rule.

As is often the case with Drupal it’s worth clearing your cache before testing.

Rules is still in it’s infancy for Drupal 8 but I feel it provides enough functionality and is well enough supported that it can be included in production site.

Changing the Default Content Moderation State in Drupal 8

I have been working with the Content Moderation module in Drupal 8 and one slight niggle we came across was the default option for the Moderation state.

If the current state was In Review we would like the Change to option to also be In Review rather than Draft.

Current default moderation state

The desired default state option

This means the default is to keep the entity in it’s current state rather than switching to a new state. Switching to a new state is then an active task for the editor.

I did this by implementing hook_form_alter() in a custom module.

/**
 * Implements hook_form_alter().
 */
function my_module_form_alter(&$form, \Drupal\Core\Form\FormStateInterface $form_state, $form_id) {

if ( array_key_exists('moderation_state', $form) ) {

$node = $form_state->getFormObject()->getEntity();

$moderationState = $node->get('moderation_state')->getValue('state');

if (isset($moderationState[0]['value'])) {
 $form['moderation_state']['widget'][0]['state']['#default_value'] = $moderationState[0]['value'];
 }

}

}

In this function we first check if the form contains the moderation_state key.

Next we load the node so we can access it’s properties.

The moderation state is then retrieved from the node.

The last part is to check if the moderation state has a value. If we are creating a node this will not be set.

In the if statement we then set the default value to the current state.