Preparing for GDPR: Data Mapping

One of my responsibilities at Accelerate Places is to ensure that the business (across all of its limited companies) is compliant with the changes to data protection law under GDPR.

I’ve found a useful tool from the Isle of Man Information Commissioner: their Know Your Data, Mapping the 5 W’s (PDF) toolkit.

I’m currently using the toolkit to create an inventory of all the personal data processed. Once that is done, I’ll be able to map the processing of personal data and work through which is the most appropriate legal basis for the processing.

Building the next generation network with Okta

Last week, I was asked by Okta to speak at Okta Forum London about our implementation of their Single Sign-On and Multifactor Authentication products. We had considered Okta when we first implemented identity management, at the beginning of’s journey into the cloud, however we didn’t sign with them until some time after, having watched their product’s development move at great speed.

I was lucky to be introduced by Sammi Laine, who is Director of Technical Marketing at Okta. He gives an excellent introduction to Multi-factor authentication and how it is used to secure access to applications. My talk begins approximately 15 minutes and 7 seconds into the video below.

I was asked to tell a story and I hope I didn’t disappoint. I gave a flavour of the journey that we have been on, moving from thin terminals and Lotus Notes to embracing the cloud and minimising on-premise infrastructure. Having a single identity was key, because friction would mean that users would forget their passwords, forget which system they were supposed to be using or look for alternatives, and that’s how shadow IT takes hold. We work hard to provide a great experience and have built an IT estate we are proud of, that works for our growing business.

Our Okta implementation was painless. There was a lot of preparation involved, but the actual roll-out was achieved overnight by three members of staff, with no professional service time. That is not just a testament to my team, but also to the documentation that Okta provides.

We have implemented more of Okta’s features as time has gone on, protecting our applications using Multi-Factor Authentication and increasing automation in on- and off-boarding. Identity management is core to our ongoing strategy to securely enable staff to work efficiently and effectively, wherever they may be and I am pleased to have been able to share our journey with Okta’s customers and prospects.

A Multilingual WordPress Wedding

I’ve been building and hosting websites on and off for the last fifteen years, but, despite speaking French and German to degree level, it’s only recently that I’ve had to think about how to cater for non-English speakers.

As is the fashion for engaged couples, I built a wedding website with my fiancé. A lot of our guests were coming from overseas and so we decided that we wanted our guests to RSVP online and be able to access a lot more information than we could fit on the invite. Weddings are stressful, travel is stressful and we wanted to do as much as we could to make life easy for our guests. That and we didn’t fancy collating emails, written RSVPs or managing a spreadsheet.

There is an excellent RSVP plugin for WordPress, which allows you to require guests to enter a code to identify them before they RSVP. We used a spreadsheet to generate random four letter codes and then wrote the code onto the RSVP slip which we sent with the invite.

I should say that my fiancé is Spanish. This meant we had guests who spoke only Spanish. We could have duplicated every page and simply had a complicated menu, but that seemed inelegant, would provide a fairly poor user experience and didn’t offer guests a means of RSVPing entirely in Spanish.

To solve this issue we used Takayuki Miyoshi’s Bogo plugin. Bogo builds on the built-in localisation functionality in WordPress’s core. Every post we created was assigned a language (either English or Spanish) and then Bogo handled displaying the correct version to the visitor, based on their language preference. When plugins are called, Bogo uses the GNU Gettext translation files included with the plugins to display the correct language. We found that editing these (with POEdit) allowed us to customise the messaging in the RSVP flow – some of the standard wording was a bit informal. For menus, there is no need to create two menus, you simply add all the pages to one menu, in the order you want them to be seen and Bogo filters the menu at runtime to only display relevant content to the visitor. The plugin even includes a handy language switcher, which enabled visitors whose language was incorrectly sniffed (I wrote a function based on the advice in the Automatically switch WordPress language based on User’s Browser Language support forum thread) to find content in their language.

All in all, we were very pleased with how straightforward it was to use Bogo and WordPress to create a multilingual site. Where we ran into problems, there is a lot of help content on the Bogo support forum.

For others considering building a wedding website using WordPress, multilingual or otherwise, we would recommend the Quick and Easy FAQs plugin, which, because every question is stored as a separate article, allowed us to provide translations using Bogo. The plugin also offers a variety of ways to display the FAQs, which allowed us to get creative.

Now it’s time to get back to our wedding planning Kanban board.

Notes on Let’s Encrypt

I’ve been using SSL certificates from Let’s Encrypt for a while now and though there’s no direct integration between certbot and nginx, it was all relatively straightforward to configure.

I’ve created a small nginx config file that can be included in any server block where I’m using LE certificates. This means authentication can use webroot challenges.

location ~ /.well-known {
root /path/to/challenges;
allow all;

To request a new certificate for a new domain (using letsencrypt-auto, but that can be swapped out for certbot-auto depending on the client version) run the following as root, replacing the text in brackets:
/opt/letsencrypt/letsencrypt-auto certonly --agree-tos --email <email> --webroot -w </path/to/challenges> -d <domain>

If the certificate should cover multiple subdomains, that’s possible by specifying -d <domain> multiple times. The caveat is that, if you later stop making the .well-known directory available on just one of the subdomains on the certificate, your renewal will fail (as I found out).

To set up auto-renewal, add the following to the root crontab:
30 2 * * 7 /opt/letsencrypt/letsencrypt-auto renew -q --post-hook "service nginx reload" >> /var/log/le-renew.log
This causes the renewal job to run at 2:30am every Sunday, restart the server when it completes and log any errors to /var/log/le-renew.log.

To avoid the problem of the renewal failing when you decommission a subdomain, it’s very straightforward to request a new certificate for the available domains. Because Lets’ Encrypt will only automatically renew certificates when they are nearing expiry to keep you from hitting the rate limits, you will likely need to force the renewal using the following command and restart the web server:
/opt/letsencrypt/letsencrypt-auto renew --force-renewal --allow-subset-of-names

An Ode to APIs

The importance of data driven decisions throughout the enterprise cannot be underestimated. Understanding the data flowing through an organisation is key to providing customers with products that they will love. This has long been known and understood. In which case, why does providing access to the data stored within our business applications seem so often to be an afterthought?

Is it too much to ask to be able to query the data stored within our application landscape without having to download a CSV and rely on q? If you haven’t met q, it is, according to its website, “a command line tool that allows direct execution of SQL-like queries on CSV [files]”. As a tool to quickly calculate totals, chop and slice data and find interesting patterns it is a lifesaver, but using the command line (or worse, a spreadsheet) isn’t the way I want to get access to data.

In 2016, providing a small suite of reports, such as CSV exports which lack finer detail and graphs which cannot be customised will no longer be considered enough. More and more SMEs will be investing in data warehouses and data visualisation tools, meaning access to the underlying data becomes increasingly important. Furthermore, I can only believe that the requirement from organisations to integrate applications will become even more pressing, meaning process owners will be looking to vendors for the ability to access structured data using APIs, to call out to other applications and integration platforms using webhooks and outbound messages.

I am lucky to work with vendors who provide APIs to access the data stored within their applications. Jive give me more information in their reporting API than I know what to do with; Kissflow supports webhooks to integrate workflow processes with other applications. All it takes is a little imagination to make data flow through back office applications, support decision-making and reduce double typing.

Free your data, bring on the APIs! That’s what I say.