blog ruby

How to add the VideoJS Yarn package to Rails

It’s not unsurprising that a web tutorial should follow the old ‘how to draw a duck’ format… “draw two circles for the head and body, draw two circles for the eyes, draw the rest of the duck!” Case in point, the videojs project doesn’t elaborate beyond “npm install…” which is nice if you’re using npm every day. If you’re new to the process or are using something a little more complex like Rail’s Yarn/Webpacker integration here’s how to add the VideoJS Yarn package to Rails.

Here’s how to get going, and how to get the new Video.JS theme pack installed while you’re at it. I’m using the Jumpstart Pro provisioning of Yarn, Webpacker etc., which I won’t touch here. So I guess you’ll have to draw the rest of that duck yourself! GoRails has some good webpacker content.

Note that this project uses the (weird) new normal of putting stylesheets within the javascript structure which is described in this Tailwind tutorial (similar to the way Jumpstart is built).

Once you have Yarn functioning as part of your project it’s quite simple. Just as a primer let’s look at the npm installation guide on the Mux themes page. We’re not going to copy this directly…

$ npm install --save video.js @videojs/themes

// Base Video.js theme
import 'video.js/dist/video-js.css';

// City
import '@videojs/themes/dist/city/index.css';

That needs split up into three.

  1. install the npm components using Yarn
  2. amend the import statements and place into your application.scss file
  3. add the unmentioned javascript components into your application.js file

1. Yarn

Add the npm components using Yarn (you’ll recognise this from the npm install line earlier).

yarn add video.js @videojs/themes

Your output will look something like this:

Some warnings are to be expected – take a note of them in case you see errors later.

2. Webpacker’s application.scss

You’ll see a series of ‘@import’ statements in your file already. We need to convert the “import” lines from the turial to “@import”.

@import 'flatpickr/dist/flatpickr.css';

@import 'video.js/dist/video-js.css';
//@import '@videojs/themes/dist/city/index.css';
//@import '@videojs/themes/dist/fantasy/index.css';
//@import '@videojs/themes/dist/forest/index.css';
@import '@videojs/themes/dist/sea/index.css';

You’ll notice at this point, if your server is running, that the CSS files will recompile. Any errors will appear at this time. (I’m using Jumpstart’s Foreman + processes).

3. Webpacker’s application.js

Finally, in application.js we need to require the ‘video.js’ package’s script. Note the styles use @videojs and the script uses video.js. No idea why it’s different – any clues or wildly inaccurate guesses leave them in the comments.

// Rails functionality
window.Rails = require("@rails/ujs")


Another webpack compile and a browser refresh, if configured. And that’s how to add the VideoJS Yarn package to Rails 6 and webpacker!

Test it!

Here’s a test using a sample m3u8 (hls) stream which is equivalent to any modern live stream after adding the VideoJS package via webpacker.

<video-js controls autoplay preload class="vjs-theme-sea" id="player-id" data-setup='{"liveui": true,"fluid": true}'>
  <source src="" type="application/x-mpegurl">
A screenshot showing VideoJS content playing back with the ‘Sea’ theme

Here’s the sample video loaded alongside some teaser text for my upcoming product. If you’re interested in asset management and a pay per view live streaming platform for sports teams and leagues, sign up to get info!

While you are here

I’ve recently launched social media platforms for my company on both Facebook and LinkedIn. I’d love it if you would like/follow/share the profiles if you can. My focus is heavily on live streaming solutions but as you can tell from my recent blog posts web development is an ongoing part of my business.

My last two articles:

Thanks for reading! If you have any questions drop a comment below, or email me.


Databases: MySQL to PostgreSQL conversion

This isn’t an attempt to write an authoritative or exhaustive guide, but should help you find the way to get (the title) done. More importantly, so next time I have to do it… I’ll not have to google so much for mysql to postgresql conversion.

Here’s the scenario: an old Rails 4/5 app is sitting on a server which doesn’t cost a lot to run, but it’s the only thing left on the server. Ruby and Rails have both become somewhat more efficient over the years. Using my Dokku Rails 6 server will only add undesired RAM requirement.

One proposed solution is to drop the old site onto Heroku. Let their magic beans keep it operational until such a time as the site can be upgraded though more likely migrated and retired.


  • Heroku uses PostgreSQL by default, and MySQL only comes as a premium add-on. You can get a whole 5MB database for free… <sarc/>.
  • Getting old databases from MySQL into PostgreSQL format for migration isn’t straightforward as there is no “export to Postgres” option in Sequel Pro, for example.

Step 1: Set up Postgres locally

Setup on mac is easy using the Homebrew package manager, if you don’t have it installed check out the Homebrew homepage info.

# install postgres, it'll likely
# update Homebrew itself at the same time
brew install postgresql

Have a read of the release notes which may alter the following and will tell you how to run an on-boot server service. The below command will fire up postgres for now.

# then start the database
brew services start postgresql

Handily, the brew installer creates an account in your user’s name with a blank password.

Step 2: Set up a local database

This article elaborates on the entire PSQL setup process, handy if you want to know more.

Login into the psql prompt by typing ‘psql postgres’:

➜  ~ psql postgres
psql (12.4)
Type "help" for help.


Then, to find which users exist and confirm yours has been created as expected:

postgres=# \du
                                   List of roles
 Role name |                         Attributes                         | Member of
 david     | Superuser, Create role, Create DB, Replication, Bypass RLS | {}
 postgres  | Superuser, Create role, Create DB                          | {}

Next create a database, though if you need to create a custom user don’t skip the steps in between in the article.

CREATE DATABASE databasename;

Step 3: install the migration tool

PGLoader is a command line tool for doing exactly what we’re looking for. Installation for linux via apt or any platform via Docker is covered in the Github repo. With our homebrew setup it’s easy:

brew install --HEAD pgloader

Step 4: Migrate your database

Migration photo by Barth Bailey on Unsplash

Scaleway (not an affiliate link) a hosting company with some interesting offerings on their platform, have an article on the migration element.

The format is as follows:

pgloader mysql://mysqluser:password@<mysql-server>:<mysql-port>/<source-database> postgresql://<pgsql-role>:password@<pgsql_server>:<postgresql-port>/<target-database>

So if your MySQL data is in your local database here’s a sample:

pgloader mysql://localuser:localpass@localhost:3306/mysqldatabasename postgresql://david:@localhost:5432/psqldbname

Your database is probably on your VPS, firewalled from public port access (at least it should be) so you’ll need to use an SSH tunnel.

Information on how to test your local connection and more can be found here but generally the tunnel command follows this format.

ssh -L tunnelport:thismachine:localmysqlport remoteuser@remoteserver

In practical terms that means executing something like the following means you can access your remote database’s port 3306 using local port 3307.

ssh -L 3307:localhost:3306

Now that we have our tunnel in place we make a very minor adjustment to the pgloader script:

# replace port number 3306 with 3307
pgloader mysql://localuser:localpass@localhost:3307/mysqldatabasename postgresql://david:@localhost:5432/psqldbname
focus photography of gray and black pigeons behind Eiffel Tower
Voila. Photo by Fabrizio Verrecchia on Unsplash

Et voilà, mysql to postgresql conversion complete.


The whole point of this is to get your legacy data into your new Heroku database. The heroku documentation on importing to postgres is the place to start.


Resolving X-Frame-Options DENY issues – Ghost, WordPress and More

TL;DR if you are having issues with X-Frame-Options DENY appearing in your logs, you’ll probably want to change the setting to X-Frame-Options SAMEORIGIN for the site or your overall web server settings.

I recently rebuilt and relaunched my company website using the website building platform Ghost.

Aside: it’s been a long time since I’ve posted here! Most of the content on this website is a distilling of my ‘just google it’ history. More recently there’s been a permanent job and a freelance focus on live streaming both of which are a bit more kept behind the curtain.

One quirk needing dealt with is that like many modern services, Ghost focusses on what it’s good at (blog management) and for my needs, requires a secure reverse proxy in front of it. Until Caddy takes over the world, Nginx is the option of choice. Ghost knows how to interact with Nginx and automatically sets up a LetsEncrypt certificate when you’re getting going. Everything works pretty smoothly.

However, the home page dashboard view is a “View Site” interface, for previewing updates etc. My Nginx configuration blocks this sort of website-in-website previewing to avoid bad actors hijacking hosted image on any client websites. This is what it looks like:

Screenshot of Ghost admin interface with "refused to connect" message for homepage preview

Chrome’s inspector reveals the problem: “Refused to display ‘’ in a frame because it set ‘X-Frame-Options’ to ‘deny’.”

Chrome console message

The solution is quite simple. LetEncrypt creates a file called something like /etc/nginx/snippets/ssl-params.conf (other versions maybe ssl-dhparams.conf) containing a line refering to X-Frame-Options.

# add_header X-Frame-Options DENY;
add_header X-Frame-Options SAMEORIGIN;

Please note this will override your global Nginx settings for this server. If you have multiple domains on your server, you may wish instead to override this global setting on your Nginx website’s / location block instead. Just make sure it’s after the ssl-params.conf call.

Nginx configuration block to solve X-Frame-Options DENY configuration


Screenshot of Ghost admin interface with fixed homepage preview

PS. remember to check your logs!


Daily Oops

AKA “Your PHP installation appears to be missing the MySQL extension which is required by WordPress.”

This afternoon’s glorious freelance waste of time was debugging a server’s PHP installation which got broken with a experimental installation of iRedMail — do NOT install iRedMail on a production machine, it’s meant for fresh installs only.

iRedMail’s removal script is great… if you want to remove apache, mysql, php, etc. etc. etc. Don’t do it.

That’s my scenario, you may have gotten this error in a different manner. Either way the solution is similar.

I rebooted my Ubuntu 12.04 Linode VPS (referral link) and found my Rails/Passenger sites came up first time, but my WordPress/mysql reliant php sites failed giving this error, or a variant thereof:

Your PHP installation appears to be missing the MySQL extension which is required by WordPress.

You’ve probably already tried to reinstall your “php5” “php5-mysql” packages already, but found it said they’re correctly installed.

Enter this line to find all packages that are in a “purged” state.

dpkg --get-selections | grep purge

Purged packages are uninstalled but not removed. Which is confusing, as you can’t install over them.

 apache2 purge
 libapache2-mod-php5 purge
 libaprutil1 purge
 php5-cli purge
 php5-common purge
 php5-imap purge

So you want to remove + reinstall the pertinent packages:

sudo apt-get remove php5-common

and then do the apt-get install command as per your original setup. So for me:

sudo apt-get install libapache2-mod-php5 php-pear php5 php5-cgi php5-cli php5-common php5-curl php5-gd php5-json php5-ldap php5-mcrypt php5-mysql php5-pgsql php5-readline


sudo apachectl restart

Your mileage may vary.


CS Lewis, according to Alister McGrath (audio)

Lots of people quote CS Lewis, perhaps not knowing where he actually comes from or what he actually means. A Northern Irish boy, raised in East Belfast, he struggled through early faith journey eventually becoming an atheist before heading to Oxford. That’s not where the story ended. I saw the below quote tweeted in a positive life affirming sense (I imagine,) when in reality Lewis was overtly and unapologetically Christian in his writings after conversion.

“There are far better things ahead
than any we leave behind”

Alister McGrath (Oxford educated scientist, resident theologian, and author of far too many books) spoke today at a pair of events hosted by the QUB Chaplaincies at The Hub on Elmwood Avenue. Below are two recordings provided by The Hub team of the talks.

The former might be described as building the framework behind why we believe, touching on a little of how Lewis was a “travelling companion” to McGrath in his journey into faith; the latter is a broad overview of Lewis, his influences (including how the Northern Irish landscape influenced Narnia) and legacy of thought.

“Why I am a Christian, by a lapsed atheist…”

“CS Lewis’s Vision of Christianity and its relevance today”

For what it’s worth, McGrath’s biography
is available on Amazon.
C. S. Lewis: A Life: Eccentric
Genius, Reluctant Prophet

(affiliate link, current earnings in four years £0.00)


Adding links to PDFs using PHP ezPDF

I’ve been making some changes to a site which already uses the ezPDF class for creating PDFs from PHP. How to insert a link?

Helper methods seem to exist but they are not documented in the code and are a bit confusing to read. Instead, you can just embed a hyperlink into any (I think) text entry and it’ll be converted into a clickable link. Use the syntax below.

A very simple example of a complete PDF generation is shown here:

Note: you can get the complete “R&OS” ezPDF class here


Creating a download click tracker in Rails 4 – Also, Run-ins with Turbolinks

A Friday lunchtime double feature for you.

The task: create a click tracker, to keep a rough (accuracy isn’t of core importance, but why skimp?) measure of the number of time s a link has been clicked by site users.

Here’s some sample code I used for the new version of GiantsLive which contains a digital downloads feature.

The task is a fairly simple one, it doesn’t even require a controller, but here we use a migration to create a stats table, and a model to provide a simple API.

The ‘object_key’ I’ve used refers to an AmazonS3 object’s unique name (within a bucket), this is because each ‘fixture’ has several video files available to download.


class CreateFixtureDownloadTrackers < ActiveRecord::Migration
def change
create_table :fixture_download_trackers do |t|
t.integer :fixture_id
t.string :object_key
t.integer :download_count, default: 0
add_index :fixture_download_trackers, [ :fixture_id, :object_key ], :unique => true, :name => :tracker_index
view raw gistfile1.rb hosted with ❤ by GitHub


class FixtureDownloadTracker < ActiveRecord::Base
belongs_to :fixture
def self.track(fixture, key)
FixtureDownloadTracker.find_or_create_by(fixture: fixture, object_key: key).incr
def self.counter(fixture, key)
FixtureDownloadTracker.find_or_create_by(fixture: fixture, object_key: key).download_count
def incr
self.download_count += 1
view raw gistfile1.rb hosted with ❤ by GitHub


#Find current status of the counter
FixtureDownloadTracker.counter(fixture, obj_key)
#Track a click on an item
FixtureDownloadTracker.track(fixture, obj_key)
view raw gistfile1.rb hosted with ❤ by GitHub

You could create a helper method to present the detail neatly, but that’s up to you. Hope that code is helpful, if you have any suggestions on how to clean it up any further do use the comments below.

Part 2: Turbolinks causing 403 problems

Or maybe just one. Rails 4.0 introduces a new feature called which is intended to reduce the amount of server requests caused by a page change, in very simple terms it means that instead of looking for CSS, JS and font files every time you click a link, it just asks for the HTML + images. I was getting a strange quirk in my tracking code whereby every click was creating two increment actions. I finally figured out the problem by looking my logs. An excerpt here:

Started GET "/fixtures/get/1/videofiledotext" for at 2013-09-06 14:16:30 +0100
Processing by FixturesController#get_download_uri_for as HTML
Parameters: {"fixture_id"=>"1", "object_key"=>"videofiledotext"}
Redirected to
Completed 403 Forbidden in 54ms (ActiveRecord: 43.6ms)
Started GET "/fixtures/get/1/videofiledotext" for at 2013-09-06 14:16:30 +0100
Processing by FixturesController#get_download_uri_for as HTML
Parameters: {"fixture_id"=>"1", "object_key"=>"videofiledotext"}
Redirected to Redirected to
Completed 302 Found in 50ms (ActiveRecord: 19.2ms)
view raw gistfile1.rb hosted with ❤ by GitHub

You’ll notice the “Complete 403 Forbidden” first, then “Completed 302 Found” second. Turbolinks appears to have been requesting the file, failing and then resorting to a page reload. This caused the tracker to be triggered twice, and the statistics to be incorrect.

There is an open ticket on github regarding this issue. [h/t @StuartGibson]

A fix

As per the Turbolinks documentation, adding the html data attribute data-no-turbolink on a parent tag will disable any turbolink action from occuring. So, in my case:

<ul data-no-turbolink="">...

Problem solved.

blog Music

New Music From Gungor: “Wayward & Torn”

The new album ‘I am Mountain’ is on its way, American Songwriter have a preview of a track. Sounding like a background hum from the set of Justified, it has a moody southern blues feel. While the track is supposedly set apart from the rest of the album, it paves a welcome path.

Listen to the track now

If you haven’t heard of [one of my favourite bands] Gungor… A little context, in the form of a video:

iTunes pre-order & preview available here.


Sorry, I didn’t quite catch that? — Sally the sign language interpreter

A while ago I launched a new website for my friend Sally, who’s both a PhD candidate at Queen’s and a successful professional sign language interpreter for hire.

Check out the website at

The simple site has two aspects: professional and academic. It’s somewhat adaptive (but perhaps not quite ‘responsive’?), has been developed to accommodate video content on each page if a video BSL interpretation is available and in future will allow Sally to receive feedback on various research projects.

Sally is a BSL/English interpreter and a first year PhD translation student. Through her research, she is exploring the demographics of the deaf signing population in Northern Ireland and the resources and opportunities available to this linguistic minority.

She adopts a balanced approach to practice and research, and is currently organising an academic conference on Sign Language and the Politics of Recognition in addition to her studies and interpreting work.

Freelance professional sign language interpretation services in Belfast, across Ireland and the UK.

blog Video editing

Converting HD Video for use in an SD DVD project

Why a DVD, why can’t everyone just go digital…?

Unfortunate as it is, real world clients’ requirements tend to rely on 1990s technology.

I’ve been putting together DVDs for the New Wine conference, which is generally consists of:

  • complete the edit…
  • export full quality
  • export to MPEG-2 in Compressor
  • drag into DVD Studio
  • burn the disk

This doesn’t work very well for projects which originate in high definition, partly because of your finely rendered type and partly because of the poor transcoding magic performed going directly from 1080 to 576 for DVD.


Like me, plenty of people out there have learnt everything they know about videography from working on pet projects and tinkering with the tools they have to hand, and stuff like this just isn’t really “in the manual.”

I think I might’ve finally worked out a workflow that should work, should you need wish to put your beautiful HD Video onto a standard definition DVD.

Starting point:

Most of my video projects are in Apple ProRes — captured live or post-event using my Ninja device. The specific variety of ProRes doesn’t matter, frame rates (60i or 50i) and colour depth aren’t the problem here. We’re in 1080 and it looks pretty great. On the computer. And in whatever web format we tend to use.

Output from the edit:

Using your editing software typically you will typically output a full resolution copy of the edit, or a reference movie — just the audio and rendered parts, with pointers to all your other source video files, hidden in a file much smaller than your full copy.

What you would do for the web:

At this point, I usually drag the reference movie into MPEG Streamclip. Sometimes it misbehaves and doesn’t understand the reference pointers; so it might be best just to output a full QuickTime file for transcoding.

Vimeo (the discerning videographers publishing platform of choice?) has a helpful guide on good settings for upload, but basically I do this (my saved “Vimeo 720” preset):

  • Choose MPEG-4 export, using H.264
  • Bump the quality to 100% (no point compromising here)
  • Limit the data rate to 3500 or 5000Kbps depending on how good you want it to look (I tend to choose 5000 for final exports, 3500 for key drafts, 1000 for quick previews)
  • Do whatever you want with sound, AAC / Stereo / 44.1 / 256kbps
  • I tend to scale to 1280 x 720 for output
  • Choose the right frame right
  • Select “Interlaced Scaling” and “Deinterlace Video”
  • Everything else should be right

And a visual representation of the above, if it helps…

vimeo-720-hq-settingsThis creates a pretty good image and the interlaced/deinterlace couplet fixes most “why did my text go fuzzy” issues.

When creating DVDs:

MPEG Streamclip is not the tool for this job. iDVD is Disney. Toast does burn video DVDs but it [ignorant assumption] won’t give you the ability to make the type of interface that you actually want.

So you need to use Compressor + DVD Studio Pro. Nightmare. But they can do pretty much everything you want them to.

Bad assumption:

You can just use the source file, make it convert using one of the provided DVD formats (in Compressor) or just drop it into DVDSP directly and it will create a useable disk. It will, but it won’t look good. All your rendered images and text will be of very poor quality and while much less noticeable the video itself will not be as good.

Larry to the rescue:

Often you’ll find the answer to a video question is answered on Larry Jordan and this is no exception, though it took me a few days of casual searching to find a link to the article entitled Solving Video Compression Problems When Down-Sizing HD to SD — you may notice some similarity in the title of this article…

The basic solution here is this:

  • Pre-compression” — resize the HD movie to an SD frame size without compressing (transcode to ProRes HQ from 1080 to 1024×576 if you’re working in the PAL world) — full details of which settings to use are in the article
  • Then use that exported file as your DVD track source.

Obviously this will take a bit longer to output, but it will provide far cleaner images from which to produce your DVD. In practice it takes significantly longer: on a 2009 iMac, transcoding five 35 minute files took just over 14 hours. Fine if you’ve nothing else to do!

P.S. If you spot any flaws, or have any suggestions or additions to this article please leave a comment below and I’ll consider updating it.

Mastered DVDs