Experimental Qt and QML in the browser

Update: You might also be interested in the discussion on Hacker News.

I was amazed when I read Morten Johan Sørvig’s blog post about bringing Qt to Native Client (NaCl) and Emscripten in September last year. I have been following multiple efforts to bring Qt to the web for some time. Mostly because I wanted to make my own Qt projects easily available to others, but also because I find the whole concept of running C++ code in a browser fascinating.

This is definitely not the first time someone has worked on bringing Qt to the browser. Qt4 was ported to Emscripten by Simon St James and published with a lot of examples. However, it seems the port was discontinued when Qt5 came along due to the changes that came with that transition. In the meantime, Lauri Paimen, Anton Kreuzkamp and many others have contributed to qmlweb, a QML interpreter written from scratch in JavaScript. That project does however not make it easy to use any C++ classes you may have defined. And most of my projects are QML/C++ hybrids.

These projects brought something important to the table and took us one step closer to writing QML for the browser. Personally I find QML very enjoyable because of its clean and intuitive structure in comparison to other options like XML. And the ability to easily bind GUI elements to high-performance C++ code has made Qt my favorite framework for many projects.

I tried to port Qt5 to Emscripten myself about a year ago. That turned out to be a real challenge. I based my work on the changes Simon St James made to Qt4 and tried to figure out how to make the same changes in Qt5. After a couple of weeks, I got to the point where I had QtCore running and QtDeclarative (QML) compiling, but other things caught up with me and I had to leave that project behind. The biggest disappointment was that I never got around to making any demos.

I decided to pick up the ball a couple of months ago This time I knew about Morten Johan Sørvig’s efforts and asked him to give me some info on the status of his port. At that point he had support for compiling to NaCl ready (that means Chrome only) and he had already started porting to Emscripten using pepper.js as glue between Emscripten and NaCl. There was still a bit left to do before any examples could be up and running in Firefox. I decided to give it a go and see if I could contribute with something.

A couple of weeks later, I managed to get Qt running in Firefox with a number of QML examples. The process was a lot of fun as I learned much about what happens inside Qt’s sources. It was interesting to see how often I could write many lines of code and rebuild multiple times only to arrive at a one-line change that fixed the issue at hand. I was really happy when I finally could send a patch to Qt’s NaCl branch. The main changes were to Qt’s event loop, which would hang if control wasn’t returned to the browser. I had to add a couple of hacks to keep Qt’s mutexes from locking up everything (this needs to be properly fixed later, perhaps by creating pthread stubs). And OpenGL needed to be explicitly set to OpenGL ES2 in addition to a bunch of smaller modifications.

QML sandbox running in Firefox
QML sandbox running in Firefox. See the links below for a live demo.

Below are two resulting demos of QML running in the browser. I’m amazed with the performance both in Firefox and Chrome. Just be aware that the size of the examples is quite big because no optimizations have been done to reduce the size of the binary JavaScript. Further, there may be lots of bugs and some browsers where the examples don’t work. Remember that this is still very experimental and only a small taste of what may be possible with Qt in the browser in the future:

What amazes me the most is the fact that we are now running Qt’s own JavaScript engine, which is written in C++ and compiled to JavaScript with Emscripten, inside the browser’s own JavaScript engine! I really want to see if we can get QML’s WebEngineView running inside itself inside Firefox. That’s a step on the way to true inception, which reminds me of a great talk by Gary Bernhardt titled “The Birth & Death of JavaScript”.

Using Blender and make-na to create DNA art

In CINPLA we are developing teaching material for a new course in biology and programming. One topic we’re working on is that of bioinformatics and DNA. I figured that an artistic rendering of the DNA double helix would be nice to have and decided to try and make one in Blender. At first I considered building this from the ground up using its design tools, but I quickly figured that it would be too time consuming and hard to get the proportions right. After all, I’d like it to be as close to the real structure of DNA as possible.

I then remembered that the RCSB Protein Data Bank (PDB) hosts a large number of molecular structures that have been described by researchers over the past decades. The structures can be downloaded freely as long as they are cited properly. I went looking for files with the DNA molecule and found a couple, such as 1BNA and 1D66, but they turned out to be a bit too short or too hard to untangle from the other molecules.

Then I found make-na, a fantastically simple open source tool that will generate a PDB file from any DNA sequence. Just open the make-na server and type in your favorite combination of the letters A, T, C and G. Out comes a PDB file with the corresponding molecular structure of DNA.

To open the PDB file in Blender, click File > User Preferences > Add-ons and search for PDB. Enable this add-on by clicking the check box next to it:

Screenshot from 2016-02-14 17-40-24(You may have to restart Blender after enabling the add-on).

Click File > Import > Protein Data Bank (.pdb):

Screenshot from 2016-02-14 17-43-40Voila! You should now have a DNA molecule ready for rendering in Blender.

I tweaked the material settings, enabled Cycles Render and added a couple of lights to make the image you see below:

duplex14Here is the .blend file used to create the image, if you’d like to modify it to make your own version:

For more technical figures, I’m thinking about using the cartoon style in JSmol or PyMOL which produces ribbon diagrams of proteins. Those make the DNA strands look more like this:


The next step is to load this image into Inkscape and start annotating the different parts of DNA.

The above images and the .blend file are licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.

An upcoming C++ library for HDF5 files

I recently started working on a new C++ library for reading and writing HDF5 files. I got the idea when I was working on a few files in Python and C++ at the same time. The Python library h5py is just way more comfortable than the HDF5 C++ API. But with the new features in C++11 and C++14, I figured it should be possible to make a C++ library that is just as easy to use as h5py. And I must admit that I believe I’m on the way to making it even easier.

One design goal I have set for this project is to make the library forgiving. This means that I assume you know what you are doing and try to make it possible, even though there might be some side effects.

One example of such a side effect is that of creating a dataset in a HDF5 file and later changing its size. Because of limitations in the HDF5 standard, the original dataset will still take up space in the file, although it’s no longer in use. I believe the HDF Group designed it this way because it is hard to free already allocated space in a file without truncating it first. So they optimized for performance rather than flexibility, which is a fair choice to make. However, I assume you want flexibility and that you’d rather see that extra space taken up than seeing your program crash for trying to reuse a dataset. So this code is perfectly fine within my library:

#include <armadillo>
#include <elegant/hdf5>

using namespace elegant::hdf5;
using namespace arma;

int main() {
    File myFile("myfile.h5");
    myFile["my_dataset"] = zeros(10, 15);
    myFile["my_dataset"] = zeros(20, 25);
    return 0;

This opens the myfile.h5 for reading and writing and sets my_dataset to a 10×15 matrix of zeros before resetting it to a 20×25 matrix. This will, however, leave space taken up by the 10×15 matrix used in the file, even though it’s no longer accessible. Oh, and did I mention that the Armadillo library is already supported?

On the contrary, if you try to do the same in h5py, you will get a RuntimeError:

from h5py import *
from pylab import *
my_file = File("myfile.h5")
my_file["my_dataset"] = zeros((10, 15))
my_file["my_dataset"] = zeros((20, 25))
# output:
RuntimeError: Unable to create link (Name already exists)

As you can see, the syntax is pretty much the same, but the C++ library will be slightly more forgiving than h5py. That, in addition to a range of other nice features, is what I hope will make this new HDF5 C++ library attractive.

An experimental version will hopefully be released soon.

Getting started with unit tests in Qt Creator and Catch

I have written about unit testing in Qt Creator on multiple occasions earlier. Since then, a new testing framework seems to have become a new leader in the field, and that is Catch. It is similar to UnitTest++ and Google’s gtest and brings the best of both worlds. It is header-only and appears to reduce the amount of code that must be written in comparison to the other two.

This is an adopted version my previous posts that shows how to use Catch in a Qt Creator project. It doesn’t any longer integrate the testing output into Qt Creator’s issue pane. Instead I’ve started to rely on Travis CI or Jenkins to run the tests and notify me about any newly introduced errors.

An example project using this code structure has been posted on Github by Filip Sund (thanks, Filip!) and further adapted by us with the move to Catch.

The structure of the project is as follows:

├─ MyProject.pro
├─ defaults.pri
├─ app/
│  ├─ app.pro
│  └─ main.cpp
├─ src/
│  ├─ src.pro
│  └─ myclass.cpp
└─ tests/
   ├─ tests.pro
   └─ main.cpp

The main project file, MyProject.pro will now be based on a subdirs template, and may look like this:

TEMPLATE = subdirs
    src \
    app \
app.depends = src
tests.depends = src

The app.depends and tests.depends statements makes sure that the src project is compiled before the application and tests. This is because the src directory contains the library that will be used by both the app and the tests.

CONFIG+=ordered is needed because depends doesn’t affect the order during make install.


Each of the other .pro files will include defaults.pri to have all the headers available. defaults.pri contains the following:


If the library, main program and tests use common libraries, it is very useful to have the defaults.pri define these dependencies too.


In the src folder, I have myclass.cpp, which is the class that I want to use and test. The src.pro needs to compile to a library, so that it may be used both by app and tests, and could look something like this:

CONFIG -= qt

TARGET = myapp

SOURCES += myclass.cpp
HEADERS += myclass.h

What this class does is not so interesting, it could be anything. A simple example could be this header file:

#ifndef MYCLASS_H
#define MYCLASS_H

class MyClass {
    double addition(double a, double b);

#endif // MYCLASS_H

With this accompaning source file:

#include "myclass.h"

double MyClass::addition(double a, double b) {
    return a * b;


I only have a main.cpp file in app, because the app is basically just something that uses everything in the src folder. It will depend on the shared compiled library from src, and app.pro would look something like this:


CONFIG += console
CONFIG -= app_bundle
CONFIG -= qt

SOURCES += main.cpp

LIBS += -L../src -lmyapp

The main.cpp file could be a simple program that uses MyClass:

#include <myclass.h>
#include <iostream>

using namespace std;

int main()
    MyClass adder;
    cout << adder.addition(10, 20) << endl;
    return 0;


In the tests folder I have simply added a main.cpp which will run the tests. Then tests.pro has the following contents:


CONFIG   += console
CONFIG   -= app_bundle
CONFIG   -= qt

SOURCES += main.cpp

LIBS += -L../src -lmyapp

Which now links to the myapp library in addition to the unit tests.

The main.cpp in tests file which could contain the following, if we were to use UnitTest++ as our testing library:

#define CATCH_CONFIG_MAIN  // This tells Catch to provide a main() - only do this in one cpp file
#include "catch.hpp"
#include <myclass.h>

TEST_CASE( "MyMath", "[mymath]" ) {
    SECTION("Addition") {
        MyClass my;
        REQUIRE(my.addition(3,4) == 7);

This test will fail because my implementation of MyClass::addition is completely wrong:

class MyClass {
    double addition(double a, double b) {
        return a * b;

Note that I’m including MyClass by through <myclass.h> which is possible because of the INCLUDEPATH variable in defaults.pri.

This should help you define a project that compiles a library, as well as tests and an application using the library.

Why I won’t be starting my next project in Rust

I have been inspired to learn Rust and Julia lately. The idea was to use these promising languages in all of my new projects. I wanted to learn Rust because it is a safe and fast replacement for C++. I wanted to learn Julia because it is a language tailored for scientific use that might someday replace Python.

I quickly realized that I would be learning both for the wrong reason. I already know C++ and Python well, and should be starting important new projects in these languages.

One article that changed my mind was “Criticizing the Rust Language, and Why C/C++ Will Never Die” by Eax Melanhovich. Not all the points Melanhovich makes are longer valid. For instance, the speed of Rust has improved significantly over just the past months. However, he is right in that the number and quality of development tools for Rust is way lower than that of C++. And it is hard to argue with the number of vacant positions for  C++ programmers in comparison to those of Rust and Julia.

This does of course not say anything about the quality of these languages. For all I know, Rust may be (or become) a much safer, faster and elegant language than C++. But I understand that many of the benefits I would get from Rust are already available in the new C++11 and C++14 standards. And from using C++ in a modern way.

I found Rust attractive because of its promise of memory safety. It is designed in a way so that you won’t be able to shoot yourself in the foot as easily as you can in C++. However, I’m currently using pointers in C++ the old-fashioned way. I still work with raw pointers, new and delete when I should be using smart pointers. Sometimes I even use pointers when references and values would be the right choice.

I realize now that I need to start a project or two where I use only modern C++. That should hopefully teach me how to steer clear of those nagging segfaults sometime in the future. I’ll be reporting back here about my experiences and will make a list of some recommendations for others who are trying to do the same.