This week was the annual International Day of Happiness; I never knew we had a specific day to be really happy but it turns out that we do.

This inspired me to finish a pet project I've been working on recently which uses the Azure Machine Learning Project Oxford Emotion API (which I have blogged about before) to take a photo containing up to 64 human faces and order them by happiness. It is really fun to use when talking to groups of people as I did at one of my recent events.

The website is available for anyone to use at

Some happy people at a recent MS Web Day event

How does it work?

The website builds on the simple emotion API example I published on GitHub a few weeks ago. It is written in ASP.NET Core 1.0 with a bit of JQuery and Bootstrap.

You basically upload a photo and get a JSON response from the API. I then sort the results by the happiness score and use some JQuery and Bootstrap magic to plot the scores on the original picture.

You can see my source code here:

What's next?

This project is certainly not finished. Here are some of the things I want to do next:

  • Scale the photo to fit on a screen rather than relying on the weird drag feature I have right now. If anyone can help with that, I do accept pull requests! 🙂
  • Plot as a list as well as an image. I might steal from Martin Beeby's HowHappyCordova project for this
  • Do the same for all the emotion scores (how angry etc)


Comments (0)

Skip to main content