Open Search

Our House, an IoT App

January 4, 2018 7:25 pm
Categorised in: , , , ,
Reading Time: 8 minutes

This post details a recent app build project I was working on. This app was interesting because it also had a Raspberry Pi powered hardware component supplying it with data. The App, Our House, is an IoT home monitoring app. Now, I know what you are thinking. This has kind of been done. But bear in mind this is the project that got me started in Swift and Pi Hacking. I’ve just not bothered to write about it until now.m

Premise of the app

This app has been a long gestating project for me, which is still nowhere near finished. It’s been a tremendous learning exercise, I credit most of my Swift knowledge to the learnings I’ve made from both this app and from this one. However it didn’t start out Swift focussed at all. In 2013 my partner bought me a Raspberry Pi along with a pack of sensors and things as a Christmas gift. I had a lot of fun with it, programming LEDs and getting to know the Pi. At some point I must have picked up the temperature sensor and thought “I reckon I could build a connected home monitoring tool with this”. This was way before HomeKit and Nest devices were really on the market although the market would be well in play before I’d got this anywhere decent.

A connected home monitoring device was something I’d wanted to build because, well, you tend to get a bit more paranoid about this sort of stuff when you’re woken up by your neighbours at 2 in the morning on hot July with the rather pressing and urgent message that your shed and moments from then, your house, was ablaze. Fire is pretty fucking terrifying when it’s on your house and you’re stood across the street in your boxers looking at it, hoping the fire engine rocks up incredibly fast. It did. It rocked up very fast, but god, seconds felt like hours. I did go back in for my laptop though.

shed fire

The thought that I could monitor the house to ensure that it wasn’t burning down with my dogs inside it, or even better be alerted to rapid temperature increases is one of those things that just made sense to me at the time.

Early Development

For getting the Pi to read from the DHT22 temperature sensor I’d used this tutorial on Adafruit and it was invaluable. Same for reading from the photocell resistor I had also picked out of my sensor back for approximating current light level in the room. Their libraries make short work out of what would have extremely hard work for me as the DHT22 can’t be read using Python and instead needs to be read using C and I know no C. I avoid things with C in the name because they are always arcane and hard*. Hence my now undying love of the completely opposite, Swift, which was described at the time of introduction as “what would Objective C be like, without the C”, which totally sold me.

Initially, the scope of the project was that I wanted to create just a webpage where I could read view the latest stats. The Raspberry Pi (running modified scripts from the above tutorials for the sensors) would also be set up as a web server using lighttpd, the Pi would run a cron job every minute to check the readings on the sensors and take a photo and store it in a folder accessible over http. The readings I’d store in a SQLite3 database and then pull them into the page via Javascript request using PHP to serve the queries back out as JSON then just update the DOM on a responsive HTML page when the data has been returned. Depending on the size of the viewPort I’d also load in some graphs using the Google Charts API.

There were of course stumbling blocks, but I think I had it this up working in just a matter of a weeks or so. This was really easy to do.

Swift development

Normally my appetite would have been sated at this point, I had a tool which works and still works to this day, but it won’t have been long after this that Apple announced Swift. I really wanted to learn Swift. Swift would mean I could make native apps and it’s syntax looked a lot more simple than Objective C*. Making a Swift version of the home monitoring app seemed like a great way to learn it. After all, I had a web backend in place already, so I literally could concentrate on the Swift aspect of things. And so begins my journey.

The Swift version of the app has 3 or 4 versions notable by my approach to the app at the time and I can generally time each sprint to have begun around each Swift release as with each release I will have had to convert them to the new Swift which will have got me tinkering with the app again.

Version 1

This was me just getting to know Swift, so this was single ScrollView for the main page of the app. For some reason, probably my naivety, I really stuck to using ScrollView as a way of controlling the potential for overflow across devices. It flitted in my mind I could use a TableView. Utimately, I did use a TableView. But in V1 I used a ScrollView. I spent a lot of time getting annoyed at Autolayout, but thanks to SwiftyJson, I got the data from the Pi Server into the app and into a UI that was essentially a jazzed up version of the webpage prototype. I experimented with segueing to a new view to look at relevant graphs using JBChartView and a new view creating a time-lapse of the photos taken from the Pi Camera, adding a refresh button to get new data when required.

The downside here was speed, I’d pull all the data back at once from the server, then loop through it then display it which added latency before the user saw the information. In v1 this was probably UI blocking. Despite this, I was happy with this until around version 2 although I’d pick it up on occasion and continue to try and get autolayout and the contents of my ScrollView to play nice through the Swift 1 – 1.2 transition.

Version 2

Version 2 probably encapsulates the time period from Swift 2 onward. I started to tangle with saving my JSON data locally with Coredata, looking at bringing the content in via Background Fetch, added notifications to Notification Centre and adding Widget in the Today View, refactored and then refactored some more. Still kept fighting with a ScrollView like a moron.

Apple Watch arrived and I was totally into it. watchOS1 was a lot simpler to deal with as the app on the phone did the heavy lifting  of pulling the data out and then just sent it to the Watch. Along with the main app, I built a glance. A complication would be more useful.

Version 3

Version 3 around the time of Swift 3 onwards, here I started to develop the app for multiple devices sensors. Due to the method I was using for Creating a carousel* and the increasing thought towards modularisation of the main sensor view i.e. that I might have certain sensors not available at certain times. This is the moment where it clicked that perhaps a TableView was the right design decision here. It was. It made life better. Behind the scenes though I now refactored and refactored and learnt and unlearn and learnt CoreData. It became my nemesis. I started adding on-boarding and settings options to admin the addresses as my ISP has a habit of switching my IP every 6 months or so and the app became more filled out.

*effectively a ScrollView to handle horizontal panning inside a ScrollView for vertical scrolling

Current Version

With the latest version I decided that I wanted an instant overview of the rooms covered by devices, so I developed a front screen to act as an information hub. By this point I had become pretty comfortable with Swift so I started looking at how I could incorporate 3D assets using Scenekit into the app. I developed a model of the house in 3D using Blender that on touch of the room would take you that View.

The 3D model area is affected by the time of day and weather conditions pulled in using the Dark Sky API. I added 3D states for rain & snow using SceneKits particle systems, one for clear weather which was unanimated & added a  wind state to my to do list. Before it would have any effect  I would need to add some bones to the tree outside.

I also developed a class for incorporating & displaying information about upcoming events from the family calendar pulled in using EventKit. Bluetooth based “at home” detection for me and my partner was a next step I never got round to, but dropped in a place holder visual anyway. I refactored a lot of stuff. But most of the time I spent tinkering with the 3D model in Blender to be honest. I’d got to the point here where the Swift side of creating a front screen was pretty straight forward, so time to relearn how to model in 3D. That was back in about January 2017, I haven’t really touched it since then.

Next Steps

At some point soon I will pick this up again, if nothing more than just to finish off the last bits and pieces and kill a few bugs, tweak auto layout to account for iPhoneX. With Swift 4 comes native JSON handling so I’ll be switching everything over to that too, having used it extensively in the latest version of my Pioneer AV Receiver Remote app I’m readying for store. That CoreData will want a refactor now too. I imagine I’ll tweak a lot of stuff if I start, but at this point it’s more polishing off some old work.

The reality is, HomeKit devices came along and the Apple Home app also provides me with data on temperature, light and motion via Hue Motion Sensors as well as controls to my HomeKit devices so this app is almost redundant even to me, let alone a potential market of users*. But the Pi servers still tick over to this day, collecting data, and so it still remains a  good bit of work and as such deserves to be updated.

This joint was penned by @elmarko

Comments are closed here.