Building an "IoT-in-a-Box" with Raspberry Pi, camera and sensors

Building an "IoT-in-a-Box" with Raspberry Pi, camera and sensors

Recently I've started the project of getting a Raspberry Pi to provide everything I need to IoT-ify my house. The general idea is to have one Raspberry Pi do three things:

  1. read and stream data from any connected hardware (including usb-connected sensors, the Raspberry Pi camera, or GPIO pins),
  2. run an MQTT server that relays those messages to/from any IoT devices,
  3. serve a webpage where I can view data and control all available IoT devices

A Bunch O' Uses

Time-lapse controller: With an interface for the Raspberry Pi camera, I can create time-lapses without requiring an external monitor or ssh'ing to control the Pi camera. With a Pi Zero mounted on a little tripod, this could also work nicely as a device my 9 year old could use to create stop-motion animations, if he were actually into such things.

Science kit: A friend of mine was working on a project where he wanted to look at bacterial growth in Petri dishes. If I sent him a kit, he would be able to use it out-of-the-box with his phone to both create a time-lapse of bacterial growth in the Petri dish, as well as to monitor temp and humidity. (Or even control temp and humidity, if I include a way of writing "recipes" to control the relays.)

Outdoor science kit: Since the application doesn't really need external internet, you could take the setup outdoors, power it with a USB battery pack, and use your phone's browser to tell it to start logging environmental data. This could be stored on the Pi itself and dowloaded later. (I believe this was part of  Manylabs' vision underlying the InSPECT project, which is what I work on at work.)

A "DataDots" prototype: I want small, independently addressable IoT LEDs that can be integrated into a variety of spaces and controlled with IFTTT-style "recipes". For example, I want: a small, red LED to sit by my trash can and turn on on Thursday nights to remind me to take out the trash; a blue one on my desk as a "soft" notification that I have an email; many little green ones stuck into the various potted plants that I have to remember to water. I am currently trying to make these with the ESP-8266.

For my "Office Space" project: Generally, I am interested in how technologies can be used to make knowledge work more "humane". Specifically, I want working at my desktop to feel like working at a lab bench. To be able to explore this within my own workspace, I would like to develop a general, flexible platform for messaging between my computer, and all the objects on my desk.

Hardware

Raspberry Pi: I have been using a model 3B+.

Raspberry Pi Camera: I've been using the NoIR Raspberry Pi camera (where "NoIR" means "no IR filter," not "no IR"...) and a mount. But a Pi Zero, camera-only setup would be cool.

USB Sensors: The project I work on used (in an earlier version of our hardware, developed by ManyLabs) the Pololu A-Star 32U4 Micro to manage serial communications with commercial sensors (like this CO2 sensor). The Pololus then plug into the Pi's USB ports, where they communicate the sensor data to software running on the Pi.

From top left, clockwise: a relay, and temperature/humidity, carbon dioxide, and oxygen sensors.

Software

The Python code turns attached hardware (camera, sensors, relays) into something like "IoT Things", by giving them each an interface that handles MQTT messages addressed to them, and publishes their updates (or data) as MQTT messages.

Additionally, a web server serves a static index.html page from which you can view the camera feed, and send MQTT commands to the camera to tell it to take a photo or take a timelapse. (I intend to have this page also show sensor data and provide controls for relays.)

The code is here:

lahardy/IoT-in-a-Box
Contribute to lahardy/IoT-in-a-Box development by creating an account on GitHub.

How it works

The fun thing about this project for me was figuring out how to try to write the Pi code. This is the organization I ended up with:

A mighty fine system diagram, if I do say so myself.

The sensors, relays, camera, and eventually any GPIO-connected sensors, all fit in the above scheme. It's self-contained, in that the Pi can also act as the MQTT server, and so this doesn't require any external internet if the Pi is set up as an access point.

I am writing about different parts of this in a little more detail, in these posts:

Current Status

5/27/2020: Sensors/relays are detected when plugged into the Pi; sensor data is sent to the MQTT server. Relays can be turned on/off with MQTT messages. The Pi serves a webpage where you can view the camera feed, and send MQTT messages to the camera to take a single image or start a time-lapse series:

Screenshot of my phone, of the RPi camera view.

Next steps: Getting the web page to build itself based on the responses to "roll-call" messages. Fixing a camera feed frame rate issue– the camera feed is super slow right now, like 2-3 fps.

(Medium-to-Far) Future Plans

Adding GPIO Sensors: I am considering adding GPIO later to the project, at least for completeness' sake. This could either be by enabling serial communications over GPIO with other digital sensors, or adding other analog sensors, like a button, capacitive touch sensor, or photodiode, with some external circuitry.

**Note: the "plug and play" of the USB sensors isn't as easy with GPIO-- I'd need to either hardcode in specific GPIO pins to communicate with specific sensors/devices, or create a post-hoc way to connect changes on these pins with specific devices and MQTT messages.

Other devices: I am working on a sub-project to create IoT buttons, lights and sensors using the ESP-8266. These will connect on their own to the MQTT server and be powered separately, and not require a Pi. I'll post about it when I get them working with the IoT-in-a-Box.

I also want to play with other input or output devices, like an IoT thermal printer– I could create a real-time graph of home data, printed out on my desk. Or print out email notifications, and never hear another notification sound again. What I'd really like though is to be able to press a button, have the state of my entire workspace recorded at that moment, and a reference to that state printed out. This way restarting a project could look like grabbing a little receipt-- like a restaurant order off a kitchen line– and having it scanned and acted on (like re-opening specific programs, documents, Chrome tabs, etc).

Running "Recipes": I may want to have relays or other output devices respond only in specific circumstances, or record data only in certain conditions. In this case, I'll need a way to create and run new objects that subscribe to different MQTT updates, do some processing, and in response send commands to components. At my work, the software Dataflow currently does this with AWS Lambda, though in the earlier versions these sorts of programs were executed on the Pi. Since this is all MQTT now, it should not be too difficult to add something like a "Recipe Manager" to the above code, that manages "recipes" that subscribe to some MQTT topics, do some logic, and spit out new messages.