Industrial Tracking Device

Introduction

Metanate's client wished to develop a tracking and sensor device for industrial and scientific deployment, which submitted data to a central database using a cellular modem. The device featured a built-in status display, and could be monitored by mobile apps using Bluetooth Low Energy (BLE).

This device was equipped with a variety of sensors and communications hardware. It served as a technology demonstrator and was used in extensive field trials. Following these trials a range of more specific, cost-engineered devices was developed, as described in Asset tracking/environmental monitoring devices.

The device was battery-powered and needed to operate for several years on a single non-rechargeable battery. It was required to work in areas without cellular or GPS coverage, recording events throughout its deployment which could be extracted and stored in the database once it was returned to a service depot.

Metanate was commissioned to design and develop various components for this system:

  1. The firmware for the tracking and sensor device.
  2. A server/database for gathering the data from the devices and allowing them to be monitored.
  3. A manufacturing application for the factory that produced the devices.
  4. A service depot application for uploading stored events from the devices to the database and updating their firmware.

System Architecture

Tracking and Sensor Device

Architecture

  • The main CPU was a Nordic nRF51822 System-on-Chip (ARM Cortex M0 CPU).
  • A separate STM F32 processor and bitmapped LCD display was used to display device status and diagnostics.
  • The firmware was developed using a Linux-hosted GNU C cross-compiler toolchain.
  • It used a single-threaded, event-driven design running in 32kB of RAM.
  • CPU and attached component sleep modes were utilised to minimise power consumption.
  • A "Coulomb counting" algorithm was devised to monitor up-time for each component and compute remaining battery life; the devices used lithium-ion batteries whose voltage did not drop, allowing discharge to be measured.

Event Recording

  • Several environmental monitoring sensors were fitted, read using I2C and UART interfaces.
  • A three-axis accelerometer was fitted, providing motion detection. The ADC values read from this accelerometer were used to compute the device's orientation.
  • A GNSS receiver was used to obtain the GPS location and real time. There was provision for using GLONASS in appropriate regions.
  • Significant events were recorded in onboard SPI flash. These included an elapsed seconds counter to allow their real time to be extrapolated when real time could not be obtained using GPS.
  • Event data was stored and transmitted in a compact binary format, because the limited RAM precluded the use of JSON or similar encoding.

Event Transmission

  • Events were normally sent to the database using a cellular call twice daily to minimise charges; however certain high priority events triggered immediate cellular transmission.
  • Some of the devices were fitted with LoRa UHF transceivers. In suitably-equipped depots they could transmit their events to a LoRa repeater (developed by others), to save cellular call costs and prolong battery life.
  • Bluetooth Low Energy (BLE) status beacons were advertised at a slow rate, to conserve the battery life. Experience of compatibility issues with Linux kernels led to this approach being replaced by bursts of beacons in the next generation of devices.
  • BLE Peripheral connection support enabled the service depot app to retrieve the stored events from devices and download firmware updates onto them.

Security

  • The initial trial phase transmitted data in clear. Security was added before the large-scale trial.
  • Cellular event transmission used HTTPS to encrypt the data.
  • BLE beacon and event data was encrypted, and events were authenticated with CMACs, using AES-128 implemented in the Nordic hardware and keys unique to each device installed during manufacture.
  • Firmware images were in Nordic DFU format, with AES-128 CMACs added to the metadata to provide authentication and integrity-checking.

Data Server/Database

This stored the events submitted from the devices, to facilitate their development and trials while a customer-facing cloud service was being developed elsewhere.
  • It was based on an Apache web server hosted on a Linux system.
  • A MariaDB database contained tables representing each device's event history and current state.
  • Custom FastCGI code was written in C++ to handle the binary-encoded device events.
  • A JSON REST API was implemented for mobile apps developed elsewhere. The original prototype used Perl DBI on the server; subsequently the API was incorporated in the C++ code to support the larger-scale trial phases.
  • Initially a prototype web interface was implemented using Perl Mason to substitute data from the database into HTML page templates on the server.
  • For the trials an improved web-based monitoring interface was written in HTML5 and JavaScript, using jQuery and jQuery UI and JSON AJAX requests to obtain the data to populate the pages, as illustrated below.

Manufacturing Application

This was developed to automate the factory provisioning of devices. It ran on a Linux PC.
  • A barcode scanner was used to identify each device, and read an XML file of its properties that had been created during PCB manufacture.
  • It programmed firmware onto the device using a JTag connection, for both the main CPU and display processor.
  • It initialised the device configuration, which in later phases included unique cryptographic keys obtained from the network HSM.
  • It then submitted the device manufacturing properties to the cloud database using a JSON REST API.

Service Depot Application

Metanate developed an app to process the devices once they had been returned from field deployment to a service depot.
  • The app ran on Linux on a ruggedised tablet, and could process batches of several devices at a time.
  • It used the Qt GUI toolkit to provide a touch-driven graphical interface, displaying the progress of the operations on each device in the batch.
  • It operated as a BLE Central, connecting to a device to perform operations on it. In later phases this required it to fetch a unique encryption key from the network HSM.
  • It first extracted all stored events from the device and submitted them to the database over wi-fi, to ensure that the complete history was recorded however poor the cellular coverage had been during deployment.
  • It then updated the device's firmware if newer firmware images were available. If the display processor firmware needed updating, a temporary firmware application was downloaded onto the main CPU to transfer it across, then the normal main CPU firmware was reinstated.
  • Finally it reset the device and submitted an event to the database indicating the start of a fresh deployment.