Website Logo. Upload to /source/logo.png ; disable in /source/_includes/logo.html


News and Updates about OpenSensors.IO

First ‘Things’ First

I was pleased to see the recent post by the ODI on the open-shared-closed data spectrum since it resonates with the challenges faced at OpenSensors. To date most of our commercial projects have been at the private end of the spectrum; they are challenging, they are innovative, but they are often not ingesting open data or publishing data as an exhaust.

Are we worried about private IoT messaging? Not too much. Most of our private clients choose to get their own house in order first, after all typically there’s a lot of opportunity to juice existing sensors. First ‘things’ first as they say.

The good news is these deployments are sowing the seeds of sharing behaviours by distributing content internally, releasing data that used to terminate and die. They are unlocking data and distributing for access via API for dashboards, data science and decision support, which is the first step on a journey to openness.

So as a tech company how do we lead our clients and help them deliver open data strategy? We provide the tools to allow organisations to manage data entitlements pushing themselves up the data spectrum to become open. Each of our clients will make their own journey to open up their content, our job is to deliver infrastructure allowing them to manage data at a privacy that works for them.

This is important stuff. IoT tech companies are developing the smart city data network, and we don’t want it to be private. We want pain free navigation from edge to edge of our urban data grid, whilst feeling secure and confident about the data we consume. Our platforms must secure data whilst facilitating its exchange and entitlement control, so what’s needed to make smart city data exchange a reality? A couple of things spring to mind, we need to …

Evolve Topics and Communities – Expect faster adoption of sharing behaviours within trusted communities. By curating communities with shared interests expect adoption of localised data exchange, say amongst tenants of a commercial property. Communities sharing data should ease the path to universal open data.

Evolve Exchange Mechanisms – Transparent pain free data exchange is key to delivering a functionally rich lean IoT data infrastructure, the alternative could be akin to a ‘european data mountain’ of needless and costly sensor deployments.

Building the tech stack for these needs is plenty of work, so as we define the business and technical models for IoT we need to act responsibly. Deploying and decommissioning software is cheap, just a couple of mouse clicks away. IoT deployments are very real, they consume natural resource, risk cluttering our environment and can loiter well past their usefulness.

Encouraging sharing behaviour within IoT through lean shared infrastructure will prevent waste. The alternative would be a legacy of urban junk, we made a mess of space by not decommissioning hardware, lets not do the same with our urban environment and keep it open and centred on communities.

When Sensors and Open Data Collide

So I’m new to IoT having spent my career trying to find meaning in econometric data (ouch). Given I started life as a structural engineer I’m pretty excited to be working on data products with, feels like I’m back home again. So what’s exciting me today about where IoT and data science collide?

As data scientists we’re always looking at new models, new ways of shaping our view on a given data set to eek out some kind of edge. But at some point it feels like we’re chasing our tails with little hope of finding new factors to make our science better. Without new data, or at least the same data in a more granular or timely form, we’re just rehashing the same functional forms over the same content.

Fortunately life is about to get a whole lot more interesting as more connected devices come on line. We’re experiencing tangible innovation in IoT, we’re not talking hand wavy stuff; at we see hackers, hobbyists and enterprises building the next generation of smart cities with real velocity.

We’re also fired up since we see pretty much everyone embracing openness in their data. Exactly what open data means remains up for debate, but most agree that some flavour of open data is a prerequisite for successful smart cities.

It would be a pretty dumb city where you could only use the data in your own location. So it makes sense to open pathways for data to be exchanged allowing us all to benefit from advances in technology, without a cost to our built environment. The alternative is a proliferation of street clutter used to deliver data already gathered in our smart buildings. Paradoxically not smart! Having delivered connected buildings, transport and personal devices can expect a wave of innovation in apps and data science. So what’s would help to make this happen?

Communities – Architects, hackers and makers provide the crucible of IoT innovation but need support for their creative process. Helping to gong the technical pain points is great, even better is curating communities to support and challenge. Our mission is to build best of breed engineering whilst retaining our community roots leveraging platforms like github and hackster.

Connect existing things – Increasingly we see used to unlock value in existing device estates. For many enterprises it’s the ‘I’ in IoT that is new, to deliver the ‘I’ they need open, available, performant, secure and low cost messaging and data persistence.

Put open data to work – ‘I can’t define it, but I know it when I see it’, to paraphrase Justice Potter Stewart. The debate about what is open data will remain, what is rightly or wrongly tagged as such. But expect innovation in business models firmly founded on principles of open data. ‘Open’ may mean sharing data with your neighbour, your street, your city. Most importantly make it economically attractive for all enterprises to make data available in some form, even if it’s not ‘open’ in the purest sense.

We have an exciting journey ahead delivering significant change to our urban environment. Over a century ago ‘The league of American Wheelman’ catalysed improvements to America’s transport infrastructure; open data movement can deliver similar disruptive change to our urban environment. I hope open data becomes as ubiquitous as our transport network is today; in future no one will recall activists like the ODI and, but that would be a sign of open data’s success.


Here at OpenSensors we are committed to making it as easy as possible to get started with your IOT projects. You can now have the ability to post messages through OpenSensors using HTTP-POST as well as MQTT.

To post a message to a topic simply use this URL: and adding the client-id and password of your device as well as your username in the header

An example command using Curl is:

curl -X POST -H 'client-id: XX' -H 'password: XXXXXXX' -H 'username: yods' -d '{"value": 1}'

Next up is support for another great IOT protocol, CoAP!! My policy is that we will support any open standard and protocol so if there is a particular protocol you love feel free to send us an email.

You Use Open Data Every Day

You’ve done your hair, you’ve picked out the perfect outfit and you’re set on making a great first impression. How long have I got? What tube should I get? How soon is the next bus? You check the CityMapper app – you find a route and within seconds know exactly how long it will take to get to your hot date. You may not know it, but open data has just made your trip a lot easier.

Transport for London, OpenStreetMaps, Foursquare, Google Maps, Apple Maps and Cyclestreets all provide access to open data for others to use. In this case, CityMapper ingests the real-time open data produced by Transport for London, remixes it with freely available open mapping data, adds a touch of their own special sauce, real-time usage and congestion data from CityMapper users, and finally curates this brew in an accessible form for the user, waiting at the bus stop. In short, the CityMapper team takes the available open data, adds value to it and provides that as a service.

It is not an exaggeration to claim the future development of the city is intertwined with open data. From transport data to air quality data to real time high-street footfall, as cities become leaner, genuinely smarter and more efficient the availability of reliable high quality data will become more important than ever. Generating open data from our surroundings is unlocking value and insight from our environment, information that is all around us, for the researcher, for the app developer, for the tinkerer, for the activist.

For cities to succeed in building resilient systems and networks, the emerging data ecosystem in the city can’t rely on closed data, closed systems. Devices and sensors in the city won’t function as JawBone and FitBit do, two closed devices whose real-time data I couldn’t access and share even if I wanted to. Open data is disrupting the digital landscape, former data-as-commodity brokers, such as Landmark, have since fundamentally reshaped the way in which they do business, focusing on curating available data.

How to Make a Battery Last Forever…

Many devices in the IoT won’t be mains powered, but require some sort of supply with limited capacity: be it common batteries or rechargeable cells, energy management is a key challenge for the development of IoT products.


The basic anatomy of a program on a microcontroller (the little green boards that run the logic of many digital devices) is an infinite loop: as long as you’re powered, do…

Unfortunately, even “doing nothing” costs energy. While a processor is powered, it’s always going to draw some current for the most basic housekeeping. And it’s not just the processor: voltage regulators, interfaces, nearly everything that is connected to a circuit eats up electricity, and most of it is lost in the form of heat.

The data in this section primarily comes from a very good blog post on Power saving techniques for microprocessors by Nick Gammon. Let’s look at an example: Just running an empty loop on the Ardunio Uno (a commonly used microcontroller for hobbyists) draws about 50 mA. That is, with a standard 9V block (supplying 500 mAh), that Arduino can run just about 10 hours from a battery. Clearly not long enough for a hardware product that’s aimed to participate in the IoT.

Processor manufacturers are well aware of this issue and most platforms support a sleep mode. In case of the Arduino Uno, adding just four lines of code reduces the cost of the empty loop to 35 mA. This is still significant, but mostly owed to all the other components on the microcontroller (including a cheerfully blinking light and a wasteful voltage regulator that takes the 9V of the battery and supplies 5V to the board).

Fortunately, ‘real products’ don’t require all the baggage that the Arduino Uno is carrying around. The processor, the Atmel Atmega328, really just draws 0.35 mA in its most optimal sleep mode. If we don’t require particular features of the processor, this can further be reduced to less than 0.5 uA. (Note that this would allow a 500 mAh battery to drive the processor for 10 years – unfortunately, doing absolutely nothing!).

“But IoT devices are supposed to do things!”, I hear you say. Even more so, the hardware to send information into the Internet can be quite energy hungry – remember the times when it was recommended to switch Wifi off while you were on the road with your laptop? Now, many IoT devices, sensors in particular, only need to work in bursts.

Let’s take a look at one of our favourite core components in connected products: The RFu328 from Wireless Things. It combines a barebones Atmega328 with a transceiver that can send and receive radio messages to and from an Internet-connected hub device. The processor and the radio can be sent into a deep sleep, drawing 0.5 uA. However, there’s a timer inside the radio that can trigger the Atmega328 chip and wake the entire system, ready to send or receive data at about 30 mA. We may even have to supply electric current to external sensor hardware and increase our need to more than 50 mA for a second, but for our overall energy budget that’s rather marginal – most of the time, our device will be asleep for minutes if not hours.


For the standard Atmega sleep modes, consult Power saving techniques for microprocessors.

The sleep modes of the RFu328 depend on a simple modification to the hardware as well as a library from Wireless Things. In the absence of in-depth documentation, we learned a lot from PCB designs and software examples from the Oxford Flood Network.

In short, the RFu328 can configure the radio to go into the extended sleep mode by sending the ATSM3 AT command. The Wireless Things library handles a lot of the high-level “+++” string handling to communicate with the SRF radio.

Most of the setup magic happens here:

uint8_t setupSRF(char* sleepString) { // set Sleep mode 2
 if (!enterCommandMode()) { // if failed once then try again
 if (!enterCommandMode()) return 1;
 if (!sendCommand(sleepString)) return 2;
 if (!sendCommand("ATSM3")) return 3;
 if (!sendCommand("ATDN")) return 4;
 return 5;

and sleepString is a combination of a configuration suffix and the millisecond sleep duration in hexadecimal notation: ATSD1388 – 5 sec; ATSD4E20 – 20 sec; ATSDEA60 – 1 min; ATSD493E0 – 5 min; ATSD1B7740 – 30 min; ATSD36EE80 – 60 min

Everytime the radio wakes up from this sleep, it triggers a pin of the Atmega328, and having

LLAP.sleep(WAKE_PIN, RISING, false); // sleep until woken

as part of your loop() takes care of listening to that signal.


So how does the WAKE_PIN (either D2 or D3) get its signal from the SRF? Via a bridge from D11. (A simple wire.) If you have a newer model of the RFu328, there’s a small field labelled ATSM3. This allows you to create a direct solder bridge to D2 or D3 without the need for the wire.

What other hardware modifications may be necessary? Well, most sensors have a quiescent current draw even when they’re not active. Would it not be nice to have them seperated from the battery until they’re really required? That’s exactly what we’re going to do. Transistors can be used to interrupt the flow of current until triggered, and the RFu328 has a sufficient number of pins remaining that allow for controlling (gating) as MOSFET.

An OpenSensors example on GitHub.

Closing the Loop Between Maker & Customer for Connected Devices

Onboard IoT hardware as easy as Kindle, at a fraction of the cost.

Last month we released OpenSensors’ Search and Subscribe features. This IoT Day we’re pushing our latest development, Organisations, or Orgs for short.

Orgs is a new feature focused on streamlining hardware production workflows and managing large numbers of devices.

With OpenSensors’ Orgs, you are able to manage all of your devices and data in one place with and pretty soon you will get an at-a-glance easy device health monitoring – giving your customers a seamless user experience equivalent of the Kindle but without Amazon’s budget. Those of you managing 1000s of devices in the wild are only too aware of how hard it is to manage the workflow between factory and individual devices all over the world.

We’re trying to make IoT onboarding easier for Hardware startups and connected device manufacturers. That’s why we’ve designed a production to customer workflow, think Kindle for IoT.

People managing organisations can now batch provision and manage their new devices. Devices can be searched for and ‘claimed’ by your customers via the UI or API adding relevant meta-data such as location information to help you keep track of your devices and new customers once your connected devices are out in the wild.

It’s an exciting feature, and we hope you’re just as excited as we are. We want to save you time and effort and also enable you to get your connected devices to market quicker and cheaper.

As some of you may have noticed, we released a stripped down Alpha version of Orgs in our last release. We’ve learned from your feedback.

And we’re not finished yet, we have a lot of plans for Orgs and this is the first step in the journey. To celebrate IOT day we are giving away 3 months worth of Organisational hosting and functionality to IoT startups. Get in touch on for a voucher.

For more details about Organisations and how to use them check out our help pages

Happy IoT Day!

When you are working with many connected devices updating the software on them can be tedious. Sometimes sensors are in hard to reach locations and having to get to all of them to update a little bit of code can be a nightmare. has a solutions for this. They have made it quick and easy to update the code running on all your connected linux devices. This is very useful for Raspberry Pi based sensors.

Here we’ll explain how we used to get code running on our devices publishes to our OpenSensors topic.

Getting Started

Firstly we checked out the Resin getting started guide found here:

We needed to reformat the SD before we got it to boot properly.

Following the instructions on the getting started guide we pushed the node.js text2speech project they suggested. We were surprised at how easy it made getting code running on your PI.

The next step was to push some python code and get it running. We used a barebones hello world python script with the required docker file which we found here:

That too was easy to get working.

Communicating with OpenSensors

To get the extra packages needed to communicate with OpenSensors we added some lines to the docker file:

RUN apt-get install -y python-pip RUN pip install paho-mqtt

the -y was needed to select the yes option on the pip install.

Then we added some code that uses the paho-mqtt library in the python script

You can check it out here

You’ll have to change the username and device ID and password to get it working.

It created the image uploaded it and started running without a hitch. We could see that it was working because the messages were appearing on the OpenSensors online dashboard for my topic. is a great product that solves a very real problem in a clever way, I will be adding it to my toolkit.

Does this help you solve a problem you’ve been having with your connected sensors? If so get contact and let us know what you are up to on

Open and Transparent

Open And Transparent

It’s been an interesting few days in OpenSensors HQ. My fairly harmless comment about privacy and security for the Internet of things on Twitter turned into a fracas. A fairly prominent blogger and activist, Aral Balkan, took issue with our use of the word Open in our company name. Twitter is not the best medium for nuanced debate so I wanted to address the points raised.

The central point of Aral’s comments can be summed up by his tweet If you have a closed platform, don’t call it open. Is that too much to ask for?

For clarity, here is why we are proud to call ourselves OpenSensors.

Who we are

We are an Open Data and Internet of Things startup incubated in the Open Data Institute, founded by our hero Sir Tim Berners Lee. If you have never come across open data please check out his great talk about what it is and also see the great Open Knowledge Foundation’s write up.

The Internet of Things (IoT) is a very broad term for connecting day to day objects and sensors to the internet. In the IoT world open refers to Open Source Software (OSS), Open Data, Open Hardware, Open Protocols and the Open Web. The common thread that holds these ideals together is that accessibility is key to creating value and benefit.

We strongly believe in all of these ideals. We write open source code and we develop firmware for Open Hardware devices and our guiding principle is to support Open Internet of Things and Web protocols.

OpenSensors aims to create a real time public data exchange. Most public sensor data sets are currently sitting in silos and we will make them available for reuse by anyone with Open Data Licences. Publishers of data sets include individuals, cities, etc Data such as Air Quality information, flooding, parking etc is so much more useful when it’s accessible and reusable by as many people and services as possible.

In order to do this, we have built a hugely scalable core, thanks to existing Open Source projects. We use standard web technologies such as HTTP as well as Server Sent Events for easy real time transfer of data between sensors and the web. In addition, we use Open Sensor protocols such as MQTT to enable M2M applications and we will soon have support for another great open protocol CoAP.

Will all our code be Open?

We will develop open source software (including our core azondi). We will also contribute back in some way to the huge amount of open source software we use such as Cassandra, Elastic Search, Postgres, Netty, a ton of clojurewerkz projects and have incubated Cylon, a security library, from Alpha. We have plans to release a ton of other OSS projects for things that we needed to scratch our itch. We recognise that we stand on shoulders of the giants of the computer science world.

All that being said please be aware that we are not a social enterprise and there will be parts of our code base that will be private. We are ultimately a for profit company and our aim is to create a sustainable business model for an engineering led business to thrive.

  • We want to hire amazing developers to solve hard problems, enable them to unleash their creative energies and love the product.
  • We pay everyone from interns upwards a sustainable wage.
  • We value diversity and spend time and money organising community groups for free to give back.
  • We do not charge to speak at or to arrange community events.
  • We run paid training events where at least 30% of attendees places will fully or in part covered by Open Sensors for those that don’t have the means to pay the full price.
  • We do not depend on government funding and our pricing structure is very clear

We have a freemium model around open data that will hopefully create a lot of value to a lot of people. We also will enable our paid clients to build connected products for a charge in order to pay for servers, salaries, office costs, etc.

We aim to find enough people to give us their hard earned money by building an amazing product. It is that simple. We do not resell private data or try to create revenue from insight into private user behavior.

Do we have the right to call ourselves Open and claim a seat at the table?

Hell Yes! No one gets to play at un-appointed gatekeeper in our communities especially using exclusionary language and labels, not even the founder of the web and open data.

European Parliament Approves eCall Technology

European Parliament approves eCall connected car platform

The Internet of Things threatens to revolutionise everyday life, embedding and imbuing everyday objects and the world around us with sensors, software and electronics. Through machine-to-machine communication, automation and advanced analytics, we are able to understand and scrutinise our environment and the processes which surround us in ways never conceived. From high level analysis allowing automated condition monitoring of critical engine parts, giving engineers the tools to reduce costly operational downtime to embedding real-time sensors in bridges to predict stresses and flooding. Beyond the Cloud, the Internet of Things brings the internet to the everyday, and there are clear use cases for such technologies in the realm of road safety.

This is where eCall comes in. eCall is a European Commission initiative coming into force on 31 March 2018, making mandatory the deployment of internet-connected sensors into cars that enable emergency services to be immediately contacted and requested automatically after a serious road incident within the European Union. EC VP for Digital, Neelie Kroes, argues “EU-wide eCall is a big step forward for road safety. When you need emergency support it’s much better to be connected than to be alone.” eCall will drastically cut European emergency service response times, even in cases where passengers are unable to speak through injury, by sending a Minimum Set of Data (MSD), including the exact location of the crash site.

The deployment of eCall is one of most ambitious EU-wide programs since the 2007 enlargement, rolling out implementation of the eCall platform to some 230 million cars and 33 million trucks in the European Union. Implementation of eCall at a European level (including Norway, Switzerland etc) however benefits consumers and industry through reducing costs due to economies of scale, reducing the installation cost to as little as €100. The basic pan-European eCall service will be free at the point of use for equipped vehicles. It is likely that the eCall technology platform (i.e., positioning, processing and communication modules) will be exploited commercially too, for stolen vehicle tracking, dynamic insurance schemes, eTolling and emerging forms of vehicle-to-vehicle (V2V) and vehicle-to-infrastructure (V2I) road safety systems. eCall will be based upon a standardised platform, one system for the entirety of Europe, aimed at enabling both car and telecoms industries a quick roll out and to avoid crippling OEM versioning and patching issues.

In terms of privacy, the basic eCall system has been given the green light by the European Commission on the express condition that firm data protection safeguards are in place and that the sensor-equipped vehicles will not push data to external servers except in the case of a crash, or by the actions of the driver, in order to contact the PSAP (Public Safety Answering Point) and will lie dormant until that point. The data transmitted to the emergency services, described as MSD, Minimum Set of Data, are those strictly needed by the emergency services to handle the emergency situation. While in normal operation mode the system is not registered to any telecoms network and no mediating parties have access to the MSD that is transmitted to the PSAPs.

Today the European Parliament’s Internal Market and Consumer Protection Committee MEPs voted on and approved eCall pushing forward a life-saving Internet of Things technology that will significantly improve European road safety. The UK Government however, has not followed suit, whilst welcoming the implementation in other member states, feels that “it is not cost-effective … given the increasing responsiveness of our road network, we feel that smart motorways do the same thing,” remarked Minister Perry on behalf of the Department of Transport. Whilst it can be argued that ‘Smart Motorways’ are far from a worthy substitute to connected cars & V2V/V2I systems, the UK’s criticism belies a certain caution with regards to green-lighting large and costly IT projects. Only time will tell whether the UK Govt’s decision has left those drivers not on Britain’s Smart Motorways in the lurch.

Measuring Air Quality on Opensensors

Measuring the air quality of the ODI using an Arduino and a Shinyei PPD-42

So, whilst thinking of a good demonstration for the Opensensors platform, we thought why not see how polluted our workplace is by hooking up a sensor to publish a continuous data stream to the Opensensors messaging broker.
For this we need an easy to pick up and use sensor, we settled on the Shinyei PPD-42. We’ll use this in order to measure the number of potentially hazardous small particulates in the air, with an arduino connected to a linux PC (or Raspberry PI).

To run this mini-project you will need:

  1. Shinyei PPD-42
  2. Arduino UNO
  3. Computer with Linux installed (you can use a Raspberry PI)

We are basing this run-through on a project called DustDuino that uses the Shinyei PPD-42 sensor with an arduino and a wifi module. Check it out here. We used this project as our reference when setting up the sensor and writing the Arduino code.

Firstly we follow step 2 of the instructions for hooking up the sensor to the Arduino. Then we download the code from the projects github repository by opening the link for the code DustDuinoSerial.ino selecting raw and saving that page.

Opening this up in the arduino IDE, we now upload it to our Arduino UNO by connecting the Arduino and pressing upload.

You can check the data is coming in by using the Arduino IDE’s serial monitor.

We then need to figure out how to send the incoming serial message to the Opensensors message broker.

To do this we chose to write a Python script. We used the Mosquitto Python module. I’m going to assume that you already have Python installed, as it comes pre-packaged on most versions of Linux. If you don’t have it already, you’ll need to install pip to download and set up the Mosquitto python module. On Ubuntu or Debian this can be done with the following command:

sudo apt-get install python-pip

Once pip is installed we can install the Mosquitto python client module using the following command:

sudo pip install paho-mqtt

You can find out how to use the python module by having a read through the website which we’ve linked above. Writing and compiling python is really easy.

Hello Python World

Open up your favorite plaintext editor. Enter the line:

print “Hello World”

Save it as Then in terminal, navigate to your document and enter the command:


You should see your “Hello World” response. It’s that easy.

Hello Opensensors

To use the Mosquitto client python module we can run the following code to test out publishing. You’ll need to replace my username “Louis” (keeping the speech marks), and password with your details:

The mosquitto library we need to communicate with the Opensensors message broker:

import paho.mqtt.client as mqtt

Initialise the client option with our client ID of our device:

mqttc = mqtt.Client(client_id="939")

Set our username and password:

mqttc.username_pw_set("Louis", password="AbcDEFgH")

Connect to the Opensensors server:


Publish a message to say hello:

mqttc.publish("/users/Louis/test2", payload="Hello Opensensors!", qos=0, retain=False)



Success, you should now have a functioning sensor :)

Next we need to get the serial working. To find out what your arduino serial port looks like we executed following command into terminal:

dmesg | grep tty

The output was something like this…

[    0.000000] console [tty0] enabled
[ 3522.192687] cdc_acm 7-1:1.0: ttyACM0: USB ACM device

The second line has details of our Ardiuno. The ttyACM0 is the device name and ‘/dev/ttyACM0’ is the serial port.

To open and read the serial port Python makes it really easy. You can run a little test to check whether it is working by using the following code:

For communication with the Arduino we need to use the serial library:

import serial
ser = serial.Serial(‘/dev/ttyACM0’) # open first serial port
while True:
print ser.readline()        # prints each line it reads from serial

Finally we just need to hack together the two pieces. Here is the code we used:

import serial
import paho.mqtt.client as mqtt
import time

mqttc = mqtt.Client(client_id="939")
mqttc.username_pw_set("Louis", password="AbcDEFgH")

ser = serial.Serial('/dev/ttyACM0')  # open first serial port
while True:
message= ser.readline()
print message
mqttc.publish("/users/Louis/ODI/airquality", payload=message, qos=0, retain=False)

Running this we were publishing our sensor data to Opensensors!

WE recommend adjusting the Arduino code to output the data in JSON format. This will make it easier to read and add functionality.

You can check out the topic producing Open Data we created here!