Website Logo. Upload to /source/logo.png ; disable in /source/_includes/logo.html

OpenSensors.IO

News and Updates about OpenSensors.IO

Next Generation of Workspaces Event

OpenSensors co-hosted a panel for invited guests on the Future of Workspaces with Cushman & Wakefield. The panel also included Yodit Stanton, CEO of OpenSensors, Uli Blum, Architect at Zaha Hadid and Simon Troup, Founder of Fractalpha. Juliette Morgan, a Partner at Cushman & Wakefield moderated the panel. It was a lively crowd with a sense of urgency – wanting the future now!

Key takeaways

Our panelists gave a view of the current state of data driven workspaces through their different lenses.

Data driven world

For Uli Blum, Architect at Zaha Hadid the world is increasingly driven by data. It gives us much more understanding of the technical aspects of how people work and are living in our spaces. He shared about different work styles, variations of acoustics across a floor, lighting conditions, proxemics, adjacencies, and connectivity. Zaha Hadid wants to better understand all of these aspects and take into account in design.

Competitive edge

Simon Troup, Founder of Fractalpha shared how with data you are trying to find that secret sauce that differentiates you from the competition. He gave an example from the financial market where having access to early data before your competition is a huge edge over them.

IoT traction

Yodit Stanton, CEO at OpenSensors shared about the traction she was seeing, the practical side of how companies are deploying sensors and how to get started. Lots of people are putting in desk meeting room footfall sensors and trying to understand how many people are in the space and how to design better. But we also see combining this workspace occupancy data with facilities data from access control and building management systems for a full view of what is happening.

Say Goodbye to Clipboards! Why Sensors Are Replacing Manual Desk Occupancy Surveys?

Over the past two decades, clipboard reports have been the foundation for desk occupancy studies. In a typical study, 12 undergraduates walk a 5km route through an office workspace to document desk and conference room occupancy. The path takes about an hour to complete and once they finish the start around the path again.

One of the primary benefits of the desk occupancy sensors is that companies can make improvements in how space is used and the potential for reducing costs and energy usage. By capturing and centralizing utilisation information, and doing so in a timely, automatic, non- intrusive manner, analytic programs can find places for improvement.

  • Staffing cost – Manual surveys are expensive, and the biggest expense in such studies is labor. The staff cost is not just for gathering the information, but additional resources are needed to do the reporting on the data.
  • On-going staff training expense – Because of the high turnover rate of these surveyors with clipboards, companies spend a surprisingly high amount of ongoing training and hiring activities. Often this is a very large hidden expense.
  • Errors – Walking a long tedious route gets boring, surveyors make mistakes, and the quality of the study suffers.
  • Sampling rates – Because of the large staff cost, manual surveys are usually constrained to about a week. Sensors enable you to get a better picture of what is going on as you are measuring for a longer period of time i.e. minimum of 8 weeks or permanently. Also, rather than sampling what is going on every hour, you can now sample every 5-10 minutes. The rule of thumb is, you’ve got to sample at twice the event frequency to have confidence in what you’re doing. If you’re doing an hourly survey, you’re really only capturing events that last 90 minutes to 2 hours with any kind of accuracy. On a 10 minute sample, you’re catching stuff that’s 20 minutes, half an hour long. On a 5 minute sample, you’re probably catching events that are 10-15 minutes long.
  • Reporting – The whole point of the study. With manual surveys, whether using pencil and paper or software, staff still need to generate reports. With OpenSensors, the sensors’ data becomes a feed and the reporting and dashboards are ready made and don’t require on-going work to be generated. The whole operation becomes less of a manual process of moving data around; we link with CAFM systems and any other facilities management systems. The process becomes API driven and enables multiple stakeholders to analyse the data.
  • Security – Sensors are less disruptive than having people constantly walking through the office.

Utilisation studies can help you manage desk sharing ratio and unit mix for your flexible working office. Workspace occupancy sensors are replacing manual surveys for a timely, automatic, non- intrusive way to manage wasted desk space and save cost and energy usage.

Why Use Sensors for Workspace Design?

Workspace designers are using OpenSensors’ capabilities to enable their customers to optimise their usage of real estate, smart buildings deliver productivity and improved UX for employees.

Why use sensors for workspace design?

Designers turn to IoT technology and OpenSensors’ digital data layer to address the needs of the owners, facilities managers and building tenants. Innovative new IoT technology and OpenSensors’ data reports, alerts and dashboards provide designers with detailed understanding of how people are using the space vs gut feel on building performance.

A game-changer for the industry

  • Winning more deals both for new development or re-fit of iconic buildings
  • Lower cost than manual surveys
  • Real-time information to facilities managers and even tenants
  • Private data combined with public
  • Understand Air Quality factors for building wellness assessments

Sensors to replace manual work

For the first time deployment and maintenance of smart IoT sensors have become a cheaper alternative to manual occupancy questionnaires and surveys, sensors can have sampling rates of anywhere between once every few seconds to once every 30 mins. This sensor data can be correlated with information from Building Management Systems (BMS) to provide richer context and considerable more insight than manual surveys. Common interfaces include BACnet, KNX and other major systems. These data not only can be combined with private building data but can also be combined with public data like outdoor pollution.

How does it work?

OpenSensors have built hardware, installation and network provider partnerships and relationships to help architectural firms implement smart IoT devices efficiently. We have found that the most successful IoT projects follow a phased implementation approach: Design Phase, Proof of Concept, Pilot, and Deployment. The design phase asks questions such as which sensors, who will be installing and maintaining the sensors. For Proof of Concept, a lab evaluation should include hooking up 5-8 sensors all the way through a gateway to data collection in the cloud. This will give enough real data to verify that the queries and the analytics are feasible. The Pilot Phase ensures that the sensors work at scale and that the gateway configuration has been made easy for the deployment specialists. A pilot phase should be about 40 sensors depending on the density of the sensors. At this point, you can scale up to the number of sensors and the bandwidth required for full deployment.

Practical Examples

Heat maps can help define predictable patterns of usage including peak demand for: * Desks – real-time information of which desks that are in use and which that are available * Conference rooms – Do you have the appropriate amount of meeting rooms, and are they of the right size? * Breakrooms – Where do tenants tend to go and hang out? Are some breakrooms over- or under-utilised? * Corridors and hallways (footfall monitors) – Are some paths through the offices more used than others? Why?

Sensors helps in pitching for new work in a world where people are aware of sensors and how they can drive revenue. Firms who have sensor capabilities have adopted data driven design methods which is replacing gut feel.

Emerging Areas of Practice

Using sensor data enables more accurate planning, and by making it available to occupants, you enable them to both change their behavior and allow them real-time insights and finer customization.

Integration

  • Digital scale models: OpenSensor data can be integrated with architects’ current CAFM systems and 3D rendering environments.
  • Intelligent / Reactive Environments: OpenSensors data can be integrated with displays for open desk notification.

Top 10 Reasons for Data Driven Design

Two of the biggest risks you face as an architect or space planner are:

  • overlooking a key problem in your design or
  • investing too much space or budget for one of the client’s goals, leading to less satisfactory solutions to other goals.

These are often two sides of the same coin: while experience counts for alot it cannot always compensate for a lack of data about how the client is actually using the current space. A small investment in sensors to continually monitor desk usage, hallway traffic, and meeting room and break room occupancy can yield a wealth of hard data to base your design decisions on. This data allows the team to move past dueling hypotheses and get on the same page about real needs based on current usage patterns, which in turn makes the design and development process more efficient and allows the team to craft better solutions for the client’s needs within their space and budget constraints.

The Good Sensor

On a daily basis our customers and community ask us to recommend a sensor provider to buy from, you should ping me on hello@opensensors.io if you want us to recommend your sensor. Often the requirement is vague, “I need an air quality sensor to put on my street for $100?” or “What sensors shall I use to understand my space usage?”. My process of assessment has grown more refined over time because if the sensors we recommend are unsuitable or unusable our company’s reputation is also on the line by association.

So we have come up with our own unscientific way to rate the quality of a sensor that should be applied simply. Most large scale sensor rollout projects of 1K or more often have these requirements as well. It’s possible that sensor providers that don’t rate highly using our criteria produce good sensors but getting the below right takes iteration and discipline in design and the likelihood is that the provider will a higher chance of being able to deliver.

Battery life If a sensor is battery powered, the typical expected life of battery should be clearly stated. Buyers will often want some explanation of what typical means for your sensor i.e. if it’s a PIR sensor have you calculated battery life based on being triggered once a day? The last thing your customers wants to do is invest in a lot of sensors, plus the cost of installation in order to find out that the battery life is only % of what they expected as it will still cost them a lot of money to rip them out and return them.

Bonus point for sensors that publish their battery status as standard so that the sensor owners can have some warning before changing.

Heartbeats

Sensors should tell people whether they are still alive or not periodically. Depending on your battery and connectivity constraints, this can vary, the important thing is that the buyer should not find out a bunch of devices are not working because they haven’t been heard from in days or weeks. Top tip; Heartbeats every 10-60 minutes when possible is sufficient, anymore and it ceases to be informative.

Installation and maintenance procedure

In non consumer environments, the people installing and maintaining sensors are often not the technical design firms or manufacturers. Does your device clearly tell people how to install it, do you have helper applications so that they don’t have to configure firmware? We are working on some solutions for this but more on this later; hint it’s all about enabling people to install sensors efficiently and a non technical installer being able to walk away knowing that the device has joined the network correctly. Does your sensor come with mounting and fittings?

Do people have to unscrew the casing to change batteries? Have you tested this with people and verified it?

 Data Quality

Quality in my definition means, is the data from your sensor easily understandable for someone that doesn’t know your domain. The reality is that often manufacturers pass on the analogue value of the particular sensor and that is too low of an abstraction for most people trying to read it. Battery voltage is a good example, during its life an AA battery will go from 1.5v to about 0.8v, but it follows a curve specific to the device and the battery. Understanding how this maps to a percentage or days of life is often complex. If it’s not possible to do much conversions or processing on your sensor or gateway, perhaps a handy explainer when people buy your device making them understand what the data means.

Support

Please state clear terms for warranties and return procedures to protect your consumers. Consumer protection should naturally apply.

Finally developing high quality hardware is hard, I am always amazed at the skill and dedication it takes when hardware designers and engineers take an idea and get it to manufacturing stage. We try to manage the community’s expectations on sensors they should buy vs the attitude of ‘just throw around cheap sensors’. It would be better in terms of environmental sustainability and user experience to get into the habit of doing more with less sensor density. For more on this, see Dr Boris Adryan’s excellent blog post

I have purposefully not mentioned security in this post as security assessments come with a lot of complexity, will aim to write up on this sometime soon.

Many Thanks to Toby Jaffey for editing.

Tips for Installing a Community Air Quality Sensor Network

Small air pollution sensor technologies have enabled deployment of numerous sensors across a small geographic area that can supplement existing monitoring networks and significantly reduce the cost of longer-term community air pollution studies.This helps mitigate the risk of current approaches to monitoring air quality in a region that rely on only a dozen or so stations and may give you an average that is not be representative of what’s happening where you live.

What are you trying to do

Air quality is affected by many possible contaminants, in fact the Environmental Protection Agency (EPA) has identified six “criteria pollutants” as pollutants of concern because of their impacts on health and the environment . The criteria pollutants (http://www.epa.gov/airquality/urbanair/) are:

  1. ozone (O3) http://www.epa.gov/air/ozonepollution/
  2. particulate matter (PM) http://www.epa.gov/air/particlepollution/
  3. carbon monoxide (CO) http://www.epa.gov/airquality/carbonmonoxide/
  4. nitrogen dioxide (NO2) http://www.epa.gov/air/nitrogenoxides/
  5. sulfur dioxide (SO2) http://www.epa.gov/air/sulfurdioxide/
  6. lead (Pb). http://www.epa.gov/air/lead/

Under the Clean Air Act, the EPA has established primary and secondary National Ambient Air Quality Standards (NAAQS) for these six pollutants. As you begin, keep in mind what you want to measure and how that information will be used. Is there some final output or final report you’ve got to get to?

Understand your sensor choices for collecting air quality data

Commercially available sensors can measure the level of potential contaminants including O3, NO2, NO, SO2, CO, PM2.5 and lead. These devices should be designed to be easy to connect and provide quality data measurements so that non technical community groups can deploy them.

Here are some factors to consider in assessing options for sensors to collect air quality data * cost * operating lifetime * accuracy, precision,and bias of measurement * range of sensitivity * speed of response time * maintenance requirements * reliability

More information on what and how to measure see https://cfpub.epa.gov/si/si_public_file_download.cfm?p_download_id=519616

Beyond the sensors, you will need to make tradeoffs between cost and redundancy for the best network connectivity.

Point to point – lowest cost, greater number of coverage points, least redundancy for each individual point Mesh – higher cost, greater redundancy

Most community-based sensor networks are adopting point-to-point network connectivity because of the ease of connection and low-cost structure. Here is a guide that we already have around pros and cons around connectivity, use that to find the best connectivity network

Our Process

OpenSensors recommends a phased approach, from proof of concept to full-scale deployment, to ensure a successful installation of an IoT network in a business environment. Our aim is to reduce the time to go live and minimize risk.

Phase 1 Evaluate sensors:

Evaluate different sensors for quality, signal-to-noise ratio, power consumption and ease of setup by trying them out on a very small scale in a lab.

Phase 2 Proof of concept:

Do a full end-to-end test to verify that the queries and analytics were feasible. Connect 5 to 10 sensors to a cloud infrastructure.

Phase 3 Pilot phase:

Move out of the lab into your actual environment. Typically, this requires somewhere between 30 to 100 sensors. We suggest a one to two-month test to ensure that the sensors work at scale and the gateway can handle the load, similar to production usage.

In addition to testing the sensors in the wild, this is the time to think through your onboarding process for the devices. Questions like; who will install the sensors feeds into design decisions on the firmware of how much pre-configuration has to be done. We recommend a ‘just works’ approach and an assumption that all sensors will be installed by people who willnot configure firmware. If you need to deploy 200-300 sensors, the installation engineers need to be able to deploy a lot of sensors in a distributed physical environment over a short amount of time. It is much more efficient for your sensors to be pre configured. In these situations, we give usually give people a simple interface to enable them to add meta data such as location and elevation. Sensors should be labelled clearly and details pre-loaded on a cloud platform like OpenSensors before they are deployed so that adding meta data information is a matter of 1-2 steps.

Phase 4 Plan and implement full-scale deployment:

After the pilot phase, there should be enough data to verify network performance and your choices for sensors and connectivity, after which, full deployment can be planned in detail and implemented.

Want to create your own Air Quality project?

The EPA Smart City Air Challenge (https://www.challenge.gov/challenge/smart-city-air-challenge/) is now live. The challenge is trying to help communities figure out how to manage installations of 250 to 500 sensors and make the data public. OpenSensors.io is free to use for community projects working on IoT Open Data projects and will be supporting the EPA’s iniative.

Contact us if you would like assistance on sensor selection, network design, or planning a proof of concept deployment.

Path to Smart Buildings

Whether you are a building manager planning efficient space usage or an architect looking to design state-of-the-art buildings, we have broken down the steps to get you to your desired end goal. IoT planning should start with the business needs, of course, and quickly moves from the component layer all the way up to the application layer. We need to figure out what core data should be gathered and ways to effectively leverage that data. These IoT solutions require an end-to-end or device-to-cloud view.

A Phased implementation approach works best.

We have found that the most successful IoT projects follow a phased implementation approach: Design Phase, Proof of Concept, Pilot, and Deployment. The design phase asks questions such as which sensors, who will be installing and maintaining the sensors. For Proof of Concept, a lab evaluation should include hooking up 5-8 sensors all the way through a gateway to data collection in the cloud. This will give enough real data to verify that the queries and the analytics are feasible. The Pilot Phase ensures that the sensors work at scale and that the gateway configuration has been made easy for the deployment specialists. A pilot phase should be about 40 sensors depending on the density of the sensors. At this point, you can scale up to the number of sensors and the bandwidth required for full deployment.

OpenSensors’ Deployments

We have built hardware, installation and network provider partnerships and relationships to help customers get rollouts live efficiently. Either roll out your own network or we will put you in touch with your local sensor installation specialist to take care of the install and maintenance. We are working with customers and the community to understand what is required at each level for your IoT solution and can ease development and integration issues.

Lessons Learned From First Generation IoT Installations

At first glance,Wi-Fi-based sensors seems like a good choice for a non consumer facing sensor network, however, we have discovered that Wi-Fi has some significant drawbacks.

Access

One of the biggest drawbacks to Wi-Fi enabled sensors in a corporate environment at many of the companies is gaining access. Corporate IT often has valid security concerns of hundreds if not thousands of sensors joining the network and have deployed corporate firewalls that block any access. Often this means that we are not allowed to spin up our own Wi-Fi network in order to have a gateway for a customer’s IoT sensor network. If IT has already deployed a Wi-Fi network they are rarely willing to provide the passwords to allow the IoT network devices and gateways to take advantage of it. Relying on corporate Wi-Fi can make on-site installations and maintenance extremely complex and painful. The whole project becomes dependent on the goodwill of a network administrator for access every time maintenance needs to be performed.

Power

Wi-Fi has good transmission range but that comes with a cost of high power usage. With a short battery life, maintenance costs for Wi-Fi sensors are higher than low-power alternatives. One wireless protocol that is we see in many successful deployments is LoRa because it offers long transmission range at a much lower battery usage than Wi-Fi.

Moving to LoRa and other long range protocols

If you follow our blog and publications, you will notice we have been talking a lot about network technologies, this isn’t a coincidence. We have spent a long time evaluating and piloting these stacks with our community.

Network access and battery constraints are driving the move to long range networks and off WiFi for many IoT installations. LoRa is working well for us so far for a number of use cases most of our customer spin up a private network. The ecosystem of providers is maturing and we are finding a lot of companies who are adopting existing sensors for their networks Gateway providers such as Multi Tech provide good support for the long tail of small scale (> 250 sensor installs) hardware providers to thrive.

LoRa is a wireless protocol that promises between two and five kilometers transmission range between sensors and gateway, if you haven’t already done so please read our introduction to what it is. With a separate LoRa network, facilities and/or operations can install and manage the whole operation without the access and security issues of using the corporate Wi-Fi network. A typical network will have hundreds of sensor devices sending messages to a gateway. The LoRa gateway is a self contained system, we can have the LoRa network sit completely outside of the corporate firewall (GSM) and minimize IT security concerns.

One LoRA gateway can normally cover an entire real estate. This can significantly reduce infrastructure, deployment, and administration costs compared to other shorter range wireless options like Zigbee or Bluetooth that requires complex installs. Our aim is to have a system that non technical engineers can roll out and support, more on how to do this on later blog posts, but in most cases the OpenSensors team is the equivalent of ‘2nd line support’ to the onsite team who have intergrated our apis to their helpdesk ticketing systems etc.

LoRa networks can be public or private. An example of a public network is The Things Network, we continue to work with and support that community. Most current commercial projects are running private networks at this time but will be interesting to see how that evolves over time.

To conclude, LoRa is working well for us at the moment but we will keep researching other networks to enable us to understand the pros and cons of all the network providers. Sigfox is a very interesting offering that we will properly test over the next few months, for example.

Savvy Building Managers Use Sensors to Reduce Operating Expenses

Sensor networks are emerging as a mission critical method for offices and commercial spaces to save money. Offices and commercial spaces are undergoing a smart transformation by connecting and linking HVAC, lighting, environmental sensors, security, and safety equipment. Building and facilities managers are also installing utilization sensors to manage their spaces more efficiently.

Main benefits of data driven buildings * Operational efficiency * Use data for better design * Better workspace experience for employees

Changing workforce

Recently we helped a company design a prototype of a desk sensor monitoring system. Because so many of their people were working from home they wanted to accurately measure the peak demand during the day to see if they could save 10-20% of their desk space. Goals for the system were: * Monitor desk occupancy anonymously. * Minimize installation and deployment costs: rely on solutions that were simple enough that existing non-expert personnel could be trained to deploy. * Minimize day-to-day maintenance and deployment: this drove strategies for long battery life among others. * Design a deployment process that ensured install team could easily add sensor location metadata to allow for rich reporting and analysis once IoT sensor network was operational. * Limit the IT resources needed for deployment

The phased approach works best

First, we looked at many sensors, evaluating quality, signal-to-noise ratio and power consumption. It’s always a good idea to get a handful of different types of sensors and try them out in a very small scale. We chose an infrared red sensor with good battery life-time and a single LoRa gateway that could support all the floors and provide connection to the cloud.

Next we did a full end-to-end test, where we hooked up 5-10 sensors up completely to a cloud infrastructure all the way through the connectivity gateway. Now we had real data flowing into the infrastructure and could verify that the queries and analytics were feasible. This step just makes sure everything works as planned and you will get all the data that you will need.

Once you’re happy with the proof of concept phase, it is time for the real pilot phase. Instead of having just a handful of working sensors, now you’ll hook up an entire floor or a street or whatever your use case might be. It should be somewhere between thirty, forty, maybe up to a hundred sensors. At this point you can ensure that the sensors work at scale and the gateway can handle the load. Typically we see customers running these for a month or two to get a good feel for how the sensors will perform in a production situation.

After the pilot phase, you should have enough data to verify network performance and your choices for sensors and gateways. Now you can plan the full deployment in detail. It’s been our experience, based on a number of customer installations, that the most successful IoT networks follow these steps in a phased implementation approach.

The technology at the silicon, software, and system level continues to evolve rapidly and our aim is to reduce the time to go live and minimise risk. The internet of things is a nebulous term that includes quite a lot of specialised skillsets such as sensor manufacturing, network design, data analysis, etc.

In order to make projects successful, we have taken the approach of building many hardware, installation and network provider partnerships, and relationships to help customers succeed as opposed to trying to do it all ourselves. We have been working with customers to develop methods to lower the sensor density and in turn lower the cost of projects whilst still getting comparable accuracy.

Contact us if you would like assistance on sensor selection, network design, or planning a proof of concept deployment.

Getting to Grips With IoT Network Technologies

How sensors communicate with the internet is a fundamental consideration when conceiving of a connected project. There are many different ways to connect your sensors to the web, but how to know which are best for your project?

Having just spent the better part of a week researching these new network technologies, this brief guide outlines the key aspects to focus on for an optimal IoT deployment:

Advanced radio technology

  • Deep indoor performance – networks utilising sub-GHz ISM (industrial-scientific-medical) frequency bands such as LoRaWAN, NWave and Sigfox are able to penetrate the core of large structures and even subsurface deployments.
  • Location aware networking – a variety of networks are able to track remote sensors even without the use of embedded GPS modules. Supporting sensors moving between hubs – with advanced handoff procedures and innovative network topologies mobile sensors can move around an area and remain in contact with core infrastructure without disrupting data transmission. Intelligent node handoff is also crucial for reducing packet loss, if the connection to one hub is hampered by passing through particularly chatty radiowaves, the node can switch to a better placed hub to relay it’s crucial payload.
  • Interference resistance – the capability of a network to cleave through radio traffic and interference that would ordinarily risk data loss.

Low energy profiling

  • Device modes – LoRaWAN is a great case and point with three classes of edge node: the first, Class A, allows a brief downlink window after each uplink upload i.e after having sent a message, the sensor listens in for new instructions; a Class A node appoints a scheduled downlink slot, the device checks in at a certain point; and the last, Class C type nodes, listen for downlink messages from LoRaWAN hubs at all times. The latter burns considerably more power.
  • Asynchronous communication – this enables sensors to communicate data in dribs and drabs where possible, services do not need to wait for eachother thereby reducing power consumption.
  • Adaptive data rates (ADR) – depending on the quality of signal and attenuation, modern networks are able to dynamically allocate data rate depending on interference, distance to hub etc. This delivers real scalability benefits, frees up space on the radio spectrum (spectrum optimisation) and improves overall network reliability.

security

  • Authentication – maintains data integrity by ensuring the sensor which is publishing that mission critical data really is that sensor and not an impostor node. Ensures information privacy.
  • End to end encryption (E2E) – prevents tampering and maintains system integrity.
  • Integrated security – good network security avoids potential breaches and doesn’t place the onus on costly, heavily encrypted message payloads.
  • Secure management of security keys – either written remotely on the initial install or embedded at manufacture, security keys are fundamental to system security. ZigBee’s recent security issue shows how not to manage security keys, by sending them unencrypted over-the-air to devices on an initial install.
  • Receipt acknowledgement – ensures mission critical data is confirmed received by network or device.

Advanced network design

  • Full bidirectional comms – enables over the air (OTA) updates, enabling operators to push new firmware or system updates to thousands of remotely deployed sparse sensors at the push of a button. This is critical to a dynamic and responsive network. As with device modes mentioned previously, bidirectionality allows deployed devices to function as actuators and take action (close a gate, set off a fire alarm etc) rather than just one-way sensors publishing to a server.
  • Embedded scalability and consistent QoS – as load increases on a network so too does the capacity of the network. This takes the form of adaptive data rates, prevention of packet loss by interference and channel-blocking, the ability to deploy over-the-air updates and ensuring the capability to add nodes, hubs and maintain existing assets without impacting on overall network service, perhaps through automatic adaptation.

There are also a number of legal, cost, market and power focused aspects worth considering that I shall not cover here. But, critically, it’s worth mentioning that the majority of these technologies operate on ISM (industrial – scientific – medical) frequency bands, and as a result are unlicensed. These bands are regulated and there are rules, however anyone operating on these bands can do so without purchasing a license. Notably, you don’t have sole ownership of a slice of the spectrum, you don’t get exclusive access. Therefore, with a variety of other vendors blasting away across the radio waves, these technologies encounter significantly more interference than the licensed spectrum. However, the new networks, LoRa, Sigfox, NWave etc are based on protocols and technologies designed to better sort through this noisy environment, grab a channel and send a message.

Understanding that the airwaves are a chaotic mess underlines the importance placed on features such as adaptive data rates, node handoff and power saving methods such as asynchronous communication. Wired networks do not have to consider such things. But for most it’s not just a case of who shouts loudest wins. The majority of wireless protocols ‘play nice’ opting for a polite “listen, then talk” approach, waiting for a free slot in the airwaves before sending their message.

Some protocols such as Sigfox forego such niceties and adopt a shout loud, shout longer approach, broadcasting without listening. A typical LoRaWAN payload takes a fraction of a second to transmit, Sigfox by comparison sends messages 3-4 seconds in length. However, if you just broadcast without listening, Sigfox must therefore operate with severe cycle duty limitations, which translate into a limited number of messages sent per device per day and severe data rate limitations.

These choices also translate into varying costs, and critically, into battery life limitations and gains, the crux of any remote deployment.

See this link for a matrix of the major technologies currently vying for network domination.