Part of creating a new insight into ‘Connected Environments’ (more on that term in future posts) is understanding the function, design and nature of devices. The enclosure of a device is all important and in the world of the Internet of Things, often overlooked.
Before delving into casing sensors its worth taking a step to look at kits available online and how they can be adapted to a new form. As such we have used the Pimoroni radio kit (its a great kit) as a first example:
The redesigned radio case was designed to create a small, easy to use radio with decent sound. It uses the PhatBeat header from Pimoroni, which can either be purchased alone or as as part of a Pi Radio kit, full links are below via Thingiverse. It will of course work with any other audio output from a Raspberry Pi Zero.
The speaker has been replaced with a Surface Transducer, allowing a smaller case as well as making use of what ever surface the radio is placed upon to enhance the sound. The transducer is at the bottom of the unit as a small metal disc, placing it on a wooden surface creates the deepest sound, a sound beyond the size of the device.
The buttons are ‘mini arcade’ buttons, aimed at simplifying the operation of the radio – it features play/pause (Yellow), next station (Blue) and Volume up (Red), Volume Down (Green).
There is no power off in the current version as its built to be left powered for instant audio (although the feature is built into the PhatBeat so can be easily extended).
Full details on how to make it, along with the 3D case to print are now available on Thingiverse.
.
Do let us know if you make one…

The Weather Flow ‘Smart Weather Station‘ is arguably one of the most innovative weather sensors on the market. Launched via a kickstater campaign in 2017, the system is now shipping to backers and will be made available for general sale shortly.

The system consists of two current hardware modules – the Air and Sky with a third unit – Breathe focusing on air quality arriving at a future date. The Air (pictured above left) measures Temperature, Pressure, Humidity and Lightning. The Sky unit (above middle) measures wind (via ultrasonic sensors), rain (haptic sensors), Solar Radiance and Solar UV.
Sensor data refreshes every 3 seconds via either the Weather Flow app (see smartweather.weatherflow.com/share/2701/grid for our data feed) or via a dashboard powered by various 3rd party applications.

Placing temperature and humidity in the field is not as easy as simply putting a device outside. It needs to be suitably shielded from the sun and rain as well as being at a set height (1.25m) to reduce heating from the ground. High end stations often use a ‘Stevenson Screen’ to shield instruments and to comply with international measurement standards.

Such screens are expensive to buy and often impractical for home based weather stations. As such we decided to model and 3D print our own shield for the Weather Flow Air.
Modelled using Fusion 360 it is designed to be made on a standard 3D printer, apart from the screws required to fix it to a post. We used the Ultimaker 3.

The model consists of 6 separately designed parts – top and bottom mounts, middle sections, rods, nuts and spacing washers (again all printed) – and is designed to be easy to print and assemble. Everything slides into place without any need for gluing or fixing.

Each part was sanded (1000 and 400 grit), primed and sprayed with gloss white paint – although this is not an essential, it can simply be 3d printed. The Weather Flow air sits inside and the parts act as a shield against radiation and the sun.

All the parts are available on Thingiverse, Weather Flow makes its data available via UDP, opening up opportunities to link to systems such as Node Red or any number of data displays/home hub devices. We will have more of that in future posts.

Btw – Digital Urban is back, its been a long time since the last post – with a focus on connected environments, sensors and sensor validation, data visualisation and 3D systems its good to be back online – after a long stint as Head of Department (and various other things) at the Centre for Advanced Spatial Analysis, University College London.
The increasingly availability of bandwidth along with advances in computer hardware and Internet services is making it possible to stream HD content, live from multiple places. Traditionally the first port of contact for urban data is a web page and with that comes various issues of compatibility and the ability to communicate live urban data within a singe interface. The CASA city dashboard (citydashboard.org) was developed over 4 years ago and represented the current state of play in live data.
Yet city data should perhaps be a true ‘window on the world’ and with this comes the need for a streaming video with the potential to overlay data. As ever on digital urban we use our live weather feed for prototyping systems, as its a data set that is easy to hand and also provides a range of display options.
Our weather dashboard provides data updates every 2 seconds from our Davis Pro station on the roof off Tottenham Court Road, London. It includes live graphs and gauges (via highcharts) as well an historical view of data, yet the traditional webcam view of the actual physical view of the data has always been lacking. Streaming HD video content onto the web, 24 hours a day and in a format accessible across multiple platforms has until now been problematic. This is where YouTube Live comes in, currently in beta, the system allows a simple feed from a webcam (in our case a Logitech 930e) to be streamed live, along with additional overlays to provide realtime condition updates. It is not a case of simply pointing you camera to YouTube however, the encoding needs be carried out on the host machine and then streamed to YouTube for distribution. There are a number of options (see YouTube Encoding), having tried them all out we settled on Xsplit as the best current system:

Encoding and Overlaying Data
Our stream runs at 720p, 30fps with an HTML overlay taken directly from the main weather landing page. This allows the current conditions to be viewed in realtime, the move to 30fps and HD makes a notable difference – it creates a window into the world, updating in realtime and providing a smooth, natural view of the city, you can view the stream direct below:
The stream is aimed at viewing fullscreen – 720p has its limitations but is a current balance. The system accepts a 1080p feed but the encoding machine takes a notable hit on processing power. With a 24/7 feed we opted for the balance of a smooth stream and medium demands on machine capacity. Bandwidth is of course all important, the feed is coming direct from a home based fibre option feed, offering 500mb uploads over wifi and unlimited data. It is this option that is opening up the ability to create data windows on the world and perhaps a true view of the city in realtime.
Particle Flow is a versatile, powerful particle system for Autodesk’s 3ds Max. It employs an event-driven model, using a special dialog called Particle View, allowing you to combine individual operators that describe particle properties such as shape, speed, direction, and rotation over a period of time into groups called events. Each operator provides a set of parameters, many of which you can animate to change particle behaviour during the event. As the event transpires, Particle Flow continually evaluates each operator in the list and updates the particle system accordingly.

pFlow 3ds Max
To achieve more substantial changes in particle properties and behaviour, you can create a flow. The flow sends particles from event to event using tests, which let you wire events together in series. A test can check, for example, whether a particle has passed a certain age, how fast it’s moving, or whether it has collided with a deflector. Particles that pass the test move on to the next event, while those that don’t meet the test criteria remain in the current event, possibly to undergo other tests. The simple example pictured above details a pFlow dialogue determining the birth of particles linked to a target geometry. The particles can subsequently be baked (using pFlow Baker) into an animation timeline for simple output via .fbx, allowing import into external systems such as Unity or Lumion.
The clip above illustrates the pFlow system imported into Lumion with the addition of a scene created in CityEngine.