Skip to main content
Land Use Policy

Applying cybernetic principles to the co-creation of spaces

By online planning, papers, planning No Comments

Another new paper, written with a number of colleagues from The Bartlett School of Planning –

The paper is based on the experience of creating and piloting a functioning ‘Incubator’ crowdsourcing platform for designing public spaces in an estate regeneration project in South London. The paper uses a cybernetics framework to analyse and present the way the platform itself was created and how issues of effectiveness, efficiency and equity were dealt with. It explores the generic qualities of interface and reviews applications of variety reduction in established crowdsourcing CS) models. It briefly presents the legal and socio-spatial parameters (like property rights) associated with the creation of the Incubators platform as well as the generic rules applicable to human-spatial relationships, based on studies exploring human-spatial interactions. Practical constraints including costs, catchments, life-span and meaningful feedback are looked into, followed by a discussion on social and political limitations associated with this form of public participation. 

It can be read in full in Land Use Policy – Designing an incubator of public spaces platform: Applying cybernetic principles to the co-creation of spaces. https://sciencedirect.com/science/article/pii/S0264837722002149

Neopixel Barometer

Open Weather Map NeoPixel Barometer – Open Gauges

By Blog, Making, Open Gauges
Open Weather Map Barometer

Open Weather Map Barometer

The Open Gauges project aims to allow open-source data gauges to be built, modified, and viewed as both physical (3d printed) and digital gauges. Depending on the user’s preference the models can be made to run from any online data source with a data feed – from Weather Data with Air Pressure, Temperature, Wind Speed etc though to Air Quality Gauges, Noise Meters, Energy etc.

Part of the initial release, from the Connected Environments Team at The Bartlett Centre for Advanced Spatial Analysis, University College London, and alongside the more traditional ‘dial style’ gauges, is our new Neopixel Barometer, updated for Open Weather Map. Back in October we published the Weather Flow version, this new, open source version is specifically designed to use the free Open Weather Map API, making it easier to use.

Designed to be as simple as possible it is powered by a Raspberry Pi and uses the data feed from the Open Weather Map Single Call API, making it open to anyone with data available world world, according to your choice of location. So you could chose to display local Barometric Pressure or have a series of them on display showing locations around the world. Each gauges updates every 5 minutes with a Green Pixel to note successful data collection and Red for unsuccessful

Full code and files can be found in the Open Gauges Github Repository.

Data Source

The barometer uses the One Call API from Open Weather Map, provided as JSON.

Data displayed

The Neopixel Barometer displays current sea level air pressure (Mb) and the current pressure trend – Rising, Steady, Falling.

The data updates every five minutes with a sweep of blue/yellow neopixels on power up. The pressure trend is calculated in the Python script, as its not part of the API. As such it takes 3 hours to calibrate – with ‘Rising’ shown initially and then changing to the current trend after 3 hours of data has been downloaded.

3D printed model

The main barometer markers – ie STORM, FAIR, CHANGE, as well as the numbers – 950, 960 etc are provided as separate .stl files to 3D print. This is to allow easy alignment with the Neopixel strip with the correct pixel.

The conditions come in a single section, again to be aligned once the Neopixel strip is mounted, the Trend titles are also provided. We also provide the end caps for the Acrylic Tube (optional, see below).

Wood

The Neopixel strip is can be mounted either onto a thin strip of wood approx 125 centimetres long by 4.5 cm wide using the fixings that come with the Neopixel Strip, or with a wider block. The Text/Numbers are 3D printed and glued on the wood. It is a standard wood strip that most DIY/Hardware stores stock. The use of wood/mounting is to allow flexibility – ie mount it however you like.

As an update to this post (June 22nd, 2022) we now include mounting ‘Feet’ for a table top horizontal display – as illustrated below, angled at 30 degrees to provide a clear viewing angle of the air pressue.

Digital Barometer Angled Stand

Acrylic Tube

For this updated version we adapted the model to allow the additional use of an 1m x 28mm Acrylic Tube, widely available it allows the LED strip to be mounted into the tube (we used a piece of conduit to straighten the led strip). This give the barometer a more ‘finished look’ and provides more of a nod towards the mercury barometers of old.

Hardware

The hardware has been selected to be as low cost as possible –

  • A Raspberry Pi – We used the Raspberry Pi Zero W
  • 1 Meter 144 Addressable Neopixel Strip (NeoPixel/WS2812/SK6812 compatible) – Example here from The PiHut

It is made to be mounted either vertically or horizontally – the 3D model above details the make (click and drag to examine the model/zoom in). The tabletop version with 30-degree angled legs can now be viewed directly on Sketchfab.

Code and library

The full code/3d printing files etc are provided on the Github page, which also includes the other Open Gauges to 3D print and make.

Libraries used

  • requests
  • json
  • time
  • neopixel
  • board

Digital model

The model is also provided in Fusion 360 for any edits to wording, sizing etc (note the Pi is not included due to separate licensing).

Leonardo, MIT Press

Mobile Communications Technologies in Tree Time: The Listening Wood 

By IOT, papers No Comments

This article presents a practice-led investigation by a cross-disciplinary team of artists and computer scientists into the potential for mobile and digital communications technologies to engage visitors to London’s Hampstead Heath with the histories of its veteran urban trees. Focusing on the application of Internet of Things (IoT) technologies within the arboreal environment for the digital poetic walk, The Listening Wood, it considers the reciprocal impact of “tree time” on the development of “slow tech.”

You can read the paper via – Lovett, Leah & Hay, Duncan & Hudson-Smith, Andy & Jode, Martin. (2020). Mobile Communications Technologies in Tree Time: The Listening Wood. Leonardo. 54. 1-2. 10.1162/leon_a_02006.

Star Trails

AllSky Camera Project – 3D Printed Case, All Day Timelapses’ and Home Assistant for the Pi HQ Camera

By Blog, Making

AllSky is an outstanding package for the Raspberry Pi, allowing the capture of long exposures using either the Raspberry Pi HQ camera with a fish eye lens or the more specialised ZWO astronomy cameras. All it needs is a casing and while there are options to use drainpipes etc there did not seem to be a 3D printed option, so while laid up with Covid we fired up Fusion 360 and made a solution with the aim to be as simple as possible to make. In addition we have linked the output of AllSky to IPTimelapse on Windows, for additional image staking and data overlays from Weather Underground which are then additionally merged into all day Timelapses’, and then fed into Home Assistant, opening up the option of both day and night captures of the sky.

Our 3D printed case is rendered below:

AllSky Case - Fusion 360 Render

AllSky Case – Fusion 360 Render

Using the system it is easy to produce the output below – an all night image of 40 second exposures using the Raspberry Pi HQ Camera.

Allsky Star Trail

Allsky Star Trail

Hardware Required 

Raspberry Pi 4 + 32gb SD Card – from PiHut

Pi HQ Camera – from PiHut

180 Degree Fish Eye Lens – see the excellent low cost option from PiHut

4 Inch/100mm CCTV Dome  – from Amazon

Rain X Spray (optional – Amazon)

Small tube of Sealant – from Amazon

3D Printer (we use a Prusa Mk3 but any 3D printer will do)

The case consists of two parts, the Top – holding the Pi Camera, Fisheye Lens and the 4 inch Dome and the Bottom, holding the Raspberry Pi. The top simply slots into the bottom, a tight fit with a ridge to place a run of sealant around the edge.

AllSky Camera Case Parts

AllSky Camera Case Parts

The dome also slots in with sealant applied to waterproof the fit – giving the final case, pictured right.

AllSky Camera 3D Printed Case

AllSky Camera 3D Printed Case

The STL files, to 3D print the case can be freely download from the Prusa Printable site.

Software Set Up

The set up and install of AllSky is well covered at its corresponding  Wiki and Github pages – with the odd gotcha we found on running through the instructions for the current version using the Pi HQ Camera. The main parts are to install the Raspberry Pi Operating system, install the main AllSky Package via the command line, install the Graphical Interface and then the Web Server set up.

The whole things takes around 30 minutes, providing you with arguably the best and most configurable option to photograph and timelapse the sky out there. Currently on version 8.0.3, the software is under active development so as ever there are always a few moments of trial and error to get the correct settings. The main takeaway is to go with the default options for the Raspberry Pi HQ Camera set up and in the ‘Editor section’ include the line – CAPTURE_EXTRA_PARAMETERS=”-daymean 0.7 -nightmean 0.3″. This is highlighted in the wiki but can be easily missed, without it our system would not produce any usable images.

Timelapse’s

While AllSky is primary designed for capturing the night sky and producing startrails and a nightly time-lapse, it will also capture images during the day which can be sent to additional software to produce daytime Timelapse’s. There are many options at this point, but over the years we have found that IPTimelapse (Windows) is one of the best options of there for long term, reliable Timelapse production.

It also includes the ability to overlay weather data from Weather Underground and in additional to stack images, producing brighter star images than AllSky currently offers. The addition of IPTimelapse is not required and indeed adding in a Windows machine can complicate matters, but it does offer enough extra to make the steps worthwhile if you can. Allsky can capture a live image every set number of seconds and output it to a local file – http://allsky.local/current/tmp/image.jpg – this runs day and night – IPTimelapse simply listens for an update to this file and overlays data, image stacks and overlays weather data while then at the end of the day outputting its own Timelapse to save locally or FTP to a website.

Allsky to IPTimelapse

Allsky to IPtimelapse

The two images above show the output of AllSky vs AllSky and IPTimelapse with its Star Image Stacking, in this case 4 images. It produces notably brighter images, although with each exposure at 40 seconds and then stacked there is a slight movement in the image. Note also the timestamp and weather data overlay added at the bottom of the image.

We sprayed the dome with RainX – allowing water to bead and run off. The Timelapse below shows an all day Timelapse with a mix of rain, snow and clearing at the last part of the video to show the stars.

As a final step, we then feed all of this into Home Assistant  – our current dashboard is pictured below, allowing the sky image to be viewed in context to local environmental data.

Home Assistant with AllSky

Home Assistant with AllSky

The 3D files take approximately 8 hours in total to print, using 0.3mm draft mode, do let us know any thoughts in the comments and we look forward to seeing any AllSky images you produce.

Close Menu

About Salient

The Castle
Unit 345
2500 Castle Dr
Manhattan, NY

T: +216 (0)40 3629 4753
E: hello@themenectar.com

Archives