Yet city data should perhaps be a true ‘window on the world’ and with this comes the need for a streaming video with the potential to overlay data. As ever on digital urban we use our live weather feed for prototyping systems, as its a data set that is easy to hand and also provides a range of display options.
Our weather dashboard provides data updates every 2 seconds from our Davis Pro station on the roof off Tottenham Court Road, London. It includes live graphs and gauges (via highcharts) as well an historical view of data, yet the traditional webcam view of the actual physical view of the data has always been lacking. Streaming HD video content onto the web, 24 hours a day and in a format accessible across multiple platforms has until now been problematic. This is where YouTube Live comes in, currently in beta, the system allows a simple feed from a webcam (in our case a Logitech 930e) to be streamed live, along with additional overlays to provide realtime condition updates. It is not a case of simply pointing you camera to YouTube however, the encoding needs be carried out on the host machine and then streamed to YouTube for distribution. There are a number of options (see YouTube Encoding), having tried them all out we settled on Xsplit as the best current system:
Our stream runs at 720p, 30fps with an HTML overlay taken directly from the main weather landing page. This allows the current conditions to be viewed in realtime, the move to 30fps and HD makes a notable difference – it creates a window into the world, updating in realtime and providing a smooth, natural view of the city, you can view the stream direct below:
The stream is aimed at viewing fullscreen – 720p has its limitations but is a current balance. The system accepts a 1080p feed but the encoding machine takes a notable hit on processing power. With a 24/7 feed we opted for the balance of a smooth stream and medium demands on machine capacity. Bandwidth is of course all important, the feed is coming direct from a home based fibre option feed, offering 500mb uploads over wifi and unlimited data. It is this option that is opening up the ability to create data windows on the world and perhaps a true view of the city in realtime.
To achieve more substantial changes in particle properties and behaviour, you can create a. The flow sends particles from event to event using , which let you events together in series. A test can check, for example, whether a particle has passed a certain age, how fast it’s moving, or whether it has collided with a deflector. Particles that pass the test move on to the next event, while those that don’t meet the test criteria remain in the current event, possibly to undergo other tests. The simple example pictured above details a pFlow dialogue determining the birth of particles linked to a target geometry. The particles can subsequently be baked (using pFlow Baker) into an animation timeline for simple output via .fbx, allowing import into external systems such as Unity or Lumion.
The clip above illustrates the pFlow system imported into Lumion with the addition of a scene created in CityEngine.
Although I like to use SketchUp for 3D modelling wherever possible 3ds MAX was much better suited to this project as it allowed me to quickly generate the city scene, model the roller coaster track, and animate the path of the ride, all in the one software package. After generating the urban scene I used the car from the Asset Store roller coaster as a guide for modelling my track in the correct proportions.
The path of the ride through the city was modeled using Bezier splines, first in the Top view to get the rough layout and then in the Front and Left views to ensure the path would clear the buildings in my scene. The experience needed to be comfortable to users who may not have experienced virtual reality before so it was agreed to exclude loop-the-loops on this occasion. It was also important to avoid bends that would be too sharp for roller coaster to realistically follow. Once I was happy with the path I welded all the vertices in my splines so that the path could be used to animate the movement of the roller coaster car along the track later.
Next sections of track were added to the path I’d created using the 3ds MAX PathDeform (WSM) modifier. As the name suggests this modifier deforms selected geometry to follow a chosen path. Using this modifier massively simplified the process by allowing my pre-made sections of track to be offset along the length of the path and then stretched, rotated and twisted to fit together as seamlessly as possible. This was the most intricate and time consuming part of the project.
In order to minimise the the potential for motion sickness with the Oculus Rift I was careful to keep the rotation of the track as close to the horizontal plane as possible. Supporting struts were then arrayed along the path of the track and positioned in order to anchor it to the rest of the scene. When I was satisfied a ‘Snapshot’ was made of the geometry in 3ds MAX to create a single mesh ready for export to Unity. At this point the path deformed sections of track could be deleted as Unity does not recognise the modifier.
To create the movement of the roller coaster car along the track a 3ds MAX dummy helper was constrained to the path I’d created earlier. This generated starting and ending key frames on the animation timeline. The roller coaster car model was then placed on the track and linked to the dummy helper. It is possible in 3ds Max to have the velocity and banking of the dummy calculated automatically, but I found that this did not give a realistic feel. Instead I controlled both by editing the animation key frames using a camera linked to the dummy for reference. This was time intensive but gave a better result. The city scene, roller coaster and animation were exported as a single FBX which is the preferred import format for 3D geometry in Unity.
Having completed the track and animated the car it was time to assemble the final scene in Unity. First I generated a terrain using a great plugin called World Composer. This enables you to import satellite imagery and terrain heights from Bing maps to give your backdrops a high degree of realism.
The urban scene and roller coaster were then imported and a skybox and directional light were added. The scene was completed with various assets from the Unity Asset Store including skyscrapers, roof objects, vehicles, idling characters and a flock of birds.
To prepare the Oculus Rift integration the OVR camera controller asset from Oculus was placed inside and parented to the roller coaster car. In my initial tests with the Asset Store roller coaster I’d found that OVR camera would drift from the forward facing position. This would disorientate the user and contribute to motion sickness. To prevent it with a quick fix I parented a cube to the front of the roller coaster car, turned off rendering of the cube so it would be invisible, and set the camera controller to follow the cube.
In order to ensure the best possible virtual experience it is really important to keep the rendered frames per second as high as possible. As the Oculus Rift renders two cameras simultaneously, one for each eye, you need to aim to render 60 fps in Unity so as to ensure the user can expect to experience a frame rate of 30 fps.
In order to achieve this I took advantage of occlusion culling in Unity Pro which prevents objects being rendered when they are outside the camera’s field of view or obscured by other objects.
I also baked the shadows for all static objects in the scene to save them as textures which saves the processor calculating them dynamically. The only objects casting dynamic shadows are the roller coaster car and animated characters.
Finally two simple java scripts were added. The first would start the roller coaster and play a roller coaster sound file upon pressing the ‘S’ key. The second closed the roller coaster application upon pressing the ‘Esc’ key.
The reception of the CASA Urban Roller Coaster ride at Grand Designs Live was fantastic and I’m really pleased to have participated. It was a great project to work on and an excellent opportunity to learn new techniques in 3ds MAX and Unity. Having my first VR roller coaster under my belt I’m looking forward to building another truly terrifying one when I get the time, hopefully for the Oculus Rift DK2 which has just arrived at CASA.
On a last note I’d like to thank Tom Hoffman of Lake Earie Digital whose excellent YouTube tutorials on creating roller coasters in 3ds MAX provided a great guide through the most difficult part of this challenging project.
You can follow Oliver’s latest work at http://virtualarchitectures.wordpress.com/…..
Using our current favourite Internet of Things service – If This Then That – the front light in the radio can be linked to any number of data feeds (see out post on IFTTT, Netatmo & Philips Hue: Linking Data to Lighting), at the moment it changes colour according to the outside temperature. The movie below shows the link to the Philips Hue and the iPhone BBC Radio App (ignore the cat, it decided to take part in every example i filmed):
APPLY NOW FOR SEPTEMBER ENTRY
As Course Director, i am pleased to announce the new MSc and MRes in Smart Cities, here at the The Bartlett Centre for Advanced Spatial Analysis. Both the MSc and MRes offer an innovative and exciting opportunity to study at UCL – with key training for a new career in the emerging Smart Cities market. Smart cities are focused on how computers, data, and analytics which consist of models and predictions, are being embedded into cities. Cities currently are being extensively wired, thus generating new kinds of control and new kinds of services, which are producing massive data streams – ‘big data’. To this end, we need powerful analytics to make sense of this new world with a new skill set to understanding and lead this new field.
The Bartlett Centre for Advanced Spatial Analysis (CASA) is one of the leading forces in the science of cities, generating new knowledge and insights for use in city planning, policy and design and drawing on the latest geospatial methods and ideas in computer-based visualisation and modelling.
CASA’s focus is to be at the forefront of what is one of the grand challenges of 21st Century science: to build a science of cities from a multidisciplinary base, drawing on cutting edge methods, and ideas in modeling, complexity, visualisation and computation. Our current mix of architects, geographers, mathematicians, physicists, urban planners and computer scientists make CASA a unique department within UCL.
Our vision is to be central to this new science, the science of smart cities, and relate it to city planning, policy and architecture in its widest sense.
EDUCATIONAL AIMS OF THE PROGRAMME
The MSc Smart Cities and Urban Analytics comprises 180 credits which can be taken full-time over 12 months or on a flexible modular basis of up to 5 years duration (full details of the MRes structure can be found here. If taken full time over one year, the following structure is followed:
Smart Systems Theory
The module provides a comprehensive introduction to a theory and science of cities. Many different perspectives developed by urban researchers, systems theorists, complexity theorists, urban planners, geographers and transport engineers will be considered, such as spatial interactions and transport models, urban economic theories, scaling laws and the central place theory for systems of cities, growth, migration, etc., to name a few. The course will also focus on physical planning and urban policy analysis as has been developed in western countries during the last 100 years.
The class runs during term one, for two hours per week
Assessment is by coursework (2,500 – 3,000 words)
This module will empower you with essential mathematical techniques to be able to describe quantitatively many aspects of a city. You will learn various methodologies, from traditional statistical techniques, to more novel approaches, such as complex networks. These techniques will focus on different scales and hierarchies, from the micro-level, e.g. individual interactions, to the macro-level, e.g. regional properties, taking into account both discrete and continuous variables, and using probabilistic and deterministic approaches. All these tools will be developed within the context of real world applications.
The class runs during term one, for three hours per week (one hour lecture followed by two a hour practical)
Assessment is by coursework (2,500 – 3,000 words) and via an exam
This class runs during term two, for one and a half hours per week
Assessment is by coursework (2,500 – 3,000 words)
Spatial Data Theory, Storage and Analysis
This module introduces you to the tools needed to manipulate large datasets derived from Smart Cities data, from sensing, through storage and approaches to analysis. You will be able to capture and build data structures, perform SQL and basic queries in order to extract key metrics. In addition, you will learn how to use external software tools, such as R, Python, etc., in order to visualise and analyse the data. These database statistic tools will be complemented by artificial intelligence and pattern detection techniques, in addition to new technologies for big data.
The class runs during term two, for two hours per week
Assessment is by project output (5,000 – 6,000 words)
The module provides the key skills required to construct and apply models in order to simulate urban systems. These are key in the development of smart cities technologies. You will learn different approaches, such as land-use transport interaction models, cellular automata, agent-based modelling, etc., and realise how these are fashioned into tools that are applicable in planning support systems, and how they are linked to big data and integrated data systems. These models will be considered at different time scales, such as short-term modelling, e.g. diurnal patterns in cities, and long term models for exploring change through strategic planning.
The module runs during term two, for two hours per week
Assessment is by coursework (2,500 – 3,000 words)
This dissertation marks the culmination of your studies and gives you the opportunity to produce an original piece of research making use of the knowledge gathered in the lectures. You will be guided throughout this challenge by your supervisor and with the support of the Course Director, and together you will decide the subject of research. This enterprise will enable you to create a unique, individual piece of work with an emphasis on data collection; analysis and visualisation linked to policy and social science oriented applications.
Assessment is by 10,000-12,000 words dissertation.
The teaching staff are worlds leaders in the field from Professor Mike Batty MBE, through to Sir Alan Wilson – you can found out full details and staff profiles at our sub-site http://mscsmartcities.org/
Ideally, you will already have a Bachelor’s degree in an appropriate subject such as Geography, GIS, Urban Planning, Architecture, Computer Science, Civil Engineering, Economics or a field related to the Built Environment though other subjects will be considered especially if you can demonstrate a keen interest in your personal statement to convince us you should be given a place. You’ll need to have obtained a 2.2 (or international equivalent) to join the MSc.
If you do not have a Bachelor’s degree, we can still consider you if you have a professional qualification and at least three years relevant experience, so don’t be put off applying if you fall into this category because the greater the mix of students we have on each course, the more interesting seminars and discussions are going to be.
HOW TO APPLY
You can apply for this course online or find out more at http://www.mscsmartcities.org – If you have any questions, you can contact our Teaching and Learning Administrator, Lisa Cooper. The course brochure can be download in pdf format (5Mb).
Once the bulbs are connected to the network there are a variety of third party services that can be used to control the bulbs, either as a group or individually. One of our current favourites here at CASA is IF This Then That (IFTTT), IFTTT is a service that allows the creation of simple ‘recipes’, linking popular online systems via a series of rules. Among the services linked on IFTTT is the Netatmo Weather Station and the Philips Hue.
We are using the Netatmo in CASA as part of our forthcoming office data dashboard, it monitors internal temperature, humidity, sound levels and carbon dioxide with an external unit providing temperature and humidity readings.
IFTTT is an emerging service, as such it does have a number of limitations. We had to create a recipe for each temperature change, rather than combining it into one logical statement. IFTTT also only checks the readings from external services every 15 minutes, making it unsuitable for rapidly changing data. Finally, there do seem to be a few bugs, the hex colour does not seem to create a direct match to the Hue light, so a few tweaks are required to gain the correct colour output.
Despite a few limitations it does open up a number of possibilities, the next steps are to look into how to link in other data feeds. This is arguably where the power of IFTTT comes into play, its ability to link a number of services and then control external hardware makes it a perfect opening into linking data to the Internet of Things.
The whole system could arguably be built using an Arduino board / Raspberry Pi linked to LED’s at a much lower cost. The linking of consumer grade units to data is however perhaps a step towards smart objects for everyone. The whole set up took under two hours to get up and running, including the setting up of the IFTTT recipes and linking to the Philips Hue. Edit (20/2/2014) – the light is now installed inside an old Ships Lamp with the colour changing according to the outside temperature.
It is probably a niche market, but the linking of data to lighting is an intriguing development. From making the lights blink when carbon dioxide levels in the office rise through to changing colours according to a twitter or rss feed, we are excited by the possibilities.
The web based weather dashboard can be viewed at http://www.casa.ucl.ac.uk/weather/colours.html data updates every 3 seconds. We will be mounting the ‘weather bulb’ in a small globe and placing it in the corner of the office. The Philips Hue starter pack comes with three bulbs, 50 lights can be linked to each bridge. With the ability to link each bulb, or a group of bulbs, to almost any data feed, the office is about to become a colourful place…
Of note in the literature is Mike Batty’s new book on The Science of Cities along with new papers in the CASA Working Paper Series – now up to number 194 and all available to download. We have launched a new MSc in Smart Cities and Urban Analytics as well as an MRes in Smart Cities to run along side the current MRes in Advanced Spatial Analysis and Visualisation, opening up a number of new routes to gain a Masters Degree at CASA. As a note to the season, our Christmas Movie is below:
The project is building a series of architectural interfaces in East London and Nottingham neighbourhoods, which use broadcast media and interactive technologies to enhance real world connections, add value to the daily experience of the urban environment, foster community participation and ownership of the urban space. In this project we are seeking to involve local organisations and residents as key partners in the creative development process.
It is one of our favourite projects out of the current Research in the Wild themes, The Screens in the Wild team includes: makers, architects, HCI designers, computer scientists, anthropologists, developers, artists and curators.
Find out more at http://www.screensinthewild.org/
As part of a wider segment on the BBC’s One Show with a 4 minute documentary about pigeons by Mike Dilger and Adam Rogers the Pigeon Sim was wired up for a fly through by comedian Jack Dee. The clip below shows the system in action and our attempt of making sense of the whole thing on live tv:
The One Show was hosted by Chris Evans and Alex Jone and attracts audience ratings in excess of 4 million, the joys of live TV and technology should not be underestimated – it worked but with a full studio getting the Kinect to recognise a comedian in a pigeon outfit is challenging to say the least…..
You can fly Pigeon Sim yourself at the Almost Lost exhibition in Wellington Arch, London, from December 4th.
The UK’s Future Cities movement is set to receive a major boost, thanks to a new initiative by the Future Cities Catapult and Level39, Europe’s largest accelerator space for financial, retail and future city technologies, based at Canary Wharf.
Launched in March this year, the Future Cities Catapult is an independent innovation centre set up by the Technology Strategy Board (TSB). It aims to help UK businesses develop cutting-edge, high value urban solutions and then sell them to the world. The Catapult will unite business, city governments, innovators and academia to run a series of large-scale pilot projects that address how cities can become more economically active, while lowering their environmental footprint and moving towards a low-carbon economy.
The Future Cities Catapult is prioritising the ‘Future Cities Financing’ aspect of their work programme to address the need for large-scale investment into the sector. Growing cities represent a great opportunity to develop innovative technologies that have the potential to provide a major boost to the UK economy.nMore than £6.5tn will be invested globally in city infrastructure over the next 10 to 15 years. Of this, the accessible market for companies developing future city systems is estimated to be worth £200bn a year by 2030. Critical to the flourishing of future cities is greater access to finance for integrated city systems (not simply the traditional model of investment in specific projects in areas such as transport, energy and health).
The Future Cities Catapult will focus on developing and leveraging finance for cohesive city systems as a key area of activity in future. This work has begun by tasking Level39 with pulling together experts from within its extensive finance and investor networks to raise awareness of investment opportunities and draw upon their knowledge to debate the merits of the current or potential new funding models.
Peter Madden, Chief Executive, Future Cities Catapult, said: “In the future, the majority of people will live in cities. There are huge benefits to people and opportunities for business in making cities better places to live, but at present we lack the financial mechanisms to unlock these opportunities.
“Future Cities Financing will explore the challenges, opportunities and risk profiles for different kinds of investment. By engaging with the right financial stakeholders, we can also amplify the message that investment in smart cities is a new and highly valuable asset class.”
Eric Van Der Kleij, Head of Level39 at Canary Wharf Group plc says:
“We are pleased to be supporting the Future Cities Catapult by creating the right blend of industry experts to debate the crucial issue of future cities financing.” “The built environment at Canary Wharf is one of the most technologically advanced in London and we will continue this journey at Level39 by facilitating new innovation in future cities technology. Our location alongside the capital’s leading financial district and extensive contacts, which include finance specialists, fund managers and investors, will enable us to help unlock vital sources of funding.”
The clip below provides a good insight into Level39 – having been there a couple of times we can safely say its an exciting place:
Finally, details on What is the Future Cities Catapult are highlighted in the movie below:
You can find out more about the Future Cities Catapult at https://futurecities.catapult.org.uk and Level39 at http://level39.co/level39/introduction/