Skip to main content
Pebble Watch Face

Just Weather for The Pebble Watch: Making a Data Watch Face

By Apps

It’s an exciting time to be a Pebble fan. After years of being kept alive by the dedicated Rebble community, the Pebble is officially back. The new Pebble 2 Duo watches (the black-and-white model) are officially shipping to the first backers, with the high-resolution color Pebble Time 2 set to follow.

So, I decided to make one myself.

 

Introducing ‘Just Weather’

I wanted a face that was clean, digital, and gave me all the key data at a glance, formatted to look great on the 144×168 screen of the Pebble 2. I call it “Just Weather.”

It uses the free Open-Meteo API to pull in a ton of useful, hyperlocal data right to your wrist:

 

    • Current Location (from your phone’s GPS)
    • Temperature
    • Current Conditions (“Partly Cloudy,” “Rain,” etc.)
    • Barometric Pressure & 3-Hour Trend
    • Wind Speed & Precipitation
    • …and of course, the time!

 

 

Built with GitHub CoPiliot

The best part is that the new Pebble development workflow is incredibly modern. I was able to build this using the CloudPebble IDE, which now integrates directly with VS Code in the browser.

This meant I could use the emerging tools like GitHub Copilot to help generate the code and work through the trickiest parts—like making direct HTTPS requests to the weather API, which (after a lot of testing!) we proved is possible from the phone app.

After getting the data, the final step was tweaking the C code to make sure the layout wasn’t clipped and all the information fit perfectly on the 144×168 screen. It’s now compatible with watches in the Pebble family, from the original Pebble Time (color) to the new Pebble 2 Duo and the upcoming Pebble Time 2.

 

Pebble Watch Face
Pebble Watch Face

 

Available Now

This project took around 6 hours, with the main issue being that Co-Pilot did not know how to get HTTP requests – it took me down a lot of rabbit holes, and in the end, it was down to using a simpler call on the data – XMLHttpRequest. Once this worked it all fell into place and it was simply a case of asking Copilot to add in the data fields, do the geocoding and then take a step back and explain how the code actually works.

If you’re like me and just want a simple, data-rich weather face, please give it a try…

 

    • Download ‘Just Weather‘ from the Rebble Appstore
    • Check out the source code on GitHub for the latest updates – now includes its own settings page (its become a proper watch face app)…

 

 

Cities in the Metaverse Book

Cities in the Metaverse: Spatial Computing, Digital Twins, Avatars, Economics and Digital Habitation on the New Frontier

By Books and Papers

We are pleased to announce the publication of our new book – Cities in the Metaverse: Spatial Computing, Digital Twins, Avatars, Economics and Digital Habitation on the New Frontier and we have 20% off using the code 25ESA4 via via Routledge (it also available via all good bookshops/Amazon, etc),  the book offers a comprehensive exploration of the intersection between urban environments and digital realms.

Cities in the Metaverse Book
Cities in the Metaverse Book

The book examines:

·         The size and shape of cities in the Metaverse

·         Spatial computing and its impact on digital interaction

·         Digital twins in urban planning and management

·         Avatars and social dynamics in virtual spaces

·         Economic models emerging in the Metaverse

·         The influence of gaming on immersive digital landscapes

·         The impact of Artificial Intelligence on the Metaverse

·         Potential futures of digital habitation

Drawing from architecture, computer science, urban planning, geography, social studies, and economics, the authors provide a multidisciplinary analysis of how virtual cities are shaping our digital future. Using key case studies, they trace the evolution from early cyberspace concepts to current spatial computing technologies, offering insights into both historical context and future possibilities. The book addresses key questions about the opportunities and challenges presented by metaverse technologies, including issues of accessibility, creativity, and the future of humans and artificial intelligence co-existing, side by side, in digital spaces. It serves as a practical guide, equipping readers with the knowledge they need to navigate the future of urban life in virtual environments with a thought-provoking examination of how we might build, inhabit, and govern cities in the Metaverse.

The authors explore the concept of digital twins, demonstrating how these virtual replicas of physical spaces can revolutionise urban planning and management. They delve into the social aspects of the metaverse, examining how avatars shape our interactions and relationships in digital realms. Economic considerations are central to the book ethos, with an analysis of emerging models that often leverage blockchain technologies. The book addresses the challenges, potential pitfalls and ethical considerations in creating and inhabiting digital cities. At the same time, it takes a step back and examines already abandoned digital worlds, offering lessons from past attempts at creating virtual spaces. 

Essential reading for urban planners, geographers, economists, technologists, policymakers, and anyone interested in the future of cities and digital interaction, Cities in the Metaverse provides a balanced, informed perspective on this rapidly evolving field.

Authors:

Andrew Hudson-Smith is a Professor of Digital Urban Systems at the Centre for Advanced Spatial Analysis (CASA), University College London. He focuses on real-time data, virtual environments and the Internet of Things within the urban environment. He is also an elected Fellow of the Royal Society of Arts, A Fellow of the Academy of Social Sciences and a Fellow of the Royal Geographic Society. Socials: @digitalurban

Duncan Wilson is a Professor of Connected Environments at the Centre for Advanced Spatial Analysis (CASA), University College London. His research focuses on how emerging technologies, such as connected sensors and cognitive computing, can augment our understanding of the built and natural environment. He has over 25 years of experience in industry and is a Fellow of the Royal Society of Arts. Socials: @djdunc

Valerio Signorelli is a Lecturer in Connected Environments at the Bartlett Centre for Advanced Spatial Analysis (CASA), University College London. He holds an MSc in Architecture and Urban Design and a PhD in Territorial Design and Government from the Department of Architecture and Urban Studies at the Politecnico di Milano (Italy). He is also a Fellow of the Royal Society of Arts. Socials: @ValeSignorelli

Close Menu

About Salient

The Castle
Unit 345
2500 Castle Dr
Manhattan, NY

T: +216 (0)40 3629 4753
E: hello@themenectar.com

Archives