A little less than a month ago the Open Data Institute launched Open Data Certificates, a new scheme to promote open datasets and help build trust between data producers and data consumers. Essentially, data producers fill in a detailed online survey for a particular dataset, covering various topics such as legal information (rights, licensing, privacy), practical information (findability, accuracy, quality, guarantees), technical information (locations, formats, trust) and social information (documentation, […]
In my last post I mentioned a talk I gave about my work for berlinonline on the Berlin Data Portal. Several months have passed since then (so quickly!), and yesterday we were finally able to launch the portal’s new version at daten.berlin.de (aka “Offene Daten Berlin”)! Compared to the upheaval in the open data community that the launch of the federal German data portal caused, the re-launch of the Berlin […]
This week, I was invited to give talk at a workshop on Open Data and Open Government at the Innovationsforum Semantic Media Web in Berlin. There were short talks on different aspects on the topic; I was asked to provide a local Berlin perspective (slides see at the bottom of this post) based on my work on the Berlin Open Data Portal which I have recently started doing on behalf of BerlinOnline.
When talking about the foundation necessary to build smarter, digital cities, the focus is often on ubiquitous broadband, wireless access or sensors - in other words hardware. However, another important ingredient is data, ideally of the open flavour.
In this post, we're going to use SPARQL to derive additional data from the dataset we created in the previous post. That way, we can enrich our dataset by making some implicit information explicit.
This is the first in a four-part series of posts about how to do a simple Linked Data project, using EURO2012 Panini stickers as an (admittedly somewhat silly) example. It is going to illustrate all the steps from finding and converting the source data, linking and enriching it with the wider Web of Data and finally creating a little app on top of the data.