Mapping Data – Open standards and transparency are key to value
I have been using GPS to record my biking for a few years now. It started when I found Google’s MyTracks app in 2010, in preparation for my first STP ride. The app was simple to use and I liked the fact that I could export track data. This prompted me to use another program called TopoFusion. The limits of the MyTracks app didn’t mater, because TopoFusion offered more features. TopoFusion gave me a way to compare my rides and get a better visual of routes I planned. The one feature that I never tapped into, but was available was measuring heart rate. Both MyTracks and TopoFusion support heart rate and cadence data through a limited third party hardware base. This limitation is less noticeable with TopoFusion, since the table data can be modified.
Early last Fall, a coworker recommended a phone app called Strava. The remarkable quality of this app was the ability to compare tracks with other riders. I also found the elevation profiling better presented. I haven’t used the program to the same extent I have as MyTracks. The main reason for this is I haven’t done as many rides as I would like. Secondly, I’ve been testing other logging apps. Even with the limited experience I have with Strava, I will say the best feature is how the data is structured and presented.
This turns out to be a solid observation with the recent news about Strava being a particapant in the Open Trails System Specification, or more commonly known as OpenTrails. The focus of the OpenTrails project is to make data structure transparent to app developers. This is key because data structure that is proprietary requires massaging. The transparency makes sure that a report that works with one source will work with another. It also means aggregate data is possible using third party sources that follow the open standard. This was proven by TriMet’s IT Manager of GIS, Bibiana McHugh. She opened the data to developers. By doing so, they were rewarded with apps that present that data in ways never imagined, minus the cost of hiring in house developers.
The amount of data being collected by organizations like TriMet and Strava is staggering. Having this data available to the public will be the only means to quantify its value. It still isn’t clear how developers will tap into this raw resource and refine it into a commodity that is in demand. But it is clear that any hindering of development will hinder any value of the data. This just brings up a bunch of questions.
How will sensor development fit into this? What significance is a complex sensor array on a single platform compared to that of a dispersed sensor cloud on mobile devices. Where should the effort be, data storage, collection, structuring, processing, or reporting? Is it too late to get in on the game at some or all of these levels?
I don’t thinks it’s too late. But waiting to see what happens next and having the comfort of hindsight equates to no front seat tickets. There is a common theme that most of these systems offer, and it is the API. Strava and MyTracks offer API’s. TopoFusion does not, but the layers featured use API’s, one example is OpenStreetmap. Having these API’s available means the game isn’t over. The potential for logging custom sensors and combining the results from data already gathered has never been greater, and it show no sign of tapering out soon. As a matter of fact, Garmin is making headway into extensible sensor logging by expanding on the GPX format. Garmin offers several API options for developers to elaborate on.
Having this resource available for anyone with the wherewithal to refine it, is tantamount to the changes made by those that took sulfur smelling tar from the ground.