High Earth Orbit
Status
@kachok @digiphile ah - a 'meet up' for government or registered by external people? is there any geographic filtering? # 1 min ago
Location
Washington, DC
spacer spacer


02 Mar 2012
Arlington, VA

FourSquare and OpenStreetMap

Published in OpenStreetMap, Technology  |  3 Comments


spacer Earlier this week FourSquare announced that they switched their website maps from Google Maps to OpenStreetMap data hosted by MapBox. In what has been a growing trend of broader adoption, FourSquare remarks the utility and success of OpenStreetMap. Additionally it’s another movement in the recent switch2osm campaign since Google began requiring paid licensing for high-usage of the once completely free Google Maps.

Currently the switch is only for the website, which I admit I have used less than a dozen times and the mobile application will still be using the native Google Maps libraries. There are a number of valid reasons for this, not least of which is that Google is not yet charging for mobile maps usage, though I imagine it only a matter of time before they do and also for developers to build comparable mobile mapping libraries for OpenStreetMap.

Value of the Basemap

There are several intriguing aspects of this announcement as well as the reaction. First is that the change of the basemap, while intriguing to the geospatial and data communities, is likely highly irrelevant to most FourSquare users. Would there have been much news had the switch been to Microsoft Bing maps? Probably not. The interest is clearly impacted by the community, and general good will, of the OpenStreetMap project. Each adoption by a major company further verifies its value, as well as solidifies its continuity as organizations build their own business with OpenStreetMap as a core component.

Second is that there have been a number of companies whose primary, or recent, goal has been to be a trusted provider of OpenStreetMap basemaps. CloudMade, started by one of the founders of OpenStreetMap Steve Coast, was created for exactly this purpose. Additionally MapQuest has been using OpenStreetMap as a tactic to increase adoption of their long-standing mapping platform as well as insure themselves against likely increasing commercial data provider costs. However it was an extremely recent technology, albeit from a longer established company, to be the one to provide the OpenStreetMap basemap for FourSquare.

Development Seed’s MapBox is truly a compelling creation of technology and innovation. They have done extremely well adopting the best of breed software, and the development team that built it, with Mapnik. And they combined it with new technology to make it fast, and a differentiating and compelling story for developers by using Node.js. Technical details aside, the design and thought into the representation of OpenStreetMap clearly was a key differentiator in FourSquare using Mapbox to serve their OpenStreetMap tiles. And I’ll also add that Development Seed is a local DC and East Coast company – something I don’t doubt was interesting to the New York based FourSquare in pushing against the typical Silicon Valley technology scene.

Of course, in the end it is just a basemap. This is the background canvas that contains the actually valuable information that FourSquare has gathered and users engage with. The switch from Google Maps to OpenStreetMap does not in any way change the value or usage of the FourSquare application and community. Technically there is no real difference – it’s possible to restyle most any basemap today, and I imagine the switch from one provider to another was a relatively trivial code switch. FourSquare, or others, could just as easily switch to a new basemap if it was important to them as a business or their community.

More than a Basemap

spacer What I am most excited about, and believe FourSquare has an almost unique potential to enable, is the adoption of OpenStreetMap as more than just the canvas for visualizing check-in’s and user activity. OpenStreetMap’s true value is that it is an open, editable, relational database of geographic data – where the basemap is merely one way to access the information. What makes OpenStreetMap the future of location data is that the information can only get better, more up to date more quickly, and better representative of unique and varied views of a person’s place.

Several years ago Dennis and I had a conversation just after the initial launch of FourSquare about the potential of using OpenStreetMap. At the beginning, FourSquare only worked in specific cities, and in his considering how to expand it everywhere the options were between having a blank database and having an OpenStreetMap populated dataset. Obviously the tremendous potential was having the then nascent community of FourSquare users using and updating OpenStreetMap data. Unfortunately for usability and I assume business reasons (e.g. build your own database that you can own) FourSquare didn’t adopt OpenStreetMap at that time.

However, imagine if FourSquare adopted just this technique. Leverage their millions of users to improve the OpenStreetMap database. OpenStreetMap itself suffers from the common platform issue of being everything to everyone. This is confusing for new users that want to contribute to know where to begin. They may just want to include the road in front of their house – or the park down the street and the great coffee shop they frequent. Unfortunately the interface for performing these activities often requires understanding of British terminology of places and an overwhelming choice of categories, tags, and drawing options.

FourSquare by contrast is forced to be simple and focused. Users are quickly engaging and disengaging with the application that should capture the data and reflect it to the user for verification. Because my activity is being tracked, FourSquare can know that I’m on foot in the US and in an urban area, so don’t start by showing me hiking trails, or highways but show me restaurant and relevant places of interest – allowing me to dive deeper if I want to but making it simple for the casual user to improve the data. I believe that only through simple and focused user applications will OpenStreetMap broadly enter into the common use and be able to reach the end tail of location data.

Of course, this assumes FourSquare, specifically the investors and board, don’t see their user collected place data as a key and protected dataset. There have been enough POI selling companies in a dying market. There are now businesses such as Factual, and still CloudMade, who are focused on making this data openly available – though themselves as brokers to the data.

Despite continuing to cross numerous impressive adoption hurdles and over seven years of development, OpenStreetMap is still a young project. Its adoption by FourSquare is indeed another momentous occasion that heralds optimism that it will continue to grow. And as companies like Development Seed, CloudMade, MapQuest and others adopt OpenStreetMap as a core to their business – providing not just services but truly engaging with the community and providing focused context and value, OpenStreetMap will only get better.


27 Oct 2011

Google Maps Terms of Service and Pay

Published in Google, Mapstraction  |  5 Comments


Today Google announced that they are enforcing free usage limits on the Google Maps API. Beyond the free limit of 25,000 views per day, sites will start having to pay $4 per 1,000 views. They will automatically charge your credit card based on these usage fees and it’s not clear if you can set a “cut-off” limit or if it will have the similar suprises as overseas cell charges.

I find this is a bit of a surprising action from Google. In 2005 they changed the mapping and geospatial web by providing a powerful, easy to use great API (eventually), and primarily free of charge slippy map platform. The term “GoogleMap” became synonymous with being able to pan and zoom through the entire world without any reloading of the page or poor user experience. Since then, there have been millions of sites that have used GoogleMaps to provide simple map views and location services. Assumedly this information has been of huge value to Google in understanding interest, spatial-context, and generally eyeballs to Google tools and content.

Google has also worked to monetize maps, often subtly through sponsored map markers, and other times more directly through in-map ads. Each of these decisions brought discussion and disent but it was difficult to argue with the fact that the tool was still free to use. Google has clearly put real value in content and engineering into Google Maps. The quality of geocoding, data availability and power of the API has always been extremely capable and arguably the best of breed.

Now, with a very direct pay requirement being imposed this will dramatically change the adoption of GoogleMaps. Developers will have to consider very carefully how they will afford the potential – and optimistically likely – fees that the service will require as it becomes successful.

Fortunately, there are still a few really good alternative options for developers of sites if they can’t afford the usage fees. MapQuest has really embraced the future of open by supporting and integrating OpenStreetMap into their sites. Microsoft Bing maps are very capable and there are many more – not least of which is a developer “rolling their own”.

This interesting change by Google also validates abstraction libraries such as Mapstraction. Mapstraction provides a common API where a developer can easily switch between map provider libraries without having to rewrite their code – something that would likely cost much more in the short term than paying for usage fees. On GeoCommons we use ModestMaps to be able to switch to any map data provider service.

I’m very interested to see the general developer reaction to this change.


30 Aug 2011
Chicago,IL

Geospatial Preservation at Society of American Archivists

Published in Conference, Data


Cross-posted from the GeoIQ Blog

spacer Last Week I participated in a panel with spatial archival experts at the at the Society of American Archivists. Led by Butch Lazorchak of the Library of Congress, and also joined by Steve Morris from GeoMAPP, and John Faundeen from USGS, the panel was a full spectrum discussion of “Geospatial Data Preservation” ranging from the Library of Congress’ $10 million acquisition and access to the infamous Waldseemüller 1507 map Universalis Cosmographia of ‘America’ USGS’s environmental conditions for storing historic satellite imagery to GeoMAPP’s work in gathering time-stamped state geospatial data. Butch in particular provided an inspiring overview on what’s special about Spatial – density of data, representation vs data, and the difficulty in capturing interactivity of more modern digital maps.

The Archivists were a new community to me – people that are passionate about the capturing and storing of data – often until the end of time! But they also vary in their core missions – often diverging on the utility of the captured data and information. Very few seem to be really thinking about archives as a useful resource today and only focusing on the long-time storage and eventual access of the data by some unknown entity. As one member of GeoMAPP said: “All of the Archives are storing this superseded GIS data in dark archives and aren’t really providing access to the datasets and don’t have web mapping interfaces”

Clearly, we think a bit differently about archiving – choosing to focus foremost on access to data which will result in improved archiving of data, distribution, and analysis on utility and benefit. My presentation Maps as Narratives: Making Spatial Archives Accessible

focused on the concept that maps have been, and are increasingly a vital resource for people in their daily lives and work. By providing users tools to access and use historic and realtime data, we can then capture this data and provide it to other users and data repositories.

Particular to internet feeds, and social media we can’t easily predict what data will be useful. Neogeographers create visualizations of twitter streams, photos, foursquare checkin’s, friend locations. How do we know which of these are the modern correspondances of tomorrow’s US President or Global business leader? Through easy mechanisms for sharing data and maintaining links we can begin tracking this information in it’s varied forms, providing better insight and archiving of data for later reuse, whether it is tomorrow or in 100 years.

Geospatial Archiving – Society of American Archivists

View more presentations from Andrew Turner

29 Apr 2011

Endeavor Shuttle Launch STS-134

Published in Space  |  2 Comments


spacer I was fortunate enough to be selected to attend the #NASATweetup to see the last launch of Space Shuttle Endeavor – STS-134. Along with 150 other lucky selected people including even @dens, the Obamas, Gabi Giffords, Seth Green, Levar Burton and numerous inspiring astronauts we’ll be at the countdown clock with a front row seat the second to last launch of the entire shuttle program.

Endeavor is carrying the Alpha Magnetic Spectrometer to the International Space Station that will perform some inspiring science on measuring dark matter radiation. There’s also a host of spiders, aggressive bacteria and other science experiments that will be run on the iSS. I’ll have more photos and stories up soon.


04 Feb 2011
San Jose, CA

Automatic Road Detection – the Good and the Bad

Published in OpenStreetMap  |  2 Comments


spacer

Yesterday Steve Coast announced that Bing had released a new tool for doing automatic road detection using satellite imagery. The concept is definitely interesting as it provides a way to rapidly generate road data over the entire globe without need of manual tracing.

However, I remarked that it was particularly interesting that Steve was working on this. Several years ago, when OpenStreetMap was still an ambitious but unproven concept many people argued that road detection was a useful, and perhaps necessary, mechanism for actually capturing all the road data. Steve was quite adamant that while it was possible – and he demonstrated it – it wouldn’t work for other reasons.

OpenStreetMap is more than just a set of lines that render to nice maps. It is a topologically connected, classified and attributed, labeled network of geographic entities. Each road consists of intersections, road classifications, names, speed limits, overpasses, and lanes. OpenStreetMap has provided a very rich set of linked, geographic data.

And beyond the data, it has built a community of invested members that careful capture, annotate, and cultivate the data in OpenStreetMap. This means that the data is captured, but also updated and maintained (ideally) with new information, changes, and other entities such as parks, buildings, bus stops and more.

So Steve convincingly pointed out that automatic road identification was interesting, it would circumvent these other benefits of what OpenStreetMap was working on: rich connected data, and a community of volunteers that would build and maintain the dataset. Road detection has a tendency to generate a large amount of data in an area that no one is actively working on the data. So you can gain what appears to be good coverage but limited local knowledge on intersections, names, and other metadata.

I don’t think that these are insurmountable problems. The act of capturing GPS data can be tedious, inaccurate, or not readily possible in remote areas. Road detection can provide this data and users can work afterwards to improve the data, either remotely or using even simpler mobile devices that a user can annotate features without having to capture the entire geographic road line.

So my comment the other day was about pointing out an interesting change in message and strategy. I applaud the work of Steve and the Bing team in developing new tools, but there are many other pieces that warrant consideration. Steve even asked often if the bulk import of the TIGER/Line data was good or bad for the US community. In the end, I believe it was the right thing as it provided a canvas of data using open data that provided a validity to skeptics that OpenStreetMap was viable and valuable.

Now that OpenStreetMap has become increasingly adopted by the world’s largest providers and users of data it is time to evaluate new tactics for gathering and maintaining data. However this can’t be at the expense of what made OpenStreetMap a success for the past 5+ years.


« Previous Entries