Designing Indicator reports for the Web 2.0
Designing indicator reports that are effective and accessible to a variety of audiences requires considerable creativity, even when the report is produced only in paper form. However, with the Internet, this process has become a lot more complicated. In the past, the favoured approach seems to have been to produce the paper report and then post it on a web site. However, as I have said before, the age of the publish-and-browse internet is coming to an end. The new web, sometimes referred to as Web 2.0, presents a vastly different model that offers unprecedented opportunities for collaboration amongst self-organizing communities of interest. Indicator processes need to take advantage of this new Web.
In that light, I have jotted down a few points that I think those developing indicator reports may find useful, particularly with respect to the use of the Internet as a tool to increase citizen participation in decision making around SD - one of the most commonly stated goals of indicator frameworks. In many instances I've tried to note the implications for government forest services, who often produce indicator reports on sustainable forest management.
1. Data is the next Intel Inside
Lesson 1: Users want control over how data is presented. Try as hard as we might to present data in a meaningful way, new functionality on the internet allows others to take data and re-package it in new, innovative ways that may prove to be more meaningful to our intended audience.
The recent introduction of Google Maps provides a living laboratory for presenting third party data in new and innovative ways to specific communities of interest. Google's lightweight programming model has led to the creation of numerous value-added services in the form of mashups that link Google Maps with other internet-accessible data sources. Paul Rademacher's housingmaps.com, which combines Google Maps with Craigslist (craigslist.com) apartment rental and home purchase data to create an interactive housing search tool, is the pre-eminent example of such a mashup. Other sites, like gapminder.com take complicated time series data on national economic and well-being statistics provided by the United Nations Development Program, and present it in interactive and easy to understand ways (see the Gapminder interactive tool at http://tools.google.com/gapminder/). In the realm of forestry data, the Active Fire Mapper mashup (http://firecenter.berkeley.edu/cafiremap/gmap_html/gmodis.html) maps fire hotspots in California using google maps and publicly available wildfire data from the US Forest Service. Even though the US Forest Service produces maps of hotspots, this mashup provides the additonal functionality of google maps, like the ability to zoom in, accurate street maps and satellite imagery. New do-it-yourself mashup applications like YouMap now make it relatively easy for novices to re-display data in new ways. For example, in about 5 minutes I was able to produce a map (http://www.polleto.com/YouMap/index.html?lng=en&map=sbridge) that uses google maps to display data from indicator 5.2.1 of the Canadian Council of Forest Ministers' Criteria and Indicators report on forest area under Aboriginal and non-Aboriginal tenure in Canadian provinces.
Other applications allow users to alter websites to meet their needs. For example, Greasemonkey allows users to write simple scripts that alter web pages they visit - altering the way the site looks, removing advertisements, or even retrieving data from other sites to make two sites more interconnected.
At present, these mashups are mostly innovative experiments, done by hackers. But entrepreneurial activity follows close behind. By making it easier for end users to customize the way data is represented, data suppliers, like government forest services can promote their data and improve its accessibility and usability.
Lesson 2: Google and others through applications like their mapping functions seem to be striving not only to be the way users search for information, but also the principle way that users retrieve and use data.
By making their applications highly customizable by the end user, and by working with data suppliers, Google and others are trying to not only facilitate finding information, but be the way that users retrieve and use data. There are many examples on the Internet where control over data can lead to market control, particularly if the data are expensive to create or amenable to increasing returns via network effects. Look at copyright notices at the base of every map served by MapQuest, maps.yahoo.com, maps.msn.com or maps.google.com, and you'll see the line "Maps copyright NavTeq, TeleAtlas". NavTeq made substantial investments in its data, and now is the soul source of data in systems whose software infrastructure is largely open source. There is no reason governments could not do the same thing with their very expensive forestry information. Google is working with governments and others to do just this. Google is working with local transit authorities to allow users to plan transit trips using Google Transit (http://www.google.com/transit) and with local taxi companies to allow users find taxi's near their location with Google Ride Finder (http://labs.google.com/ridefinder). Interestingly, in some cases it is google users themselves who petition authorities to make data available to Google.
For competitive advantage, governments that supply forest data must seek to leverage their own unique, hard to recreate sources of data. For reporting on indicators of sustainable forest management in Canada, much of the data is produced, at great expense, by the federal, provincial or territorial governments. Others could not hope to recreate, for example, a province's forest resource inventory. It can only be a matter of time before Google and others come looking to make this data available to end users via their application. Like NavTeq, governments may be able to license their data for use by other applications.
Lesson 3: You can be the authoritative source of information by producing datasets that others cannot afford or are unable to recreate. But to bring people back to your web site, you need to continuously upgrade, enhance, and extend the data.
Online booksellers, like Amazon.com, Barnesandnoble.com, chapters.ca all sell the same product and derive their original database from the ISBN registry provider R.R. Bowker. How is it then that these booksellers can differentiate themselves from each other and how is it that Amazon.com can become such a dominant player. The answer lies in Amazon's relentless enhancement of the data, adding publisher-supplied data such as cover images, table of contents, index, and sample material. Even more importantly, they harnessed their users to annotate the data, such that after ten years, Amazon, not Bowker, is the primary source for bibliographic data on books, a reference source for scholars and librarians as well as consumers. Effectively, Amazon "embraced and extended" their data suppliers.
To be competitive, the forest data suppliers need to relentlessly enhance their data, constantly updating it, but also harnessing its users to annotate the data and add extra value.
2. Users add value
The key to competitive advantage in internet applications is the extent to which users add their own data to that which you provide. Therefore: Don't restrict your "architecture of participation" to software development. Involve your users both implicitly and explicitly in adding value to your application. As described above, Amazon sells the same products as competitors such as Barnesandnoble.com, and they receive the same product descriptions, cover images, and editorial content from their vendors. But Amazon has made a science of user engagement. They have an order of magnitude more user reviews, invitations to participate in varied ways on virtually every page--and even more importantly, they use user activity to produce better search results. While a Barnesandnoble.com search is likely to lead with the company's own products, or sponsored results, Amazon always leads with "most popular", a real-time computation based not only on sales but other factors that Amazon insiders call the "flow" around products. With an order of magnitude more user participation, it's no surprise that Amazon's sales also outpace competitors.
Indicator web sites are no different, they need to find ways for users to participate and add value. That could mean allowing users to add comments on the indicators or even their own data or analysis. Other users will find value in this additional content, increasing the value of the site, increasing transparency and credibility, and increasing usage.
3. Network Effects by Default
Only a small percentage of users will go to the trouble of adding value to your application. Therefore: Set inclusive defaults for aggregating user data as a side-effect of their use of the application. An indicator Web site needs to be designed in a way that it gets better the more people use it, and designed in a way that users, who are pursuing their own selfish interest, build collective value as a byproduct.
One example of this might be to track the key words users search for on an indicator report web site. By looking for relationships between words the user is searching for, it may be possible to suggest other pages within the site that might be of interest.
4. Some Rights Reserved.
Intellectual property protection limits re-use and prevents experimentation. Therefore: When benefits come from collective adoption, not private restriction, make sure that barriers to adoption are low. Follow existing standards, and use licenses with as few restrictions as possible. Design for "hackability" and "remixability."
5. The Perpetual Beta
When devices and programs are connected to the internet, applications are no longer software artifacts, they are ongoing services. Therefore: Don't package up new features into monolithic releases, but instead add them on a regular basis as part of the normal user experience. Engage your users as real-time testers, and instrument the service so that you know how people use the new features. For indicator reporting, that may mean producing a paper report periodically, but continuously updating and improving the web site with new information - respond to user comments, redesign the presentation of indicators if necessary.
6. Cooperate, Don't Control
Web 2.0 applications are built of a network of cooperating data services. Therefore: Offer web services interfaces and content syndication, and re-use the data services of others. Support lightweight programming models that allow for loosely-coupled systems.
7. Software Above the Level of a Single Device
The PC is no longer the only access device for internet applications, and applications that are limited to a single device are less valuable than those that are connected. Therefore: Indicator reports should be designed from the get-go to integrate services across handheld devices, PCs, and internet servers. It should be as easy for a user to find out the area of forest or the employment in the forest industry from their cell phone or PDA as it is from their work computer.
For more information on how the evolving nature of the Internet could effect indicator reporting, check out the 'paper' "What is Web 2.0" by Tim O'Reilly - http://www.oreillynet.com/lpt/a/6228. Many of the ideas presented here have been adapted from that paper.