Orchestrating streams of data from across the Internet

The liveblog was a revelation for us at the Guardian. The sports desk had been doing them for years experimenting with different styles, methods and tone. And then about 3 years ago the news desk started using them liberally to great effect.

I think it was Matt Wells who suggested that perhaps the liveblog was *the* network-native format for news. I think that’s nearly right…though it’s less the ‘format’ of a liveblog than the activity powering the page that demonstrates where news editing in a networked world is going.

It’s about orchestrating the streams of data flowing across the Internet into a compelling use in one form or another. One way to render that data is the liveblog. Another is a map with placemarks. Another is a RSS feed. A stream of tweets. Storify. Etc.

I’m not talking about Big Data for news. There is certainly a very hairy challenge in big data investigations and intelligent data visualizations to give meaning to complex statistics and databases. But this is different.

I’m talking about telling stories by playing DJ to the beat of human observation pumping across the network.

We’re working on one such experiment with a location-tagging tool we call FeedWax. It creates location-aware streams of data for you by looking across various media sources including Twitter, Instagram, YouTube, Google News, Daylife, etc.

The idea with FeedWax is to unify various types of data through shared contexts, beginning with location. These sources may only have a keyword to join them up or perhaps nothing at all, but when you add location they may begin sharing important meaning and relevance. The context of space and time is natural connective tissue, particularly when the words people use to describe something may vary.

We’ve been conducting experiments in orchestrated stream-based and map-based storytelling on n0tice for a while now. When you start crafting the inputs with tools like FeedWax you have what feels like a more frictionless mechanism for steering the flood of data that comes across Twitter, Instagram, Flickr, etc. into something interesting.

For example, when the space shuttle Endeavour flew its last flight and subsequently labored through the streets of LA there was no shortage of coverage from on-the-ground citizen reporters. I’d bet not one of them considered themselves a citizen reporter. They were just trying to get a photo of this awesome sight and share it, perhaps getting some acknowledgement in the process.

You can see the stream of images and tweets here: http://n0tice.com/search?q=endeavor+OR+endeavour. And you can see them all plotted on a map here: http://goo.gl/maps/osh8T.

Interestingly, the location of the photos gives you a very clear picture of the flight path. This is crowdmapping without requiring that anyone do anything they wouldn’t already do. It’s orchestrating streams that already exist.

This behavior isn’t exclusive to on-the-ground reporting. I’ve got a list of similar types of activities in a blog post here which includes task-based reporting like the search for computer scientist Jim Gray, the use of Ushahidi during the Haiti earthquake, the Guardian’s MPs Expenses project, etc. It’s also interesting to see how people like Jon Udell approach this problem with other data streams out there such as event and venue calendars.

Sometimes people refer to the art of code and code-as-art. What I see in my mind when I hear people say that is a giant global canvas in the form of a connected network, rivers of different colored paints in the form of data streams, and a range of paint brushes and paint strokes in the form of software and hardware.

The savvy editors in today’s world are learning from and working with these artists, using their tools and techniques to tease out the right mix of streams to tell stories that people care about. There’s no lack of material or tools to work with. Becoming network-native sometimes just means looking at the world through a different lens.

Rethinking news for a world of ubiquitous connectivity

I gave a presentation on the implications of ubiquitous connectivity for journalism at the Rethinking Small Media event held at the University of London yesterday. The slides are here:

[slideshare id=14634695&doc=smallmedia-mattmcalister-121008074438-phpapp01]

I realized by the time I finished talking that the point I really wanted to make was more about how important it is that we move the dialog away from an us v them view of small and big media. Fracturing a community that is mostly full of people trying to do good in the world is not helpful, even if the definition and method of doing good varies.

The more important issue is about protecting the open public space we call the Internet.

As the network begins adopting more and more nodes, more streams of nonhuman data, new connected devices, etc., we must work harder to ensure that the interests that make all these things possible are aligned with the principles that made the Internet such valuable infrastructure for people across the globe.

But, in the meantime, there are certainly some tangible things people from both small and big media can do to point in the right direction.

The list includes atomizing information, adding more context such as time and space, linking it, making it developer-friendly, and sharing it openly with partners, among other things.

Calling your web site a ‘property’ deprives it of something bigger

BBC offered another history of London documentary the other night, a sort of people’s perspective on how the character of the city has changed over time, obviously inspired by Danny Boyle’s Opening Ceremony at the Olympics.

Some of the sequences were interesting to me particularly as a foreigner – the gentrification of Islington, the anarchist squatters in Camden, the urbanization of the Docklands, etc.  – a running theme of haves vs have-nots.

It’s one of a collection of things inspiring me recently including a book called ‘The Return of the Public‘ by Dan Hind, a sort of extension to the Dewey v Lippman debates, what’s going on with n0tice, such as Sarah Hartley’s adaptation for it called Protest Near You and the dispatch-o-rama hack, and, of course, the Olympics.

I’m becoming reinvigorated and more bullish on where collective action can take us.

At a more macro level these things remind me of the need to challenge the many human constructs and institutions that are reflections of the natural desire to claim things and own them.

Why is it so difficult to embrace a more ‘share and share alike’ attitude?  This is as true for children and their toys as it is for governments and their policies.

The bigger concern for me, of course, is the future of the Internet and how media and journalism thrive and evolve there.

Despite attempts by its founders to shape the Internet so it can’t be owned and controlled, there are many who have tried to change that both intentionally and unwittingly, occasionally with considerable success.

How does this happen?

We’re all complicit.  We buy a domain. We then own it and build a web site on it. That “property” then becomes a thing we use to make money.  We fight to get people there and sell them things when they arrive.  It’s the Internet-as-retailer or Internet-as-distributor view of the world.

That’s how business on the Internet works…or is it?

While many have made that model work for them, it’s my belief that the property model is never going to be as important or meaningful or possibly as lucrative as the platform or service model over time. More specifically, I’m talking about generative media networks.

Here are a few different ways of visualizing this shift in perspective (more):

Even if it works commercially, the property model is always going to be in conflict with the Internet-as-public-utility view of the world.

Much like Britain’s privately owned public spaces issue, many worry that the Internet-as-public-utility will be ruined or, worse, taken from us over time by commercial and government interests.

Playing a zero sum game like that turns everyone and everything into a threat.  Companies can be very effective at fighting and defending their interests even if the people within those companies mean well.

I’m an optimist in this regard.  There may be a pendulum that swings between “own” and “share”, and there are always going to be fights to secure public spaces.  But you can’t put the Internet genie back in the bottle.  And even if you could it would appear somewhere else in another form just as quickly…in some ways it already has.

The smart money, in my mind, is where many interests are joined up regardless of their individual goals, embracing the existence of each other in order to benefit from each other’s successes.

The answer is about cooperation, co-dependency, mutualisation, openness, etc.

We think about this a lot at the Guardian. I recently wrote about how it applies to the recent Twitter issues here. And this presentation by Chris Thorpe below from back in 2009 on how to apply it to the news business is wonderful:

Of course, Alan Rusbridger’s description of a mutualised newspaper in this video is still one of the strongest visions I’ve heard for a collaborative approach to media.

The possibility of collective action at such an incredible scale is what makes the Internet so great.  If we can focus on making collective activities more fruitful for everyone then our problems will become less about haves and have-nots and more about ensuring that everyone participates.

That won’t be an easy thing to tackle, but it would be a great problem to have.

Dispatchorama: a distributed approach to covering a distributed news event

We’ve had a sort of Hack Week at the Guardian, or “Discovery Week“. So, I took the opportunity to mess around with the n0tice API to test out some ideas about distributed reporting.

This is what it became (best if opened in a mobile web browser):

http://dispatchorama.com/



It’s a little web app that looks at your location and then helps you to quickly get to the scene of whatever nearby news events are happening right now.

The content is primarily coming from n0tice at the moment, but I’ve added some tweets with location data. I’ve looked at some geoRSS feeds, but I haven’t tackled that, yet. It should also include only things from the last 24 hours. Adding more feeds and tuning the timing will help it feel more ‘live’.

The concept here is another way of thinking about the impact of the binding effect of the digital and physical worlds. Being able to understand the signals coming out of networked media is increasingly important. By using the context that travels with bits of information to inform your physical reality you can be quicker to respond, more insightful about what’s going on and proactive in your participation, as a result.

I’m applying that idea to distributed news events here, things that might be happening in many places at once or a news event that is moving around.

In many ways, this little experiment is a response to the amazing effort of the Guardian’s Paul Lewis and several other brave reporters covering last year’s UK riots.

There were 2 surprises in doing this:

  1. The twitter location-based tweets are really all over the place and not helpful. You really have to narrow your source list to known twitter accounts to get anything good, but that kind of defeats the purpose.
  2. I haven’t done a ton of research, yet, but there seems to be a real lack of useful geoRSS feeds out there. What happened? Did the failure of RSS readers kill the geoRSS movement? What a shame. That needs to change.

The app uses the n0tice API, JQuery Mobile, and Google’s location APIs and a few snippets picked off StackOverflow. It’s on GitHub here:
https://github.com/mattmcalister/dispatchorama/

The power of collective research, task-based investigations and swarm intelligence

In January 2007 a well known computer scientist named Jim Gray was lost at sea off the California coast on his way to the Farallon islands.

It was a moment that many will remember either because Jim Gray was a big influence personally or professionally or because the method of the search for him was a real eye opener about the power of the Internet.  It was a group task-based investigation of epic proportions using the latest and greatest technology of the day.

I didn’t know him, but I will never forget what happened.  Not only did the Coast Guard’s air and surface search cover 40,000 square miles, but a distributed army of 12,000 people scanned NASA satellite imagery covering 30,000 square miles.  We all used Amazon’s Mechanical Turk to flip through tiles looking for a boat that would’ve been about 6 pixels in size.

They attacked the search in some phenomenal ways.  Here is Werner Vogel’s public call for help. You can also go back and read the daily search logs posted by his friends on the blog here.  Both Wired and the New York Times covered this incredible drama in detail.

Since then we’ve seen the Internet come to the rescue or at least try to make a difference using similar crowdmapping techniques.  Perhaps the most powerful example is the role crisis mappers and the Ushahidi platform played in the major Haiti earthquake in 2010.

But it’s not just crisis where these technologies are serving a public good.  We’ve seen these swarming techniques applied in a range of ways for journalism and many other activities on the Internet.

Perhaps the gold standard for collective investigative reporting is the MPs Expenses experiment by Simon Willison at the Guardian where 170,000 documents were reviewed by 15,000 people in the first 80 hours after it went live.  The Guardian has deployed its readers to uncover truth in a range of different stories, most recently with the Privatised Public Spaces story.  We’ve also looked at crowdmapping broadband speeds across the UK, and Joanna Geary’s ‘Tracking the Trackers‘ project uncovered some fascinating data about the worst web browser cookie abusers.

Last year Germany’s defense minister Karl-Theodor zu Guttenberg, a man once considered destined for an even larger role in the government, was forced to resign from his post as a result of allegations that he plagiarized his doctoral thesis.  It was proved to be true by a group of people working collectively on the investigation using a site called GuttenPlag Wiki.

ProPublica is a real pioneer in collective reporting and data journalism.  For example, their 2010 investigation into which politicians were given Super Bowl tickets provided a wonderful window into the investigative process.  And the Stimulus Spotcheck project invited people to assess whether or not the 2009 stimulus package in the US was in fact having an impact.

Also, Kevin Anderson reminded me of http://www.ipaidabribe.com tracking local corruption and http://oilreporter.org/ which came out of the Gulf of Mexico Oil Spill in 2010 and helps people report wildlife damage, share photos, etc.

Of course, swarming projects can have a range of different intentions, and if one were to try and count them I would bet only a small percentage are high impact journalistic endeavors.

Andy Baio is a pioneer in this kind of concept and has either been the curator of data already in existence or the inspiration for a crowdsourced investigation.  For example, his “Girl Turk” collective research uncovered an exhaustive list of artist and track names sampled for Girl Talk’s Feed the Animals album.

The big advertising brands intuitively understand the power of swarming intelligence, too, as they see it as a way to use their loyal customers to help them acquire new customers or to at least build a stronger direct relationship with a large group of people.  This is essentially the pitch once used by MySpace and adopted by Facebook, Twitter and Google +…Step 1: create a brand page where people can congregate, Step 2: inspire people to do something collectively that spreads virally.

The technologies that make these group tasks possible are getting easier and more accessible all the time. The wiki format works great for some projects.  DocumentCloud is a tremendous platform.   Google Docs are providing a lot of power for collective investigations, as we’ve discovered several times on the Guardian’s Datablog. And, of course, crowdmapping can be done with little technical intervention using Ushahidi and n0tice.

Of course, you can’t discount the power of the social networks as distribution platforms and amplifiers for group-based investigations.  Creating the space for swarming activity is one thing, but getting the word out is a role that Facebook and Twitter are very good at playing.  It’s a perfect marriage, in many ways.

An army of helpers may be accessible in other ways, too.

Amanda Michel who famously drove the Off The Bus campaign at HuffPo (more on that below) produced a guide to “Using Amazon’s Mechanical Turk for Data Projects” while at ProPublica where she describes how they hired workers to complete short, simple tasks.

But I imagine that the next wave of activity will arise as some of the human patterns of group tasks inspire more sustainable technology platforms.  As Martin Kotynek and ‘PlagDoc’ acknowledge in their wonderful report “Swarm of thoughts” there’s a need for some sort of centralized research platform so this kind of activity is easier to trigger and run with.

Perhaps it’s a matter of identifying a few very specific collective research concepts that work and fueling ongoing community activity around those ideas.  Citizen journalism, for example, is an obvious activity where communities are forming.

CNN’s iReport has a ready-built citizen journalist network incentivized by exposure on cnn.com, and the n0tice platform can enable citizen-powered crowdmapping activity for a range of different projects and get exposure and distribution across different platforms.  Both are capable of serving an ongoing role as useful every-day citizen journalism services that can crank up the volume on a particular issue when the appropriate moment arises.

Platforms can create some ongoing momentum, but so can issues.

Off The Bus was an 18-month HuffPo initiative where readers and staff covered the US elections collaboratively from their own communities. The project had the additional benefit of generating insights that turned into larger editorial investigations such as the Superdelegate Investigation, a report on the Evangelical Vote and the Political Campaign HQ crowdmapping project.  Ryan Tate’s book The 20% Doctrine goes into some detail about Off The Bus, how it developed, and how Amanda managed it all.

I suspect that a whole class of swarming intelligence projects is starting to bubble up that may only appear when the human story, the technology, and the amplifier join up and create a perfect storm.

In the end, it comes down to projects that resonate with people on a personal level.

Though Jim Gray was never found, the thinking about how to conduct the search amongst the leaders of the crowd at the time could not have been more cogent.  The instructions for participants were inspiring, detailing a simple task and the result of completing it:

You will be presented with 5 images. The task is to indicate any satellite images which contain any foreign objects in the water that may resemble Jim’s sailboat or parts of a boat. Jim’s sailboat will show up as a regular object with sharp edges, white or nearly white, about 10 pixels long and 4 pixels wide in the image. If in doubt, be conservative and mark the image. Marked images will be sent to a team of specialists who will determine if they contain information on the whereabouts of Jim Gray. Friends and family of Jim Gray would like to thank you for helping them with this cause.

It’s conceivable that the most important thing social media has accomplished over the last 3-5 years is that it has unlocked the natural desire people have to impact what’s happening in the world in a way they may not have felt empowered to do for decades.

Now it’s simply a matter of joining up the technologies in ways that enable those ideas to come to life.


 

A List of Collective Investigations

Below are some of the projects mentioned above and several others that have been sent to me.  I’ve included a few things that aren’t journalism investigations that are worth a closer look simply because they can be instructive.

Tenacious SearchSince January 28, the San Francisco police, the Coast Guard and Jim’s friends and family have conducted an extensive search to find him and his sailboat, Tenacious, off the California coast. I want to summarize the status of that search here, so that the broad volunteer community that’s done so much knows where we stand.

Embedly Powered

Crisis mapping brings online tool to Haitian disaster relief effortPatrick Meier learned about the earthquakes at 7 p.m. Tuesday while he was watching the news in Boston. By 7:20, he’d contacted a colleague in Atlanta. By 7:40, the two were mobilizing an online tool created by a Kenyan lawyer in South Africa. By 8, they were gathering intelligence from everyplace,…

Embedly Powered

The Brian Lehrer Show – Are You Being Gouged?Our latest “crowdsourcing” project asks listeners to go to their local grocery store and find out the price of three goods: milk, lettuce and beer. You don’t have to buy them (or consume them), but we want to know how much they cost in different neighborhoods throughout the New York area.

Embedly Powered

via Wnyc
Investigate your MP’s expensesWe have 458,832 pages of documents. 33,105 of you have reviewed 226,139 of them. Only 232,693 to go… Start reviewing Please read our privacy policy to find out how we use your data. You must also read our terms of service.

Embedly Powered

Privately owned public space: where are they and who owns them?We’re in the middle of a creeping privatisation of public space. Streets and open spaces are being defined as private land after redevelopment. It began with Canary Wharf but is now a standard feature of urban regeneration. In future, one of the biggest public squares in Europe – Granary square, in the new development around Kings Cross – will be privately owned.

Embedly Powered

Broadband Britain: how fast is your connection?With your help, the Guardian is creating an up-to-date broadband map of Britain, showing advertised versus real speeds. We want to highlight the best and worst-served communities, and bring attention to the broadband blackspots.

Embedly Powered

Tracking the trackers: help us reveal the unseen world of cookiesCookies and other web trackers monitor our online behaviour and store our browsing habits, but who are the companies behind them and what are they doing with our data? We have teamed up with Mozilla to try to find out.

Embedly Powered

GuttenPlag WikiAchtung: Dies sind keine Initiativen von GuttenPlag Dies ist eine kollaborative Dokumentation der Plagiate – jeder ist eingeladen, hier mitzuarbeiten. Ergänzungen und Änderungen in diesem Wiki sind transparent und jederzeit nachvollziehbar. Jede Bearbeitung wird protokolliert. Siehe: Letzte Änderungen (ohne Diskussionsbeiträge) Guttenbergs Dissertation und die Plagiatsvorwürfe wurden seit dem 16.

Embedly Powered

via Wikia
ProPublica’s Super Bowl Blitz: Which Congressmen Are Getting Super Bowl Perks?Carson, André (D)(202) 225-4011 7th IN Don’t call [ Sebastian Jones, ProPublica ] Don’t call [ Sebastian Jones, ProPublica ] Awaiting reply [ Kathleen McLaughlin, Indianapolis Business Journal | Congressional staff Glendal Jones, press secretary, Feb 3, 2010 ] Delahunt, Bill (D)(202) 225-3111 10th MA Staff doesn’t know.

Embedly Powered

I Paid a Bribe | Uncover the market price of corruption in Indiaipaidabribe: Share your story on bribes and corruption. Read latest news on corruption in Indian bureaucracy and civic agencies. Read corruption and bribery related stories from all over India

Embedly Powered

Deepwater Oil Reporter Crowdsourcing PlatformHere are a few things we think you need to know before joining this open data sharing initiative. Please read before you proceed. Know that all data reported on Oil Reporter is PUBLIC. If you don’t want to share information with the public, Oil Reporter isn’t for you.

Embedly Powered

Girl Turk: Mechanical Turk Meets Girl Talk’s “Feed the Animals” – Waxy.orgGirl Talk’s Feed the Animals is one of my favorite albums this year, a hyperactive mish-mash sampling hundreds of songs from the last 45 years of popular music. Gregg Gillis created a beautiful, illegal mess of copyright clearance hell, which you should download immediately.

Embedly Powered

via Waxy
HuffPost Launches OffTheBus Citizen Journalism Project Ahead of 2012 ElectionsWASHINGTON — If you are like most people, you don’t much like the way the “national media” cover politics. As a long-time member of the Washington press corps, I agree with you. We can be trivial, shortsighted, credulous, ideologically blinkered and timid — on a good day.

Embedly Powered

Netflix Prize: HomeThe Netflix Prize sought to substantially improve the accuracy of predictions about how much someone is going to enjoy a movie based on their movie preferences. On September 21, 2009 we awarded the $1M Grand Prize to team “BellKor’s Pragmatic Chaos”. Read about their algorithm, checkout team scores on the Leaderboard, and join the discussions on the Forum.

Embedly Powered

HerdictWeb : AboutAbout Us Herdict is a project of the Berkman Center for Internet & Society at Harvard University. Herdict is a portmanteau of ‘herd’ and ‘verdict’ and seeks to show the verdict of the users (the herd). Herdict Web seeks to gain insight into what users around the world are experiencing in terms of web accessibility; or in other words, determine the herdict.

Embedly Powered

The High Price of Creating Free Ads – New York TimesFrom an advertiser’s perspective, it sounds so easy: invite the public to create commercials for your brand, hold a contest to pick the best one and sit back while average Americans do the creative work. But look at the videos H. J. Heinz is getting on YouTube.

Embedly Powered

SpotCrime Crime MapArrest Arson Assault Burglary Robbery Shooting Theft Vandalism Other Loading Crime Data… City and county crime map showing crime incident data down to neighborhood crime activity. Subscribe for crime alerts and reports.

Embedly Powered

The Peer to Patent Project – Community Patent ReviewThe Community Patent Review: Peer to Patent project On June 15, 2007, the United States Patent and Trademark Office (USPTO) opened the patent examination process for online public participation for the first time.

Embedly Powered

via Nyls
Prize4LifeOur mission is to accelerate the discovery of treatments and a cure for ALS by using powerful incentives to attract new people and drive innovation. We know that the solutions to some of the biggest challenges in ALS research will require out-of-the-box thinking, and some of the most critical discoveries may come from unlikely places.

Embedly Powered

FixMyStreetHow to report a problem Enter a nearby GB postcode, or street name and area Locate the problem on a map of the area Enter details of the problem We send it to the council on your behalf 1,616 reports in past week 2,529 fixed in past month 204,852 updates on reports

Embedly Powered

Reporting Recipe: Using Amazon’s Mechanical Turk for Data ProjectsOf all of journalism’s recent evolutions, data-driven reporting is one of the most celebrated. But as much as we should toast data’s powers, we must acknowledge its cost: Assembling even a small dataset can require hours of tedious work, deterring even the most disciplined of journalists and their editors.

Embedly Powered

HuffPost’s OffTheBus Superdelegate InvestigationWe asked HuffPost readers to join with us and profile the hundreds of superdelegates who are likely to decide the Democratic nomination for president. Hundreds of you responded and we can now present our initial findings. Just click on a state or territory and a list of superdelegate profiles, as compiled by our citizen journalists, will pop up.

Embedly Powered

The Political Campaign HQ Next Door: OffTheBus Special Ops PhotographsWhere are the state campaign headquarters located, exactly, for the party that claims to represent Main Street? Where are they located for the party that claims to represent everyone? Thanks to the work of HuffPost OffTheBus Special Ops, you can visit offices around the nation in just a few key strokes.

Embedly Powered

Introducing Stimulus Spot CheckJuly 20, 2009: This post has been corrected. It’s the middle of July and we’re all wondering whether the stimulus is working. If we do as the administration has advised, we should remain patient – and let the administration measure its own success.

Embedly Powered

WNYC – Mapping the Storm Clean-upWe’ve been asking readers and listeners to let us know if their streets have been plowed. Here are maps from Tuesday, Wednesday and Thursday (white balloons represent unplowed streets, blue plowed). Click the balloons for full information and voice messages where available. Submit yours by texting PLOW to 30644.

Embedly Powered

via Wnyc
How Do You Feel About the Economy? – Interactive Feature – NYTimes.comEnter the word that best describes your current mood. You can submit a response once a day.

Embedly Powered

Adjunct ProjectThe Project The Adjunct Project exists for the growing number of graduate degree holders who are unemployed and underemployed. Many of these highly educated and passionate people are being forced to take jobs dramatically below their achievement and earning potential.

Embedly Powered

The Scrapbook – POPS Report: Tell Us About New York City’s Privately-Owned Public SpacesListen: Project Intro from October 19th // Listen: Wrap-Up from November 9th // WNYC’s Brian Leher Show and The New York World are collaborating on a project to map and report on New York City’s Privately-Owned Public Spaces, aka POPS. We want to figure out how public these public spaces really are.

Embedly Powered

via Wnyc

An open community news platform: n0tice.com

The last several weeks I’ve been working on a new project, a SoLoMo initiative, as John Doerr or Mary Meeker would call it.

One of those places
Noticeboard photo by Jer*ry

It’s a mobile publishing platform that resembles a community notice board.  It’s called n0tice*:

http://n0tice.com.

After seeing Google’s “News near you” service announced on Friday I thought it was a good time to jump into the conversation and share what I’m up to.  Clearly, there are a lot of people chasing the same or similar issues.

First, here’s some background.  Then I’ll detail what it does, how it works, and what I hope it will become.

What is n0tice?

It began as a simple hack day project over a year ago.  I was initially just curious about how location worked on the phone.  At first I thought that was going to be beyond me, and then Simon Willison enlightened me to the location capabilites inherent in modern web browsers. There are many solutions published out there. Here’s one.

It took half a second from working out how to identify a user’s location to realizing that this feature could be handy for citizen reporters.

Around the same time there was a really interesting little game called noticin.gs going around which was built by Tom Taylor and Tom Armitage, two incredibly talented UK developers.  The game rewarded people for being good at spotting interesting things in the world and capturing a photo of them.

Ushahidi was tackling emergency response reporting. And, of course, Foursquare was hitting its stride then, too.

These things were all capturing my imagination, and so I thought I would try something similar in the context of sharing news, events and listings in your community.

Photo by Roo Reynolds

However, I was quite busy with the Guardian’s Open Platform, as the team was moving everything out of beta, introducing some big new services and infusing it into the way we operate.  I learned a lot doing that which has informed n0tice, too, but it was another 12 months before I could turn my attention back to this project.  It doesn’t feel any less relevant today than it did then. It’s just a much more crowded market now.

What does n0tice do?

The service operates in two modes – reading and posting.

n0tice.com - what's near you nowWhen you go to n0tice.com it will first detect whether or not you’re coming from a mobile device.  It was designed for the iPhone first, but the desktop version is making it possible to integrate a lot of useful features, too.

(Lesson:  jQuery Mobile is amazing. It makes your mobile projects better faster. I wish I had used it from day one.)

It will then ask your permission to read your location.  If you agree, it grabs your latitude and longitude, and it shows you what has been published to n0tice within a close radius.

(Lesson: It uses Google Maps and their geocoder to get the location out of the browser, but then it uses Yahoo!’s geo services to do some of the other lookups since I wanted to work with different types of location objects.  This combination is clunky and probably a bad idea, but those tools are very robust.)

You can then zoom out or zoom in to see broader or more precise coverage.

Since it knows where you are already, it’s easy to post something you’ve seen near you, too.  You can actually post without being logged in, but there are some social incentives to encourage logged in behavior.

Like Foursquare’s Mayor analogy, n0tice has the ‘Editor’ badge.

The first person to post in a particular city becomes the Editor of that city.  The Editor can then be ousted if someone completes more actions in the same city or region.

It was definitely a challenge working out how to make sensible game mechanics work, but it was even harder finding the right mix of neighborhood, city, country, lat/long coordinates so that the idea of an ‘Editor’ was consistent from place to place.

London and New York, for example, are much more complicated given the importance of the neighborhoods yet poorly defined boundaries for them.

(Lesson: Login is handled via Facebook. Their platform has improved a lot in the last 12 months and feels much more ‘give-and-take’ than just ‘take’ as it used to. Now, I’m not convinced that the activities in a person’s local community are going to join up naturally via the Facebook paradigm, so it needs to be used more as a quickstart for a new service like this one.)

The ‘Editor’ mechanics are going to need a lot more work.  But what I like about the ‘Editor’ concept is that we can now start to endow more rights and priveleges upon each Editor when an area matures.

Perhaps Editors are the only ones who can delete posts. Perhaps they can promote important posts. Maybe they can even delegate authority to other participants or groups.

Of course, quality is always an issue with open communities. Having learned a few things about crowdsourcing activities at the Guardian now, there are some simple triggers in place that should make it easier to surface quality should the platform scale to a larger audience.

For example, rather than comments, n0tice accepts ‘Evidence’.

You can add a link to a story, post a photo, embed a video or even a storify feed that improve the post.

Also, the ratings aren’t merely positive/negative.  They ask if something matters, if people will care, and if it’s accurate. That type of engagement may be expecting too much of the community, but I’m hopeful it will work.

Of course, all this additional level of interactivity is only available on the desktop version, as the mobile version is intended to serve just two very specific use cases:

  1. getting a snapshot of what’s happening near you now
  2. posting something you’ve seen quickly and easily

How will n0tice make money?

Since the service is a community notice board, it makes sense to use an advertising model that people already understand in that context: classifieds.

Anyone can list something on n0tice for free that they are trying to sell.  Then they can buy featured promotional positions based on how large the area is in which they want their item to appear and for how long they want it to be seen there.

(Lesson: Integrating PayPal for payments took no time at all. Their APIs and documentation feel a little dated in some ways, but just as Facebook is fantastic as a quickstart tool for identity, PayPal is a brilliant quickstart for payments.)

Promotion on n0tice costs $1 per 1 mile radius per day. That’s in US dollars.

While still getting the word out and growing the community $1 will buy you a featured spot that lasts until more people come along and start buying up availability.

But there’s a lot we can do with this framework.

For example, I think it would make sense that a ‘Publisher’ role could be defined much like the ‘Editor’ for a region.

Perhaps a ‘Publisher’ could earn a percentage of every sale in a region.  The ‘Publisher’ could either earn that privelege or license it from us.

I’m also hopeful that we can make some standard affiliate services possible for people who want to use the ad platform in other apps and web sites across the Internet.  That will only really work if the platform is open.

How will it work for developers and partners?

The platform is open in every way.

There are both read and write APIs for it.  The mobile and desktop versions are both using those APIs, in fact.

The read API can be used without a key at the moment, and the write API is not very complicated to use.

So, for example, here are the 10 most recent news reports with the ‘crime’ tag in machine-readable form:

http://n0tice.com/api/readapi-reports.php?output=xml&tags=crime&count=10

The client code for the mobile version is posted on Github with an open license (we haven’t committed to which license, yet), though it is a few versions behind what is running on the live site.  That will change at some point.

And the content published on n0tice is all Creative Commons Attribution-Share Alike so people can use it elsewhere commercially.

The idea in this approach to openness is that the value is in the network itself, the connections between things, the reputation people develop, the impact they have in their communities.

The data and the software are enablers that create and sustain the value.  So the more widely used the data and software become the more valuable the network is for all the participants.

How scalable is the platform?

The user experience can scale globally given it is based on knowing latitude and longitude, something treated equally everywhere in the world.  There are limitations with the lat/long model, but we have a lot of headroom before hitting those problems.

The architecture is pretty simple at the moment, really.  There’s not much to speak of in terms of directed graphs and that kind of thing, yet.  So the software, regardless of how badly written it is, which it most definitely is, could be rewritten rather quickly.  I suspect that’s inevitable, actually.

The software environment is a standard LAMP stack hosted on Dreamhost which should be good enough for now.  I’ve started hooking in things like Amazon’s CloudFront, but it’s not yet on EC2.  That seems like a must at some point, too.

The APIs should also help with performance if we make them more cacheable.

The biggest performance/scalability problem I foresee will happen when the gaming mechanics start to matter more and the location and social graphs get bigger.  It will certainly creak when lots of people are spending time doing things to build their reputation and acquire badges and socialize with other users.

If we do it right, we will learn from projects like WordPress and turn the platform into something that many people care about and contribute to.  It would surely fail if we took the view that we can be the only source of creative ideas for this platform.

To be honest, though, I’m more worried about the dumb things like choking on curly quotes in users’ posts and accidentally losing users’ badges than I’m worried about scaling.

It also seems likely that the security model for n0tice is currently worse than the performance and scalability model. The platform is going to need some help from real professionals on that front, for sure.

What’s the philosophy driving it?

There’s most definitely an ideology fueling n0tice, but it would be an overstatement to say that the vision is leading what we’re doing at the moment.

In its current state, I’m just trying to see if we can create a new kind of mobile publishing environment that appeals to lots of people.

There’s enough meat to it already, though, that the features are very easy to line up against the mission of being an open community notice board.

Local UK community champion Will Perrin said it felt like a “floating cloud of data that follows you around without having to cleave to distribution or boundary.”

I really like that idea.

Taking a wider view, the larger strategic context that frames projects like this one and things like the Open Platform is about being Open and Connected.  Recently, I’ve written about Generative Media Platforms and spoken about Collaborative Media.  Those ideas are all informing the decisions behind n0tice.

What does the future look like for n0tice?

The Guardian Media Group exists to deliver financial security for Guardian News and Media.

My hope is that we can move n0tice from being a hack to becoming a new GMG business that supports the Guardian more broadly.

The support n0tice provides should come in two forms: 1) new approaches to open and collaborative journalism and 2) new revenue streams.

It’s also very useful to have living projects that demonstrate the most extreme examples of ‘Open and Connected‘ models.  We need to be exploring things outside our core business that may point to the future in addition to moving our core efforts where we want to go.

We spend a lot of time thinking about openness and collaboration and the live web at the Guardian.  If n0tice does nothing more than illustrate what the future might look like then it will be very helpful indeed.

However, the more I work on this the more I think it’s less a demo of the future and more a product of the present.

Like most of the innovations in social media, the hard work isn’t the technology or even the business model.

The most challenging aspect of any social media or SoLoMo platform is making it matter to real people who are going to make it come alive.

If that’s also true for n0tice, then the hard part is just about to begin.

 


* The hack was originally called ‘News Signals’.  But after trying and failing to convince a few people that this was a good idea, including both technical people and potential users, such as my wife, I realized the name really mattered.

I’ve spent a lot of time thinking about generative media platforms, and the name needed to reflect that goal, something that spoke to the community’s behaviors through the network. It was supposed to be about people, not machines.

Now, of course, it’s hard to find a short domain name these days, but digits and dots and subdomains can make things more interesting and fun anyhow. Luckily, n0tice.com was available…that’s a zero for an ‘o’.

Behind the scenes of the Open Platform’s evolution

When I came to the Guardian two years ago, I brought with me some crazy California talk about open strategies and APIs and platforms. Little did I know the Guardian already understood openness. It’s part of its DNA. It just needed new ways of working in an open world.

Last week, The Guardian’s approach to openness and mutualisation took a giant step forward when we brought the Open Platform out of Beta.

It’s a whole new business model with a new technology infrastructure that is already accelerating our ambitions.

I’ll explain how we got to this point, but let me clarify what we just announced:

  • We’ve implemented a tiered access model that I think is a first in this space. We have a simple way to work with anyone who wants to work with us, from hobbyist to large-scale service provider and everything in between.
  • We’ve created a new type of ad network with 24/7 Real Media’s Open AdStream, one where the ads travel with the content that we make available for partners to reuse.
  • That ad network is going to benefit from another first which is Omniture analytics code that travels with the content, as well.
  • License terms that encourage people to add value are rare. Using many of the open license principles we developed T&Cs that will fuel new business, not stop it.
  • Hosted in the cloud on Amazon EC2 the service scales massively. There are no limits to the number of customers we can serve.
  • The API uses the open source search platform Solr which makes it incredibly fast, robust, and easy for us to iterate quickly.
  • We introduced a new service for building apps on our network called MicroApps. Partners can create pages and fully functional applications on guardian.co.uk.

We’re using all the tools in the Open Platform for many of our own products, including the Guardian iPad app, several digital products and more and more news apps that require quick turn-around times and high performance levels.

Open Platform: Build applications with the GuardianThere’s lots of documentation on the Open Platform web site explaining all this and more, but I figured I could use this space to give a picture of what’s been happening behind the scenes to get to this point.

It’s worth noting that this is far from the full picture of all the amazing stuff that has been happening at the Guardian the past 12 months. These are the things that I’ve had the pleasure of being close to.

Beginning with Beta

First, we launched in Beta last year. We wanted to build some excitement around it via the people who would use it first. So, we unveiled it at a launch event in our building to some of the smartest and most influential London developers and tech press.

We were resolute in our strategy, but when you release something with unknown outcomes and a clear path to chaos people get uneasy. So, we created just large enough hurdles to keep it from exploding but a wide enough berth for those who used it to take it to its extreme and to demonstrate its value.

It worked. Developers got it right away and praised us for it. They immediately started building things using it (see the app gallery). All good signs.

Socializing the message

We ran a Guardian Hack Day and started hosting and sponsoring developer events, including BarCamp, Rewired State, FOWA, dConstruct, djugl, Music Hack Day, ScaleCamp, etc.

Next, we knew the message had to reach their bosses soon, and their bosses’ bosses. So, we aimed right for the top.

Industry events can be useful ways to build relationships, but Internet events have been really lacking meaning. People who care about how the Internet is changing the world and who are also actively making that change happen were the types of people we needed to build a long term dialog with.

So, we came up with a new kind of event: Activate Summit.

The quality of the speakers and attendees at Activate was incredible. Because of those people the event has now turned into something much more amazing than what we initially conceived.

Nick Bostrom’s darkly humorous analysis of the likelihood of human extinction as a result of technology haunts me frequently still, but the event also celebrates some brilliant ways technology is making life better. I think we successfully tapped into some kind of shared consciousness about why people invest their careers into the Internet movement…it’s about making a difference.

Developers, developers, developers!

Gordon Brown was wise in his decision to put Tim Berners-Lee and Nigel Shadbolt on the task of opening government data. But they knew enough to know that they didn’t know how to engage developers. Where did they turn for help? The Guardian!

We couldn’t have been more excited to help them get data.gov.uk out the door successfully. It was core to what we’re about. As Free Our Data champion Charles Arthur joked on the way to the launch presentation, “nice of them to throw a party for me.”

We gave them a platform to launch data.gov.uk in the form of developer outreach, advice, support, event logistics, a nice building, etc., but, again, the people involved made the whole effort much more impressive than any contribution we made to it.

Tom Taylor’s Postcode Paper, for example, was just brilliant on so many levels. The message for why open government data could not have been clearer.

Election data

Then when the UK election started to pick up some momentum, we opened up the Guardian’s deep politics database and gave it a free-to-use API. We knew we couldn’t possibly cover every angle of the election and hoped that others could use the Guardian’s resources to engage voters. We couldn’t have asked for a better example of that then Voter Power.

A range of revenue models

All along there were some interesting things happening more behind the scenes, too.

The commercial team was experimenting with some new types of deals. Our ad network business grew substantially, and we added a Food Ad Network and a Diversity Network to our already successful Green Ad network.

It was clear that there was also room for a new type of ad network, a broader content-targeted ad network. And better yet, if we could learn about what happens with content out across the web then we might have the beginnings of a very intelligent targeting engine, too.

24/7 Real Media’s Open Ad Stream and Omniture were ready to help us make this happen. So, we embedded ads and analytics code with article content in the Content API. We’ve launched with some house ads to test it out, but we’re very excited by the possibilities when the network grows.

The Guardian’s commercial teams, including Matt Gilbert, Steve Wing, Dan Hedley and Torsten de Reise, also worked out a range of different partnerships with several Beta customers including syndication, rev share on paid apps, and rev share on advertising. We’re scaling those models and working out some new ones, as well.

It became obvious to everyone that we were on to something with a ton of potential.


Rewriting the API for scale

Similarly, the technology team was busily rebuilding the Content API the moment we realized how big it needed to be.

In addition to supporting commercial partners, we wanted to use it for our own development. The new API had to scale massively, it had to be fast, it had to be reliable, it had to be easy to use, and it had to be cheap. We used the open source search platform Solr hosted on Amazon’s EC2. API service management was handled by Mashery.

The project has hit the desk of nearly every member of the development team at one point or another. Here are some of the key contributions. Mat Wall architected it. Graham Tackley made Mat’s ideas actually work. Graham and Stephen Wells led the development, while Francis Rhys-Jones and Daithi O’Crualaoich wrote most of the functions and features for it. Martyn Inglis and Grant Klopper handled the ad integration. The wonderful API Explorer was written by Francis, Thibault Sacreste and Ken Lim. Matthew O’Brien wrote the Politics API. The MicroApps framework included all these people plus basically the entire team.

Stephen Dunn and Graham Tackley provided more detail in a presentation to the open source community in Prague at Lucid Imagination’s Solr/Lucene EuroCon event.

The application platform we call MicroApps

Perhaps even more groundbreaking than all this is the MicroApp framework. A newspaper web site that can run 3rd party apps? Yes!

MicroApps makes the relationship between the Guardian and the Internet feel like a two-way, read-write, permeable membrane rather than a broadcast tower. It’s a very tangible technology answer to the openness vision.

You can learn more by reading 2 excellent blog posts about MicroApps. Dan Catt explains how he used MicroApps for Zeitgeist. Since most of the MicroApps that exist today are hosted on Google AppEngine, the Google Code team published Chris Thorpe’s insights about what we’re doing with MicroApps on their blog.

The MicroApps idea was born out of a requirement to release smaller chunks of more independent functionality without affecting the core platform….hence the name “MicroApps”. Like many technology breakthroughs, the thing it was intended to do becomes only a small part of the new world it opens up.

Bringing it all together

At the same time our lead software architect Mat Wall was formulating the MicroApp framework, the strategy for openness was forming our positioning and our approach to platforms:

…to weave the Guardian into the fabric of the Internet; to become ‘of‘ the Web, not just ‘on‘ the Web

The Content API is a great way to Open Out and make the Guardian meaningful in multiple environments. But we also knew that we had to find a way to Open In, or to allow relevant and interesting things going on across the Internet to be integrated sensibly within guardian.co.uk.

Similarly, the commercial team was looking to experiment with several media partners who are all thinking about engagement in new ways. What better way to engage 36M users than to offer fully functional apps directly on our domain?

The strategy, technology and business joined up perfectly. A tiered business model was born.

The model

Simon Willison was championing a lightweight keyless access level from the day we launched the Beta API. We tested keyless access with the Politics API, and we liked it a lot. So, that became the first access tier: Keyless.

We offered full content with embedded ads and analytics code in the next access level. We knew getting API keys was a pain. So, we approved keys automatically on signup. That defined the second tier: Approved.

Lastly, we combined unfettered access to all the content in our platform with the MicroApp framework for building apps on the Guardian network. We made this deep integration level available exclusively for people who will find ways to make money with us. That’s the 3rd tier: Bespoke. It’s essentially the same as working in the building with our dev team.

We weren’t precisely clear on how we’d join these things up when we conceived the model. Not surprisingly, as we’ve seen over and over with this whole effort, our partners are the ones who are turning the ideas into reality. Mashery was already working on API access levels, and suddenly the last of our problems went away.

The tiers gave some tangible structure to our partner strategy. The model felt like it just started to create itself.

Now we have lots of big triangle diagrams (see below) and grids and magic quadrants and things that we can put into presentation slides that help us understand and communicate how the ecosystem works.

Officially opening for business

Given the important commercial positioning now, we decided that the launch event had to focus first and foremost on our media partners. We invited media agencies and clients into our offices. Adam Freeman and Mike Bracken opened the presentation. Matt Gilbert then delivered the announcement and gave David Fisher a chance to walk through a deep dive case study on the Enjoy England campaign.

There was one very interesting twist on the usual launch event idea which was a ‘Developer Challenge’. Several members of the development team spent the next 24 hours answering briefs given to us by the media partners at the event. It was run very much like a typical hack day, but the hacks were inspired by the ideas our partners are thinking about. Developer advocate Michael Brunton-Spall wrote up the results if you want to see what people built.

Here is the presentation we gave at the launch event:


(Had we chosen a day to launch other than the same day that Google threw a press release party I think you’d already know all this.)

Do the right thing

Of all the things that make this initiative as successful as it is, the thing that strikes me most is how engaged and supportive the executive team is. Alan Rusbridger, Carolyn McCall, Tim Brooks, Derek Gannon, Emily Bell, Mike and Adam, to name a few, are enthusiastic sponsors because this is the right thing to do.

They created a healthy environment for this project to exist and let everyone work out what it meant and how to do it together.

Alan articulated what we’re trying do to in the Cudlipp lecture earlier this year. Among other things, Alan’s framework is an understanding that our abilities as a major media brand and those of the people formerly known as the audience are stronger when unified than they are when applied separately.

Most importantly, we can afford to venture into open models like this one because we are owned by the Scott Trust, not an individual or shareholders. The organization wants us to support journalism and a free press.

“The Trust was created in 1936 to safeguard the journalistic freedom and liberal values of the Guardian. Its core purpose is to preserve the financial and editorial independence of the Guardian in perpetuity, while its subsidiary aims are to champion its principles and to promote freedom of the press in the UK and abroad.”

The Open Platform launch was a big day for me and my colleagues. It was a big day for the future of the Guardian. I hope people also see that it was a major milestone toward a brighter future for journalism itself.

Defining online media platforms

My thinking about what platforms and ecosystems look like in the online media world seems to evolve constantly. It has certainly become more clear, bigger and more nuanced the last 2 years or so, but the language to describe a media platform feels very unfinished still.

It may be that the word ‘platform’ is throwing off conflicting ideas of what exists in my head. If you ask 10 people to define a platform you’ll get 10 different answers.

You can think of a platform in several different ways. There’s the functional role it serves. You can talk about the ecosystem around it. Some explain it in terms of the pieces and how they interact. It can also be an abstract concept or more of a strategic view of things.

Blogger (and conspiracy theorist) Kid Mercury has a paper of sorts defining platforms in terms of the strategies that they serve and their relationship to products, particularly in the Web 2.0 context:

“Products are “things” (goods, services, experiences, etc) that are sold; platforms are the intermediaries that deliver products. True to Yin/Yang form, the two are complementary yet opposing as well; platforms cannot exist without products, and products need platforms to be put into context and to be found.”

Wikipedia has several entries on platforms including “computing platforms“, “political platforms” and “platform shoes“:

Platform often describes the set of hardware components that make up the computer itself, that the software is written to target (often just described as “written for an architecture”).”

Auto manufacturers initiated platform strategies in the 1960’s and ’70’s to improve several aspects of their processes:

“Vehicle platform-sharing combined with advanced and flexible-manufacturing technology enables automakers to sharply reduce product development and changeover times, while modular design and assembly allow building a greater variety of vehicles from one basic set of engineered components. Many vendors refer to this as product or vehicle architecture.”

Similarly, John Hagel and John Seely Brown noted how platform design in auto manufacturing in Asia has enabled amazing efficiencies in their 2006 paper “Connecting Globalization & Innovation: Some Contrarian Perspectives“:

“Honda’s share of Vietnam’s motorcycle market, for instance, dropped from nearly 90 percent in 1997 to 30 percent in 2002. Japanese companies complain about the “stealing” of their designs, but the Chinese have redefined product architectures in ways that go well beyond copying, by encouraging significant local innovation at the component and subsystem level. “

David S. Evans, Andrei Hagiu and Richard Schmalensee go into great detail about various approaches to platforms taken by companies like Microsoft and Apple in their book “How Software Platforms Drive Innovation and Transform Industries.” They explained several models for how platform businesses work:

“In a multisided strategy, the software platform mainly facilitates inter-actions between the sides of the platform (particularly applications vendors and end users). In a single-sided (or merchant) strategy, the platform either produces the complementary products itself or buys them and resells them to end users. “

The book jumps from strategy down to specific detail such as this bit about how to develop relationships with customers:

“In the case of the software platforms we have examined, even the weakest relationships are far deeper than the arm’s-length relationships one sees in many one-sided industries. Software platforms can’t have direct relationships with the thousands of small developers, hardware makers, and peripheral device makers. Yet they document and make APIs available to developers, provide interface information to hardware and peripheral makers, and make sure their platforms have the relevant drivers for the peripherals. And they develop relationships through large developer conferences and small focus groups that bring some of these smaller players together. At the other extreme, software platforms often have deep relationships with several larger partners. These relationships involve regular exchange of information and joint work on defining new standards and specifications. They may also involve joint investments in product development or marketing.”

Thomas Eisenmann of Harvard Business Review has done some interesting work addressing the network effects of the modern media platform in his paper called “Platform-Mediated Networks: Definitions and Core Concepts“. He uses Visa’s and Microsoft’s XBox platforms to compare and contrast different methods for creating network effects. He talks about how a jointly sponsored platform like Visa’s leverages more complicated but more scalable relationships vs the single-proprietor platform like Microsoft’s which has lots of dependencies.

Visa vs Xbox platform ecosystem
He builds on this view in his follow up work with Geoffrey Parker and Marshall Van Alstyne for MIT. In “Network Platforms – Core Concepts” he builds network effects as a necessary component of a platform:

“A “Network platform” is defined by the subset of components used in common across
a suite of products (Boudreau, 2006) that also exhibit network effects. Value is exchanged among a triangular set of relationships including users, component suppliers (co-developers), and platform firms.”

They spell out some of the trickier issues the platform organizations need to consider such as channel conflict:

“Platform providers must determine how much of the value created through network
interactions they should seek to capture and from which users. consider who adds the most value. A bigger network served by a single platform can create more value in aggregate, but users may worry that a dominant platform provider will extract too much value. Likewise, when the participation of a few large users is crucial for mobilizing a network (e.g., movie studios vis-a-vis new DVD formats), conflict over the division of value between platform providers and “marquee” users is common.need to give in order to get. Controlling most of a multi-billion dollar business is better than controlling all of a million dollar business. “

Just having a platform and a platform strategy is not enough. There are organizational requirements that make it possible to drive a platform business. Kid Mercury adds some insight into the ways in which organizations must institutionalize capability creation to serve platforms rather than functional hierarchy to serve products:

“There can be no long value chains where each employee is a rung in a ladder, with all the value ultimately flowing to the top. Such hierarchical organizations are essentially immobile by design; they are not capable of creating new capabilities because everyone in the vertical hierarchy is participating in a way that only serves the existing value chain. This is great for incremental innovations, as such a structure essentially institutionalizes the process of adding more value to existing value chains. It is not so effective, though, for creating new value chains.”

I’m looking for more views on platform strategies and ecosystems, so please comment below if you have any favorites. I haven’t found as much dialog and literature as I’d like.

I suppose all this goes hand-in-hand with some of the discussion around designing for growth. Maybe I’m alone here, but I’m more interested than ever before in this old magazine idea.

Local community data reporting

EveryBlock has taken a very data intensive look at local news reporting. As founder Adrain Holovaty explains:

“An overall goal of EveryBlock is to point you to news near your block. We’ve been working hard to do a good job of this so far by accumulating public records, cataloging newspaper stories and pulling together various other geographic information from the Web.”

This generally takes the form of raw data points placed on maps. They recently rolled out a variation on the theme by using topic-specific data which adds more context to the local news reporting idea.

“A week or so ago, 15 people were arrested on bribery charges as part of a federal probe into corruption in Chicago city government. We’ve analyzed U.S. Attorney Patrick J. Fitzgerald’s complaint documents and cataloged the specific addresses mentioned within. On the project’s front page, you can view every location we found, along with a relevant excerpt from the complaint. You can sort this data in various ways, including a list and map of all the alleged bribe locations.”

This is the type of value that’s otherwise kind of missing from the experience. Rather than providing a mostly pure research tool, the site now gives some insight and perspective with an editorial view on the data. In this case, the data is telling a story that otherwise might seem a little distant to you until you see how the issue may in fact be a very real one right in your backyard, so to speak.

But it occurred to me that the community is probably even better able to capture and share this level of useful insight. It would be really neat to see EveryBlock open the reporting and mapping process so that anyone who has an interest in exposing the trends in their neighborhood or elsewhere had a platform to do so.

Average payment (€) by Area
Similar to the way Swivel allows you to collect data in spreadsheet form, visualize it and then share it the way Flickr and YouTube allow you to share, EveryBlock could provide an environment for individuals to do the reporting in their neighborhood that matters to them. The wider community could then benefit from the work of a few, and suddenly you have a really powerful local news vehicle.

This isn’t necessarily in contrast to the approach Outside.in has taken by aggregating shared information from around the web, but it certainly puts some structure around it in a way that may be necessary.

Managing a community is a very different problem than aggregating and presenting useful local data. But I wonder if it’s a necessary next step to get both of these fledgling but very forward-thinking local media services closer to critical mass.

Interactive journalism: An amazing homicide mashup

I had the pleasure of interviewing Sean Connelly and Katy Newton for YDN Theater recently with YDN videographer Ricky Montalvo. They created the amazing (and award-winning) crime data mashup Not Just A Number in partnership with The Oakland Tribune.

Not Just A NumberAfter getting tired of watching the homicide count for 2006 climb higher and higher, they decided to humanize the issue and talk to the families of the victims directly. They wanted to expose the story beneath the number and give a platform upon which the community could make the issue real.

Statistics can tell effective stories, but death and loss reach emotional depths beyond the power of any numerical exploration.

Sean and Katy posted recordings of the families talking about the sons, daughters, sisters and brothers that they lost. They integrated family photos, message boards, articles and more along with the interactive homicide map on the site to round out the experience making it much more human than the traditional crime data mashup.

Here is the video (7 min.):

I also asked them if they had trouble getting data to make the site, and they said the Oakland Tribune staff were very supportive. There weren’t any usable open data sets coming out of the city, so they had to collect and enter everything themselves.

This, of course, is a very manual process. Given the challenge of getting the data Sean and Katy didn’t see how the idea could possibly scale outside of the city of Oakland.

SOmebody needs to take that on as a challenge.

I’m hopeful that efforts like Not Just A Number and the Open Government Data organization will be able to surface why it’s important for our government to open up access to the many data repositories they hold. And if the government won’t do it, then it should be the job of journalists and media companies to surface government data so that people can use it in meaningful ways.

This is a great example of how the Internet can empower people who otherwise have no voice or audience despite having profound stories to tell.