Mobilising the web of feeds

I wrote this piece for the Guardian’s Media Network on the role that RSS could play now that the social platforms are becoming more difficult to work with. GeoRSS, in particular, has a lot of potential given the mobile device explosion. I’m not suggesting necessarily that RSS is the answer, but it is something that a lot of people already understand and could help unify the discussion around sharing geotagged information feeds.


Powered by Guardian.co.ukThis article titled “Mobilising the web of feeds” was written by Matt McAlister, for theguardian.com on Monday 10th September 2012 16.43 UTC

While the news that Twitter will no longer support RSS was not really surprising, it was a bit annoying. It served as yet another reminder that the Twitter-as-open-message-utility idea that many early adopters of the service loved was in fact going away.

There are already several projects intending to disrupt Twitter, mostly focused on the idea of a distributed, federated messaging standard and/or platform. But we already have such a service: an open standard adopted by millions of sources; a federated network of all kinds of interesting, useful and entertaining data feeds published in real-time. It’s called RSS.

There was a time when nearly every website was RSS-enabled, and a cacophony of Silicon Valley startups fought to own pieces of this new landscape, hoping to find dotcom gold. But RSS didn’t lead to gold, and most people stopped doing anything with it.

Nobody found an effective advertising or service model (except, ironically, Dick Costolo, CEO of Twitter, who sold Feedburner to Google). The end-user market for RSS reading never took off. Media organisations didn’t fully buy into it, and the standard took a backseat to more robust technologies.

Twitter is still very open in many ways and encourages technology partners to use the Twitter API. That model gives the company much more control over who is able to use tweets outside of the Twitter owned apps, and it’s a more obvious commercial strategy that many have been asking Twitter to work on for a long time now.

But I think we’ve all made a mistake in the media world by turning our backs on RSS. It’s understandable why it happened. But hopefully those who rejected RSS in the past will see the signals demonstrating that an open feed network is a sensible thing to embrace today.

Let’s zoom out for context first. Looking at the macro trends in the internet’s evolution, we can see one or two clear winners as more information and more people appeared on the network in waves over the last 15 years.

Following the initial explosion of new domains, Yahoo! solved the need to surface only the websites that mattered through browsing. The Yahoo! directory became saturated, so Google then surfaced pages that mattered within those websites through searches. Google became saturated, so Facebook and Twitter surfaced things that mattered that live on the webpages within those web sites through connecting with people.

Now that the social filter is saturated, what will be used next to surface things that matter out of all the noise? The answer is location. It is well understood technically. The software-hardware-service stack is done. The user experience is great. We’re already there, right?

No – most media organisations still haven’t caught up yet. There’s a ton of information not yet optimised for this new view of the world and much more yet to be created. This is just the beginning.

Do we want a single platform to be created that catalyses the location filter of the internet and mediates who sees what and when? Or do we want to secure forever a neutral environment where all can participate openly and equally?

If the first option happens, as historically has been the case, then I hope that position is taken by a force that exists because of and reliant on the second option.

What can a media company do to help make that happen? The answer is to mobilise your feeds. As a publisher, being part of the wider network used to mean having a website on a domain that Yahoo! could categorise. Then it meant having webpages on that website optimised for search terms people were using to find things via Google. And more recently it has meant providing sharing hooks that can spread things from those pages on that site from person to person.

Being part of the wider network today suddenly means all of those things above, and, additionally, being location-enabled for location-aware services.

It doesn’t just mean offering a location-specific version of your brand, though that is certainly an important thing to do as well. The major dotcoms use this strategy increasingly across their portfolios, and I’m surprised more publishers don’t do this.

More importantly, though, and this is where it matters in the long run, it means offering location-enabled feeds that everyone can use in order to be relevant in all mobile clients, applications and utilities.

Entrepreneurs are all over this space already. Pure-play location-based apps can be interesting, but many feel very shallow without useful information. The iTunes store is full of travel apps, reference apps, news, sports, utilities and so on that are location-aware, but they are missing some of the depth that you can get on blogs and larger publishers’ sites. They need your feeds.

Some folks have been experimenting in some very interesting ways that demonstrate what is possible with location-enabled feeds. Several mobile services, such as FlipBoard, Pulse and now Prismatic, have really nice and very popular mobile reading apps that all pull RSS feeds, and they are well placed to turn those into location-based news services.

Perhaps a more instructive example of the potential is the augmented reality app hypARlocal at Talk About Local. They are getting location-aware content out of geoRSS feeds published by hyperlocal bloggers around the UK and the citizen journalism platform n0tice.com.

But it’s not just the entrepreneurs that want your location-enabled feeds. Google Now for Android notifies you of local weather and sports scores along with bus times and other local data, and Google Glasses will be dependent on quality location-specific data as well.

Of course, the innovations come with new revenue models that could get big for media organisations. They include direct, advertising, and syndication models, to name a few, but have a look at some of the startups in the rather dense ‘location‘ category on Crunchbase to find commercial innovations too.

Again, this isn’t a new space. Not only has the location stack been well formed, but there are also a number of bloggers who have been evangelising location feeds for years. They already use WordPress, which automatically pumps out RSS. And many of them also geotag their posts today using one of the many useful WordPress mapping plugins.

It would take very little to reinvigorate a movement around open location-based feeds. I wouldn’t be surprised to see Google prioritising geotagged posts in search results, for example. That would probably make Google’s search on mobile devices much more compelling, anyhow.

Many publishers and app developers, large and small, have complained that the social platforms are breaking their promises and closing down access, becoming enemies of the open internet and being difficult to work with. The federated messaging network is being killed off, they say. Maybe it’s just now being born.

Media organisations need to look again at RSS, open APIs, geotagging, open licensing, and better ways of collaborating. You may have abandoned it in the past, but RSS would have you back in a heartbeat. And if RSS is insufficient then any location-aware API standard could be the meeting place where we rebuild the open internet together.

It won’t solve all your problems, but it could certainly solve a few, including new revenue streams. And it’s conceivable that critical mass around open location-based feeds would mean that the internet becomes a stronger force for us all, protected from nascent platforms whose their future selves may not share the same vision that got them off the ground in the first place.

To get more articles like this sent direct to your inbox, sign up for free membership to the Guardian Media Network. This content is brought to you by Guardian Professional.

guardian.co.uk © Guardian News & Media Limited 2010

Published via the Guardian News Feed plugin for WordPress.

Dispatchorama: a distributed approach to covering a distributed news event

We’ve had a sort of Hack Week at the Guardian, or “Discovery Week“. So, I took the opportunity to mess around with the n0tice API to test out some ideas about distributed reporting.

This is what it became (best if opened in a mobile web browser):

http://dispatchorama.com/



It’s a little web app that looks at your location and then helps you to quickly get to the scene of whatever nearby news events are happening right now.

The content is primarily coming from n0tice at the moment, but I’ve added some tweets with location data. I’ve looked at some geoRSS feeds, but I haven’t tackled that, yet. It should also include only things from the last 24 hours. Adding more feeds and tuning the timing will help it feel more ‘live’.

The concept here is another way of thinking about the impact of the binding effect of the digital and physical worlds. Being able to understand the signals coming out of networked media is increasingly important. By using the context that travels with bits of information to inform your physical reality you can be quicker to respond, more insightful about what’s going on and proactive in your participation, as a result.

I’m applying that idea to distributed news events here, things that might be happening in many places at once or a news event that is moving around.

In many ways, this little experiment is a response to the amazing effort of the Guardian’s Paul Lewis and several other brave reporters covering last year’s UK riots.

There were 2 surprises in doing this:

  1. The twitter location-based tweets are really all over the place and not helpful. You really have to narrow your source list to known twitter accounts to get anything good, but that kind of defeats the purpose.
  2. I haven’t done a ton of research, yet, but there seems to be a real lack of useful geoRSS feeds out there. What happened? Did the failure of RSS readers kill the geoRSS movement? What a shame. That needs to change.

The app uses the n0tice API, JQuery Mobile, and Google’s location APIs and a few snippets picked off StackOverflow. It’s on GitHub here:
https://github.com/mattmcalister/dispatchorama/

Local news is going the wrong way

Google’s new Local News offering misses the point entirely.

As Chris Tolles points out, Topix.net and others have been doing exactly this for years. Agregating information at the hyperlocal level isn’t just about geotagging information sources. Chris explains why they added forums:

“…there wasn’t enough coverage by the mainstream or the blogosphere…the real opportunity was to become a place for people to publish commentary and stories.”

He shouldn’t worry about Google, though. He should worry more about startups like Outside.in who upped the ante by adding a slightly more social and definitely more organic experience to the idea of aggregating local information.

Yet information aggregation still only dances around the real issue.

People want to know what and who are around them right now.

The first service that really nails how we identify and surface the things that matter to us when and where we want to know about them is going to break ground in a way we’ve never seen before on the Internet.

We’re getting closer and closer to being able to connect the 4 W’s: Who, What, Where and When. But those things aren’t yet connecting to expose value to people.

I think a lot of people are still too focused on how to aggregate and present data to people. They expect people to do the work of knowing what they’re looking for, diving into a web page to find it and then consuming what they’ve worked to find.

There’s a better way. When services start mixing and syndicating useful data from the 4 W vectors then we’ll start seeing information come to people instead.

And there’s no doubt that big money will flow with it.

Dave Winer intuitively noted, “Advertising will get more and more targeted until it disappears, because perfectly targeted advertising is just information. And that’s good!”

I like that vision, but there’s more to it.

When someone connects the way information surfaces for people and the transactions that become possible as a result, a big new world is going to emerge.

Building community is hard

Jay Rosen has an interesting post on the failure of AssignmentZero, an effort to build a publicly funded crowdsourced news organization.

Among the many lessons, he keeps coming back to motivation and incentive.

“A well managed project correctly estimates what motivates people to join in, what the various rewards are for participants, and where the practical limits of their involvement lie.

…amateur production will never replace the system of paid correspondents. It only springs to life when people are motivated enough to self-assign and follow through.”

The idea wasn’t fundamentally broken, in my mind. Crowdsourced news is very powerful. As Derek Powazek said,

“At its best, crowdsourcing is about expanding the walls of the newsroom to the internet, giving an opportunity to people with real experience to share their expertise. This is a point that’s often lost on people who are just looking to make a quick buck on Web 2.0.”

More than anything else, I suspect that AssignmentZero failed because there weren’t any readers. Motivation wouldn’t have been a problem with a NYTimes-sized audience.

To date, I’ve never seen a better explanation of the motivations in collaborative online experiences than Yochai Benkler’s paper called Coase’s Penguin. One of my favorite excerpts from that is where he warns against paying for contributions from the community:

“An act of love drastically changes meaning when one person offers the other money at its end, and a dinner party guest who will take out a checkbook at the end of dinner instead of bringing flowers or a bottle of wine at the beginning will likely never be invited again.”

There are as many motivations as there are contributors in a shared media project. What holds them together is more art than science. Some of that art includes good timing and luck. But it also requires a unique kind of commitment and salesmanship from the leaders of the project.

I’ve begun to wonder if the tipping point happens when the confluence of the community size, the ROI to the contributors and the depth of the trust relationship with the company or the brand creates more value than the sum of the parts. Maybe the science of collaboration services can be found by quantifying the meaning of the relationships between those elements: size, cost, benefit and trust.

Or it could also be that the secret sauce inside the Craig Newmarks, Stewart Butterfields and Jimmy Waleses of the world is much more complicated and nuanced than anyone realizes.

A human-powered relevance engine for Internet startup news

Here’s a fun experiment in crowdsourcing. I’ve been getting overwhelmed by all the startup news coming out of the many sources tracking the interesting ideas and new companies hunting for Internet gold. Many of these companies are really smart. Many are just, well, gold diggers.


And with so many ways to track new and interesting companies, I’ve lost the ability to identify the difference between companies that are actually attacking a problem that matters and companies that are combining buzzwords in hopes of getting funding or getting acquired or both.

There must be a way to harness the collective insight of people who are close to these companies or the ideas they embody to shed light on what’s what. Maybe there’s a way to do that using Pligg.

While shaking my head in a moment of disappointment and a little bit of jealousy at all the new dotcom millionaires/billionaires, the word “flipbait” crossed my mind. I looked to see if the domain was available, and sure enough it was. So, I grabbed the domain, installed Pligg and there it is.

It should be obvious, but the idea is to let people post news of new Internet startups and let the community decide if something is important or not. If I’m not the only one thinking about this, then I can imagine it becoming a really useful resource for gaining insight into the barage of headlines filling up my feed reader each day.

And if it doesn’t work, I’ll share whatever insight I can glean into why the concept fails. There will hopefully at least be some lessons in this experiment for publishers looking to leverage crowdsourcing in their media mix.