Making smaller things have bigger meaning

The atomization of everything digital is a wonderful direction of travel which seems to create more and more opportunity the deeper we go.

It’s only going to accelerate, really.  More and more raw data is getting published.  Short-form dialog is proliferating.  More apps are on more devices responding better to ever-smaller information signals.

The problem is that this massive growth of small things also creates some challenges when meaning gets lost in the detail.

By deconstructing and isolating everything we understand, from data in a news article to the exact position of a device on the planet, we can then assemble new views of the world and reinvent knowledge itself.  It’s heady stuff when you start seeing how time and space converge onto small points.

But globalizing small things also creates imbalances. It means that the weight of the world’s attention can crush unstable information.  It means chunky and complicated ideas have to compete with individual and often out-of-context datapoints in the same environments.  And small things can be elevated to have more meaning than they deserve.

Glenn Beck famously uses such tactics by saying things like ‘fear is up 6%.’

The atomization of everything often seems to happen at the expense of context. That isn’t good. Atomization and context should at least co-exist if not actually reinforce each other.

I was reminded of how important it is to develop context more when Joris Luyendijk, a Dutch reporter and author, visited the Guardian the other day to talk about what he’s working on.

Joris has been applying some interesting approaches to reporting, collaborating very explicitly with experts to educate himself and therefore his readers on big themes.  He’s asking the question, “Is the electric car a good idea?”  The collaborative process he’s using is fueling a community of shared interest that includes among its members thought leaders, scientists, officials and challengers in addition to an increasingly engaged community of more peripheral readers.

He needed to step out of the news cycle in order to do the work properly.  Joris said that competing at the pace of news means that reporting must focus on the changes happening in the world, the abnormalities.  The variance becomes more important than the purpose of reporting something. The result is a news popularity contest.

We saw this with the US midterm elections. The witchcraft variant squeezed out the slower-paced topics such as repealing healthcare law.

News should be more than an expression of normality variance.  News is not a changelog

Computers are complicit here. They are brilliant at finding variance in streams of data. The Google News algorithm is a great example of how effective machines can be at discovering and amplifying new information. But when a machine-driven system becomes successful at amplifying small things, new machines will find small things to create in order to get amplified.

For example, 70 Holdings is an SEO business that targets Google News through a sort of network of blogs.  They simply produce content that will attract attention. The company elicits “clicks and ad impressions on content simply because it ranks among the highest–and supposedly most trustworthy–results on Google News,” according to CNET. And this is not much different from what Demand Media is up to, too.

That kind of ecosystem fools itself into thinking that it informs people or that it understands intent, but all it really does is direct click traffic patterns, casting a huge net hoping to catch a few fish.

What it fails to understand is that the signals they are using to interpret intent, variances in data flow, lack any awareness of the context of the activity observed by the machines.

Sir Tim Berners-Lee suggests journalists need to surface the stories in the data:

“Journalists need to be data-savvy. It used to be that you would get stories by chatting to people in bars, and it still might be that you’ll do it that way some times.

But now it’s also going to be about poring over data and equipping yourself with the tools to analyse it and picking out what’s interesting. And keeping it in perspective, helping people out by really seeing where it all fits together, and what’s going on in the country.”

Experts, inside sources, stories, commenters, readers and even data are all going to benefit from the existence of each other and the new knowledge each contributes when they are connected via context, a theme, an idea. And if the human inputs to an idea all benefit from the existence of eachother then the story will find itself at the center of a new kind of network effect.

Consequently, the business models around network effects can be incredibly powerful.

News then becomes connective tissue for people who share an interest in an idea.

Some view linked data as the connective tissue and news as a transport vehicle for ideas to spread.  Zemanta and Storify both tackle the problem this way.  Zemanta finds related context when you write a blog post from around the web through linked data.  Storify helps you connect things you write with things people are posting on twitter and youtube.

The fact that the Internet makes it possible to connect to people around the world so easily should mean that it’s easier to engage with things that matter to us, but it often feels like the opposite is happening.

The noise makes us numb.

We need to amplify meaning when it matters. We need to value the small things in some sort of understandable scale.

Without these dynamics, we will lose the forest through the trees and find the flood of media in the world overwhelming and increasingly useless to us. That’s already happening to many.

While social filters are helping with this problem, they are also atomizing relationships and creating even more noise.

In a recent blog post “The False Question Of Attention Economics“, Stowe Boyd wrote about the need to innovate around our relationship to information rather than give up and drown it:

“I suggest we just haven’t experimented enough with ways to render information in more usable ways, and once we start to do so, it will like take 10 years (the 10,000 hour rule again) before anyone demonstrates real mastery of the techniques involved.

Instead, I suggest we continue experimenting, cooking up new ways to represent and experience the flow of information, our friends’ thoughts, recommendations, and whims, and the mess that is boiling in the huge cauldron we call the web.”

Our world would be much worse off if the flow of information slowed down or reversed.  There’s so much to be gained still.

I think the solution, rather, is to fuel meaning and understanding by directing atomization toward a purpose, giving it context, and framing it in a space that makes it matter to people.

Enhanced by Zemanta