Two cultures of information in big organisations

In a prescient, wide-ranging essay from 1997 [1], Phil Agre reminds us that computers were originally put to industrial use in order to “automate a kind of factory work whose raw material happens to be information rather than anything physical or tangible.” Information was the ‘stuff’ that made this new part of the industrial process work. In the early days of ‘information processing’, information was “treated as a tangible, measurable substance, but only in a minimal sense does the information seem to be about anything.”

This might seem like an alien view to most people now, at a time when computing devices of all kinds – ipad, iphone, laptop – are bursting with meaning (‘content’ as those who feed it to us call it), in the form of news, gossip, images, and videos. But within large organisations who live or die according to their ability to process information, this way of thinking still exists, and the fact that it still exists is (I will claim) at the heart of those organisations’ struggles to embrace the ‘Big Data’ revolution.

One of my first jobs was at Orange, the UK mobile network. My remit was to develop online services for the new connected generation of customers. Near the top of the wish list of services was online billing: being able to see your bill, compare with past bills, perform call analysis, etc. In my first meeting with the billing team, who built the system that very efficiently produced and sent out accurate paper bills to millions of customers each month, I found out that the digital version of the bill sent out to customers stored in our databases was, in fact, an image file: a picture of the bill that was sent to a printer. There was no structured customer billing data retained in the bill image at all. It was almost completely useless for my project.

The system that treats a bill as an image is a perfect example of a product of a computing culture that treats information as mere stuff. There was an information-processing goal to be achieved, that of getting bills out to customers on time, and a set of raw materials to be assembled quickly and accurately, and the system was built to achieve that goal. It’s unlikely that many people in the team that built the billing system worried about the meaning of the information that was being moved around and re-assembled. That was for someone else to think about. (Today, with image processing technology being what it is, we would probably have thought about recovering meaning from the images, but that would have been too costly at the time, even if it had been feasible.)

This culture stands in contrast to the culture of Big Data that we’ve all been hearing so much about lately, which treats information as an interesting object of study in its own right. The same information that the company uses as a raw material is something to be sequestered and interrogated for meaning by new systems and processes: insights into customers or the organisation itself. Consultancy firms have been quick to jump on the Big Data bandwagon, and herald a new world of understanding that’s within the grasp of any company that runs on information. But, of course, the underlying impulse to study data in its own right has been around for a long time: analysts have always been interested in getting hold of the data used to power the organisation in order to study it; what’s different now is the sheer volume of information that is becoming available for study, the competitive gains or losses at stake, and the investment in technology that is needed.

The challenge that many organisations now face, without even knowing that they face it, is that of reconciling these two very different cultures. All large organisations will have whole teams who embody the information-as-raw-material cuture. Most will have teams that embody the information-as-object-of-study culture. Where the two cultures co-exist, mutual distrust and suspicion are almost certainly getting in the way of productive collaboration.

Those who treat information as stuff are suspicious of those who want to develop ways to analyse it. The information isn’t produced by ‘their’ systems with analysis in mind. Thinking about how it might be prepared for analysis means thinking about its meaning, rather than its use within their system, which is an alien perspective for them to take. Preparing information for analysis will involve all sorts of changes to their business processes, which will, they worry, put those processes at risk. Even if those obstacles are surmounted, there is a risk that the information will be misinterpreted, with consequences that come back to bite. The whole thing is easily seen as a lot of trouble.

Conversely, those who treat information as an object of study find it exasperating that their colleagues can’t just open up their data to them, or hand it over, so they can start finding things out. The data is being used every day: isn’t it just a matter of siphoning it off, copying it into the cloud, putting it on a server? It doesn’t matter if it’s just a one-off – anything would be a start. Making the process sustainable is something to worry about later.

I’m caricaturing here, but the underlying sentiments should be familiar to people on both sides of this cultural divide. These differences in outlook are paralleled by differences in tools and skills and ways of doing things that only serve to underline the sense of difference.

A key question – though by no means the most ardently contested – on which divisions between the two cultures come to the fore is that of how long information should be retained. For the information-as-stuff tribe, the instinct is usually to say as long as operationally and legally required and no longer. The other tribe’s instinct is to argue for information to be kept for as long as legally permissible: who knows what value it might hold in future?

As with any attempt at cultural rapprochement, the best starting point is to acknowledge differences and for each party to come to an understanding of the opposing worldview. This is going to be harder for older companies with seasoned IT teams as they come to terms with the need to compete using insight. But studying organisational data plays such a critical role now in driving competitive advantage that if they aren’t able to make the necessary accommodations between cultures, they will fail.
[1] Agre, P E (1997) Beyond the Mirror World: Privacy and the Representational Practices of Computing. In Agre, PE and Rotenberg, M (Eds) *Technology and Privacy: the New Landscape*. MIT Press: Cambridge, MA.

Comments are closed.