Hard data in a soft TEST

THERE’S SOMETHING MAGICAL ABOUT THE COASTLINE; THAT PLACE WHERE THE MOUNTAINS MEET THE SEA, WHERE THE CONSTANT BATTLE BETWEEN THE STRENGTH OF THE ROCKS AND GREATEST FORCES OF THE SEA PLAY OUT WITH SPECTACULAR RESULTS.

In 2017 I celebrated 30 years of working with data. A lot has changed in the world of data over that time: the continuing advances in data technology mean that data is faster, bigger and more granular now and our ability to combine, analyse and monetise data continues to grow. But there is something fundamental about data that has not changed and, I suspect, never will.

At its lowest level data is a series of binary bits that are 1 and 0 – there is no grey area – data is hard. Of course, we soften data with scale; we put lots of those bits together to build big, complex data structures that represent the greyness, the fuzziness and complexity of the world. Data modelling has become more sophisticated. Our ability to model the world’s messy reality is more detailed, nuanced and flexible. But these models remain rigid. But data is still hard. Data is like rock.

Data exists to represent the world; through the operation of some sort of process or transaction or by providing a basis for analysis and greater understanding. But the world that data attempts to represent is full of wonder. It contains limitless complexity, unpredictability and it is fluid and ever-changing. The world does not conform to rational ideals or absolute predictability. The world is the sea.