Julian Valkieser shares his thoughts with us about “Benefits of Big Data” in this blog post for our Emerging Fellows program. The views expressed are those of the author and not necessarily those of the APF or its other members.
In my last articles, I have already mentioned the power of Big Data. My blog colleague Jason adopted it and expressed his own thoughts. In his last article, he has shown wonderfully how technology has already overturned business models and efficiency in other sectors and renewed them. In comparison to this, it could happen in the area of futurist and industry’s foresight as well.
Now, there are foresight methods that work well or best with uncertainty. Indeed, Delphi-Interviews are planned preciously, e.g. interviewees are pre-selected. But this does not mean that the statements can be processed for hard facts of future reality. And, they should not. That’s the exciting thing about scenarios. They give a way to stimulate the imagination and to derive recommendations for action.
But again, you try to keep the “cone of plausibility” as narrow as possible. (See Jason’s blog). You are looking for certain experts. You force certain issues. This is done in order to build the scenario reasonably.
Now you can imagine how neutral subjective responses and subjective questions are. Anyone who read “Thinking Fast and Slow“ from Daniel Kahneman knows what I mean. And right here data comes into play. Information could passively express motives and interests of groups. I have already indicated this in my last article.
In this article, I already referred to the fact that you can only get the most out of Big Data, if one applies the prediction to a trigger event. One extracts motives and interests out of big data for one or more so-called, trigger events. These are events that can be relatively easily predicted in the near future based on data, because the circumstances are (should be) less complex. Based on these trigger events you can create a scenario. In principle, this is nothing new. Just the basic information is extracted out of big data instead of interviews and subjective insights.
Let’s take an example. A major mobile phone company has 50 million customers. Each customer has a phone and moves every day with this turned-on phone – in this case between different radio towers (See Triangulation). Let’s suppose further that the company receives 20-100 motion information’s by any customer. Provided the company may cache this information for a longer period of time, the result is a huge amount of data information, how people move, how long they stay in which locations, etc. Of course, each individual could now be afraid of privacy. But the individual is not of interest. It’s about the mass.
Imagine what you can do with this information now available. Road offices could optimize the logistics. Infrastructure projects could be optimized. Where should the new stadium be built? How is the highway to be calculated? How many trains must be set on this track?
In a rising urban environment, where sheer masses of people are moving, all these data are exciting as the basis for trigger events and scenarios.
And finally, I have another wonderful example for these ideas. Eric Fischer has evaluated geo-tagging data from photo cameras. He compared where locals and tourists take pictures in certain cities in the world and displayed this information on maps.