Views on Customer Insight

A practical view on gaining real insight to drive better decision making

Leave a comment

Are You Really Ready For Big Data?

This post is brought to you by Tim Spooner. Managing Director at Consumer Cloud 

The amount of data in our world has been exploding. Companies capture trillions of bytes of information about their customers, suppliers, and operations, and millions of networked sensors are being embedded in the physical world in devices such as mobile phones and automobiles, sensing, creating, and communicating data. Multimedia and individuals with smartphones and on social network sites will continue to fuel exponential growth.

However, in line with this explosion of data, the ability to store, aggregate, and combine data and then use the results to perform deep analyses has become ever more accessible. General advancements in computing processing speed, memory capacity, its equivalent in digital storage developments, and cloud computing continue to lower costs and other technology barriers. For less than $600, an individual can purchase a disk drive with the capacity to store all of the world’s music. Last, the means to extract insight from data are also markedly improving as software available to apply increasingly sophisticated techniques combines with growing computing horsepower.

The interest in “Big Data”is given importance because it potentially provides a new, more cost effective and efficient way of identifying insight that enables improved business decision-making and performance.But many of the considerations and learning’s from the past decades are as applicable, if not more so, to Big Data. Therefore the brands that use these well-established principles to decide whether to use Big Data systems andtools or not, on what to use them and how best to leverage the learning from them will benefit the most.

So what are the key considerations and challenges that organisations face when considering whether to invest in Big Data projects?

  1. Advanced Analysis Techniques Don’t Mean Advanced Analysis; Analysis for analysis sake is clearly pointless. Companies need to think carefully about whether analysis is required and what the benefits are likely to be before undertaking any analysis. Benefits accrue from analysis that delivers insight that can deliver performance improvements and/or competitive advantage. Whilst data is a critical ingredient, the most effective approach to deliver meaningful insight is to identify the business opportunity and then determining how analysis and modelling can help improve performance.


  1. Garbage In Garbage Out; There is an unfortunate tendency for Big Data fans and technology providers to almost position the approach as a panacea and imply that it somehow removes the need to think about the data being loaded, it’s accuracy and format. The term “unstructured” is imprecise and doesn’t account for the many varying and subtle structures typically associated with Big Data types.


Big Data may well have different data types within the same set that do not contain the same structure and therefore, Big Data is probably better termed “multi-structured” as it could include text strings, documents of all types, audio and video files, metadata, web pages, email messages, social media feeds, form data, and so on. The consistent trait of these varied data types is that the data schema isn’t known or defined when the data is captured and stored. But don’t assume that it does away with the need to do additional work to prepare the data first – it is well accepted for traditional analytics and data mining projects that 80% of the analysis time is spent not analysing but preparing the data. Big Data practitioners already consistently report that 80% of the effort involved in dealing with data is cleaning it up in the first place, as Pete Warden observes in his Big Data Glossary: “I probably spend more time turning messy source data into something usable than I do on the rest of the data analysis process combined.”

Moreover, when analysing data it is important to understand how the data was generated in order to understand whether it is actually useful. For example, one might expect whether or not a consumer has subscribed for an e-newsletter to help explain their relative level of engagement with a brand – but if on checking how the data was collected we establish that certain consumers were automatically registered then this data is unlikely to be useful in analysis. Information in its raw form is too diffuse to be usable. It must be refined and concentrated – much like crude oil.

  1. Does your Company naturally collect all customer purchase moments?; Many companies cannot realistically take advantage of sophisticated analysis, Big Data or not, because they simply don’t have access to information on the majority of customers and their purchasing. Amazon knows exactly who its customers are, what they buy, when and how much they pay for it as a natural bi-product of the way they do business. But, it is a fact that many companies simply don’t have this information and therefore much of their potential uses for Big Data will be limited to either performance improvements or non-purchase related interactions with the brand. The Database Marketing, Relationship Marketing and CRM pioneers either missed or chose to ignore this key fact and to date much of the Big Data movement seems to be falling into the same trap.


  1. Consolidation of Multiple Data Sets; For many years now those companies which have understood the need to think creatively and cast the net as wide as possible to source data to support analysis and insight projects have generally benefited more from information. This requires not just looking at and merging multiple data sets from within an organisations existing information systems but at external data sources. One of the potentially most significant changes that have facilitated the need for and potential interest in Big Data is the plethora of new data sources such as social media, machine sensors and publically available data collected and made available on the internet. When planning Big Data analysis projects companies need to think carefully about what data is likely to be useful and then identify what they have and where (and if) they can source any missing data from external sources. Clearly if key data is missing and either cannot be sourced or it would cost too much to do so then this should impact the decision as to whether or not to proceed with any analysis in the first place.


  1. Ensure Insight Can be Leveraged; Almost as much thought and effort needs to go into planning how to implement the insights as in setting up the analysis project in the first place. This requires a realistic assessment of delivery capabilities and most notably ensuring that whilst the models themselves may be quite sophisticated their outputs need to be relatively simple to understand and use for the staff that will be expected to work with them. It is therefore generally a good idea to separate the full day-to-day responsibility for implementation from the analysts/data scientists that built the models. Visual displays and dashboards are generally a good way to deliver the information to front line managers and staff.


Of equal importance is the need to align people’s job roles and performance appraisal with the required objectives. Just as siloed data hampers the ability to derive benefit from data insight projects (including Big Data) so can traditional functional based organisational structures if key objectives aren’t aligned and managed across functional areas. Again, this challenge is not new. Consider for example, a Sales Call Centre where one key metric for personnel is almost always how quickly they can finish the call and how many sales per hour/day/week they can make. But from an insight point of view it may be critical to understand what led to the sale. However, asking Sales focused Call Centre staff to spend an extra few minutes per call to collect this information accurately invariably fails. No matter how much training and education is undertaken this isn’t likely to change. What is required is a change to the measurement process linked to their job description and performance evaluation – in many cases this even needs to be integrated into their remuneration package.

  1. Big Data analysis will more often than not tell you what is happening but not necessarily why; companies should be prepared to spend a potentially significant amount of effort in understanding the output of certain analysis projects. In some cases it may be necessary to conduct specific research to understand why a sequence of events is happening.


  1. Are You Really Ready for Big Data; Key factors influencing the likely success of any Big Data initiatives are corporate culture and existing expertise.


Benefiting from Big Data means investing in teams with the relevant skillset – Data Scientists. Good Data Scientists need to combine mathematical strengths, programming and scientific instinct as well as, increasingly, a strong commercial awareness. These experts need to be given sufficient freedom to explore a number of different hypotheses but be integrated into the business as a whole. Companies will critically need to surround them with an organisational willingness to understand and use data for advantage.

As mentioned already, careful consideration needs to be given to aligning individuals and departments roles & responsibilities and remuneration with companywide objectives.

There is also the consideration of IT and changing traditional views and values regarding where a company’s data sits. It is almost guaranteed that in order to take advantage of Big Data solutions and manage requirements and delivery over time in a cost-effective and efficient manner that organisations will need to leverage cloud based services in some way shape or form. This means a company being prepared and able to have their key customer and transactional information residing in physical storage outside of their organisation – yes it can be secured/protected, as indeed it must be, but many companies (and particularly their IT departments) have often had a far more myopic view to where their key company data resides and why.

And, perhaps most importantly, the push needs to come from senior executives who need to be prepared to switch from decisions made based purely on past experience to one which is led by the key insights delivered from the data.

In all analysis, whether Big Data or not, it is important to make sure you are realistic in your assessment of where you are as an organisation in relation to delivering meaningful insight from data. Ensure you have buy-in and real support at a senior level and invest in the right types of people. But most importantly, don’t assume every analysis project is necessarily a Big Data one and don’t assume that you have the right data to deliver meaningful insight.


Leave a comment

Big Data – New Buzzword Same Old Challenges

This post is brought to you by Tim Spooner. Managing Director at Consumer Cloud 

“The modern age has a false sense of security because of the great mass of data at its disposal. But the valid issue is the extent to which man knows how to form and master the material at his command.”

You could be forgiven for thinking this quote was penned recently by one of any of the chief luminaries of our day when warning about the perils of assuming that just by having lots of data stored on a massively parallel software solution running on tens, hundreds, or even thousands of servers somewhere in the cloud means you’ll make smarter business decisions for the benefit of the bottom line. And indeed they would be right to be trying to get the message across that simply collecting, storing and analysing Big Data sets won’t necessarily lead to any insight let alone commercial advantage or betterment of the world in which we live. Worse still, without careful consideration and rigorous assessment to identify true insight it is all too easy to find misleading patterns in the data and mistake correlation with causation – the scope for finding misleading insights by definition increase given the sheer scale and coverage of the data being analysed.

But perhaps the most interesting thing to me about this quote is that it is attributed to none other than Johann Wolfgang Goethe and when……………………………………………….. in 1832! The problem has been the same for a good many years now. Whilst the scale may have changed the challenges remain the same – it’s how we master the opportunity and leverage the mass of data at our disposal that will determine whether we derive any real benefit.


Statisticians have spent centuries working tirelessly to understand what problems lie ahead and how these can be solved or at least minimised. With more data, faster processing and analysis tools all at a lower cost per x it would be a grave mistake to assume that this either solves the problems or renders the theory irrelevant. It has not and will not.

“There are a lot of small data problems that occur in big data,” says Spiegelhalter. “They don’t disappear because you’ve got lots of the stuff. They get worse.”

In business, as increasingly in our personal lives, we have become a quick fix society. We buy ready cooked meals rather than going to all the hassle of cooking – and if we can’t even be bothered to go to the supermarket to pick up a ready meal and then pop it into the oven then we get a takeaway or eat out at one of the ever increasing number of cafes and restaurant chains. In business we increasingly seem to look for the quick, hassle free ways to make our working days easier and seem to have confused the potential power of computers and the ever multiplying data being generated as being some sort of magic wand that will make instant, profitable and flawless business changing decisions. Whereas in fact without stopping to ask the right questions, collect the right data, shape it in the right way, analyse it in a robust and meaningful manner and then consider the why’s to the what’s all this does is increase the speed and sheer scale of the disasters we can unleash on the business. Remember, to err is divine but to really screw things up you need a computer!

Big Data – AKA The Emperor’s New Clothes

Leave a comment

This post is brought to you by Tim Spooner. Managing Director at Consumer Cloud 

I’ve just read Imageyet another article on the new, revolutionary approach that is Big Data. Big ticket, definitely, Big Egos, invariably, Big load of B*&&*!ks more accurately.

Is it just me that’s sick and tired (I don’t think so given I’m not that bright), of countless articles espousing the virtues of this wonderfully spun suit from the finest cloth, using gold yarn and perfectly put together by master craftsman that seems unashamedly to set out to simply justify big ticket consultancy, technology and other such services that masquerade as a not only business changing but almost shaming those that can’t see it and don’t get involved as dim witted, luddites?


Almost more laughable are that the majority of so called Big Data examples clearly aren’t Big Data at all. For example,   I’ve just read one such article which cited the story of a New Jersey healthcare provider who was able to identify that 1 percent of patients were accounting for 33 percent of visits .No shit Sherlock!. This hardly requires a Big Data solution   by any stretch of the imagination. in fact in this case it is doubtful it even need a Single customer view crafted from multiple, disparate data sources – I would have though a bit of trivial analysis of the operational system storing patient details and visits could have delivered this nugget – or   maybe even a simple tabulation in early prehistoric code written on a Sinclair ZX Spectrum!

It was Einstein that said “Not everything that can be counted counts, and not everything that counts can be counted”. It seems to me that much of Big Data theory and discussions to date ignore the core truths of what Einstein spoke. All too many Big Data projects, like all too many data analysis projects before the word Big Data was even in our lexicon, think that if you through everything at it that the answer will miraculously reveal itself. Nonsense, if the data that matters isn’t within what you are analysing then you’ll not get the answer – or worse still you’ll get a wrong one that seems convincing due to the sheer volume of data that led to it being identified. Just because certain data exists does not necessarily mean it matters or can help in answering you key business questions.

As always, you need to think carefully about what you want to know and why? Then you can start to understand what data might actually be useful in finding the answers – or at least giving you some useful pointers that can be explored further. After this you can understand if you actually have the data that counts, if you don’t whether and how you can get it and then assess whether you can realistically (within an appropriate time and budget) progress any further. All too often, whilst it does sound as glamorous, doesn’t fill some vociferous technology providers pockets and doesn’t seem, well, so sexy – the answers are relatively easy to find. I’m reminded of the infamous example of the when NASA started sending astronauts into space, they quickly discovered that ball-point pens would not work in zero Gravity. To combat this problem, NASA scientists spent a Decade and $12 billion developing a pen that writes in zero Gravity, upside-down, on almost any surface including glass And at temperatures ranging from below freezing to over 300C …………………………………………………The Russians simply used a pencil!

I’d be interested in anyone’s views on this increasingly irritating subject and to see if there is anything we can do to start more realistic, worthwhile discussion around data.