Read the article at ITProPortal

Chatting to colleagues, customers and competitors about their big data projects, I’m starting to sense disillusionment in the field.  Reading through the news websites, too, questions are starting to be asked.  Is big data analytics all it’s cracked up to be? Why am I not getting the value I was promised from my investment in big data?  Why am I still not able to get access to the insight I need?  And how come others seem to be able to drive commercial advantage when I seem to be no further forward?

It doesn’t have to be this way though.  The value is there to be had from the big data revolution still.  The promise of taking broad varieties of data types, managing that data and then deriving the kind of insights that can facilitate real business value remains true.  However, as with so many things in life, you have to take the right approach.  It’s the failure to tackle the big data challenge in the right way that is leading to so many organisations failing to get the results they want.  According to a stat I heard recently, it is rumoured that only 15 per cent of big data projects are ending in production deployment.  Success can be achieved, though, and I have five simple steps to help you get to value…

1) A solution in search of a problem

Like most technologies, there tends to be a high percentage of people adopting the technology, without first having identified a business problem that can be solved by it.  Time and time again we see this happen in the technology industry; the latest ‘hot thing’ results in a rush towards adopting it without taking the time to really determine the goal.  This is not actually a big data problem, but rather a simple business strategy issue.  When I speak to potential customers, I often begin by asking  ‘what problem are you really trying to address?’ Often, they struggle to cogently answer it!

Learning one – Organisations that have been successful with respective “big data” components and projects have first identified a business problem that they want to address and only then determine the appropriate technology to solve it.  

2) No such thing as a free lunch

One of the more beguiling aspects of the big data revolution is that it appears much of the technology available to manage the ever swelling pool of data is free!  What’s not to like!  But here’s the rub, just because the software itself is free, that does not make it free, cheap or easy to install, configure and operate. The tools themselves may be free, but then the skills required to install, configure, debug, manage and develop are hard to find, and expensive when you do find them.   Furthermore, open source big data components tend to lack the breadth and depth of Operations and Maintenance (O&M) capabilities that the vast majority of traditional platforms have had for decades.  This puts additional burden on IT resources to manage and monitor — organisations (and auditors) don’t really care that it is open-source, they care that the data is governed, secured, etc.  

Learning two – While free may be tempting, take a step back and compare the total costs and benefits of open-source versus an enterprise grade solution. You likely will find that the enterprise solution will be the most cost effective and painless way to value in the long run.

3) Integration is the name of the game

The proliferation of big data tools and products are most valuable when they are used appropriately – take a ‘horses-for-courses’ approach to building out your big data solution but recognise they are not, on their own, a magic bullet. Instead, they are just parts of a larger, more complex eco-system that is now becoming the new blueprint for enterprises - the Logical Data Warehouse (LDW).  The same applies for data science/discovery tools; they most certainly have a place in the modern environment. However, they are not a replacement for the Enterprise Data Warehouse as some of those vendors would lead you to believe - unless you want a data junkyard.  

Learning three – Understand you may need a ‘pick and mix’ approach when defining your LDW eco-system as you have to focus on turning a collection of open source technologies into a functioning scalable architecture if you want to be successful.

4) The difficult ‘final mile’

A lot of the new technologies, such as Spark and Hadoop, are appealing on the surface and relatively straightforward to implement in a dev, sandbox or even test environment.  However, transitioning from test/qa to production generally entails a level of governance and DevOps capabilities that these tools tend to lack.  This makes the final hurdle a relatively difficult step for organisations to make, especially large enterprises which still have governance, control, and auditing related requirements, regardless of the underlying technologies being used.  

Learning four – The shift into production requires careful planning, the right skills and the necessary investment in order to make your big data project a success.  Take the time to work out your strategy up front for this part of the project; it will be a worthwhile investment.

5) The resource crunch

It is certainly true that traditional approaches to analysing and deriving value from data cannot usually be applied to a lot of the new types of data being ingested.  As a result, organisations need to adjust their culture and processes in order to accommodate the new paradigms that are available.  But this can be tough when, over the past decade or so, many IT departments have downsized and eliminated their integration and architectural expertise.  Asking people to do new things when they are already very stretched is often a recipe for disaster.  For example, the notion of hand-writing relevant code and then updating and maintaining the code as the underlying APIs change, or new/better components come to market, puts incredibly high demands on internal resources.  The truth is, most organisations are simply not equipped to handle this rate-of-change in the underlying technology.  And consequently fail.

Learning five - For organisations that are constrained by resources/money, maintenance costs for Big Data projects can be substantial and in some cases prohibitively expensive.  Look, therefore, for innovative approaches, such as metadata-driven solutions that can mitigate a lot of the time, cost and risk to this scenario.  

Given my role and where I work, it disappoints me that the lustre of Big Data is starting to be tarnished by the failure of too many organisations to realise its value.  But I believe that it is attainable for many of those organisations that are currently struggling.  And certainly, the five steps I’ve outlined are as good a place as any to start.  The Big Data revolution remains one of promise and hope. Approach it the right way and you’ll benefit accordingly.

Neil Barton, Chief Technology Officer, WhereScape

Image source: Shutterstock/wk1003mike

Published via ITProPortal on Sept 1, 2017 (View the article at here).