Menu Request Demo

5 Steps to Making Big Data Investment Work for You

Date: 27 November 2017 Author: Neil Barton

Published via DataInformed on Oct 27, 2017 (View the article at DataInformed).

There is beginning to be a sense of disillusionment in the field of big data, and lots of people are starting to question the validity and usefulness of big data analytics. Some questions that are being brought to the table more frequently revolve around:

  • Getting the value promised for the investment in big data/li>
  • Getting access to the right insights to provide business value
  • How others can drive commercial advantage from these insights while others find themselves struggling to move the needle forward

Despite the frequent confusion and questions, it’s important to realize that a lot of value still exists from big data revelations. The promise of taking broad varieties of data types, managing that data and then deriving the kind of insights that can facilitate real business value remains true.

As with so many things in life, taking the right approach is vital. Failure to tackle big data challenges in the right way leads so many organizations to miss the mark and get results that do not affect the business. It is rumored that only 15 percent of big data projects end in production deployment. However, in spite of this, success can be achieved, and these five steps can put you much closer to ensuring it.

  1.    Don’t End Up with a Solution in Search of a Problem

Like most technologies, there tends to be a high percentage of people willing to adopt the technology without first identifying a business problem that can be solved by it. Time and time again this happens in the technology industry; the latest “hot thing” results in a rush toward adopting it without taking the time to really determine the goal.

This is not actually a big data problem, but rather, a simple business strategy issue. When thinking about implementing a new technology, you should always have a discussion about what problem your organization wants to solve with the technology. If your organization cannot find a concise answer to this question, it should reevaluate the need for the technology in the first place.

Organizations that have been successful with big data components and projects have first identified a business problem that they want to address and only then determine the appropriate technology to solve it.

  1.    There’s No Such Thing as a Free Lunch

One of the more beguiling aspects of the big data revolution is that it appears much of the technology available to manage the ever-swelling pool of data is free. But here’s the catch: Just because the software itself is free, that does not make it free, cheap or easy to install, configure and operate. The tools themselves may be free, but the skills required to manage them are hard to come across and expensive when you do find them.

Additionally, open source big data components tend to lack the breadth and depth of operations and maintenance capabilities that the vast majority of traditional platforms have had for decades. This situation puts an additional burden on IT resources to manage and monitor these open source big data components, especially as organizations and auditors don’t really care that they are open source; they only care that the data is governed and secured.

While free may be tempting, take a step back and compare the total costs and benefits of open source versus an enterprise-grade solution. You likely will find that the enterprise solution will be the most cost-effective, painless way to ensure business value in the long run.

  1.    Integration Is the Name of the Game

The proliferation of big data tools and products is most valuable when they are used appropriately – take a horses-for-courses approach to building out your big data solution, but recognize it is not, on its own, a magic bullet. Instead, it is just part of a larger, more complex ecosystem that is now becoming the new blueprint for enterprises – the Logical Data Warehouse (LDW).

The same applies for data science and discovery tools; they most certainly have a place in the modern environment. However, they are not a replacement for the Enterprise Data Warehouse as some of those vendors would lead you to believe – unless you want a data junkyard.

Understand you may need a pick-and-mix approach when defining your LDW ecosystem as you have to focus on turning a collection of open source technologies into a functioning scalable architecture if you want to be successful.

  1.    Be Prepared for the Difficult “Final Mile”

A lot of the new technologies, such as Spark and Hadoop, are appealing on the surface and relatively straightforward to implement in a development, sandbox or even test environment. However, transitioning from test and quality assurance (QA) to production generally entails a level of governance and DevOps capabilities that these tools tend to lack. This makes the final mile a relatively difficult journey for organizations to make, especially large enterprises that still have governance-, control- and auditing-related requirements, regardless of the underlying technologies being used.

The shift into production requires careful planning, the right skills and the necessary investment to make your big data project a success. Take the time to work out your strategy up front for this part of the project; it will be a worthwhile investment.

  1.    Be Ready for the Resource Crunch

It is certainly true that traditional approaches to analyzing and deriving value from data cannot usually be applied to a lot of the new types of data being ingested. As a result, organizations need to adjust culture and processes to accommodate the new paradigms that are available. But this can be tough when, over the past decade or so, many IT departments have downsized and eliminated integration and architectural expertise. Asking people to do new things when they are already very stretched is often a recipe for disaster. The truth is that most organizations are simply not equipped to handle fast-paced change to the underlying technology and consequently fail.

For organizations that are constrained by resources and money, maintenance costs for big data projects can be substantial and, in some cases, prohibitively expensive. Look, therefore, for innovative approaches, such as metadata-driven solutions that can mitigate a lot of the time, cost and risk to this scenario.

It is disappointing to see that the luster of big data is starting to be tarnished by the failure of too many organizations to realize its value. But it is attainable for many of those organizations that are currently struggling. By taking a step back and following these five steps, organizations can lay a new foundation and tap into the promise and hope of the big data revolution.

Neil is the Chief Technology Officer at WhereScape and leads the long-term architecture and technology vision for the company’s software products. Neil has held a variety of roles over the past 20 years, including positions at Oracle Australia and Sequent Computer Systems, focused on software architecture, data warehousing and business intelligence. He is a co-inventor of three US patents related to business intelligence software solutions.