Menu Request Demo

5 Steps to Improve your Business's Data Analytics Time to Value

Date: 03 November 2017 Author: Neil Barton

Published via VMblog on Nov 3, 2017 (View the article at VMblog).

With IT departments facing increasing pressure to deliver insights that businesses can use to make smarter decisions, the ‘time to value' (or TTV) is quickly becoming the most crucial metric that they are judged on. Nowhere is this become more apparent than when investing in data analytics capabilities. Thinking about it from a logical perspective, successful companies only invest in technology to make themselves more efficient or to drive a better strategic - often transformational - outcome.

Logically, the sooner a project is successful, the better your return on your investment will be, but too many organizations either struggle to evaluate TTV with any confidence or simply assume that they must passively accept the timeline set by technology suppliers. This does not have to be the case and in just five easy sets, any organization can improve the TTV of its data analytics projects.

Get serious about the cloud as a means to drive better TTV

Most organizations have learned through experience that the time, cost and expertise required to procure, install and configure systems can be eye watering. In fact, it can take weeks and a significant cost just to turn the lights on before deployment even begins, but this is still often how people insist in implementing solutions. Under certain circumstances, this may be the right way to go, but cloud infrastructure and platforms can now provide an obvious way to significantly shorten the TTV for new projects, especially for organizations whose data originates in the cloud or can be easily moved to the cloud.

The beauty of cloud platforms is that they provide a much lower barrier to entry because of lower cost, less time needed to stand up and the ease of scale when needed.

This results in a much more flexible eco-system from which to innovate. Organizations can start small and increase the size of compute as workload and data requirements increase, which is particularly important as the amount of data being captured by organizations and subsequently needed for analysis continues to expand.

Often the fear of creating complexity by mixing on-premises and cloud solutions prevents organizations from embracing the cloud, but there are now so many vendors that manage hybrid environments. This capability enables a split between on-premises and cloud, presenting it to IT as a single, cohesive, environment and also managing the movement of data between respective components, as needed.

Automate your data analytics

Many of the organizations are stuck in a time warp when it comes to managing data analytics. Throughout the Extract, Transform, Load or Extract, Load, Transform (ETL/ELT) stages of data analytics, manual processes - which take money and time - remain prevalent. These manual development routines can be the greatest inhibitor to TTV as they are slow, tedious and error prone if rushed and not validated. In addition, documentation is generally not done, or if it is, it is incomplete and becomes quickly out-of-date, which impacts future usage.

This makes no sense when there are robust, automated solutions available that use best-practice based methodologies to generate the ELT code, while ensuring that it is optimized for the respective platform that it will be deployed on.
This approach can reduce time - and risk - by up to 90%, allowing companies to develop at a pace that the business needs, while also ensuring that the generated code will be consistent, robust and well-documented.

And continue to automate, wherever you can

Organizations should make a commitment to automating the entire lifecycle of managing a data infrastructure - not just the ETL code generation. Whether it is on-premises, cloud, or a mix, the focus should be on reducing the time, cost and risk in all phases of the lifecycle to maximize TTV.

However, the deployment of new - or changed - components between various environments, such as development, test and production, is still frequently done manually, slowing down TTV in the process. Ensuring that respective schema changes are made and code is deployed correctly can be quite involved and fairly risky if it requires a human to manually perform tasks. By automating this aspect of the lifecycle, will both speed up the time it takes to deploy changes, as well as reduce the risk during deployment.
More organizations are looking towards this level of automation as they work to apply software development best practices and methodologies to data infrastructure environments. This entails trying to automate as much as possible, a good maxim to adopt to improve TTV.

Don't let data infrastructure lag behind the pace of business change

Business needs are changing faster in today's dynamic and fragmented economy, where competitors can appear or disappear overnight. With that pressure comes the expectation that an organization's data infrastructure can change just as fast. However, understanding quickly what needs to change, as well as the potential impacts on downstream consumers of the data, is imperative to managing the risk and time associated with evolving the existing environment. If the environment is not well-documented or not conducive to clear lineage and impact analysis, then the risk further increases.

Organizations should ensure systems support detailed lineage and impact analysis in order to provide a clean line of sight to understanding how the environment can be successfully enhanced in the future. Invest in automation solutions that allow for rapid code generation, can deliver full impact analysis relative to potential changes, and automate the code regeneration when changes are made. This significantly reduces the time to make changes as well as the risk in doing so.

Always make sure that business needs and IT capabilities are aligned

The business tells IT what it wants. IT builds a solution that address the request, only to find out that the solution built does not match what is actually needed when presenting work back to the business. This is an all too familiar circumstance that IT teams fine themselves in and with it comes a massive impact on TTV.

Collaborative and iterative development between IT and the business can significantly reduce TTV by ensuring IT's work delivers the expected value to the business. Using automation to engage in rapid prototyping and collaborative discussion early on in a project can bring IT and the business closer together to detect nuances in vision and gaps between business needs and technical capability.
Automation solutions that use a metadata framework can shrink TTV by easing the transition from prototype to a robust, on-target solution. Additionally, using rapid prototyping to provide the business an earlier view of the project's progression can bolster trust and the relationship between the two groups - an additional benefit that can pay dividends well into the future.
As Gartner has recently reinforced, organizations are in a race to significantly reduce the time it takes to turn their technology investments into value. Those that are fastest with TTV will create enormous competitive advantage by turning data into insights and, by doing so, will turn ideas into commercial success. Every organization will have a better chance in this race by heeding these five steps and will be in a great position to speed up TTV - everyone in business knows, this is a race worth winning.


About the Author

Neil is the Chief Technology Officer for WhereScape and leads the long-term architecture and technology vision for the company's software products.

Neil has held a variety of roles over the past 20 years, including positions at Oracle Australia and Sequent Computer Systems, focused on software architecture, data warehousing and business intelligence.

He is a co-inventor of three US patents related to business intelligence software solutions.