CloverDX Blog on Data Integration

The Pitfalls Of Business Intelligence Project Management

Written by CloverDX | May 24, 2019

The power of business intelligence project management, and the  solutions involved, can’t be underestimated.

Ultimately, ingesting data from across your organization - as well as analyzing and reporting on it -can lead to a more informed management team, better business decisions and an improved bottom line.

But BI and analytics projects can be anything but straightforward. And if your end-result data is bad, it’s clear that it can affect every decision you make about your business.

So, whether you’re looking to overhaul your company’s entire BI process, or just make improvements to your analytics to get more reliable information, make sure you’re aware of these pitfalls of analytics projects.

Business Intelligence Project Management: What Not To Do

1. Ignoring data preparation

Your analytics and your data visualization are only as good as the data they’re based on. Too often we see people spend their money on the visualization software without thinking about how they plan to prepare their data for their new visualization tool.

If you have low quality data, your pretty charts aren’t going to give you valuable information, and you’re certainly not going to get a good return on the investment you made in your data visualization solution.

Make sure you plan exactly what business questions you want answered, and what data you’ll need to use to get there. Effort put in at the start of the process in cleaning and transforming your data will pay off in your BI results. 

2. Thinking that self service analytics is best

Self service analytics, when provisioned properly, provides access to the broadest universe of data for an organization. This is helpful in that if an analyst needs access to any piece of data, they can expect to find it somewhere.

However, this can become problematic as the number of data sets, and their complexity, increases. The challenge is then how to find the right data and understanding the relationships between the many datasets. Beyond a certain level of complexity, self-service environments can be very time consuming and frustrating to work with.

In such cases, the best approach is to ensure that all raw data is transformed, prepared and staged in a data store that provides friendly and efficient access to analysts.

3. Starting with a complicated solution

Don’t run before you can walk. In general, if you start with a complicated solution, you are opening yourself up for more risk of failure. We recommend that you go the opposite direction and start small.

Starting small generally means that rather than worrying about an expensive BI or analytical tool, you create reports that help the business team to understand their data. Once you outgrow your reports you can then start looking at more innovative ways to drive insights about your company.

If you make data the focal point from the ground up, you will be in a better position of understanding the data you currently manage and be able to drive better insights from that data.

4. Choosing the wrong tool/solution

Picking the wrong tool for the job will not only set you back in terms of time, but it will also set you back on the bottom line. Each tool/solution is different, so you need to fully understand the problem you are looking to solve and the outcome you are hoping to achieve.

You wouldn’t use a screwdriver to hammer a nail into a piece of wood, would you? Again, you could, but it’s probably not the most efficient tool if you need to do it 100 times.

Choosing Data Integration Software: 8 Essential Questions to Ask

5. New systems, applications and technologies

In today’s world, businesses are constantly looking to optimize their systems to understand their data, expose it flexibly and extract maximum value from it.

As a result, it is important to factor in these eventualities and be as prepared as possible to deal with:

  • Short term requests from management
  • System and data structure changes that might be forced on you in the future
  • New systems coming onstream that need to be incorporated

6. Not understanding your infrastructure

Most data integration projects involve IT and require a thorough understanding of your infrastructure. There are many moving parts to a project and these all need to be understood and planned in order to avoid unforeseen delays mid project.

You should especially consider areas such as:

  • Appropriately configured firewall rules that allow connection to external data sources
  • Suitable access permissions to all systems
  • Knowledge of existing data structures, especially those in other domains or departments
  • Impact of temporary unavailability of source data on your ingestion processes
  • Knowledge of governance and compliance processes, such as password rotations, and how they impact ingestion

7. Not understanding your data quality

Data quality is one of the most underestimated properties of data. Data in any system that has been in production for a while can have all sorts of data quality issues. These can range from simple issues such as typos through to missing or invalid data.

Data owners often have only a vague idea about the overall quality of their data and the impact on subsequent data oriented processes. While they will clearly understand more obvious issues, they may be completely unaware of more complex or legacy problems.

It's a very good idea to do a full data quality evaluation of your production data early on in a data centric project. This can be complicated by security restrictions, but in our experience, data in a test environment never fully captures the depth and complexity of the issues that appear in production systems.

White Paper: How to Anonymize Production Data for Testing

8. Inadequate system knowledge

Data applications, especially mission-critical ones, tend to be in production for extended periods of time as companies typically do not want to invest in technology unless absolutely necessary.

This means that institutional knowledge of applications is often lost due to inadequate documentation or staff turnover.

A lack of system knowledge can negatively affect data migration or integration projects due to over-optimistic estimates or data mapping issues that only manifest themselves in later stages of the project.

As such, a lack of knowledge can be very expensive and any poor estimates or deficient data mapping exercises can lead to costly project restarts and substantial budget overruns.

9. Custom coded solutions

Companies regularly take a coding approach when working with data. This can work perfectly well in the short term or for simpler projects. However, there are some important considerations over time.

  • As the amount of code grows, maintainability becomes a serious challenge
  • Logging is typically an afterthought so when issues arise, there is a lack of diagnostic information when you need it
  • Integration with new technology is slow to implement
  • Performance in the early days is rarely a consideration but years later it often is and serious bottlenecks can develop
  • Performance bottlenecks can require a full refactoring that can take a great deal of time
  • Bugs can and will happen, and debugging can be hard and cause downtime or interruptions
  • Developers carry a lot of knowledge in their heads and when they leave they take it with them
  • Code is often very poorly documented and often depends on institutional knowledge during maintenance. Leavers familiar with the codebase can leave the company with major maintenance headaches.

10. Changing requirements or scope

It is not uncommon for the project scope to change during a project’s lifetime, which is why so many IT projects end up over budget.

One of the most common reasons for this is that the involved applications or data sources are not necessarily fully understood or taken into account during the analysis phase. This is compounded by the fact that documentation of the systems involved is often incomplete, leading to guesses rather than estimates of what is required to implement the business logic. Such guesses are normally overly optimistic.

The best way to avoid this is to be thorough and honest during the design phase and to ensure that all stakeholders are invited to contribute to the scope discussions. While this can be difficult and time-consuming in larger organizations, the benefits can easily outweigh the expense of a slightly longer design phase.

Knowing the pitfalls to look out for when planning your business intelligence project will help you avoid them. Ultimately, avoiding the pitfalls will mean you have a better chance of delivering a successful, profitable project.  

Guide to BI and Analytics