STORY ON BUSINESS INTELLIGENCE – Part-4

by   |  4 min read
Published :

After the factors pushing BI and the benefits of BI covered in Part1 , Part2 and Part3, it is time to look at the future. Did you know that in-memory analytics tools can be invaluable? Read below to learn more!

Are BI applications moving to Cloud in the near future

Cloud Analytics is certainly a very attractive proposition which allows sharing BI infrastructure (which is quite a considerable portion of the solution investment) across various customers.

There can be a certain amount of reticence on part of organizations especially larger ones, in putting their organization information database on cloud but for SMEs this is a very acceptable proposition.

About In-memory analytics: Big Data Problem leads to big opportunities

The rise of 3V’s of ‘Big Data’ i.e. volume, velocity and variety has become a big problem for the banks. Though this issue has been high on agenda in the past few years for many banks, but the problem only got worsened by exponential data volume growth, need for real time analytics, requirement for improved decision making capability and reduced dependency on IT. However, our traditional business intelligence (BI) tools are unable to keep pace with the banking needs due to certain limitations such as latency in data.

 

If you follow the trends in the BI space, you’ll notice that many analysts and vendors talk about in-memory technology. In-memory analytics is the talk of the town for multiple reasons. Speed of querying, reporting and analysis is just one. Flexibility, agility, deeper insight, rapid prototyping is another. The fundamental idea of in-memory BI technology is the ability to perform real-time calculations without having to perform slow disk operations during the execution of a query.

In-memory technologies follow the approach of loading the entire dataset into Random Access Memory (RAM) instead of disks (as in the case of traditional BI). This results in removing the need for accessing the disk to run queries thus gaining tremendous performance advantage (as query response time is much higher if data is loaded in RAM). This new approach ensures that less development time is spent on data modeling, query analysis, cube building and table design, thus resulting in quicker implementation.

Apart from factors such as data volumes, there are also other factors which have compelled the banks to adopt this In-memory based technology. Such as, hardware innovations like multi-core architecture, parallel servers, increased memory processing capability, etc. and software innovations like column centric databases, compression techniques and handling aggregate tables, etc. have all contributed to the demand of In-memory products. Other strong influences are time & money, traditional BI takes comparatively longer implementation time where as in-memory processing comes at a cheaper cost.

Just so it is clear - the concept of in-memory business intelligence is not new. It has been around for many years. The only reason it became widely known recently is because it wasn’t feasible before 64-bit computing became commonly available. Before 64-bit processors, the maximum amount of RAM a computer could utilize was barely 4GB, which is hardly enough to accommodate even the simplest of multi-user BI solutions. Only when 64-bit systems became cheap enough did it became possible to consider in-memory technology as a practical option for BI.

To put it in another words, in-memory analytics tools can be invaluable. The technology allows large amounts of data to be analyzed in real-time or ‘on the fly’ as needed, ensuring issues are caught early, rectified or accounted for swiftly. This makes the business more reliable for its customers, and prevents last minute penalties in scenarios that are often avoidable. Quoting Gartner, “new capabilities are evolving in technology areas that take advantage of in-memory/high-speed analytical processing, extremely large datasets and cloud-based platform (providing infrastructure elasticity and easier integration/on boarding capabilities).”

Banks seems to be very well aware of the benefits of "in-memory analytics". As there are numbers of Banks abroad, already leveraging "in-memory analytics" for market risk management, front office support, credit valuation adjustment, client reporting functions, fraud management and compliance, payments and more. Where BI players like Microstrategy, QlikView, TIBCO Spotfire, Tableau Software already have made their mark in this space, Ramco with ‘Minnal’ is also entering the space with its scalable, powerful, flexible and light weight framework in-memory application.

However, there are some implications of having data model stored in memory, such as having to re-load it from disk to RAM every time the computer reboots and not being able to use the computer for anything other than the particular data model you’re using because it’s RAM is all used up. In-memory solutions can lead to more servers in the database system. Consequently, there are RAM and server hardware acquisition costs, implementation costs, power/cooling and administration cost involved. Hence, the approach should be to optimally enable "in-memory analytics" system to derive optimum benefits from RAM speed. Also, these solutions demand high security needs as huge amount of data can be downloaded by the end users on their desktops, resulting in the threat of data getting compromised.

In-memory analytics technology is identified as a strong – if not the strongest – solution and is likely to become the predominant architecture for tackling Big Data in the upcoming years. For this reason, the most innovative financial institutions will already be adopting faster analytics, allowing them to remain compliant in this tough regulatory climate, while gaining ground against the competition. The future certainly looks bright & promising. In-memory technology in itself is NOT a driver of BI growth while it certainly cannot be ignored!