I have written a few posts on how in-memory solutions are being adopted in industries such as financial services, travel & logistics, and big pharma to overcome many kinds of Big Data challenges. Here I’d like to share an independent report from a leading analyst, Aberdeen Group, that reinforces the view that in-memory computing infrastructure is essential for successful Big Data initiatives—especially those related to volume and velocity.
In its report “In-memory Computing: Lifting the Burden of Big Data,” Aberdeen arrived at some pretty exciting conclusions about the link between in-memory computing and Big Data success. The report examined 196 organizations worldwide that are currently dealing with Big Data, of which 33 reported implementing in-memory computing solutions. These companies represent a large cross-section of industries and sizes, and deal with business data ranging from a handful of terabytes to multiple petabytes. In aggregate, the organizations that implemented in-memory computing were able to process over 3x the data at over 100x the speed versus organizations that had not done so. Interestingly, overall employee satisfaction at these organizations also increased, contributing to better business growth and results.
At Terracotta, we have observed similar results first-hand at organizations that have implemented our own in-memory solutions. The key differences between the report’s data set and ours is that our deployments have been even more varied (in terms of enterprise size and breadth of verticals) and have typically been completed in 60 to 90 days.
With data growth accelerating, in-memory computing is attracting increased attention as a key strategy for making the most of Big Data. To learn more, download Aberdeen’s report.