In-Memory Analytics is
an emerging technology that is putting forth solutions to support Business
Intelligence in making faster, smarter decisions. In-Memory processing stores
all the data and calculations in RAM. By placing storage in RAM, firms can
experience faster processing which allows for quicker access to data and
calculations that answer BI questions. Businesses can take the answers to these
BI questions and translate them into informed decisions. By bypassing slower
queries, and indexes stored on hard disks, and utilizing the growing
capabilities of the PC, businesses can leverage their processing power in more
efficient ways [1]. In the recent past this had not been possible as most PCs were
32-bit and could only support up to 4GB of memory. This greatly restricted the
capabilities of in-memory analytics. More recently, PCs have become 64-bit and
can now support up to 1TB of memory, this allows for large amounts of data to
be cached in the computer’s RAM. [2]
According to the
Gartner’s Hype Cycle for Emerging Technologies report [3], published in August
2013, this technology could potentially plateau in less than 2 years. On the
hype cycle it is shown as currently emerging from the Trough of
Disillusionment. Many of the articles I researched seemed to have been written
while in-memory analytics was in the Peak of Inflated Expectations. As
in-memory analytics becomes a more mature technology the expectations around it
will become more realistic. This will be a positive development for the field
as resources will be dedicated to real world solutions rather than the pipe
dreams of the tech world.
Image found at http://www.gartner.com/newsroom/id/2575515 |
There are many
companies that have started offering solutions for in-memory or flash analysis.
Some of these companies include: IBM, SAS, SAP and Oracle. SAP and Oracle
currently stand out as having the more robust products. SAP sells HANA (High-Performance
Analytic Appliance) while Oracle markets Big Data and Exalytics. These
solutions are pretty exciting and it looks like they are becoming increasingly
competitive. SAP is currently winning the memory war as it reports that HANA is
scalable to 16TB. I found this buyer’s guide write up by Drew Robb to be
an informing read. Robb goes in depth to compare many of the most prominent
in-memory analytic products in the marketplace today. [4] http://www.enterpriseappstoday.com/business-intelligence/in-memory-analytics-buyers-guide-oracle-big-dataexalytics-appliances-vs.-sap-hana.html
There are some caveats
that we should keep in mind as we climb out of the Trough of Disillusionment.
This isn’t going to be a silver bullet for all companies. In fact it is
definitely safe to say that this is not for everyone. As more users seek to
access large amounts of data RAM will need to be increased, and although the
cost is declining it still isn’t free. Companies that need answers fast so that
they can implement time critical decisions are going to be able to see a
greater return than the larger, bulkier companies that have the luxury to
examine their decisions with a fine toothed comb. If you are a large company
that cannot immediately act on the data then you will not benefit from the time
savings gained from utilizing in-memory analytics. However, if you are a firm
that benefits from every second you are out ahead of your competitors then
in-memory analytics should be explored and invested in.
No comments:
Post a Comment