In-memory databases improve the performance of analytics and transaction processing. But the high investment must be justified.
Ten years ago, it would have been unimaginable that Enterprise Database Management Systems (DBMS) run entirely in memory. In recent years, however, prices for RAM have continued to fall. We have reached a point where in-memory is no longer extremely expensive. The lowered prices have opened new possibilities for how to configure database systems to take advantage of the main memory.
It’s no longer just startup companies that turn to in-memory computing to meet the high-performance demands. Leading database and software vendors also promote database technologies that support in-memory computing. Big companies like IBM, Oracle, SAP and Teradata spend a lot of money on established companies thinking about integrating in-memory databases into their own IT systems.
In-memory databases accelerate the performance of applications in two ways. Mainly, keeping the data in memory against the slower disks minimizes database query latency, if not completely eliminated. Second, the database architecture uses available memory more efficiently.
For example, many in-memory technologies use a column-based rather than a row-based layout. Columned values are better for compression, and the queries are accelerated by allowing all column values to be scanned faster.
Does In-Memory make a big profit?
One can actually argue badly against application acceleration and optimized organizations. But when should IT and data management consultants recommend that transaction processing and business analytics justify the investment in technology, resources, and new expertise for an in-memory framework?
The practical aspects of this question include overcoming the need for improved database performance against the associated costs of implementing the in-memory platform. Although the cost of RAM has dropped significantly, you cannot call systems with lots of memory a bargain.
It refers to the comparison with database servers, which rely primarily on hard disks. Management will probably be surprised to see the invoice for in-memory technology. For in-memory databases to pay off, you need to find applications with characteristics that make the most of this technology.
You can find the answer by assessing whether there is a need for more data processing in the enterprise. In addition, the business value plays a role, which results from reduced response times of the databases.
Consider the following supply chain example from the point of view of in-memory software: real-time analysis of a variety of data streams can result in faster routing and delivery decisions so that goods are in the right place at the right time. These include, for example, inventory data of a warehouse and the locations of retailers, information about goods in transit on lorries or trains, and traffic updates and weather forecasts. Increases in sales, this can justify an investment in in-memory.
Look carefully before buying
Be sure to look at the general characteristics of your business. In-memory databases pay off if one or more of the following statements describe your work environment:
Open to New IT Investments
Management must be confident that they are investing money in hardware with enough memory to meet the application’s data processing needs. Even scaling systems to support in-memory computing costs more than a database server with many hard drives.
Analytically Agile
In-memory systems support reporting and analytics applications that serve business processes and results. End users can make decisions faster. For example, if you were able to change the sales forecasts from a weekly to an hourly basis, you could create real-time product-price models, which in turn has a positive impact on profitability. But that only applies if the changed prices can also be communicated quickly.
Support for mixed environments
If transaction and analysis applications can access the same database, real-time analysis is also possible. However, resource conflicts can lead to performance problems with a conventional relational database. It is primarily due to the latencies that occur when accessing data stored on hard drives. With an in-memory configuration, latency is a minor problem.
Knowing the data
In-memory technology can also be a valuable tool if most of the database queries only affect a small part of a database. According to a data warehouse and database provider Teradata whitepaper, 43 percent of queries affect only one percent of available data. Nearly 92 percent of the queries only access 20 percent of the data. If one identifies frequently used and so-called “hot” data and keeps them in the main memory, then the corresponding response times can be shortened drastically.
If companies’ business processes benefit from real-time capabilities, blended applications, and significantly faster reporting and analysis, then the company is a candidate for using an in-memory database. In most cases, you should balance investing in in-memory software with your IT budget and business goals. It includes understanding how key areas of enterprise performance can be improved through faster transaction processes. This also applies to the reports and short-term query results that can be realized through in-memory data processing.