Businesses of both small and large scale develop mobile apps, web applications, and IoT projects to power their IT infrastructures. At times, for getting high scalability and speed to support mission-critical systems, they make use of IMC: in-memory computing. As a result, it comes off as no surprise that in-memory computing platforms are trending. Moreover, IMC technology is evolved with the memory-centric architecture, providing a greater degree of ROI and flexibility against different data sets.
Back in the second half of the 20th century, restrictions regarding disk-based platforms soon came into notice. It was found out that with transactional databases, data analysis and processing was prone to affect DB performance. Consequently, there was a need for disparate analytical DBs.
In this decade, businesses quickened businesses processes to kick off a wide range of initiatives related to digital transformation, resolve requirements pertaining to real-time regulations, and deploy omnichannel marketing strategies. However, real-time data analysis and action is not possible because of the ETL processes. Therefore, in-memory computing solutions which use HTAP (hybrid transactional/analytical processing) are used for real-time data analysis and processing against the given data.
In the past, RAM was costly and servers were considerably slow. The capability of caching and quickly processing data which resided in RAM for removing latency was quite restricted. Strategies of distributed computing like when in-memory data grids were deployed against commodity servers, allowed scaling between the available CPU and RAM—still, RAM remained quite expensive.
However, with the passage of time, the RAM costs have decreased. Additionally, APIs and 64-bit processors have enabled the in-memory data grids to assist integration with data layers and existing applications, offering high availability, scalability, and in-memory speeds.
At one hand, in-memory DBs came into production so they can become a replacement for the previous disk-based DBs. Despite the progressive nature of these steps, they unintentionally added fragmentation and complexity in the in-memory computing market.
Recently, in-memory DBs, in-memory data grids, machine learning, streaming analytics, ANSI-99 SQL, and ACID transactions—all of them have integrated with the emergence of IMC solutions into a single, reliable platform. These platforms offer greater convenience for use and deployment over those point solutions that seemed to provide as\ single product capability. As a result, these in-memory computing platforms were influential in significantly cutting down the operation and implementation expenses. Furthermore, there has been a dramatic shift in scaling out and speeding up the previous applications by designing these modern applications with the help of memory-centric architectures in a wide range of industries like healthcare, retail, SaaS, software, internet of things etc.
How In-Memory Computing Solved Real-World Issues
The biggest Russian bank—Sberbank—struggled with digital transformation in the past. What the bank wanted was to ensure support for mobile and online banking, work around with 1.5 petabytes of data for real-time data storage, and assist its 135 million customer base by facilitating a large number of transactions for each passing second. At the same time, the bank desired support for ACID transactions to monitor and track transactions and singled out high availability as one of the requirements. By using in-memory computing, the bank designed a modern web-scale infrastructure, consisting of 2,000 nodes. Experts reckon that in-memory computing has made sure that their infrastructure can compete with the best supercomputers in the world.
Similarly, Workday holds a reputation as one of the most famous enterprise cloud solutions in the HR and finance market. The brand serves close to 2,000 customers—a significant portion of whom belong to the Fortune 500 and Fortune 50. Around 10,000 employees run the company. In order to offer SaaS-based solutions, Workday utilizes IMC platforms for processing more than 185 million transactions daily.
Among the restrictions of in-memory computing solutions, a crucial one dictates that all the available data has to somehow “fit” in the memory. However, doing this is more expensive as opposed to storing the majority of the data in the hard disk; therefore, usually, businesses opt against maintaining all of their data in the disk. On the other hand, memory-centric architectures completely eliminated this issue. What they do is that they offer the means to utilize other storage and memory mediums like 3D XPoint, Flash memory, SSDs, and other storage technologies. The idea behind memory-centric architectures is simply “memory-first”. This means that the recent and important data is located in memory and disk simultaneously so the required in-memory speed can be attained. However, what separates this architecture from the rest is that the RAM amount can be exceeded by the data set. As a result, it is possible that the complete dataset resides on the disk while it offers robust performance by processing on the underlying disk store or processing against data in memory.
Keep in mind that this is different from disk-based data caching in memory. Companies leverage the capability for surpassing the memory amount; it helps them to optimize data in such a way that the entire data can reside on the disk, where the more important and valued data is located in-memory. Similarly, the less critical data is stored on the disk. Thus, memory-centric architectures have allowed companies to improve performance and reduce infrastructure expenses.
Essentially, a memory-centric architecture removes the requirement for all the waiting that is done so the RAM gets the reloaded data after a reboot scenario. Often, these delays consume a great deal of time based on the network speed and database size, violating SLAs in the process. When the system is able to perform computations on data directly from the disk while the system is still warming up and reloading the memory, you can ensure quick recovery. At first, you might find the performance identical to the disk-based systems; however, it is going to quickly improve in speed when the data is reloaded into memory, making the processing of all the operations compute with in-memory speeds.