In-memory processing speeds up the response time for queries, significantly enhancing the end user's experience. The use of in-memory processing to conduct analysis is not new. Only recently, however, has the promise become a practical reality thanks to the mainstream adoption of 64-bit architectures that enable larger addressable memory space. The adoption of in-memory applications will continue to gain momentum due to falling memory prices, faster processing speeds, and the ability to store more data. Because of this rapidly changing infrastructure landscape, it is now realistic to analyze very large data sets entirely in-memory.

Using arcplan's Shared Query Cache for Performance

Many BI products feature in-memory capabilities that provide business users with high performance, but do not address security. The issue is that when accessing cached data, the security mechanism of the underlying database will not check privileges anymore. arcplan has applied a security concept to our shared query cache function, delivering better security than any other in-memory vendors' solution. arcplan's shared query cache provides a Role Based Access Control (RBAC) in its cache, so sharing cached data is only permitted to users with identical user roles.

In general, in-memory provides customers with quicker analyses that do not require developing specific OLAP cubes and queries in advance. Although when cubes/OLAP systems are in place, pre-loaded in-memory data sets can still speed up repetitive queries that come from users. In addition to faster response time for users, administrators will benefit from lower database system load and network managers will experience less traffic.


arcplan Shared Query Cache

This white paper explains arcplan's Shared Query Cache, which provides role-based security to in-memory processing.