Introduction to Caching – A Performance Booster

Introduction to Caching

Introduction to Caching – A performance booster

Overview : Performance is the most critical part of any application, be it stand-alone, web based, distributed or enterprise level. If your application does not have excellent performance, then your business is going to be impacted heavily. Caching is one of the most important area to consider. Purpose of this article is to explore different caching concepts and their implementation.

We are not going to waste our time by explaining lot of caching theories, rather, we will quickly jump into different caching strategies.

Purpose of this article is to explore different caching mechanisms, which will help you to select right strategy for your application development.

What is caching?

Here, I will take some time to give a very brief and simple explanation of caching. This is for the first timers, who wants to explore caching and implement it.

So, experienced developers, please have some patience and wait. You can also add values by sharing your knowledge and writing it in the comment section below. I will definitely add those valuable notes in my article.

Caching is a concept, which is implemented by different ways. Caching means storing the most frequently used data temporarily in a memory location or memory buffer. Caching is always a short term and temporary process. It reduces the number of hits to the original storage, hence improve performance.

But, the implementation of caching involves lot of complexities. We will also high-light those areas in the following section.

Must read – How Java Caching System (JCS) works? 

Caching strategies

Here, we are going to discuss about different caching mechanisms which are widely used in the industry. Apart from the following lists, there are couple of other ways also, but we are only focusing on the major implementations.

  • Scaling (Horizontal and Vertical): Scaling can be horizontal or vertical. In vertical scaling more resources like RAM, CPU, HD are added in a single machine, which improves caching capabilities. In horizontal scaling, more machines are added, which improves caching efficiency. Both helps in the application level caching only.
  • In-Process: In this type of caching, objects are stored in the same instance with the application. So, there is a strong binding between the cached object and the application instance and they share the same local memory location. If there are multiple instances of the application, then synchronization would be a big challenge. This is the simplest way of caching implementation.
  • In-memory distributed: This is an external implementation of caching with separate cache server. Here, application instances are not tightly coupled with the cache objects (not like in-process caching).A separate cache client is used by the application instances to access the cache server and interact with the data. Here, the data is stored in a typical key/value pair. This type of caching is deployed in a clustered environment with a single logical view. 

But, how the application knows where the cached data is stored in a clustered environment? Interesting question? And, the answer is – caching clients use different hashing algorithms to identify the data location.

  • In-memory database: In-memory databases are quite popular for performance improvement. Here, the data is stored in RAM rather than the hard disk. The data is stored in compressed format and it can be accessed using SQL.
  • Level 1 cache: L1 cache is specific to a session. It exists as long as the session exists. All transactional data belongs to a single session will be stored within that session only. It cannot be shared by multiple sessions.
  • Level 2 cache: On the other hand, L2 cache is session factory level. It can be accessed by multiple sessions under the session factory. It is destroyed when the session factory is closed. This type of caching is one level higher than L1 cache.

You can also check – Comprehensive Guide on Java 9 and Java 10 Updates 

Challenges: As discussed above, caching can be simple or complex in nature. Simple caching is easy to implement, but the complex caching can take time, effort and proper design.

Following are some of the common challenges faced by the developers/architects.

  • Synchronization of cached data in a distributed environment
  • Proper eviction policy
  • Proper refreshing mechanism
  • Performance and memory usage
  • Concurrency issues
  • Replication
  • Reliability, availability and scalability

Explore – More articles on Java and J2EE 

Conclusion: In this article we have tried to explore different aspects of caching. We have also covered various caching strategies and their fitness. There are lot of other ways also, but the basic concepts remain same as discussed above. Hope this article will help you to understand caching.

Written by Kaushik (Founder and CEO of is a Technical Architect by profession, having more than 20 years of experience in IT industry. Passionate about the technology world. Interested in software design, open source technologies, Big data, AI and technology consulting. Teaching and mentoring IT professionals for more than 12 years. Also, involved into online/offline training, interviewing, consulting, mentoring and coaching.

If you have any query, email me @ I would love to hear from you.

Some more interesting articles – You can have a look.

============================================= ============================================== Buy best TechAlpine Books on Amazon
============================================== ---------------------------------------------------------------- electrician ct chestnutelectric

Enjoy this blog? Please spread the word :)

Follow by Email