Cache Memory

Cache

Cache its defined as a Hardware or Software Component which help in serving the data either its frequently requested or its expensive to compute on , cache its stores the Computed response and helps in saving the expensive operation

Cache Invalidation & Eviction

Invalidation : The data which is in the Cache it won’t be forever it will be volatile that means the data can be removed/invalidated

Cache invalidation its needed the data which keep you in your cache its should be change in some point of time , and the data is changing you need to update the cache process of updating cache removing the old cache value its called Cache Invalidation

TTLTime To Live this is used for managing our Cache values like we can set limited time for storing the cache value like example if you doing any operation the value its should be stored in Cache you set TTL 8 min after that you are doing the same operation the response might be change from the previous one so the older cache values its should be cleared

Cache Eviction : Like each cache have limit to store number of keys like example if the cache have an limit to store keys is 1000 , its already have 1000 keys now the new key have to be updated in Cache at this time the old key has to be evicted and this key will be stored its can be done in multiple ways,

FIFO – First In First Out , Least Recently Used , Least Frequently Used

Cache Patterns :

Cache Aside Strategy/Pattern – In this the Cache is always talks to application not talk to the server(db).

The Scenario is ,

Client —> Server/App —> DB

|

Cache

Here the Client will send the Request to the Server/Application its interacts with the DB and Give Response back to the client , at this time the Server/App its stores the Response value to Cache so the next time the Request has come from the Client The Server/App will check in Cache whether the Value is there are not if value is available its servers the Response to Client or else it will be fetches from DB

Cache its never talks to DB , the Server/Application will be talked to DB

Here the problem is , if any new value is coming from DB at that time we have to use the TTL to remove the older cache value so at that time only the Client will be get the Updated data or else they can get only the older value which we are stores in the Cache

Advantage :

In Some Cases The Cache fails or Its goes down still Application can keep saving the data

Dis-Advantage :

You have decide keep long expiry and keep saving the data or have to put updating logic inside your application code , whenever data updates in DB its refreshes in Cache as well

Read-Through Strategy/Pattern – In this pattern Cache sits between Application and DB

Client —> App/Server —> Cache —> DB

In this pattern App/Server its not talking with DB , Cache only talking with DB

Request Comes from the Client its send to App/Server its sends to Cache and Searches in DB once the data is found Shares the Responses to Cache and its shares to App/Server and to Client

Advantage :

This kind of patterns supports when read heavy volumes

Dis-Advantage :

Dat modelling of cache and DB have to be similar

Cache failure results in system

Write Through Strategy/Pattern – Similar to Read -Through

Client —> App/Server —> Cache —> DB

Application Read from Cache , Application Write from Cache , it’s the responsibility for Cache to Update the data to DB and Fetch the data from DB

Latency its an disadvantage

Write Around Strategy/Pattern – Here the Application/Server directly write from the DB no need of Cache , but while Reading Its goes to Cache and then goes to DB

Client —> App/Server —> Cache —> DB

Used for Less Number of Reads and More Number of Writes

Write Back Strategy/Pattern – In this case All the Write Request coming from the App/Server are kept in Cache and the Request is Responded you write some case after some time all Those Writes from the App/Server are kept in Cache and write to the DB after sometime

So basically here batching the Write Request from App/Sever to kept in Cache and writes in the DB after some time.

Advantage :

Useful to write heavy workloads

Database failure can be sustained for the time cache keeps data in bulk

Used by various DBs internal implementation

Dis-Advantage :

Cache failure results in System Failure

Where is My Cache ?

Cache can happen in Browser level , Proxy level , Application level , Outside Application , With the DB

0
Subscribe to my newsletter

Read articles from OBULIPURUSOTHAMAN K directly inside your inbox. Subscribe to the newsletter, and don't miss out.

Written by

OBULIPURUSOTHAMAN K
OBULIPURUSOTHAMAN K

As a Computer Science and Engineering graduate, I have cultivated a deep understanding of software development principles and technologies. With a strong foundation in Java programming, coupled with expertise in frontend and backend development, I thrive in crafting robust and scalable solutions. Currently, I am leveraging my skills as a Java Full Stack Engineer at Cognizant, where I am involved in designing and implementing end-to-end solutions that meet the complex requirements of our clients. I am passionate about leveraging emerging technologies to drive innovation and deliver tangible business value. My goal is to continually enhance my expertise and contribute to the advancement of software engineering practices in the industry.