Sign Up

Sign Up to our social questions and Answers Engine to ask questions, answer people’s questions, and connect with other people.

Have an account? Sign In

Have an account? Sign In Now

Sign In

Login to our social questions & Answers Engine to ask questions answer people’s questions & connect with other people.

Sign Up Here

Forgot Password?

Don't have account, Sign Up Here

Forgot Password

Lost your password? Please enter your email address. You will receive a link and will create a new password via email.

Have an account? Sign In Now

Sorry, you do not have permission to ask a question, You must login to ask a question.

Forgot Password?

Need An Account, Sign Up Here

Please type your username.

Please type your E-Mail.

Please choose an appropriate title for the post.

Please choose the appropriate section so your post can be easily searched.

Please choose suitable Keywords Ex: post, video.

Browse

Need An Account, Sign Up Here

Please briefly explain why you feel this question should be reported.

Please briefly explain why you feel this answer should be reported.

Please briefly explain why you feel this user should be reported.

Sign InSign Up

Querify Question Shop: Explore Expert Solutions and Unique Q&A Merchandise

Querify Question Shop: Explore Expert Solutions and Unique Q&A Merchandise Logo Querify Question Shop: Explore Expert Solutions and Unique Q&A Merchandise Logo

Querify Question Shop: Explore Expert Solutions and Unique Q&A Merchandise Navigation

  • Home
  • About Us
  • Contact Us
Search
Ask A Question

Mobile menu

Close
Ask a Question
  • Home
  • About Us
  • Contact Us
Home/ Questions/Q 5045

Querify Question Shop: Explore Expert Solutions and Unique Q&A Merchandise Latest Questions

Author
  • 61k
Author
Asked: November 27, 20242024-11-27T05:35:08+00:00 2024-11-27T05:35:08+00:00

Implementing Caching Strategies: Techniques for High-Performance Web Apps

  • 61k

Caching Strategies? What's that?

Note: in-case you haven't read Deep Dive into Caching: Techniques for High-Performance Web Apps the previous blog.

Before we go deep, let us understand some common policies here

  1. Write Through: The data is written in cache and backing store / DB simultaneously (parallelly).

  2. Write Around: The Data is written only to the Backing Store / DB, not caches.

  3. Write Background/ Write Behind: The Data is written to the Cache first then the Backing Store / DB in the background.

  4. Read Through: The Data is Written to the Backing Store / DB. if the data is ever read it's written on the cache. This makes the first data read to be time taking but subsequent reads are faster.

Each of the policies above has advantages and disadvantages.

In the case of Distributes / Microservice Architecture, the thing would be distributed more based on the scale of the whole system, and other techniques such as sharding, etc are involved. Will be writing about this on some other blogs.

Problem

Ahh, When to which Writing Policies?

Let's understand some use cases for each of them

1. Write-Through Policy

  • Data Consistency : When strong consistency between the cache and the underlying data store is required. Any data written to the cache is immediately available in the backing store.

  • Simple Implementation : Easy to implement and understand since every write operation is propagated to the underlying data store.

  • Read-Heavy Workloads : Suitable for scenarios where read operations are more frequent than write operations, as the data in the cache is always consistent with the data store.

Examples:

  • Session Management : In web applications, session data needs to be consistent and immediately available across multiple nodes.

  • Configuration Data : Configuration settings that are frequently read but rarely changed.

2. Write Around Policy

  • Write-Heavy Workloads : Suitable for applications with frequent writes and less frequent reads, reducing the number of write operations to the cache.

  • Cold Data : Ideal for scenarios where data is not frequently accessed after being written. The cache is not burdened with rarely accessed data.

Examples:

  • Bulk Data Imports : Applications that periodically import large datasets where the data is not immediately needed for reading.

  • Logging Systems : Systems that write log data directly to storage but only occasionally read the data for analysis.

3. Write Behind (Write Back) Policy

  • Performance : Improves write performance by quickly acknowledging write operations and deferring the actual write to the data store.

  • Batch Processing : Suitable for scenarios where data can be written in batches to the underlying store, reducing the write load.

  • Data Freshness : Suitable when immediate consistency is not critical, and slight delays in data propagation to the data store are acceptable.

Examples:

  • User Activity Logging : Applications that log user actions where the logs are periodically flushed to the database.

  • E-commerce : Shopping cart data that is written to the cache for quick access and periodically synchronized with the database.

4. Read Through

  • Lazy Loading : Useful for loading data on demand, caching it only when it is actually needed.

  • Read-Heavy Workloads : Suitable for applications where read operations significantly outnumber write operations, and data needs to be quickly accessible after the first access.

Examples:

  • Product Catalogs : E-commerce applications where product details are read frequently but updated infrequently.

  • Content Management Systems (CMS): Systems where articles or media are read frequently after the initial publication.

When to select what

Need Consistency

  • Write Through : Ensures strong consistency as data is written to both the cache and the store simultaneously.

  • Write Around : Can lead to stale cache data until the data is read and cached.

  • Write Behind : Provides eventual consistency with potential lag between cache and store.

  • Read Through : Ensures data is cached on first access, potentially leading to stale data if not frequently updated.

Need Performance

  • Write Through : This can be slower for write operations due to double writes (cache and store).

  • Write Around : Reduces write load on the cache, faster write operations.

  • Write Behind : Improves write performance, but read operations may suffer if cache and store are not in sync.

  • Read Through : Fast read operations after initial cache miss, good for read-heavy scenarios.

Need Simplicity

  • Write Through : Simple to implement and ensures immediate consistency.

  • Write Around : Simple for write operations but requires cache management for reads.

  • Write Behind : More complex due to the need for asynchronous write handling and potential consistency issues.

  • Read Through : Straightforward for reads, requires handling of initial cache misses.

It depends on the use case you are solving.

Now Let's go deep, into the strategy used while implemeting cache.

Developers: I commonly use the OG, LRU cache most of the time.

LRU (Least Recently Used) is a popular caching strategy, but it's not always the best fit for every use case. There are several alternative caching strategies, each with its own strengths and suitable scenarios.

There are many with their own use cases, would be naming them here

  • LRU : Best for scenarios where the most recently accessed items are most likely to be accessed again soon.

  • LFU : Best when the access frequency is a good predictor of future accesses.

  • FIFO : Simple, best when the oldest data is the least useful.

  • Random Replacement (RR) : Simple, good for unpredictable access patterns.

  • Time-To-Live (TTL) : Best for time-sensitive data that becomes stale after a certain period.

  • Adaptive Replacement Cache (ARC) : Adapts well to changing access patterns, more complex.

  • Least Recently/Frequently Used (LRFU) : Balances between recency and frequency, tunable.

  • Segregated LRU (SLRU) : Useful for multi-segment caches with different types of data.

  • Most Recently Used (MRU) : Useful in specific scenarios where the most recent data is less useful.

  • Clock Algorithm : A variant of LRU that approximates its behavior using a circular buffer (clock) and a use bit for each page.

  • 2Queue : Balances recency and frequency with separate queues.

Some frameworks by default support this strategy such as Django, and Spring Boot, and many more.

If you reached here read the Blog. Thanks for Reading. Hope this blog is about caches.

What did we learn

  1. When to use what policy which implementing caching
  2. How to not overload the caches by using various techniques for implementing caches.

Follow for more interesting blogs, Follow me to make me motivated to write some more interesting stuff.

Here are my social – Linked In

beginnersjavascriptprogrammingwebdev
  • 0 0 Answers
  • 2 Views
  • 0 Followers
  • 0
Share
  • Facebook
  • Report

Leave an answer
Cancel reply

You must login to add an answer.

Forgot Password?

Need An Account, Sign Up Here

Sidebar

Ask A Question

Stats

  • Questions 4k
  • Answers 0
  • Best Answers 0
  • Users 2k
  • Popular
  • Answers
  • Author

    ES6 - A beginners guide - Template Literals

    • 0 Answers
  • Author

    Understanding Higher Order Functions in JavaScript.

    • 0 Answers
  • Author

    Build a custom video chat app with Daily and Vue.js

    • 0 Answers

Top Members

Samantha Carter

Samantha Carter

  • 0 Questions
  • 20 Points
Begginer
Ella Lewis

Ella Lewis

  • 0 Questions
  • 20 Points
Begginer
Isaac Anderson

Isaac Anderson

  • 0 Questions
  • 20 Points
Begginer

Explore

  • Home
  • Add group
  • Groups page
  • Communities
  • Questions
    • New Questions
    • Trending Questions
    • Must read Questions
    • Hot Questions
  • Polls
  • Tags
  • Badges
  • Users
  • Help

Footer

Querify Question Shop: Explore Expert Solutions and Unique Q&A Merchandise

Querify Question Shop: Explore, ask, and connect. Join our vibrant Q&A community today!

About Us

  • About Us
  • Contact Us
  • All Users

Legal Stuff

  • Terms of Use
  • Privacy Policy
  • Cookie Policy

Help

  • Knowledge Base
  • Support

Follow

© 2022 Querify Question. All Rights Reserved

Insert/edit link

Enter the destination URL

Or link to existing content

    No search term specified. Showing recent items. Search or use up and down arrow keys to select an item.