Go to previous page

Efficient Caching Techniques in iOS: From Beginner to Advanced

iOS Caching Techniques
#
iOS
#
Cache
Adarsh Ranjan
Adarsh
iOS Engineer
February 20, 2025

Caching is a crucial technique for optimizing application performance by reducing the need to recompute or refetch data repeatedly. In this blog, we’ll explore efficient caching techniques in iOS, progressing from beginner to advanced levels, and understand why <span class="text-color-code"> NSCache </span> is often the preferred choice.

What is Caching?

Caching is the process of storing data temporarily in memory to improve performance. It helps reduce the latency of fetching data and minimizes redundant computations or network calls.

Beginner Level: Using a Dictionary for Caching

The simplest way to implement caching in iOS is by using a dictionary to store key-value pairs.

How it Works:

var cache: [String: Any] = [:]
// Add data to cache
cache["user1"] = "John Doe"
// Retrieve data
if let user = cache["user1"] {
    print("Cached user: \(user)")
}

Limitations:

  • Manual Memory Management: A dictionary does not automatically release memory when the system is under pressure.
  • No Automatic Eviction Policy: The cache grows indefinitely unless old data is explicitly removed.

When to Use:

Suitable for small and simple use cases where you can manually manage memory and eviction.

Intermediate Level: Using NSCache

<span class="text-color-code"> NSCache </span> is a thread-safe, system-managed caching mechanism that is more efficient than a dictionary for caching in iOS.

Advantages of NSCache:

  • Automatic Memory Management: The system automatically removes objects from the cache when memory is low.
  • Thread-Safe: You can safely use <span class="text-color-code"> NSCache </span> from multiple threads without additional synchronization.
  • Eviction Policy: Supports eviction based on cost and count.

How to Use NSCache:

class DataCache {
    private let cache = NSCache<NSString, NSData>()
    func setObject(_ object: Data, forKey key: String) {
        cache.setObject(object as NSData, forKey: key as NSString)
    }
    func getObject(forKey key: String) -> Data? {
        return cache.object(forKey: key as NSString) as Data?
    }
}
let dataCache = DataCache()
let sampleData = Data(repeating: 0, count: 1_000)
dataCache.setObject(sampleData, forKey: "sampleKey")
if let cachedData = dataCache.getObject(forKey: "sampleKey") {
    print("Retrieved cached data: \(cachedData.count) bytes")
}

Customizing NSCache:

  • Set Count Limit:
cache.countLimit = 100 // Maximum 100 items
  • Set Total Cost Limit:
cache.totalCostLimit = 10 * 1024 * 1024 // 10 MB
  • Assign Cost to Items:
cache.setObject(sampleData as NSData, forKey: "key", cost: sampleData.count)

When to Use:

Use <span class="text-color-code"> NSCache </span> for a scalable, system-friendly caching mechanism with minimal maintenance.

Advanced Level: Custom Caching Solutions

For more control over caching policies, you can implement a custom Least Recently Used (LRU) cache or combine in-memory caching with disk caching.

Custom LRU Cache:

An LRU cache evicts the least recently used item when the cache exceeds its capacity. It can be implemented using a dictionary and a doubly linked list.

Implementation:

class LRUCache<Key: Hashable, Value> {
    private struct CacheNode {
        let key: Key
        var value: Value
    }
    private var capacity: Int
    private var cache: [Key: CacheNode]
    private var keysOrder: [Key]
    init(capacity: Int) {
        self.capacity = capacity
        self.cache = [:]
        self.keysOrder = []
    }
    func set(_ key: Key, value: Value) {
        if cache.keys.contains(key) {
            keysOrder.removeAll { $0 == key }
        } else if cache.count >= capacity {
            if let oldestKey = keysOrder.first {
                cache.removeValue(forKey: oldestKey)
                keysOrder.removeFirst()
            }
        }
        cache[key] = CacheNode(key: key, value: value)
        keysOrder.append(key)
    }
    func get(_ key: Key) -> Value? {
        if let node = cache[key] {
            keysOrder.removeAll { $0 == key }
            keysOrder.append(key)
            return node.value
        }
        return nil
    }
}
let lruCache = LRUCache<String, String>(capacity: 2)
lruCache.set("1", value: "A")
lruCache.set("2", value: "B")
lruCache.set("3", value: "C") // "1" is evicted
print(lruCache.get("1")) // nil
print(lruCache.get("2")) // "B"

Combine with Disk Caching:

For large data sets, combine in-memory caching (e.g., <span class="text-color-code"> NSCache </span>) with disk-based caching.

Example:

let cacheDirectory = FileManager.default.urls(for: .cachesDirectory, in: .userDomainMask).first!
let fileURL = cacheDirectory.appendingPathComponent("cachedFile")
// Save data to disk
try? sampleData.write(to: fileURL)
// Retrieve data from disk
let retrievedData = try? Data(contentsOf: fileURL)

Hybrid Cache (Memory + Disk):

  • Use <span class="text-color-code"> NSCache </span> for frequently accessed data.
  • Use disk caching for large or less frequently accessed data.

Key Considerations for Caching

  1. Eviction Policy: Decide how to remove old or unused items (e.g., LRU, least frequently used).
  2. Memory Quota: Use <span class="text-color-code"> NSCache </span>'s cost and count limits to prevent excessive memory usage.
  3. Data Size: Cache small objects in memory and larger objects on disk.
  4. Thread-Safety: Ensure thread-safe access for custom caching solutions.
  5. Security: For sensitive data, use encrypted storage for disk caching.

Why NSCache is Recommended

  • Provides automatic memory management, preventing excessive memory usage during low-memory situations.
  • Thread-safe, making it ideal for concurrent environments.
  • Supports customizable eviction policies with cost and count limits.

LRU Cache vs NSCache: Key Differences

Aspect LRU Cache NSCache Eviction Policy Evicts least recently used item. Evicts items based on memory pressure or limits. Control Full control over eviction. Limited control (cost and count limits). Thread Safety Requires manual synchronization. Thread-safe by design. Memory Management Manual memory management. Automatically managed by the system. Use Case Complexity Suitable for custom policies. Suitable for general-purpose caching.

Conclusion

  • Use LRU Cache: For fine-grained control over eviction policies and custom caching needs.
  • Use NSCache: For quick and efficient caching in iOS apps with system-managed memory handling.

Caching is a powerful tool that, when used correctly, can significantly enhance the performance of your iOS applications. Understanding the trade-offs between <span class="text-color-code"> NSCache </span> and custom solutions like LRU cache will help you choose the right approach for your use case.

If you'd like to see hybrid caching specifically tailored to your project, reach out to us and let's discuss on a detailed implementation of hybrid caching.

Recommended Posts

Open new blog
Github Copilotx
#
AI
#
technology

GitHub Copilot X: Taking developer productivity to the next level

Sanskar
June 29, 2023
Open new blog
Pink Venn diagram
#
collaboration
#
team

Four Essential Skills for Successful Human Collaboration

Achin
April 17, 2023
Open new blog
Top Programming Languages
#
Programming
#
Coding

The Top Programming Languages in 2024

Anuj
October 24, 2024
#
iOS
#
Cache