Bert's Blog

Oct 26

Concurrent Caching

I recently needed to write a caching layer with the following characteristics:

  1. If the key=value is missing, block on requests for this key until it is loaded.
  2. If the key=value is stale, allow one request to update it and return the stale value to further requests.

This turns out to be extremely easy using the java.util.concurrent classes, specifically a little gem called an AtomicMarkableReference. Here I am using the associated boolean to indicate when a value should be reloaded.

The first job is to ensure that all requested keys are stored:

ConcurrentMap> map = 
  new ConcurrentHashMap>();
  
public V get(K key) {
  AtomicMarkableReference reference = map.get(key);
  
  if(reference == null) {
    map.putIfAbsent(key, new AtomicMarkableReference(null,false));
    reference = map.get(key);
  }
  // ...

Now we have a reference, we can check if another thread has done the work for us:

  boolean[] holder = new boolean[1];
  V value = reference.get(holder);
  if(holder[0]) {
    return value;
  }
  // ...

If value is null we can satisfy point 1 and block for this key using the reference. We need a second null check to prevent further threads also updating the same reference.

  if(value == null) {
    synchronized(reference) {
      V v = reference.get(holder);
      if(v != null) {
	return v;
      }
      V newValue = newValue(key);
      reference.set(newValue, true);
      return newValue;
    }
  }
  // ...

If we have a value we can satisfy point 2 by checking if we can return it as stale by attempting to mark the same value live again. If it succeeds only our thread will update the reference, otherwise we simply return the stale value.

  if(reference.compareAndSet(value, value, false, true)) {
    V newValue = newValue(key);
    reference.set(newValue, true);
    return newValue;
  } else {
    return value;
  }
}

Now a key=value can be “expired” by setting the reference’s associated boolean to false.