Concurrency Basics¶
Concurrency is doing multiple tasks within overlapping time periods. Parallelism is doing tasks simultaneously on multiple cores. Key problems: race conditions (shared mutable state), deadlocks (circular wait), starvation (thread never gets resource). Solutions: locks, atomic operations, immutable objects, thread-safe collections. Java provides synchronized, volatile, Lock, Atomic*, and java.util.concurrent utilities.
Concurrency vs Parallelism¶
Concurrency is about managing multiple tasks at once — they can interleave on a single core. Parallelism is about executing tasks simultaneously on multiple cores. Concurrency is a design concern (structure); parallelism is an execution concern (speed). A concurrent program may or may not run in parallel.
Deep Dive: Visual Comparison
| Concurrency | Parallelism | |
|---|---|---|
| Definition | Managing multiple tasks at once | Executing tasks simultaneously |
| Requirement | Single or multiple cores | Multiple cores |
| Analogy | One cook juggling 3 dishes | Three cooks each making 1 dish |
Race Conditions¶
A race condition occurs when the result depends on the unpredictable order of thread execution. Classic example: two threads incrementing a shared counter — both read the same value, increment independently, and write back the same result, losing one update. Fixes: synchronized, AtomicInteger, ReentrantLock.
Deep Dive: Lost Update Example
Deep Dive: Fixing Race Conditions
// Fix 1: synchronized — mutual exclusion
public synchronized void increment() { count++; }
// Fix 2: AtomicInteger — lock-free CAS operation (preferred for simple ops)
private final AtomicInteger count = new AtomicInteger(0);
public void increment() { count.incrementAndGet(); }
// Fix 3: ReentrantLock — explicit locking with more control
private final ReentrantLock lock = new ReentrantLock();
public void increment() {
lock.lock();
try { count++; }
finally { lock.unlock(); }
}
Deep Dive: What is CAS (Compare-And-Swap)?
CAS is a CPU-level instruction used by Atomic* classes for lock-free thread safety:
- Read current value
- Compute new value
- Atomically: if current value == expected, write new value; otherwise retry
// AtomicInteger.incrementAndGet() internally does:
do {
int expected = value;
int newValue = expected + 1;
} while (!compareAndSwap(expected, newValue));
Pros: No locking overhead, no blocking. Cons: Under high contention, many retries (spinning).
Deadlocks¶
A deadlock occurs when threads are stuck waiting for each other's resources in a circular chain. Four conditions must ALL hold (Coffman): Mutual Exclusion, Hold and Wait, No Preemption, Circular Wait. Prevention: lock ordering (always acquire in the same order), timeouts (tryLock), or avoid nested locks.
Deep Dive: Deadlock Example
Object lockA = new Object();
Object lockB = new Object();
// Thread 1
synchronized (lockA) {
Thread.sleep(100);
synchronized (lockB) { /* work */ } // Waiting for lockB
}
// Thread 2
synchronized (lockB) {
Thread.sleep(100);
synchronized (lockA) { /* work */ } // Waiting for lockA
}
// DEADLOCK: Thread 1 has lockA, needs lockB
// Thread 2 has lockB, needs lockA
Deep Dive: Deadlock Prevention Strategies
| Strategy | How | Example |
|---|---|---|
| Lock ordering | Always acquire locks in the same global order | If A < B, always lock A before B |
| Timeout | Use tryLock(timeout) — give up if can't acquire |
lock.tryLock(1, TimeUnit.SECONDS) |
| Avoid nested locks | Minimize lock scope, acquire only one lock | Restructure code to reduce dependencies |
| Lock-free algorithms | Use Atomic* classes, CAS operations |
AtomicInteger, ConcurrentHashMap |
Producer-Consumer Problem¶
A classic concurrency pattern: producers generate items into a shared buffer, consumers remove and process them. The challenge is coordination — producers must block when the buffer is full, consumers must block when it's empty. In Java, BlockingQueue handles all synchronization internally — no manual wait/notify needed.
Deep Dive: BlockingQueue Solution
BlockingQueue<String> queue = new LinkedBlockingQueue<>(10); // Capacity 10
// Producer — blocks when queue is full
Runnable producer = () -> {
while (true) {
queue.put("item"); // Blocks if queue is full
}
};
// Consumer — blocks when queue is empty
Runnable consumer = () -> {
while (true) {
String item = queue.take(); // Blocks if queue is empty
process(item);
}
};
executor.submit(producer);
executor.submit(consumer);
Queue types:
| Queue | Description |
|---|---|
LinkedBlockingQueue |
Optionally bounded, FIFO |
ArrayBlockingQueue |
Bounded, backed by array |
PriorityBlockingQueue |
Unbounded, priority ordering |
SynchronousQueue |
No capacity — producer blocks until consumer takes |
Thread Safety Strategies¶
Four approaches to thread safety, in order of preference: 1) Immutability — no shared mutable state, no problem. 2) Confinement — keep data thread-local (ThreadLocal). 3) Concurrent data structures — ConcurrentHashMap, CopyOnWriteArrayList. 4) Synchronization — synchronized, Lock, as a last resort. Choose the simplest approach that works.
Deep Dive: Strategy Details
1. Immutability — best approach, no synchronization needed
2. Confinement — don't share data between threads
ThreadLocal<SimpleDateFormat> formatter =
ThreadLocal.withInitial(() -> new SimpleDateFormat("yyyy-MM-dd"));
// Each thread gets its own SimpleDateFormat instance
3. Concurrent data structures — let the library handle it
ConcurrentHashMap<String, Integer> map = new ConcurrentHashMap<>();
map.computeIfAbsent("key", k -> expensiveComputation());
CopyOnWriteArrayList<String> list = new CopyOnWriteArrayList<>();
// Good for read-heavy, write-rare scenarios
4. Synchronization — control access to shared mutable state
private final Object lock = new Object();
private int balance;
public void transfer(int amount) {
synchronized (lock) {
balance += amount;
}
}
Priority: Immutability > Confinement > Concurrent collections > Synchronization
Deadlock vs Livelock vs Starvation¶
Deadlock — threads blocked forever waiting for each other (no progress). Livelock — threads keep changing state in response to each other but make no useful progress (like two people in a hallway both stepping aside). Starvation — a thread never gets CPU time because higher-priority threads always run first. Fix starvation with fair locks (new ReentrantLock(true)).
Common Interview Questions¶
Common Interview Questions
- What is the difference between concurrency and parallelism?
- What is a race condition? How do you prevent it?
- What is a deadlock? What are the four conditions?
- Explain the Producer-Consumer problem.
- What is thread safety? How do you achieve it?
- What is
ThreadLocal? When would you use it? - What is a
BlockingQueue? - What is the difference between
synchronizedandLock? - What is starvation? How is it different from deadlock?
- What is a livelock?
- What is CAS? How do Atomic classes use it?