Todd Hoff at HighScalability.com published another excellent article entitled Are Cloud-Based Memory Architectures the Next Big Thing? Chock full of analysis, data, links, examples and references. IMHO, it's a must-read piece for developers and architects.
As I noted in the comments to Todd's post, in addition to the many benefits of memory-based architectures which Todd lists, there is also the cost benefits in the cloud, which I discussed in Cloud Pricing and Application Architecture.
An excerpt from Todd's piece:
RAM = High Bandwidth and Low Latency
Why are Memory Based Architectures so attractive? Compared to disk RAM is a high bandwidth and low latency storage medium. Depending on who you ask the bandwidth of RAM is 5 GB/s. The bandwidth of disk is about 100 MB/s. RAM bandwidth is many hundreds of times faster. RAM wins. Modern hard drives have latencies under 13 milliseconds. When many applications are queued for disk reads latencies can easily be in the many second range. Memory latency is in the 5 nanosecond range. Memory latency is 2,000 times faster. RAM wins again.
RAM is the New Disk
The superiority of RAM is at the heart of the RAM is the New Disk paradigm. As an architecture it combines the holy quadrinity of computing:
Read the whole thing.