PRO+ Premium Content/E-Handbooks
Thank you for joining!
Access your Pro+ Content below.
August 2016

Server-side caching defined and clarified

Server-side caching, or flash storage installed in the server itself, is deployed to accelerate application performance. Placing the flash as close to the application as possible reduces latency and improves performance.

Over the past couple of years, there has been a lot of development in this area, and today, there are a number of ways that you can insert flash storage into a server. For example, you can use SATA form factor SSDs that install in place of traditional hard disk drives. Or you can use flash storage that connects directly to the PCIe bus. An emerging option is to attach flash storage directly to the server's memory channel via dual in-line memory module (DIMM) slots. Each, of course, comes with strengths and weaknesses.

There are also a variety of standards emerging for server-side caching that are important to understand when selecting a product.

This Drill Down on server-side flash will compare and contrast the variety of ways that you can deploy the technology today to help readers better understand the pros and cons to each approach. It will also explain when server-side caching is a better (or worse) alternative to an all-flash or hybrid-flash storage array. There's so much going on in this space today that it can be hard to stay on top of it all. This Drill Down will help.

About The Author

George Crump

George Crump is a former contributor to TechTarget's storage ...Read More

Table Of Contents

  • Pros and cons of server-side flash
  • Server-side flash caching's evolution
  • SSD caching and tiering aren't the same
Access this PRO+ Content for Free!

More PRO+ Content

View All