Monday, 2 December 2013

Memory Organization

Overview


When the processor needs to read from or write to a location in main memory, it first checks whether a copy of that data is in the cache. If so, the processor immediately reads from or writes to the cache, which is much faster than reading from or writing to main memory.
The data cache is usually organized as a hierarchy of more cache levels.

Cache memory
Cache is a small but high speed memory which is used by central processing unit of a computer to reduce average time to access memory. The cache stores copies of the data from frequently used main memory location. Often the main memory will provide a wider data word to the cache than the CPU requires to fill the cache more rapidly. As long as most memory accesses are cached memory locations, the average latency of memory accesses will be closer to the cache latency than to the latency of main memory.

Cache Analogy
You are writing a term paper for your history class at a table in the library
-As you work you realize you need a book
-You stop writing, fetch the reference, continue writing
-You don’t immediately return the book, maybe you’ll need it again
-Soon you have a few books at your table, and you can work smoothly without needing to fetch more books from the shelves
-The table is a cache for the rest of the library
Now you switch to doing your biology homework
-You need to fetch your biology textbook from the shelf
-If your table is full, you need to return one of the history books back to the shelf to make room for the biology book 

No comments:

Post a Comment