0
5.4kviews
Explain different cache memory mapping techniques.

Mumbai University > Computer Engineering > Sem4 > Computer Organization and Architecture

Marks: 6M

Year: May 14

1 Answer
3
393views

MAPPING

  • The transformation of data from main memory to cache memory is referred to as a mapping process.
  • Three types of mapping procedures

    1.Associative mapping

    2.Direct mapping

    3.Set-associative mapping


1. Associative Mapping

  • The fastest and most flexible cache organization uses an associative memory.
  • The associative memory stores both the address and content of the memory word.
  • This permits any location in cache to store any word from main memory.
  • A CPU address of 15 bits is placed in the arguement register and the associative memory is searched for a matching adrress.
  • If the address is found ,the corresponding 12 bit data is read and sent to the CPU.
  • If no match occurs,the main memory is accessed for the word.The address-data pair is then transferred to the associative cache memory.
  • If the cache is full,an address-data pair must be displaced to make room for a pair that is needed and not presently in the cache.
  • Different replacement algorithms like FIFO is used for this. associative

2. Direct Mapping

  • Associative memories are expensive compared to random-access memories because of the added logic associated with each cell.
  • Direct mapping uses a random-access memory for the cache.
  • The CPU address of 15 bits is divided into two fields.
  • The nine least significant bits constitute the index field and the remaining six bits form the tag field.
  • The number of bits in the index field is equal to the number of address bits required to access the cache memory.
  • In the general case ,there are $2^k$ words in cache memory and $2^n$ words in main memory.
  • The n-bit memory address is divided into two fields:k bits for the index field and n-k bits for the tag field.
  • The direct mapping cache organization uses the n-bit address to access the main memory and the k-bit index to access cache. direct

3. Set Associative Mapping

  • Set-associative mapping is an improvement over the direct mapping organization in that each word of cache can store two or more words of memory under the same index address.
  • Each data word is stored together with its tag and the number of tag-data items in one word of cache is said to form a set.
  • An example of a set - associative cache organization for a set size of two is shown.
  • Each index address refers to two data words and their associated tags.
  • When the CPU generates memory request,the index value of the address is used to access the cache.
  • The tag field of the CPU address is then compared with both tags in the cache to determine if a match occurs.
  • The comparison logic is done by an associative search of the tags in the set similar to an associative memory search:thus the name "set-associative".
  • The hit ratio will improve as the set sizs increases because more words with the same index but different tags can reside in cache.
  • When a miss occurs in a set associative cache and the set is full,it is necessary to replace one of the tag-data items with a new value.
  • The most common replacement algorithms used are : random replacement,first-in,first out(FIFO),and least recently used (LRU).sa
Please log in to add an answer.