What Is Cache Memory Definition?

11

11 Answers

d ds Profile
d ds answered
Cache is the small amount of memory that is normally less then 1MB that resides on the CPU so that the time to access the main memory is reduced. Cache memory stores the copies of the most frequently used data and when the user requests to access a certain portion of the maim memory, the computer will check cache memory first and load it immediately if it is resent in the cache memory thereby reducing the access time. If it is not present in cache then it will load from the main memory and this involves latency.
Anonymous Profile
Anonymous answered
Good Question!!

Cache is a memory where the data, which are already used by the CPU, is stored. And Cache can supply this data very quickly to the CPU in case of re-access. Thus, Cache is needed to increase the performance speed of the CPU.
Abadit Ali Profile
Abadit Ali answered
The memory is the main working area of the computer. A computer does all the work in the memory. Data is transferred from memory to processor for processing. The operating system manages this process.

The cache memory is also a random access memory but it works faster than the traditional RAM. The data which is going to be processed by the processor is already presented in the cache memory. The operating system uses the cache memory to reduce the time for searching of data. The required data is always provided in the cache for processing. The cache is very effective. The reason is most programs share same data and instruction again and again.

In some processors the cache memory is built in. Some processors don't have the cache memory. The main difference between Intel Celeron and Intel original technology was of cache memory. The cache memory is also costly then the traditional RAM.

The web browsers also used cache memory to store the visited pages of different websites. These stored pages are used to for quick access when we want them again or want to get them offline. The browsers store images, sounds, URLs and other objects.
Anonymous Profile
Anonymous answered

CACHE MEMORy

Cache memory is an extremely fast.Small memory between CPU and main memory.Whose access time is closer to the processing speed of the CPU.Its acts as a high speed buffer to the CPU and main memory.It is used to temporarily store very active data and transferred during the processing.
Anonymous Profile
Anonymous answered
Pronounced cash, a special high-speed storage mechanism. It can be either a reserved section of main memory or an independent high-speed storage device. Two types of caching are commonly used in personal computers: Memory caching and disk caching.

A memory cache, sometimes called a cache store or RAM cache, is a portion of memory made of high-speed static RAM (SRAM) instead of the slower and cheaper dynamic RAM (DRAM) used for main memory. Memory caching is effective because most programs access the same data or instructions over and over. By keeping as much of this information as possible in SRAM, the computer avoids accessing the slower DRAM.

Some memory caches are built into the architecture of microprocessors. The Intel 80486 microprocessor, for example, contains an 8K memory cache, and the Pentium has a 16K cache. Such internal caches are often called Level 1 (L1) caches. Most modern PCs also come with external cache memory, called Level 2 (L2) caches. These caches sit between the CPU and the DRAM. Like L1 caches
, L2 caches are composed of SRAM but they are much larger.

Disk caching works under the same principle as memory caching, but instead of using high-speed SRAM, a disk cache uses conventional main memory. The most recently accessed data from the disk (as well as adjacent sectors) is stored in a memory buffer. When a program needs to access data
from the disk, it first checks the disk cache to see if the data is there. Disk caching can dramatically improve the performance of applications, because accessing a byte of data in RAM can be thousands of times faster than accessing a byte on a hard disk.

When data is found in the cache, it is called a cache hit, and the effectiveness of a cache is judged by its hit rate. Many cache systems use a technique known as smart caching, in which the system can recognize certain types of frequently used data. The strategies for determining which information should be kept in the cache constitute some of the more interesting problems in computer science.
Kashif Laeeq Profile
Kashif Laeeq answered
Cache is a small but fast memory; its size is not more than an MB. Just for explanation, we can say the cache is a supportive memory that can do nothing but enhancing the processing speed by keeping most frequent data element. The two main types of cache are memory and disk cache. Memory cache is a portion of static RAM (SRAM) that is volatile. Disk cache resides in main memory that is mostly non-volatile. The memory cache is very fast as compare to disk cache, the reason is SRAM is much faster than Main memory. One more type of cache is Internet Browser Cache that keeps most frequent and most access temporary internet file in internet explorer. Internet browser cache speeds up the browsing. Most recommended approach is to clear the history of that temporary file from explorer history list, daily.
Kashif Laeeq,
Lecturer,
Dept.of Computer Science,
Federal Urdu University, Karachi.
tYe Profile
tYe answered

I is a small storage area in your computer.

Cache memory stores copies of data most frequently used, to reduced access to the main memory, which may take a while.
ghazal gi Profile
ghazal gi answered
It is a small memory which contains a copy of a portion of main memory. It is useful because it gives the data very fast and don't copy the whole RAM data. Cache works on guess.
Its working is very simple and efficient, like CPU read a word from cache, if it is not found, cache imports BLLOCK of required data from main memory, means processor want to read a page 34 or ch-3 and the ch-3 contains 1-90 pages, so when processor ask for the page 3 from cache, it cache have this page then it will give this page to the processor, if it don't have this page then it will ask for this page to the RAM, then cache will take the full ch-3 (full block) and will send the page to the processor, it keep the full ch-3 with itself because it thought that, most probably the processor will read the next page from this chapter. So it is the whole working of the cache memory. As I said cache work on guess, that's why it keep the whole chapter with itself. Via this the CPU work very efficiently. Initially this cache was came only for the main memory, but now it is used for many hardware devices like cd-drive and many other, it is very helpful for the fast data processing.

Answer Question

Anonymous