Data Storage Hierarchy in Operating System:

Definition and Explanation:

Main memory is usually located on chips inside the system unit. Two types of memories are random-access memory (RAM) and read-only memory (ROM).
The instructions and the data are kept in RAM during execution. RAM is not a permanent storage place for information. It is active only when computer is on. If computer is switched off, the information is deleted from RAM.
ROM is the memory where the information can only be read. When the computer is turned off, the information is not deleted from ROM. Information to ROM is written by vendor. The size of the main memory is measured in megabytes.
The information stored on a disk is not deleted when the computer is turned off. Information stored on the disks, is moved in and out of RAM. There are two kinds of disks: hard disk and floppy disk.
The main memory and the floppy disk have less storage capacity than the hard disk. The hard disk can write and read information to and from the main memory much faster than a floppy disk. The access speed of main memory is also much faster than a hard disk.
Most programs are stored on a disk until loaded into memory. The proper management of disk storage is of central importance to a computer system.
The storage in a computer system can be organized in a hierarchy according to speed or cost. The higher levels are expensive but very fast. As we move down the hierarchy, the cost per bit decreases, while the access time increases and the amount of storage at each level increases.
Caching is an important principle of computer systems, both in hardware and software. Information is normally stored in- some storage system like main memory. As it is used, it is copied into a faster storage system (the cache) temporarily. When some information is needed, we first check if it is in the cache. If it is, we use the information directly from the cache. If not, we use the information from the main storage system, putting a copy in the cache.
Since cache have limited size, cache management is an important design problem. Careful selection of cache size and a replacement policy can mean that 80 to 90 percent of all accesses are in the cache, resulting in very high performance.
The programmer (or compiler) Implements the register allocation and replacement algorithms to decide what information to keep in registers/ and what to keep in main memory. The movement of information between levels of a storage hierarchy may be either explicit or implicit.

Leave a Reply

Subscribe to Posts | Subscribe to Comments

Blog Archive

Powered by Blogger.

- Copyright © 2013 Taqi Shah Blogspot -Metrominimalist- Powered by Blogger - Designed by Johanes Djogan -