MOORE'S LAW
In 1965 Gordon E. Moore (one of the founders of Fairchild Semiconductors, U.S.A) predicted, based on data available at that time, that the density of transistors in integrated circuits will double at regular intervals of around 2 years. Based on the experience from 1965 to date, it has been found that his prediction has been surprisingly accurate. In fact, the number of transistors per integrated circuits chip has approximately doubled every 18 months. The observation of Moore has been called "Moore's Law
In a diagram we have given two plots. One gives the number of transistors per chip in Dynamic Random Access Memory along the y-axis and years along x- axis. Observe that the y-axis is a logarithmic scale and the x-axis is a linear scale. The second plot gives the number of transistors in microprocessor chips. Observe that in 1974 the largest DRAM chip had 16K bit whereas 1998 it has 256 M bits. An increase of 16000 times in 24 years. The increase in the number of components in microprocessors has been similar. It is indeed surprising that the growth has sustained over 30 years. By extrapolating Moore's law it is expected that by the year 2010, DRAMs will have nearly 100 billion bits which implies that PCs will have 8 GB of main memory . The current generation microprocessor are 64 bit processors with clocks in the range of 3 GHz. Beyond this clock speed the amount of heat generated becomes very high. Thus, merely increasing clock speed is not a solution to get more powerful processors. Thus, chip manufactures have instead placed multiple processor on a chip which work in parallel. These are called multicore processors. Currently up to 16 processors are integrated in a chip which share a memory. Moore's law has other implication The availability of large memory and fast processors has in turn increased the size and complexity of system and application software.It has been observed that software developer have always consumed the increased hardware capability faster than the growth in hardware. This has kept up the demand for hardware.
The implication of Moore's law is that in the foreseeable future we will be getting more powerful computers at reasonable cost. It will be upto our ingenuity to use this increased power of computers effectively. It is clear that a number of applications such as speech recognition voice and video user interfaces which require large amount of memory and computing power will be extensively used.
Since semiconductors are necessary for the operation of all digital devices, including computers, cellphones, tablets, cameras, etc., Moore's Law has practical applications for every aspect of society, from a person to various types of businesses.
As a result, computers and other gadgets naturally get smaller, faster, and cheaper as transistors and chips get smaller, more efficient, and less expensive. Additionally, the industry's progress towards smaller, quicker, and cheaper CPUs is partly to blame for consumer push for faster devices with more features.
Conclusion:
By its strictest definition, Moore's Law, which calls for a doubling of the number of transistors every two years, is no longer true. Although the benefits are coming in more slowly, it is still producing exponential improvements. Moore's law may be slowing down, but technological innovation is not. Instead, the proliferation of new application areas (such as big data and artificial intelligence) has accelerated innovation and increased the demand for "exponential" advancements in technology. Ultimately, transistor count is not the only relevant factor that pushes for better processors. The average consumer cares about cost and performance, not how many transistors are in a device. If no more transistors can be squeezed into a small space, other technologies will have to be developed to circumvent this roadblock and produce more powerful processors; such is the nature of science and research.