Moore’s Law is a concept that was first proposed in 1965 by Gorden E. Moore, one of the founders of Intel, a major American technology company. Simply put, it states that the number of transistors on a microchip will increase exponentially, typically doubling every two years. Since microchips are the powerhouses of electronics industry, this exponential progression obviously has a huge impact on computer hardware.
Moore’s observation was based on his experience in the integrated circuit manufacturing industry. He observed that Intel was able to double the number of transistors on an individual chip approximately every 18-24 months, and that this trend held steady through multiple generations of chips. By 1970, people were referring to this phenomenon as “Moore’s Law,” thanks to Carver Mead, a professor at the California Institute of Technology, who coined the phrase.
A glance at a graph that tracks microchip production suggests that this law is a reality, although people argue over its limit; several studies indicate that this exponential growth rate may stop between 2017 and 2025, as manufacturers reach the limits of possibility. Moore’s Law isn’t just about the basic number of transistors on a chip, it also has to do with prices for microchips, and pricing for electronics in general as a result.
By using this law, people can predict price points for a wide range of consumer electronics including computers, digital cameras, and phones. A larger number of transistors increases the power and ability of electronics, meaning that companies are constantly releasing new and improved versions of their products. This can be frustrating for consumers who buy a top of the line product, only to discover that the price rapidly falls within a year or so. An awareness of this trend leads some consumers to reach for midrange electronics, rather than aiming for the best.
Technology companies sometimes feel intense pressure as a result of Moore’s Law. Although Moore’s original proposal was merely an observation of market trends, some companies use it as a literal law, trying to double the capacity of their computer components every year. Major chip manufacturers, including Intel, tend to release new chips on a two year schedule, reflecting scientific development, consumer demand, and the pressure of how people understand the law. As Gordon Moore pointed out in 2005, chip development has to stop somewhere, and ultimately, technology companies will be limited on the atomic level, unable to go any smaller.