Moore’s Law was first coined by Intel co-founder Gordon E. Moore. He first described this trend in an article written in 1965. The article claimed that the number of components in integrated circuits had doubled each year from the invention of the integrated circuit in 1958 to 1965. He went on to predict that the trend would continue “for at least minus ten years “. Most observers and tech experts state that doubling only takes about 18 months!

Much has been written about her trend since then. Much more has been written about how it has been applied not only to the number of components within integrated circuits, but also to memory systems, processing speeds, the complexity of software executions, and so on. Few students, scientists, engineers, and workers in the technology industries have not realized their influence on product design cycles, competitive analysis, the future cost of technology, and a host of other business issues.

The question that is often asked, when will it end? Or rather, will it end? And if so, when? Predictions have been made that it cannot last forever and that “next year” or “next decade” will surely come to an end. Or some will say that it may not finish, but that it will gradually decrease until the doubling rate takes longer and longer. The truth of the matter is that duplication is currently taking less time!

Well, enough of that. A person interested in this great assault on nature only needs to search the Internet (Google or Bing or the search engine of their choice) to find a large number of comments on the subject.

My purpose in bringing this to your attention is to highlight how it has affected the design engineer in the production of hardware equipment and systems for the markets. We contrast the days of vacuum tube circuits with today’s systems containing GigaBytes of data with embedded software that guides the device to levels of productivity unprecedented just a decade ago.

My experience with consumer electronics such as the Compact Disc is of interest. The first CD players shown to the world before 1982 were roughly the size of a desk and contained multiple printed circuit boards powered by multiple power supplies. The first demonstrations were made with a ton of discrete electronic circuits hidden under the table behind a black curtain so as not to reveal how complex the reproduction systems were. When it was integrated to the level of technology available in 1982, the first players were the size of a very large shoe box or hat box. They contained more than 100 integrated circuits of one type or another. With the passage of time, integrated circuits become more capable of containing the functions in fewer and fewer integrated circuits. In fact, in 1990 there were integrated circuits capable of doing all digital processing on a single chip. Today you can buy a simple CD player for less than $ 10!

When the CD was standardized, it seemed impossible to be able to fit almost 1 GB of data on the small 12 cm disk. At the time, the primary media being used on the PC was an 8 “floppy disk or the newer 3.5” floppy disks with a capacity of 1.2 MB. Today we can buy a USB flash drive to connect to the PC for less than $ 10 that will have a capacity of 4 GB!

The first CD pre-mastering workstation used a high-powered PC of the day and an 8-foot rack containing hard drive memory with a large lead-acid backup battery on the bottom in case of power failure. Energy. They were so expensive that it took a large investment to prepare to press the CDs in volume. A pre-mastering studio would cost between $ 1 and $ 2 million. This job can now be done on a desktop or laptop that costs less than $ 2,000. All because of Moore’s Law; or should I say in appreciation and consequence of Moore’s Law.

The early CD players were so complex that we had no way of testing and observing much of what was going on within logic circuits. Interlacing the data, applying Reed Solomon coding, bug detection correction, and crash protection was grotesquely complex. Designers of CD equipment had to go beyond what mainstream computer analysts were doing at the time and find more powerful ways to keep the date good enough for music playback when the surface of the disc became corrupted.

The same technology arrived that brought the perfection of digital audio to instruments that could now sample and store data and compare the processing events taking place on CD recording, mastering and replication equipment. While early CDs were plagued with dropouts due to recording errors, it is now very rare to experience such failures.

These great improvements came in the form of digital oscilloscopes that allowed circuit designers to monitor and observe digital channels, timing, and processing in all complex circuits. Analog-to-digital converters, digital-to-analog converters, jitter, low-level linearity, and noise control were vastly improved due to a better understanding of the many processes that were part of the compact disc standards.

Now, we have highly automated test systems at the factory level, in repair shops, and among high-end users that keep the technology error-free and robust. The next time you listen to a CD of your favorite music, think of the million transistors buried within one or two of the player’s ICs, all handling its individual parts by reading the high-speed data that comes off the surface of the disc in a flash. digital data string. flip flops that rearrange the bits of information, remove the control and formatting from the header, and finally put the digital bits in the correct order to feed a digital-to-analog converter that sends the final audio to the amplifier and speaker system. It really is a miracle that is not surprising.

We could not have accomplished this digital feat without a similar revolution in the test, measurement, and monitoring systems used in the design and production of discs, recording systems, and players. All done in a few years. How many years did it take to “perfect” vinyl audio disc playback systems? From 1877 to 1977 was what it took! And the compact disc didn’t enter the market until 1982. So analog music recording that existed for over 100 years was replaced by a digital audio system that is now just over 20 years old and is threatened with complete obsolescence in the near future with newer digital systems to download directly without the use of a packaged medium at all.

Whats Next?

Leave a Reply

Your email address will not be published. Required fields are marked *