South East Farm Press Logo

Gordon Moore theorized the number of components per integrated circuit for computing would double every two years.

Brad Haire, Executive Editor

November 4, 2022

2 Min Read
semiconductor-GettyImages-1331269173.jpg
show999 / iStock / Getty Images Plus

Gordon Moore co-founded the Intel Corporation. In 1965, he observed a ‘law’ about the then current state of computing and its future.

Sixty years ago, Moore, also an accomplished computer engineer, worked for Fairchild Camera and Instrument. A popular magazine at the time asked his thoughts on the semiconductor industry and where it may go over the next ten years.

Moore wrote an essay called Cramming more components onto integrated circuits, which was published April 1965 by Electronics magazine. According to a well-written Oct. 3, 2022, Bloomberg article called Moore’s Law Keeps Chip Leaders Ahead of the Pack, Moore’s Law remains relevant.

Moore adjusted his law over the years, but eventually theorized the number of components per integrated circuit for computing would double every two years, and Moore expected the trend to continue for the next decade. Sixty years later, the law still has good logic and legs under it.

An integrated circuit is a set of electronic circuits on a small flat "chip" of semiconductor material. That material is usually silicon.

According to the Bloomberg article, the number of transistors per chip has increased from 100 to almost 50 billion, and the size of components has shrunk. Due to this trend, the power of computing has skyrocketed and the cost to do it has plummeted. For example, an iPhone today is stronger than a room full of circuits in 1970. The iPhone costs a lot less than that ‘70’s room full of computers.

Related:Building agriculture’s AI hub

Southeast Farm Press recently published a story about Florida’s drive to be the artificial intelligence hub for agricultural solutions. AI, or machine learning, is complicated but incredible technology.

When it comes to computer power, I can understand gigabytes. This glorified typewriter my fingers fumble around now boasts 237 GB storage, but it’s hard for me to wrap my mind around the power of today’s supercomputers. AI supercomputers are measured in FLOPS, or floating-point operations per second. One gigaflop can do one billion flops. One petaflop, and you’ll see the Florida AI computer in the article has 17.2 petaflops, is equal to 1 million gigaflops. Rereading the previous sentence, my mind just went static like an old TV.

AI helps people solve problems and can improve things. It will continue to be used to do such. And that’s a good thing. The power of computers can provide rational solutions. But is rational ‘always’ the best solution. Yes, and as long as the rational solution helps people.

Moore’s Law still holds water today as supercomputing marches on, and so does Moore. He’s 93 and lives in San Francisco.

About the Author(s)

Subscribe to receive top agriculture news
Be informed daily with these free e-newsletters

You May Also Like