The End of Moore’s Law?

The End of Moore’s Law?

Back in the 1960s, Gordon Moore predicted that the number of transistors on an integrated circuit would double every year. After a brief tweak in 1975 to his hypothesis, he upped the timeframe to every two years, setting in stone the popular theory that we know today as Moore’s Law.

This law of ever-accelerating returns has been a benchmark for the technology sector for decades. But after almost fifty years in existence, predicting the exponential growth of processing power, the end of Moore’s Law is almost in sight.

Warnings signs appeared when Intel, the company that Moore co-founded, started failing to meet its own biennial ‘tick-tock’ strategy. With production lines slowing, analysts reckoned that Intel – and many other big chipmakers – were running into technological problems.

However, there are some new threats to Moore’s Law.

Since the 1990s, the semiconductor industry has joined forces to produce a roadmap (ITRS) for the sector. Back when the first report was published there were nineteen large-scale manufacturers. Today, there are just four: Intel, TSMC, Samsung and Global Foundries.

The quartet’s latest blueprint suggests that they are collectively approaching a point where economics, rather than physics, puts an end to Moore’s Law for good.

Aside from the technological and scientific resources needed to keep pace with Moore’s Law, companies are now finding that advancements are becoming incredibly resource heavy.

The world’s computing infrastructure already uses a large amount of the world’s power. Yet that amount is finite and the processors and chips of the future will need to consume more and more power. According to the latest ITRS, if current trends continue over the next two decades, computing will require more electricity than the world can possibly produce.

Another potential roadblock for the longstanding theory is that power has shifted away from the semiconductor industry and towards the original equipment manufacturers (OEMs).

With tech giants such as Google and Apple becoming more powerful, it is the users who are calling the shots when it comes to chip design and processing applications – not the chip manufacturers themselves.

Because of this, manufacturers are becoming increasingly vulnerable to the whims of their customers and are finding it difficult to prioritise important research projects.

One potential solution – raised in the most recent ITRS publication – is the creation of a national initiative.

“A National Computing and Insight Technology Ecosystem initiative will support the development of an aggressive research agenda and a leap forward in new knowledge.”

To read the full report, head here

0 Comments