The End Of Moore's Law And What Happens Next

Computer Technology, Computational Electronics, and Information Technology (IT) have radically transformed the daily operations of end-users, small enterprises and large enterprises over the last 50 years. From the completion of daily workflows to transportation tasks and the utilization of communication technologies, computers have changed the way humans operate and how societies have evolved in recent years. Virtually every facet of society globally has benefited from - or been affected by - computers, and the constant advances of computer engineering technologies that have made possible more advanced technologies, such as mobile phones, smartphones, smart technology, tablets, personal computers, and laptops. These devices have become smarter, more powerful, less resource heavy, and have come equipped with more advanced features that allow both end-users and companies the ability to do more - with more computing power, memory and bandwidth - with smaller, more transportable devices.

In this day where even mobile phones fit into the category of a “computer,” in order to truly grasp the ubiquitous nature and constant evolution of computers, let’s define what a computer is and the engineering feats that went into the development and evolution of computers over the last 50 years:

  • Computer: Any electronic, technological device that stores data and utilizes a processor for computing data/data operations in order to output a defined, constant result, often utilizing binary code

From Alan Turing (father of the Turing Machine), to Claude Shannon (father of the Digital Age/binary code), to Dr. Mark Dean (designer of the IBM Personal Computer and modern PC architecture/1 GHz processor), to Steve Jobs (inventor of the Apple 1 Computer), to Grace Hopper (Premier computer programmer), the modern computer has a long history of development and evolution. Computers are, essentially, complicated calculators, and hence mathematicians have played a pivotal role in the evolution of the computer, which has evolved from Math, Physics and Engineering. Charles Babbage is regarded as the first inventor of the modern computer, while the basics of modern computing evolved from maths developed in ancient antiquity, with the Egyptians, Indians, Polynesians and Chinese playing a part in the first “binary math” systems used in computers today. Gottfried Wilhelm Leibniz and Claude Shannon, the latter of MIT fame, helped to standardize an ancient system of binary math that is used in digital circuits to this day, creating the basis of the digital revolution. Digital math is based on a binary system of “true” or “false,” known as “binary code,” which translates to “on” or “off” due to important semiconductor (integrated circuit) structures, called transistors, acting as electrical switches and/or amplifiers for digital/discreet logic. Computers are made up of a series of digital circuits; electrical currents run through these circuits, causing the circuit’s transistors to interact with the volts as a switch or amplifier, resulting in computation. All whole, decimal numbers are translated into binary numbers and the computation is completed accordingly via the processor. The more transistors that exist, the more computations can be accomplished, resulting in faster computer speeds and more robust computational power.

While the evolution of computers has occurred over the last 50 years, it was thought that the number of transistors in computer systems would increase exponentially over time, resulting in more powerful - and more advanced - computer systems being created on a constant, yearly basis. But, the limit of physics and engineering may have been reached, as computer rational innovations are slowing down.


What Is Moore’s Law?

Intel has helped to lead and pioneer the field of computer engineering, with the manufacture of integrated circuits and PCBs (Printed Circuit Boards) that make computers work. Leaders of the chip industry have hypothesized that computer systems would become more advanced with each passing year. Specifically, Gordon Moore, cofounder of Intel, indicated that - due to the shrinking size of transistors to the nano scale (allowing integrated circuits to be composed of more transistors, resulting in more powerful computer systems) - every year, twice as many transistors would be able to fit onto computer chips. Hence, Moore’s Law was born.

After 1975, the estimate changed to a doubling of transistors every two years. Engineers were able to consistently create computer systems/chips with double the number of transistors, resulting in a number of more advanced technologies being developed, from smart technology, to mobile tech, to wearable technology, to faster processors, to more robust computers, to faster/more efficient data centers like cloud computing. Engineers were able to dedicate time to develop more efficient nanotechnology systems whereby transistors went from being millimeters to nanometers. However, feats in engineering and physics have been pushed to their limit, and while more power results in more resources and more abilities to carry out advanced tasks via computers, engineers are unable to keep pushing the limit on smaller transistors, and thus, computer systems may have reached their limit in transistor capacity and power. Hence, industry leaders are asserting that Moore’s Law has come to an end, and computers will no longer have many more transistors every year.


What Contributions Did It Make?

Moore’s Law is simply an estimate predicting the rapid development of more advanced technologies, and the evolution of transistors. As an estimate, it helped to strategically pave the way for larger enterprises to plan for the implementation of systems that could benefit from more robust computer systems. The effect of Moore’s Law has changed the way end-users and companies have operated, with the rolling out of more powerful computers, video gaming devices, cloud/data centers, and workstations resulting in changes in strategic plans (for companies), increased capacities and even the development of new systems and apps that have benefited from more robust computational power (for consumers). Additionally, whole industries have come about due to the effect of Moore’s Law, including small, wearable computers, Internet of Things devices, smart tech, and powerful cloud systems whose circuits are composed of an unimaginable number of transistors, resulting in unprecedented computational abilities. Such abilities have helped to shape Big Data Analysis, Business Intelligence, and even Artificial Intelligence industries within the world of SMEs and larger enterprises.


Why Is It Coming To An End?

Moore’s Law, predicting the development of more robust computer systems (with more transistors), is coming to an end simply because engineers are unable to develop chips with smaller (and more numerous) transistors. Computer chips need new developmental architectures implemented into them in order to be as efficient if more transistors are to be utilized. While the creation of more powerful computers is regarded as the most important aspect of a computer system, energy efficiency and device lifetime is just as important, requiring more effective utilization of large numbers of transistors, especially when it comes to large cloud data centers which power large portions of online web applications.


What’s Going To Happen Next?

Large chip manufacturing enterprises, such as Intel, have delayed their rollout of smaller transistors in the past, and have allowed more time to pass between their chip generations. In other words, chip manufactures are slowing their chip development schedules and rollouts. Industry leaders are also abandoning strategic roadmaps that are linked to Moore’s Law and future projections of more robust computer systems that are estimated to rollout with each passing year. However, these projections of more robust computer systems are based on the estimate that Moore’s Law embodies. Computer systems can still be made to be more powerful, and even with Moore’s Law ending, manufacturers will still continue to build more physically powerful computer systems - just at a slower rate.

Better Algorithms And Software

As with all computational devices, hardware is only part of the story. Even with the most robust computer systems, the creation of apps and algorithms that make use of the underlying hardware helps to make computers work better. Like with the advent of 5G, which utilizes more powerful hardware but also uses more efficient software to power the new technology, the creation of more efficient software systems and algorithms can mean the more effective use of existing hardware systems. Better software increases the power of current computer platforms almost as much as the creation of more densely populated integrated circuits. Thus, streamlining applications with better code, and utilizing practices to optimize computer hardware operations (e.g hyperthreading a few years ago), can ultimately help to produce better computer systems for years to come.

More Specialized Chips

One of the major concerns associated with the end of Moore’s Law is the fact that more computing resources and power is necessary for the continued evolution of advanced technologies, such as advanced Artificial Intelligence (AI), self driving cars, IoT (Internet of Things) technology, and more robust cloud systems. Without the ability to effectively add more transistors to such computational systems, engineers can work around the issue by utilizing more specialized chips (i.e. GPUs) to complement powerful CPUs in computer systems, such as in the case of Bitcoin mining/Blockchain technology. Other forms of specialized chips can be developed to complement current and future CPUs.

Moving Beyond Silicon Chips

Silicon is the main material used for engineering computer systems, such as chips and PCBs. However, as materials scientists and mechanical engineers learn of new Rare Earth Elements and unique materials, new innovations are possible using these new materials as the major substrates for computer chips. These innovations may open new doors for the addition of new transistors, or the development of a new and more powerful mechanism of computing altogether.


Future Innovations

While the utilization of new materials can help to engineer new and more powerful, computer systems, there are a number of other innovations that chip manufacturers are dabbling with, such as using a 3D, upwards pattern for packing transistors on integrated circuit boards, and the utilization of novel Rare Earth Elements for the manufacturing of computer chips.


Graphene Processors

Graphene is the new, major player in the world of materials science, and is a very thin, flexible - but incredibly strong - zero overlap semimetal that is also a very powerful conductor. As a possible candidate for use in computers, graphene can be used to make computers more powerful and fast when used in computer processors, and to date, IBM has already used the novel material to create a chip that is 10,000 times faster than normal chips.

Memristors

Memristors are hypothetical computer components that can help to transform future integrated circuits by acting alongside resistors, capacitors and transistors, by regulating electrical flow within a circuit (like a resistance switch) while remembering the amount of charge that had previously flown through it. In this way, it could act as non-volatile, resistive RAM (Random Access Memory) and could thus help to save on energy resources, and could one day replace flash memory.

Quantum Computing

As a less traditional form of computing, quantum computers represent the future of computers, and as scientists get closer to creating a fully functional quantum computer, their basis in advanced quantum physics allows them to transform every industry and advanced tech that currently exists, with seemingly endless opportunities attached to the new technology. Quantum computers use a variety of REE (Rare Earth Element) substrates as computing processors to solve complex problems associated with non-traditional mechanisms of everyday life that cannot be quantified or represented by classical computers and traditional, Newtonian physics. That is, instead of using discrete logic, and bits of 0 or 1 (based on classical physics and digital math/circuits), quantum computers use quantum mechanics  and a novel set of states called qubits for computation, which allows it to do more advanced calculations via Quantum parallelism. Quantum computing has the capacity to challenge everything previously known about computing, and can be used to complete previously “impossible” tasks, such as breaking RSA-encryption.


What Lies In The Future

The future holds many possibilities for the development of more robust computer systems. While Moore’s Law - and the squeezing of more transistors into computer chips each year - was a trend that allowed for the rapid evolution of computer systems, it was only one way to optimally increase the power of computers. More effective computer software systems, along with a number of other innovations, can act as untapped methods for the future - evolution of even more powerful computer systems. The imagination of engineers has hardly been exhausted, and inventors are by no means limited to a five decades old estimate that has reached its end. While no one knows the future, it is logical to claim that, hypothetically, computers and computer systems, five decades from now, will be very different, and much more powerful, than the ones that are used today. Anything from configuration changes, to optimizing system threadings, to new chips made from new materials can change the landscape of computer engineering into the future, resulting in new, more robust computers that are unprecedented in power and capabilities.



Is The Tech in Your Business Future Proofed? Find Out Here!

New Call-to-action

Stay Up-to-Date with the Latest in Custom Software With Brainspire's Monthly Newsletter

Free Guide to Custom Software Solutions

Download the Ebook to read about how custom software solutions:

  • Drive business
  • Improve efficiency
  • Improve insight into business data, reporting and analytics
  • Automate business process
  • And much more..
Download the Guide