What will replace moores law




















Better software increases the power of current computer platforms almost as much as the creation of more densely populated integrated circuits.

Thus, streamlining applications with better code, and utilizing practices to optimize computer hardware operations e. Without the ability to effectively add more transistors to such computational systems, engineers can work around the issue by utilizing more specialized chips i. Other forms of specialized chips can be developed to complement current and future CPUs. Silicon is the main material used for engineering computer systems, such as chips and PCBs.

However, as materials scientists and mechanical engineers learn of new Rare Earth Elements and unique materials, new innovations are possible using these new materials as the major substrates for computer chips. These innovations may open new doors for the addition of new transistors, or the development of a new and more powerful mechanism of computing altogether. While the utilization of new materials can help to engineer new and more powerful, computer systems, there are a number of other innovations that chip manufacturers are dabbling with, such as using a 3D, upwards pattern for packing transistors on integrated circuit boards, and the utilization of novel Rare Earth Elements for the manufacturing of computer chips.

Graphene is the new, major player in the world of materials science, and is a very thin, flexible - but incredibly strong - zero overlap semimetal that is also a very powerful conductor. As a possible candidate for use in computers, graphene can be used to make computers more powerful and fast when used in computer processors, and to date, IBM has already used the novel material to create a chip that is 10, times faster than normal chips.

Memristors are hypothetical computer components that can help to transform future integrated circuits by acting alongside resistors, capacitors and transistors, by regulating electrical flow within a circuit like a resistance switch while remembering the amount of charge that had previously flown through it. This is where Neven's Law comes in. It states that quantum computing power is experiencing "doubly exponential growth relatively to conventional computing".

To put this into perspective, if traditional computers had seen doubly exponential growth under Moore's Law instead of singly exponential , we would have had today's laptops and smartphones by This enormously fast pace should soon lead, Neven hopes, to the so-called quantum advantage. This is a much-anticipated milestone where a relatively small quantum processor overtakes the most powerful conventional supercomputers.

The reason for this doubly exponential growth is based on an in-house observation. According to an interview with Neven, Google scientists are getting better at decreasing the error rate of their quantum computer prototypes. This allows them to build more complex and more powerful systems with every iteration.

Neven maintains that this progress itself is exponential, much like Moore's Law. New transistor designs were introduced to better corral the electrons. New lithography methods using extreme ultraviolet radiation were invented when the wavelengths of visible light were too thick to precisely carve out silicon features of only a few tens of nanometers. But progress grew ever more expensive.

Likewise, the fabs that make the most advanced chips are becoming prohibitively pricey. Not coincidentally, the number of companies with plans to make the next generation of chips has now shrunk to only three, down from eight in and 25 in He leads a team of some 8, hardware engineers and chip designers at Intel.

But Keller found ample technical opportunities for advances. It means there are many ways to keep doubling the number of devices on a chip—innovations such as 3D architectures and new transistor designs.

These days Keller sounds optimistic. Still, even if Intel and the other remaining chipmakers can squeeze out a few more generations of even more advanced microchips, the days when you could reliably count on faster, cheaper chips every couple of years are clearly over. In a new paper, the two document ample room for improving computational performance through better software, algorithms, and specialized chip architecture. One opportunity is in slimming down so-called software bloat to wring the most out of existing chips.

And they often failed to take full advantage of changes in hardware architecture, such as the multiple cores, or processors, seen in chips used today. But as we create more data every day, we also create the need for vast warehouses of computers, known as the cloud, to store and process that data. And the more data we produce, the more computing power we need to analyse it.

Despite the remarkable efforts of research engineers, you can only make transistors so small before you run out of room at the bottom. Pack in too many transistors and make them work faster and the restricted flow of electrons within the chip can make it so hot that without significant cooling, it will burn itself up.

Chip manufacturers have known about these problems for decades, and have been doing their best to work around them. We used to see microprocessors increase their clock speed the base operating speed of a computer every year, to make them compute more quickly. But that was about as fast as we could make them tick without them becoming impossible to cool. Since then, manufacturers have had to use multiple cores so processors can do their work in parallel in order to make them work faster — first double-core, then quad-core, 8-core, core and so on.

Instead, the major manufacturers now focus their efforts on specialised chips that are designed to accelerate specific types of computation. The most common examples of this are the graphics processors. They were originally created to perform many similar calculations in parallel in order to enable the blindingly fast graphics needed for computer games. They have now evolved into general purpose processors, which are used for data analysis and machine learning.



0コメント

  • 1000 / 1000