Multigig, a start-up in Scotts Valley, California has developed a new approach that changes the way a microprocessor looks at time to significantly increase power efficiency.
RotaryWave™ technology can generate hundreds and even thousands of precise phases of each clock period. This allows electronic circuits that can robustly operate with precise timing intervals as small as a picosecond.
RotaryWave™ clock technology avoids dynamic power dissipation through recycling of charge. Thus, very large capacitive loads can be directly driven by the clock without the usual CV2F power dissipation common to all other accepted methods.
RotaryWave™ clocks self-synchronize with adjacent RotaryWave™ clocks with extreme precision through the use of injection locking. Thus, a large grid of RotaryWave™ Clocks can be completely synchronized without utilizing any dedicated distribution mechanism. The RotaryClock thus becomes both the clock source and the distribution method.
This looks very promising. There’s a lot to be gained if you can increase accuracy while simultaneously reducing power consumption.
OK, but if my AMD tower cools down, what will keep my coffee warm?
Sounds neat if it works … going way over my head though … they saying they can make a multicore CPU with a really fast clock speed, that won’t destroy my power supply/batteries? If so, I want one.
Now let’s pray that this promising idea won’t be sunk because they got too greedy and charged an obscene amount for it. Too many great ideas have gone up in smoke because the average geek genius is good at everything but accounting.
#3, exactly, like Rambus did.
http://en.wikipedia.org/wiki/Rambus
> RDRAM was also two to three times the price of PC133 SDRAM due to
> a combination of high manufacturing costs and high license fees.
This could have significant implications in any technology where clock and power consumption are important. There are a number of applications outside of processors that could benefit from the ability to clock more efficiently. Although we will have to wait and see if it gets traction in the market, but the potential is great.
#4
Yikes!
And I thought that IBM was greedy.
I got it, I think, but my head hurts.