Image processing 1,000 times faster is goal of new $5M contract
eecs.umich.edu>>1,000 times faster with 10,000 times less power
Do you even need to read their research to call bullshit?
How many projects have achieved this ever?
This is DARPA money.
How many DARPA project mission statements have you read? They all sound like that.
Also, the more dollars and the more professors on the grant, the more hyperbole. Thus, a $5M DARPA contract has to promise more than 10x what a $500K DARPA contract promises.
I believe it's possible. Today's hardware chips are optimized based on a weird mix of legacy and business concerns. If you actually take modern fabrication technology and throw it at intelligently designed, single-purpose parallel architecture, and you add in a novel algorithmic approach, I could see these things being possible.
Your average software on modern hardware is barely able to take advantage of even 1% of the machine's full capability. You spend most of your time waiting for cache operations. Even when the chip is being utilized well, it's spending a massive portion of its power budget on just synchronizing the clock signal amongst all billion transistors. A parallel, asynchronous chip with dynamic power regulation would be able to offer several orders of magnitude better performance than a stock CPU. For an existing real-world example, just look at how much better a GPU outperforms a CPU for its specific parallel tasks.
It also sounds like they are using compressed sensing techniques, or probabilistic linear algebra or other rank-reduction approaches. These can drastically lead to massive reductions in power and increases in throughput, because they fundamentally reduce the number of bits of state that must be managed and transformed.
TL;DR I don't know a lick of this guy's research but I absolutely believe that current hardware and software have a TON of room for improvement, at least where parallel image processing and machine learning are concerned.
No, this team will never come close to achieving this (1000x faster with 10000x less power).
They could accomplish great things, they could open the door to new techniques that may achieve this decades from now, but that's different.
Also, you said you could see several orders of magnitude improvement, even that is less than their claim.
Also you mention GPU vs CPU, which as far as I'm aware doesn't meet the standard either.
But seriously, while I'd agree any magnitude of breakthrough is possible, I also think it's unhealthy to encourage people to make extremely unlikely claims just to get a decent research grant.