Nvidia Tegra3 launch imminent. Intel, you did this to yourself.
Reading about the likely launch of Tegra3 at Mobile World Congress 2011 and seeing this video, one cannot help wondering how big a mistake Intel made when denied Atom hardware interfaces from Nvidia some time ago. Doing that, it practically forced Nvidia to abandon mobile-x86 solutions and pour all of its resources into Tegra/ARM development.
Nvidia has recently announced its Project Denver effort which also shows how seriously the graphics company wants to transform into an all-out computer technology company shipping mobile, desktop and server processors as well not only graphics solutions.
As a result, Intel will have to face not only AMD in the desktop/server segment but a big-name ARM technologist as well. (And several smaller ones like Nufront)
Tegra3 is not well known yet, but some guesses can be made:
- Quad-core Cortex-A9 symmetric multi processing for generic application code execution
- Likely at least 1Ghz top, possible up to 1.5 Ghz, dynamic frequency scaling and individual core-power-off
- Geforce 8 or 9 level graphics core, likely with high-profile 1080p playback and encoding
- Support for Linux and Android
- Possibly produced on a <40nm process (GlobalFoundries 28nm anyone?)
If Nvidia can produce this on the GlobalFoundries 28nm process (or similar), we can be quite certain that the new SOC will still be viable for smartphones and will be an extremely appealing solution for tablets and Motorola Atrix-like phone/netbook/tablet modular solutions.
It will make Moorestown Atoms a very-very hard sell for Intel in the mobile phone and tablet space since the computing-power advantage of Moorestown is gone and Tegra3 will be much more efficient (being an all-out ARM solution). Android-centered OEMs will most likely go with ARM anyway and if there is a big-name producer like Nvidia with a powerful solution for their premium products, they will certainly pick that up instead of the Intel gear.
And this is only the mobile space. When Project Denver from Nvidia and Nufront start selling ARM based server SOCs, Intel will have to fight a battle in the datacenter which was absolutely home-turf so far.
All of this may not have happened at all (or would have happened years later, giving Moorestown a chance) if Intel had not chosen to deny Nvidia the hardware interfaces for building Ion2. They switched a huge threat and possible cut-throat competition in every computing segment for a very short-term gain in one segment.
Was it worth it Intel?