Living Room Gaming On ARM: For Microsoft, Will The Third Time Be The Charm?

Speaking of video game consoles…two weeks ago at E3, Nintendo opened the kimono on its next-generation Nintendo Wii U unit, shown in prototype hardware-and-software form at the show and due to enter volume production some time next year (my guess: autumn, in time for the all-important Christmas 2012 shopping season). It’s high definition video out-capable, which isn’t even remotely a surprise considering the competitive necessity of at least matching its current-generation peers, the Microsoft Xbox 360 and Sony PlayStation 3.

More intriguing is the wireless controller, which conceptually combines the Wii Remote on the left, the Wii Nunchuck on the right, a resistive (not capacitive?) touch-capable LCD in-between, and a webcam up top. Although the Wii U Controller looks like a tablet computer, it’s actually a lower-cost ‘dumb’ terminal acting as a destination display for video streamed from the console. And, as the graphic below shows:

the at-controller video presentation can either be a mirror of the information presented at the television or user-specific custom imagery. Regarding the console itself, partner press releases reveal that it’s constructed from IBM-sourced PowerPC CPU technology akin to that in the Watson supercomputer recently showcased on Jeopardy, coupled with ATI Technologies-now-AMD graphics. Again, this combination isn’t remotely a surprise; Nintendo tends to be the most conservative of the three tier-one console developers, in part because it also tends to produce the lowest-priced product offering of the lot.

Simplistically speaking, the Nintendo Wii was a Nintendo GameCube with a next-generation optical drive, integrated Bluetooth and Wi-Fi connectivity, and a motion-cognizant controller. Similarly, given that the Wii U is intended to be backwards-compatible with prior-generation titles, the most straightforward means of implementing this aspiration (versus, say, emulation or virtualization) is to leverage next-generation iterations of the same CPU and GPU families Nintendo used the last time(s) around. Post-E3 scuttlebutt claims that the Wii U is in aggregate 50% more powerful than either the Xbox 360 or PS3, and specifically that it embeds an R770-class ATI/AMD graphics core, comparable to that in the Radeon HD 4890 (introduced in April 2009).

To the first rumor, I respond ‘it had better be’ (and ironically, as it turns out, Nintendo’s Wii U content showcase clips were actually rendered on competitors’ current-generation systems). Consider that the Xbox 360, also based on ATI/AMD graphics technology, was unveiled in late 2005 and had been in active development for several years prior; the PS3 came out in late 2006 after a similarly lengthy gestation period. And regarding the second rumor, I’m not at all concerned that the Wii U is not based on the latest-and-greatest high-end AMD graphics core.

Consider, for example, that AMD’s own Llano ‘Fusion’ APU formally unveiled this week at the company’s premier Developers Summit embeds a Radeon HD 5570-class graphics core but requires roughly half the chip die area to accomplish this objective. As with the Nintendo Wii, the Wii U’s graphics core will be similarly integrated in a SoC with other circuitry such as system interface logic and the embedded frame buffer. For consumer electronics devices, even those that hope to leverage the ‘razors and blades’ model to counterbalance unprofitable initial hardware sales with subsequent highly profitable content transactions, bill of materials cost is paramount. And don’t forget; the display target here is a 1920×1080-max resolution television set, not a higher resolution computer monitor.

My biggest question right now regarding Nintendo’s living room situation is just how ugly the company’s fiscal fortunes will be over the next year-plus. Many of you are likely already familiar with the Osborne Effect, named after the 1983 pre-announcement of second-generation Osborne computer models whose pending availability caused Osborne 1 sales to sharply tank. Second-generation system delays created a fiscal crevasse which the company was unable to cross, leading to its bankruptcy. On the one hand, I somewhat understand why Nintendo announced the Wii U now, given that the Wii’s market is already saturated and sales are subsequently plummeting. Frankly, even with the Wii’s lower $149 price tag, I don’t think there’s sufficient remaining demand (especially given the Wii’s inability to also play optical disc-housed movies, and in spite of its Netflix Online streaming support) to guarantee it meaningful Christmas 2011 shopping season success.

Speaking of the PS3 and Xbox 360, their as-yet-unannounced next-generation successors are of far greater curiosity to me. Focusing on Sony first, consider the PS3. It launched with an untested CPU (the Cell) combined with untested optical storage (the Blu-ray disc). And not surprisingly (except, apparently, to Sony executives) the market ramp was slow in emerging, as developers struggled to unlock the difficult-to-program hardware’s potential and as consumers struggled to discern an image quality advantage over more cost-effective upscaled versions of the DVDs they already owned. Unsurprisingly, Sony announced in late May that the PS4 was in development, and that this time around the company would more heavily focus on ‘off-the-shelf’ building blocks. As such, I don’t anticipate that Sony will repeat its prior MIPS-to-PowerPC migration; the PS4 will likely remain PowerPC-based and in fact will probably leverage a lithography-shrunk version of today’s Cell CPU.

Microsoft is more of a wild card. Like Sony, as mentioned in my prior post, it executed a generation-to-generation CPU architecture migration, from Intel x86 (and an Nvidia graphics-inclusive chipset) in the original Xbox to a multi-core PowerPC approach in the Xbox 360. And ordinarily, I’d prognosticate that Microsoft will stick with the Power architecture for its next Xbox series iteration, which unsurprisingly also seems to be well along from a development standpoint. But then there’s the nagging matter of the ARM architecture license agreement that Microsoft signed in July of last year. As I wrote in my early-April EDN cover story:

Conventional licensees implement predesigned cores in their SOC (system-on-chip) designs, a more straightforward path to bringing products to market, which conversely limits each licensee’s ability to differentiate its products from those of competitors. Architecture, or instruction-set, licensees, on the other hand, have more design flexibility but also incur incremental corresponding design challenges. Although they must, as their name implies, retain full ARM instruction-set backward compatibility, they can also build on that suite with proprietary instructions, as well as make other more fundamental circuit alterations and enhancements.

Maybe, as some postulate, Microsoft shouldered the expense of an architecture license solely as a means of meaningfully influencing now-partner ARM’s architecture plans, for the benefit of Microsoft’s ARM-based Windows Phone and ARM-optional Windows 8 licensees (and therefore the benefit of Microsoft itself). Perhaps Microsoft is taking a page from competitor Apple’s book, developing A4- and A5-reminiscent ARM SoCs which it will employ in Microsoft-branded handsets, tablets and the like. And that ARM-nexus silicon may also be headed for Microsoft’s next-generation gaming-plus set-top box, too. The next year should be a fun one to watch.

Leave a comment