In recent years, Nintendo has been known more for the eccentricity of its hardware more than anything else. From the Wii’s motion controls, to the Wii U’s dual-screen gameplay, and the Switch’s portability, many analyses have focused on the physical design of Nintendo’s hardware while ignoring the systems’ most critical underpinnings: its architectural design. While Sony and Microsoft have been locked in a fierce war over console specifications for the past three generations–pitting each others’ machines in an ever-evolving battle for supremacy–Nintendo has focused on the software.
That’s a shame, though. As the progenitor of the modern games industry and a talented hardware manufacturer, Nintendo deserves a closer look at the technology behind their consoles and what makes them unique. In the final part of a three-part series, we dive deep into Nintendo’s design choices for the unfortunate Wii U and record-breaking Switch.
Nintendo Wii U: The Worst is Yet To Come
After Wii sales hit their zenith in the fiscal year 2009–totaling almost 26 million units—they began declining precipitously. Soon, the need was clear for a system to replace Nintendo’s aging giant. Rumors began emerging that Nintendo was working on their next system, codenamed “Project Cafe,” for release in 2012. Names like the “Wii 2” and “Nintendo Stream” began to float around, suggesting that Nintendo was aiming for an innovative system that pushed beyond their casual focus with the Wii.
When it was unveiled at E3 2011, the system’s name was finally announced: Wii U. An amalgamation of the Wii’s focus on casual gamers and a newly refined focus on hardcore gamers, the system received mixed responses from media and industry members alike. It seemed as strange and esoteric as the Wii before it, an attempt by Nintendo to innovate once again.
Behind that push for innovation was a new hardware design that pushed Nintendo firmly into the HD era. Bundled with IBM’s Espresso CPU and AMD’s Latte GPU, the Wii U stood theoretically equal with the PS3 and Xbox 360, capable of outputting most games at between 720p and 1080p. However, the tri-core Espresso’s poor performance (which was itself an upgraded version of the Wii’s Broadway chip) held the Wii U back from truly surpassing the Xbox 360 and PS3 technically. Digital Foundry predicted that the Wii U would struggle to produce complex, 3D titles at 1080p, an assertion that remained mostly true throughout the console’s lifespan.
It seemed as strange and esoteric as the Wii before it, an attempt by Nintendo to innovate once again.
While it struggled with many third-party ports and was a pain to develop for, the Wii U brought Nintendo the tools it needed to make the transition to HD development puissantly. Games like Super Smash Bros. for Wii U, Mario Kart 8, Super Mario 3D World, Bayonetta 2, and The Legend of Zelda: Breath of the Wild showed off Nintendo’s ability to wrest beautifully realized, stylistic graphics from their struggling console.
However, despite Nintendo’s best efforts, the Wii U struggled with sales out of the gate. By the middle of 2015, it was apparent that the system was on its way out when then-president of Nintendo, Satoru Iwata, announced plans for a successor system, the NX, while warning not to expect further details for at least a year. While Iwata sadly wouldn’t live to see the launch of the NX, he had laid the foundation for a strong successor system.
Nintendo Switch: What’s Past is Present
Throughout 2015 and 2016, speculation ran wild throughout Nintendo’s online community. What features would the NX launch with? Who would support it? Could Nintendo survive another generation without competing directly with Sony and Microsoft on power? What would power it?
Slowly, rumors began to emerge that Nintendo’s next system would be a hybrid system powered by Nvidia’s Tegra SoC (system on a chip) and would feature detachable controllers. Once again, Nintendo wouldn’t compete with Microsoft and Sony on raw power, but would focus on steady innovation within the console space. A myriad of fake prototypes, patent releases, and overblown hype, combined with the slow, inexorable demise of the Wii U, made the NX’s announcement seem as if it would never come.
And yet it did. In October of 2016, Nintendo unveiled the Nintendo Switch in a now-famous announcement trailer that demonstrated the Switch’s new hybrid capability and emphasized its capacity for playing anywhere. Soon thereafter, Eurogamer confirmed that the Switch would, indeed, be running on a variant of Nvidia’s Tegra X1, based on a Maxwell GPU architecture that had been released in 2015. Doubling the amount of RAM found in the Wii U to 4GB and packing 256 Cuda cores alongside a quad-core ARM processor, the Switch was a noticeable, if not incredible, leap in performance over the Wii U.
The most incredible aspect of the Switch’s technical design, however, was in how it handled switching (pun fully intended) between console and handheld mode. In order to maintain optimal battery life and performance, the system would automatically lower the clock speed on both its CPU and GPU. Developers loved the ease of the transition and of Switch development in general, noting that Nintendo had outdone themselves in making the Switch its easiest console to develop for.
Despite nearly two-and-a-half years post-release, the pure tenacity of the Switch’s hardware still continues to impress. From games like Doom and Wolfenstein 2: The New Colossus, to the unimaginable Witcher 3, developers have utilized the system’s easy-to-understand, modern architecture to bring a myriad of game series never before seen on a Nintendo console. However, with the next generation of consoles on the horizon and the Switch itself growing relatively long in the tooth, how will Nintendo keep up with lengthening visual disparity?
That answer, like many, lies shrouded in mystery. Whatever the case may be, after the disaster that was the Wii U, the future looks bright for Nintendo and its Switch line of handheld/console hybrids.