Back around 2003, when AMD broke the 64-bit barrier, the virtues of 64-bit computing was heralded as everything just short of the second coming. Better looking programs, more powerful computers, a whole new world. AMD spend a lot of time hyping its AMD 64 platform, particularly its impact on gaming; the AMD 64 logo showed up on many game boxes, and is slowly driving out 32-bit hardware and operating systems.
For a brief moment in 2003-05, there was a flurry of game developers jumping on the bandwagon: 64-bit versions of UT2k4, Half-Life 2, Far Cry, and STALKER appeared, amongst a few others. Of them, Far Cry probably received the most attention, since it came first and had a shiny promo video, making good use of its already-stunning exotic island locales:
And that was about the end of that, save for a few johnny-come-lately outliers. Crysis needed a 64-bit version so the designers could build the game’s engine, so out came a 64-bit version of that. And Hellgate: London had a 64-bit version up for its brief pseudo-mumorpuger life, in order to support more memory on its servers. These two programs switched for technical reasons, not to be a part of the 64-bit gaming storm, which more or less died down when people stopped and realized the world wasn’t going to convert immediately just to please hardcore gamers if it meant they couldn’t use their six-year-old deskjet printer or scanner.
In actuality, 64-bit computing hasn’t been the powerhouse AMD claimed it would be. Looking at the Far Cry video, it’s pretty easy to see that the devs just added some shiny new effects, higher-res textures, and called it a day, probably banking on the fact that anyone capable of running 64-bit games had a high-capacity gaming rig to begin with. And as this three-year-old Tom’s Hardware survey shows, most of the 64-bit games weren’t even optimized for 64-bit OS’s: the games’ framerate performances were on-par or lower than their 32-bit versions, even for the much-vaunted Crysis.
In other words: there’s a reason the 64-bit gaming hype has fallen to the wayside. The tech just isn’t being utilized to its full potential; too many people were still using 32-bit operating systems and/or legacy hardware, and building a separate architecture—or pushing exclusivity to the smaller 64-bit gamer niche—wasn’t going to fly with marketing. So instead of native 64-bit games, these are all examples of 32-bit games patched—but not optimized—for 64-bit systems. (The exception being Crysis, which was a native 64-bit program optimized for 32-bit systems.)
On the plus side, the technological advance is needed, welcome, and will—eventually—pan out into major benefits. The biggest advantage of a 64-bit OS is that it can recognize more than 4GB of RAM, something 32-bit can’t; at this point in time, 4 gigs costs about $20-35, and many hardcore gamers are investing in 8GB (or more) because the price is so low. (A quick glance at the Steam Hardware Survey shows some 60% of its PC survees have 4GB of RAM or more, and over 80% for MAC gamers.)
32-bit gaming is reaching the end of its era; games are advancing fast and hard enough that old 2GB memory caps hard-locked into 32-bit OS’s are giving some issues. To get around that, Windows programs can be flagged as “Large Address Aware” in order to use the full limit of 32-bit RAM—the aforementioned 4 GB, or 3GB if you’re using an older version of WinXP. Most of the Large Address Aware programs require a lot of system resources—PhotoShop CS3, for example—or were games, like STALKER: Shadow of Chernobyl, Company of Heroes, and Enemy Territory: Quake Wars.
Given that this is a problem we’ve known about since 2007, I’m still surprised that many AAA-titles and high-end software launch without being Large Address Aware today. Skyrim, for example, wasn’t, so it crawled along only using 2GB of system memory. (Guess which was one of the first major mods out there; an official BethSoft 4GB patch came later.) Before that, people tweaked the .ini files for Oblivion and the Fallouts to accept larger quantities of RAM—Oblivion launched only recognizing 1GB, if memory serves; then again, 2GB was pretty high-end back then.
In short, PC software is reaching the end of an era. The way things are going, clinging to 32-bit architecture is rapidly becoming obsolete, even with the Large Address Aware safety net. A high-end program, be it Adobe Creative Suite or BioShock Infinite, is going to run a hell of a lot smoother if it recognizes more RAM, moreso if it’s been optimized. Games like Skyrim are already pushing the limits of 32-bit technology. 64-bit compatible operating systems, drivers, and software are becoming commonplace enough that the full-on shift will probably/possibly/hopefully start in the next decade. And then—finally—we’ll be able to reap real rewards for 64-bit gaming: more RAM means shinier visuals, higher framerates, better performance.