google-site-verification: google9a9812ee5b832dba.html Tech On Tips

Thursday, May 2, 2013

BioShock Infinite Performance, Benchmarked

The underlying connection was closed: The connection was closed unexpectedly. AppId is over the quota
Developed using a modified version of Unreal Engine 2.5 and enhanced with Havok Physics, we were blown away by the original BioShock when it launched back in September 2007. Our performance review at the time concluded that the title had "jaw-dropping visual effects" and that you'd need one of the finest graphics cards of the day if you intended on playing at 1920x1200 -- or even 1600x1200 for that matter.

Given our first impression with the first entry, we didn't hesitate to take BioShock 2 for a spin a couple years later. However, as is often the case, the second title was less of a technical showpiece. It also used a modified build of Unreal Engine 2.5 and looked similar to its predecessor with no major improvements. In turn, the game could be run on max quality at 1920x1200 with a relatively affordable graphics card.



Another three years having passed since BioShock 2 and the dawn of a new console generation on the horizon, BioShock Infinite has taken the opportunity to mix things up. Although it's still a first-person shooter published by 2K Games and contains similar concepts and themes, the third installment doesn't follow the same story, being set decades before the previous entries in a floating city called Columbia.

We won't be delving too deep into the gameplay side of things here, but critics seem to approve of the title's fresh perspective given its metascore of 95/100. Naturally, we're mostly interested in the graphics side of things today, and plenty has changed here, too. For starters, BioShock Infinite uses a DirectX 11-enabled, modified version of Unreal Engine 3, which gives hope of a quality PC port.

Along with DX11 effects, folks playing on PC can look forward to higher resolution textures and a healthy range of customization. Infinite comes with six graphical presets from "very low" to "ultra" that should hopefully cover a broad performance spectrum, not to mention individual control over settings like anti-aliasing, texture detail and filtering, dynamic shadows, post-processing, and so on.

As the cherry on top, the developer has fully embraced widescreen gaming with what it calls "horizontal plus" widescreen support, so the wider you go, the more you'll see of Columbia?s gorgeous vistas. In that same vein, it should be noted that there's also multi-monitor support for AMD Eyefinity, Nvidia Surround and Matrox TripleHead2Go. Plenty to see for sure, and we're eager to dig in.

Our test comprises 24 DirectX 11 graphics card configurations from AMD and Nvidia covering a wide range of prices, from the affordable to the ultra-expensive. The latest drivers have been used, and every card has been paired with an Intel Core i7-3960X to remove CPU bottlenecks that could influence high-end GPU scores.

The developer has included a benchmark tool that works very well as we found it to be an accurate representation of the kind of performance you can expect to see when playing BioShock Infinite.



While the benchmark allows to test all six quality presets, we decided to benchmark the Ultra preset with diffusion depth of field enabled. This is the maximum quality setting for BioShock Infinite which we tested at 1680x1050, 1920x1200 and 2560x1600.

Because we tested just a single quality preset and the benchmark tool streamlined the process, we had time to include frame time performance as well. Using Fraps in conjunction with the benchmark tool, we measured in milliseconds the time it takes to render each frame individually. These results will be displayed in our "99th Percentile Frame Time" graphs.
HIS Radeon HD 7970 GHz (3072MB)HIS Radeon HD 7970 (3072MB)HIS Radeon HD 7950 Boost (3072MB)HIS Radeon HD 7950 (3072MB)HIS Radeon HD 7870 (2048MB)HIS Radeon HD 7850 (2048MB)HIS Radeon HD 7770 (1024MB)HIS Radeon HD 7750 (1024MB)HIS Radeon HD 6970 (2048MB)HIS Radeon HD 6870 (1024MB)HIS Radeon HD 6850 (1024MB)HIS Radeon HD 6790 (1024MB)HIS Radeon HD 6770 (1024MB)HIS Radeon HD 6750 (1024MB)HIS Radeon HD 5870 (2048MB)Gigabyte GeForce GTX Titan (6144MB)Gigabyte GeForce GTX 680 (2048MB)Gigabyte GeForce GTX 670 (2048MB)Gainward GeForce GTX 660 Ti (2048MB)Gigabyte GeForce GTX 660 (2048MB)Gigabyte GeForce GTX 650 Ti (2048MB)Gigabyte GeForce GTX 580 (1536MB)Gigabyte GeForce GTX 560 Ti (1024MB)Gigabyte GeForce GTX 560 (1024MB)Gigabyte GeForce GTX 550 Ti (1024MB)Gigabyte GeForce GTX 480 (1536MB)Gigabyte GeForce GTX 460 (1024MB)Intel Core i7-3960X Extreme Edition (3.30GHz)x4 4GB G.Skill DDR3-1600 (CAS 8-8-8-20)Gigabyte G1.Assassin2 (Intel X79)OCZ ZX Series 1250wCrucial m4 512GB (SATA 6Gb/s)Microsoft Windows 7 SP1 64-bitNvidia Forceware 314.22AMD Catalyst 13.3 (Beta 3)

AMD Radeon HD 7790 Review

The underlying connection was closed: The connection was closed unexpectedly. AppId is over the quota
AMD spent the better part of 2012 releasing an entire line of 28nm GPUs, starting with the Radeon HD 7970 in January and followed by over half a dozen more cards throughout the next 8 months.

Late in the year we wrapped things up with our feature ?The Best Graphics Cards: Nvidia vs. AMD Current-Gen Comparison? which saw Nvidia take out the $100 - $150 price bracket with the GeForce GTX 650 Ti, while AMD claimed the $150 - $200 range with the Radeon HD 7850.

As well-thought-out as the Radeon HD 7000 series was, we kind of hoped 2012 would mark the beginning and the end for the series, much as 2011 did for the previous generation. Expecting something entirely new was not to be, as we are now three months into 2013 and we find ourselves reviewing a brand new AMD graphics card that isn?t based on a new architecture.

Rather what we have is the latest member of the Southern Islands family, designed to fill the gap between the Radeon HD 7770 and 7850.

Not the most exciting product to be released, and its performance will be a far cry from what we saw with the GeForce GTX Titan last month. That said, the new Radeon HD 7790 is likely going to be of more interest than the GTX Titan to many of you for the simple reason that it is affordable and should be pretty good value as well.

The Radeon HD 7790 will be available in volume beginning April 2nd for as little as $150, which prices it smack bang between the 7770 and 7850. Current pricing has the Radeon HD 7770 at around $110-$120, while the 7850 costs betwen $180 and $200.

Last time we checked the GeForce GTX 650 Ti represented the best value in this bracket, but it looks like AMD is trying to win us over.

The Gigabyte Radeon HD 7790 we tested measured 19cm long, a typical length for a modern mid-range graphics card. Gigabyte?s own version of the GTX 650 Ti measures 23cm long, though the actual PCB is quite shorter at a mere 14.5cm long. This new Radeon GPU core runs at 1GHz, which is the highest frequency for any Radeon card, matching the 7770, 7870 and 7970 GHz Edition cards.

The HD 7790 is clocked 16% higher than the HD 7850, while its GDDR5 memory is also faster at 1500MHz (6.0GHz DDR). Still, pairing that frequency with a minuscule 128-bit memory bus gives the HD 7790 96GB/s of theoretical bandwidth, which is actually a lot less than the old HD 6790.

Gigabyte has overclocked their 7790 card from 1000MHz to a core speed of 1075MHz. However for the purpose of this review we have clocked the card back to the default AMD specification of 1GHz.

The HD 7790 comes loaded with a 1GB frame buffer, the same as previous-gen mid-range cards. We don't doubt that board partners will release 2GB versions, but because the HD 7790 isn't designed for extreme resolutions, 2GB models aren't likely to provide any performance boost.

The HD 7790's core configuration also differs from the HD 7770. This new GPU carries 896 SPUs, 56 TAUs and 16 ROPs. That's 40% more SPUs and TAUs, while the ROPs remain the same.

Gigabyte has chosen to cool the "Bonaire XT" GPU using their own custom design which employs a massive 95mm fan. Under this fan is a relatively small aluminum heatink measuring 11.5cm long, 9cm wide and at its thickest 2cm tall. While that might sound like a decent size heatsink, by graphics card standards it is actually quite small.

The HD 7790 operates at near silence because even under load it only draws 85 watts and as little as 3 watts at idle, courtesy of the ZeroCore Power technology.

To feed the card enough power, AMD has included a single 6-pin PCI Express power connector -- the same setup you'll find on the HD 7770, 7850 and GTX 650 Ti, as well as numerous other mid-range graphics cards.

Naturally, the HD 7790 supports Crossfire and so there are a pair of connectors for bridging two cards together. The only other connectors are on the I/O panel. The AMD reference version has a dual DL-DVI connector, a single HDMI 1.4a port and two Mini DisplayPort 1.2 sockets. The Gigabyte version is a little different as it employs a pair of DL-DVI connectors, a single HDMI 1.4a port and a standard DisplayPort socket.

Crysis 3 Review

The formatter threw an exception while trying to deserialize the message: Error in deserializing body of request message for operation 'Translate'. The maximum string content length quota (30720) has been exceeded while reading XML data. This quota may be increased by changing the MaxStringContentLength property on the XmlDictionaryReaderQuotas object used when creating the XML reader. Line 2, position 31920. The formatter threw an exception while trying to deserialize the message: Error in deserializing body of request message for operation 'Translate'. The maximum string content length quota (30720) has been exceeded while reading XML data. This quota may be increased by changing the MaxStringContentLength property on the XmlDictionaryReaderQuotas object used when creating the XML reader. Line 1, position 32803.


You're sitting behind the wheel of a finely tuned luxury automobile. The upholstery creaks as you make yourself comfortable; it smells like quality
in here. You haven't even turned the key and you can feel the car humming, its tightly-coiled energy waiting to be unleashed. This car isn't designed to make you feel romantic or poetic; it's designed to make you feel powerful.
You run your fingers over the dash. Near the edge, just above the glove compartment, a piece of the dashboard flicks up under your fingers. Huh, weird, how did that happen? It must've come unglued or something. You smooth it down and look at it. There, good as new. You twist the key in the ignition.

The car roars to life! It's throaty and strong! Wait, but did you feel it hitch? Nah, couldn't have been. Smell this leather! Cars that smell like this don't hitch
. But? yeah? wait. You hear something, just beneath the rumble of the engine. A high-pitched keening sound, like metal wire spinning round an un-greased spool. You put the car into gear, and it chugs. It chugs? Oh yes, there was no mistaking that: That was not supposed to happen.
You're sitting behind the wheel of a finely tuned luxury automobile. But something's wrong.

That's what it's like to play Crysis 3
.
Crysis 3
, which comes out today on PC, Xbox 360 and PS3, is the third (well, technically fourth) in a series of first-person action games that mix stealthy sneaking with huge explosions, all draped across lush, exquisitely rendered environments. The result has historically been something a bit smarter and more open-ended than, say, Call of Duty or Medal of Honor.
The Crysis
series isn't really known for its winning personality. The games don't get by on their stories, or their characters, or their lore. They're not even really all that widely regarded for their gameplay or design. They're known, first and foremost, for their sweet, sweet tech.
The first Crysis
was released exclusively on PC in 2007 and almost instantly became the high-water-mark to which all PC graphics were compared. It looked like a PC game from the future: eye-watering sunsets splashing across a shimmering ocean, tiny little frogs leaping through a carpet of jungle-undergrowth. It was the game that PC gamers could lord over their console-owning brethren. Not only was it unavailable on Xbox 360 or PS3, it was commonly held that those platforms couldn't handle the game if they tried. (The irony here is that Crysis was eventually brought to the 360, albeit as a toned-down port.)
The game's developer, the German studio Crytek, has always seemed a bit less interested in making great games and more interested in using their Cryengine technology to make great-looking
games.Crysis 3 Review

That said, I've always had a soft spot for the series. I like both Crysis
and Crysis 2 in equal measure, though for somewhat different reasons.
In Crysis
games, you play as a man in a suit. Specifically, a "nanosuit" exoskeleton that looks like SCUBA gear combined with one of those frozen human musculatures you'll see on display at Body Worlds. The suit gives a distinct advantage in combat against mere mortals, as it allows players to switch between various powerful modes on the fly. There's a stealth mode that makes you invisible like a certain dreadlocked extra-terrestrial, and an armor mode that lets you suck up bullets. There's a speed mode that lets you run super fast and jump super high. You can breathe underwater, and just in case you didn't feel enough like The Predator already, you can activate a visor that allows you to see heat signatures.
The games, then, are entirely about using your suit's powers to stalk and kill dudes. Sometimes you hunt human dudes, and sometimes you hunt alien dudes. This has traditionally been a good amount of fun, because of one crucial balancing feature of the nanosuit?it runs out of energy rather quickly, and you can't stay invisible or bullet-proof for too long before you'll have to pause and recharge. Past Crysis
games have always been at their best when players are set loose in moderately open outdoor or semi-outdoor areas, pitted against a bunch of enemies. It's in these scenarios that the games, particularly Crysis 2, start to feel something like the "thinking man's brainless shooter." You'll creep and strike, creep and strike, hiding, cloaking, attacking, hiding and recharging, before pouncing again.You are a guy named "Prophet," who is the same guy that everyone thought you were for the bulk of Crysis 2, when you were actually a guy named "Alcatraz," though at the very end of that game you actually became Prophet anyway. (I know, right?)

But every time Crysis
games get away from that core routine, things become significantly less enjoyable. The back-half of the first game, which was set on a south pacific island, featured giant flying squid-enemies that were a tenth as fun to fight as the overmatched but numerous North Korean soldiers from the opening chapters. The second game, which took place in an under-attack New York City, featured aliens that were more humanoid and a lot more fun to fight, but still not quite as enjoyable as the PMC soldiers of the opening and closing acts.
Crysis 3
, unfortunately, spends most of its time lost in the weeds. There's plenty of hunting, but it's sporadic, and changes made to the formula combine with dodgy AI and odd level-design to make the whole thing feel uncomfortable and ungainly.
In Crysis 3
, you still wear the suit. Through some plot contrivances that don't really merit a detailed explanation, you are a guy named "Prophet," who is the same guy that everyone thought you were for the bulk of Crysis 2, when you were actually a guy named "Alcatraz," though at the very end of that game you actually somehow became Prophet anyway. (I know, right?) The story goes like this: It's twenty-some years after the events of Crysis 2, and Prophet has been frozen in stasis this whole time, kept under lock and key by a megalomaniacal megacorporation called Cell.
Prophet's old buddy Psycho, who was one of his squadmates in the first game (and was the star of the Warhead
spin-off) turns up, older and fatter and conspicuously nanosuit-less, and wakes Prophet up. In the wake of the events of Crysis 2, New York has become a Cell-controlled, bio-domed jungle, loaded with wrecked, overgrown buildings. (It's lovely-looking.) There's wildlife and foliage everywhere. The aliens have been scattered to the wind, and Cell Corporation has gone full-on Lex Luthor?they're trying to take over the world. Time to show them who's boss.
Sounds fine, right? A decent action-game setup. But right from the start, something seems hinky with Crysis 3
. The first level takes place at night aboard a Navy cruiser, where Psycho escorts Prophet to freedom. I found myself surprised that I was spending the opening act doing what I've come to think of as the "First-Person Shooter Follow." See here:Crysis 3 Review

I'd follow Psycho to a door, wait for him to open the door, then go through and shoot some guys. Then I'd follow him some more. This kind of thing is de rigueur
in a Call of Duty game, but in Crysis? At the very least, it set off some warning bells.
The whole introductory level took place at night, and I found myself fighting my way through small labs, then through bigger labs, then corridors. Nothing felt open, or empowering, or particularly fun. It certainly didn't feel like Crysis
. That went on for the game's entire opening act, before the camera finally opened onto a sprawling, day-lit vista. (A screencap of this moment is a bit farther along in this review.) If you're anything like me, this is the point where you'll think, "Thank god, the actual game is starting."
Only it doesn't start. I had to follow Psycho some more.

After that, I was finally set loose in the urban jungle. Sweet! Oh, no, wait. I wasn't all that loose, actually, because there was a huge missile-launcher in the sky that would blow me up if I became uncloaked out of cover. So I did some tedious linear recon (no combat) for a couple minutes, and then finally, finally, I got to the first open area where there were some soldiers to fight. And... I defeated them handily
, because I'd been given a futuristic bow that fires silent, instantly deadly and/or explosive-tipped arrows and I could use it without uncloaking. (More on the bow later.)
I made mincemeat of those poor goons and then moved on... but not to another outdoor combat sequence! Nope, it was time to follow Psycho again, and then head underground and fight some guys in another dark, interior area. Some aliens turned up about 20 minutes later, and it just became more of a mess from there.
Crysis 3 Review WHY: Lovely graphics aside, Crysis 3 is a mostly mediocre shooter in which fancy visuals faintly disguise haphazard design and a lack of technical polish. Crysis 3
Developer: Crytek
Platforms: PC (Reviewed), Xbox 360, PS3
Release Date: February 19

Type of game: Tactical first-person sci-fi shooter centered around a mixture of stealth and action.

What I played: Completed the single-player story in around 6-7 hours, replayed several hours' worth of levels on various difficulties. Played a couple hours of multiplayer and a couple hours of the Xbox 360 version. Replayed several chunks of Crysis 2
for comparison.
My Two Favorite Things
When it's pretty, it's damned pretty. In terms of razor-sharp fidelity and near-photorealistic vistas, this is easily one of the best-looking games you can currently play.Multiplayer has a number of distinctive charms, particularly the fact that every player can become invisibile.My Two Least-Favorite Things
The last chapter is a chore, the final boss is a mess, and the d?nouement is laughable.Enemy AI just can't keep up with the new, bigger environments, and humans and aliens both behave too erratically to be much fun to fight.Made-to-Order Back-of-Box Quotes
"I didn't realize my PC could actually physically break a sweat."
-Kirk Hamilton, Kotaku.com"Why would I ever use anything but this bow?"
-Kirk Hamilton, Kotaku.com"This is it: The mediocre game that screenshots will sell."
-Kirk Hamilton, Kotaku.com
So that was more or less when I started thinking, hey, there might be something weird under the hood of this supposedly finely-tuned automobile.

Before I dig too much deeper into the design or the writing, let's back up and talk about the tech. That's why a lot of people play Crysis
games, after all: They want to make their PC beg for mercy, they want to set their post-FX slider to "low" for the first time since buying that new graphics card. They want to play this game and think, "Yeah, but in three years, when I have a new PC, I'll play this again." Call it aspirational PC gaming. We want to taste the future, even if it gives us indigestion.
I'm running an Intel i5 2.8GHz with 8GB of RAM and a GeForce 660Ti graphics card. It may not be the hottest setup money can buy, but it's not too shabby, and it can run Crysis 2
with all the high-res-texture bells and whistles at a consistent 60 frames per second. It can also run pretty much every other PC game I have, from The Witcher 2 to heavily modded Skyrim, without a hitch.
My computer certainly choked on Crysis 3
. I played a review build of the game that Crytek had put together last week, and the game's performance was erratic at best, with some combination of medium/low settings giving me solid 60fps before dipping down to 30 or 25 in certain scenes. Only by dropping every setting to "Low," turning off antialiasing, and running medium-quality textures have I been able to get a consistent 60fps at 1920x1080 resolution. And even then?sometimes it'd drop.
I've been following this NeoGAF thread with interest, as players there have been trying all manner of high-end cards and are reporting similar performance dips. Almost no one seems to be able to get the game to run at maximum settings without taking a significant framerate hit. That said, this stuff is very difficult to get nailed down?I installed Nvidia's newest drivers today, and didn't really see a noticeable improvement, despite the fact that they're optimized specifically for Crysis 3
. I'm still playing with textures on "medium" and all my settings on "low." Then again, you may not care about framerate as much as I do. Responsiveness is key for me; I'd rather play an ugly game at a steady 60FPS than a pretty one at 30. And it's worth reiterating that even on low settings, Crysis 3 looks very nice.
I like the idea of a future-ready PC game. And I don't doubt that in three or four years, people will buy this game on sale just so that they can run it maxed-out on their new 8GB GPUs or whatever, just like I did with Crysis
in 2010. But at the same time, I have to say that I find Crysis 3's under-performance to be a liiiittle bit of a bummer. The game isn't just demanding, it feels poorly optimized. The fact that it seems unable to maintain a consistent framerate unless I dial it all the way down and even then has dips makes me think that it's just not that well-constructed or stable. It's likely that future updates and patches will iron this out and make the game more consistent, but for the time being, it's a real bucking bronco.
On a related note, the Xbox 360 version of Crysis 3
is a big step down from its PC big brother. I played an hour or so of the 360 version just to see how it compares, and the difference is remarkable. It's still plenty okay-looking for a console game, but it doesn't move all that well. It's too busy for the Xbox's native resolution, and the jaggies and low-res textures make everything look muddy. Not only is the game lower resolution and lacking any of the DirectX 11 particle-porn the PC version so regularly smears onto your screen, the Xbox version's framerate is quite sluggish, which makes it less pleasant to play.
All that said, yes: If your interest begins and ends with extremely high-res PC gaming, Crysis 3
will slake your thirst. And a part of me enjoys that Crytek struts out and throws down this crazy game that's less an entertainment product and more a gauntlet, daring PC gamers to throw their machines against it with reckless abandon. The studio has done a marvelous job positioning itself as purveyor of a product that users don't deserve to use properly. It's hard not to admire their chutzpah. "This game is so awesome-looking that you can't even play it for another two years," they say. "But you know you're gonna buy it anyway, because you just want to see how you stack up."
In summary: It's totally playable as is, though it'd be nice if the damned thing worked a little bit better. And a further caveat on the graphics: While the game looks amazing in screenshots, it doesn't always look so hot in action, even on PC. Animations, especially facial animations, are stiff and waxy. The motion capture is odd, combat animations can be stilted, and characters regularly leave huge gaps of silence between lines of dialogue.
Crysis 3 Review

As an open-ended stealth/combat game, Crysis 3
falls well short of the standard so recently set by Far Cry 3. (For example: See that vista in the image above? You don't actually ever get to explore that in Crysis 3.) And as a transhumanist sci-fi adventure, it doesn't match the melodrama and romance of Halo 4 or the moral credibility of Deus Ex. But while those games' shadows stretch long over Crysis 3, the shadow that most thoroughly covers it, curiously, is that of its predecessor, Crysis 2.
I've always thought of Crysis 2
as an underrated game: it's a meaty, largely well-designed shooter that's polished, atmospheric, and gives players a ton of excellent opportunities to creatively blow shit up. It's also superior to Crysis 3 in almost every way. Crysis 2 feels like an ambitious game made by developers who were unafraid to take their time and get things right. Crysis 3 feels like it was hurried out the door, almost as though Crytek was clearing out old business before re-focusing on free-to-play games.
The differences between the two games are apparent from the very start: Crysis 2
almost immediately set you loose in open-air, outdoor environments filled with soldiers. Crysis 3 makes you follow a guy for an hour or so, putting you either in closed rooms or semi-open, darkened areas filled with enemies on high scaffolding who you can't see but who can see you. The new game is also significantly shorter and less narratively ambitious: Crysis 3 plays out over seven chapters, while Crysis 2 featured nineteen. There are smaller differences, too, like the fact that for some reason, Crysis 3 has stripped out Crysis 2's interesting and functional first-person cover mechanic.Crysis 3 plays out over seven chapters, while Crysis 2 featured nineteen.

To make sure I wasn't imagining things, this past weekend I loaded up Crysis 2
and started dropping the needle on random single-player missions. At every turn, I found a superior game. One minute I'd be fighting aliens in a fraught showdown in the middle of Grand Central Station, the next I'd be helping marines topple a skyscraper in order to block alien mortar fire. Or, I'd be holding a room against onrushing soldiers rappelling from the skylights while simultaneously fending off an attack helicopter. Or embarking on a deeply satisfying stealth-assault on an enemy base on Roosevelt Island, a sequence that was so fun that I became engrossed and played it for the better part of an hour before remembering that I had to go back to Crysis 3.
The harder I look, the more Crysis 3
's deficiencies pile up. It's a very short game, but not a particularly focused one. I played through the single-player story in around 6-7 hours, give or take, and couldn't believe the story was moving as quickly as it was. There are only three other characters in the game other than Prophet, and one of them gets about 5 minutes of total screen-time. It's only daytime for two of the game's seven chapters (And remember, by way of comparison, that Crysis 2 had nineteen chapters). The rest of the game takes place underground, in a haze, or at night.
Only one chapter?a nighttime jaunt through the flooded ruins of Chinatown?comes close to consistently capturing the type of sneaky, hunt-y encounters that were so fun in Crysis 2
. It's enjoyable while it lasts, but even then feels short-lived. Before long I was behind the wheel of a tank for a stunted vehicle segment, or in the gunner's seat of an airship for a frustrating turret sequence. The game just never settles into a groove, and as a result feels hurried and off-balance.Crysis 3 Review

Here's another unexpected problem: Prophet's bow is overpowered. It's basically a swiss-army-knife weapon that can double as a rocket launcher and can take down any enemy in the game. And, like I mentioned earlier, it's silent and allows you to fire while invisible. There's no need for stealth melee-kills or even silenced weapons, because you can just whip out your bow and waste anything that moves. Crysis
has always relied on a careful balance between the suit's energy-timer and the enemy's superior numbers. A powerful new element like the bow throws the scales out of whack.
For an example of that imbalance, picture this scenario: First, I tag the enemies using my visor. Then, I crouch up across the roof, cloaked. I change the draw-weight to make my bow super-powerful, then I pick them off one by one. It's not just that the bow is overpowered and lets me attack while invisible, the enemy AI simply doesn't really respond to the fact that their friends are dying right before their eyes.

That kind of thing happens a lot. Bugs popped up throughout my playthrough, from the weird AI to numerous graphical and audio issues. Enemies froze in place, a guard I tagged somehow fell upwards into outer space, and I was able to clip right through vent covers.

Yes, these examples are all little things. Some of those bugs will likely be patched out of the game. But we're talking about a game that has been pitched as this amazing-looking godsend, a beacon of incredible future-tech. A sign of things to come. So I can't help but be disappointed that it so consistently lacks technical polish. Despite its screenshot-ready visuals, there are plenty of current-gen games that exhibit far stronger technical execution than Crysis 3
, with the added benefit of actually running consistently on modern computers.
Crysis 3
's level design often feels overly narrow, but a couple of times it also feels too big. It's a cop-out of me to keep saying that "something feels off," but that's the best way to encapsulate the design of the game?almost every level just feels a bit off. Disorienting, difficult to navigate, with the open areas feeling too open and the enclosed areas feeling claustrophobic. One later level in particular is very large, but feels too large, and as a result seems somewhat empty. You're given access to a few vehicles, but the level is also dotted with deep pools of water that will swallow those vehicles whole.
Enemy AI seems incapable of coordinating over great distances, and often I'd see an enemy stand still in my sniper-sights, unable to do much of anything except perform an endless loop of ducking into cover, sticking his head out, then ducking back. One late-game side-mission tasked me with rescuing some guys in a tank. I came in expecting to fight off attackers and found them simply waiting for me. They drove off in their tank and invited me to take the gunner's seat. They then proceeded to drive out about fifty yards into the open, and sit there motionless while the enemy blew them apart.
Was Psycho every really anything more than a Cockney accent masquerading as a personality? I guess not.

Crysis 3
's story and dialogue are as undercooked as the rest of the game. Enemy guards all seem to have gone to the Splinter Cell school of bad enemy dialogue, regularly yelling stuff like, "He's hunting us!" and "He's using arrows!" and "You think this is hide-and-seek? Show yourself!" At one point I shot a lone guard with an arrow, only to hear one of his compatriots in another room holler "He's using a bow!"
Someone at Crytek seems to have heard complaints about the past games' relative lack of personality, and the writers have attempted a last-minute emotion-injection. This attempt, while doubtless well-intentioned, was not successful. In contrast to the second game, the protagonist speaks and emotes, but it's never convincing. The script attempts to lay out a meaningful theme about sacrifice that never actually coalesces into anything or connects with the events of the story. The writers appear to be under the impression that the theme will become meaningful through repetition alone. I didn't care about any of the characters in past Crysis
games, and this attempt to make me suddenly give a damn about their sacrifices feels like a band-aid on a corpse.
Psycho, the freedom-fighter who accompanies you for most of the story, is a dud of a character. Before I played, I was happy to hear that he'd be featured. Now that I've played it, I find myself asking: Was Psycho every really anything more than a Cockney accent masquerading as a personality? I guess not.
Crysis 3 Review

The overarching story, which concerns a reborn alien leader and a wormhole-invasion straight out of a made-for-TV adaptation of Mass Effect 3
, is nonsense even by sci-fi video game standards. What drama there is takes place elsewhere; you just hear it over your radio. The dialogue is a dispiriting collection of clich?s that includes such stinkers as "We're all human, Psycho! Nomad, Jester?. We all fought. Not the god damn nanosuits!"
At one point, a character cries out, "It was never just about the suit!" I always thought it was
about the suit. I sort of liked that. It kept things simple. I think it should've stayed about the suit.
Here's a short list of further disappointments:
Collectable audio diaries that must be listened to in the pause menu, but not while playing. They never shed any light on where you are, who the speaker was, or what's going on.A weird attempt at painting the Cell corporation as a cheerily evil corporate entity that feels inspired by Portal, of all things.A poorly designed final boss-fight that ditches all of the game's strengths and pits you against a confusing enemy.Waypoints and objectives that feel unclear, leaving you wandering around a large, empty environment for minutes on end looking for a path forward.A hacking minigame that feels tacked-on and annoying.A lackluster map that's hidden beneath one layer of the menu, and a mini-map that is mostly impenetrable.Grenades that are as liable to bounce off a doorframe and land at your feet as they are to land near your target.Incredibly vigilant enemies that are able to spot you uncloaked at two hundred yards, even if you're crouched in the shadows.
Multiplayer is a welcome bright spot. Broadly speaking, it's a sort of slick merger of the twitchy iron-sights of Call of Duty
and the heavily armored mega-jumping of Halo. In my limited pre-release multiplayer sessions, I was surprised at just how much fun I was having. Multiplayer matches follow the typical templates for these sorts of games?there's deathmatch, team deathmatch, exfiltration and point-capture. What makes it really pop off is the fact that everyone has a nanosuit that can become invisible or armor-tough. It's impressive just how much goofy fun a multiplayer game can become when everyone has the ability to become invisible for brief periods of time.
Crysis 3
's new multiplayer mode is called "Hunter Mode," and I had a good time with it as well. You either play as a cloaked nanosuit-wearing "hunter" or a lowly Cell guard. If you're a hunter, it's your job to kill all the guards. If you're a guard, it's your job to stay alive for a set amount of time. If you get killed, you spawn back on the map as a hunter, so the last surviving guard winds up having to outwit a whole lot of hunters. I was surprised to find that the most tense, enjoyable moments of my multiplayer session with Crysis 3 involved me, crouching in a corner, hoping no one found me before the clock ran out.
It was an odd thrill, more like playing hide-and-seek than any more familiar first-person shooter multiplayer mode. That video may seem like the least exciting multiplayer video ever?it's just a guy crouching by a wall! But it was actually more
exciting in a way, because it felt so new. I'm not sure I'd play Hunter Mode for more than an afternoon or two, but it's a neat idea, and nice to see more games experimenting with asymmetrical competitive multiplayer.
There are other bright spots: You can still pop a different scope, attachment, or silencer onto your weapon on the fly. The power-jump still has that satisfying "sproinggg!" feeling. There are still moments of badassery, when you'll creep on a guy and take him down, then creep away just before his friend comes around the corner. Oddly, the aliens are now more fun to fight than the humans, but they can indeed be pretty fun to fight. And of course, when Crysis 3
is pretty, it really is quite pretty.Multiplayer is a welcome bright spot.

But still, so much of Crysis 3
falls well short of the bar Crytek themselves set with Crysis and Crysis 2. The game's publisher EA has assured me that Crysis 3 will be receiving a day-one patch, but I can't imagine it will do too much to change the game from what I played. As I said, it's likely that over the weeks and months to come, Crytek will optimize the PC version to get consistent performance on a wider range of machines. But while those sorts of patches may address some of the more cosmetic bugs I ran into, it seems unlikely that they'll address the game's haphazard level design, poor AI, odd pacing, clumsy script and unbalanced combat.
Despite this laundry list of shortcomings, Crysis 3
still contains flashes of that delightful predatory thrill that makes Crysis games so fun. But they're too infrequent, hidden within a game where fancy tech disguises conservative, uninteresting design. The more I think about and play Crysis 3, the more frustrated I become. Crysis 2 managed to get an admirable number of things right. I would have loved to see the third game build upon that foundation and close the series out with style.Crysis 3 Review

Instead, Crysis 3
is a finely tuned luxury automobile that's not, as it turns out, all that finely tuned. You sit, revving the engine, hoping that weird sound will go away, but it doesn't. It gets louder. You lower the driver's-side window; it gets stuck halfway. You pull down the sun-visor; it comes off in your hand.
Perplexed, you turn the visor over and examine the underside, wondering if it's supposed to come off. Maybe this is a feature? You look up, pause, sniff. Sniff again to confirm. Yep. Beneath the rich smell of the upholstery is the smell of something else. Something less pleasant.

And you stare at the wheel for a couple of moments, and you make peace with the fact that despite its lustrous exterior, this really just isn't a nice car after all.

Republished with permission.
Kirk Hamilton is a contributing editor at Kotaku.

Tuesday, April 30, 2013

Crysis 3 Performance Test: Graphics & CPU

The underlying connection was closed: The connection was closed unexpectedly.
Built with CryEngine2, the original Crysis raised the bar for PC gaming graphics in 2007 with the most stunningly detailed visuals that crippled even the fastest of the rigs. Looking back at our first Crysis performance article, which was based on the game's demo, the fastest GPU available at the time (the GeForce 8800 GTX 768 MB) struggled to average 30 fps when running at 1920 x 1200 with high quality settings is the DirectX 10.

Given how on "punishing the first game was, we were excited to explore the 2011 's CryEngine 3-based Crysis 2, but it was quickly apparent that the second installment wouldn't be a repeat performance. Not to say it didn't look better, but relative to Crytek's in the first title, the sequel didn't really set any new benchmarks. It was just another computer game that made great use of DX9, though DX11 was eventually patched in.

Fast forward two years and Crytek has given us another opportunity to hammer some hardware with the arrival of Crysis 2 this month. Like the second title, the third installment has been built with the CryEngine 3, though that doesn't mean you should expect lousy PC features, as the engine has been updated with improved dynamic cloth and vegetation, better lighting and shadows, and plenty more.
Crysis 3 benchmarks

Plus, PC gamers won't have to wait for graphical extras. Crysis 3 launched with high resolution textures, DX11 support and plenty of customization options that set it apart from the diluted consoles builds. The result looks incredible and we get the feeling this will prove to be the game that folks who are heavily best-invested in multi-GPU setups have been waiting for. Here's hoping we aren't woefully disappointed.

We'll be testing 18 DirectX 11 graphics card configurations from AMD and Nvidia, which is considerably less than the 29 me tested for Far Cry 3 because even with the medium quality preset activated, there are almost no low-to high-end graphics cards that can play Crysis 3, even at 1680 x 1050.

The latest drivers will be used, and every card will be paired with an Intel Core i7-3960X to remove the CPU bottlenecks that could influence the high-end GPU scores.

We're using Fraps to measure frame rates during the 90 seconds of gameplay footage from Crysis 3 's in the first level, "Post Human". The test starts as soon as Michael "Psycho" Sykes hands you his backup weapon, we then simply follow the party leader until the time runs out.
Crysis 3 benchmarks

We'll test in Crysis 3 at three common desktop display resolutions: 1680 x 1050, 1920 x 1200 and 2560 x 1600, using the DX11 mode. For the very high-quality test, we'll set the "overall quality" in the menu to the very high video quality while also setting the SMAA level to 1 (low). The high and medium-quality tests will also be conducted with SMAAx1 enabled.
Gigabyte Radeon HD 7970 GHz Edition (3072 MB), Gigabyte Radeon HD 7970 (3072 MB), Gigabyte Radeon HD 7950 Boost (3072 MB), Gigabyte Radeon HD 7950 (3072 MB), AMD Radeon HD 7870 (2048MB) AMD Radeon HD 7850 (2048MB) of the HIS Radeon HD 7770 (1024 MB) of the HIS Radeon HD 69 (2048MB), Gigabyte GeForce GTX 680 (4096MB), Gigabyte GeForce GTX 680 (2048MB), Gigabyte GeForce GTX 670 (2048MB), Gainward GeForce GTX 660 Ti (2048MB), Gigabyte GeForce GTX 660 (2048MB), Gigabyte GeForce GTX 650 Ti (2048MB), Gigabyte GeForce GTX 580 (1536MB)The Gigabyte GeForce GTX 560 Ti (1024 MB) Nvidia GeForce GTX 480 (1536MB), Intel Core i7 Extreme Edition 3960X (3.30 GHz) x 4 4 GB g.skill DDR3-1600 (CAS 8-8-8-20), the Gigabyte G1.Assassin2 (Intel X 79) OCZ ZX Series 1250wCrucial m4 512 GB (SATA 6 GB/s), Microsoft Windows 7 SP1 64-bitNvidia Forceware 314.07 AMD Catalyst 8.2 (Beta 6)

BioShock Infinite Review

AppId is over the quota Translate Request has too much data Parameter name: request


Depending on where in the country you live, you might have the nagging feeling that portions of the United States have broken clean off. Just up and decided to veer into their own orbits, consequences be damned.

And, because any notion that these States aren't United is a really uncomfortable one, a cacophony of voices all jostle to demonize their opposing factions. They say that the other side has selfishly detached themselves from the reality that we Americans are all supposed to share. How dare they?!

BioShock Infinite takes that nagging feeling of disunion and makes players wade through its century-ago antecedent, in a way that lays bare the agonizing personal costs paid to the grinding cycle of history.

Columbia is a chunk of America that has in fact gone to pieces. And the result is horrifying. But beautiful, too.


You haven't been to a place like this before. The fictional floating city where Infinite is set is all clockwork platforms and brass gears, its many sections populated with hucksters, strivers, lovers and schoolchildren. One minute, you're walking past a sheer drop, the next a park swings down into the open space. Sure, they seceded from the Union but it's such a bloom-lit paradise that you almost can't blame them.
BioShock Infinite Review Why: BioShock Infinite isn't just a worthy sequel to a much-loved predecessor. It also manages to be about America - touching on its past, present and possibilities ? in a way that makes it a must.
Developer: Irrational Games
Platforms: PC, PS3, Xbox 360 (version played)
Release Date: March 26
Type of game: First-person shooter that's also a twisted fable about American history What I played: Played through all of the campaign in about 10-12 hours.. My Two Favorite Things Excellent voicework all around ? anchored by Troy Baker and Courtnee Draper as Booker and Elizabeth ? makes this feel like a operatically violent radio play.Columbia is filled with gorgeous architecture and design. Fighting for survival in a place this pretty doesn't make the bullets hurt any less but does inspire awe.Two Things I Hated Despite how its wide-open spaces look, part of Infinite can feel very much llke they're on rails.Certain sections are just straight-up killboxes that will grind you up like hamburger.Made-to-Order Back-of-Box Quotes "Add Columbia to the list of video game places I wish actually existed, only without all the racism/sexism/spacetime tomfoolery." Evan Narcisse, Kotaku.com"Like Portal 2, BioShock Infinite is a sequel that builds on and maybe even surpasses the original game." Evan Narcisse, Kotaku.com
Then you come along. Well, not you. Booker DeWitt--the former soldier and Pinkerton agent players control -- isn't a cipher meant for you to occupy, like the mute protagonist from BioShock 1. He's his own man with a voice, a checkered past and reasons for staging a one-man invasion. Debt weighs heavy on his soul and the only way he can come clear of it is to fetch a supernaturally powered young woman named Elizabeth. If he gets her to the people who want her, then he might be able to get on with the rest of his life.

If you subscribe to the idea that there is in fact a formula for making a BioShock game, then Infinite will only support your thesis. Mix up sci-fi archetypes, comic-book super-science, ideologically driven conflict and old-school first-person-shooter love with narrative ambition and philosophical discord. The player character's special abilities get wielded through the left hand while weapons get gripped by the right and he must wend his way through an isolated city-state in turmoil. Once you do that, you have--in broad strokes -- the component parts of the games that have been called BioShock.

The powers you wield this time are called Vigors. You can mix and match them so that you can electrify a flock of crows after flinging them at an enemy. Or you can hold the soldiers you're fighting aloft and then set them on fire with telekinetic and pyrokinetic Vigors.

Important moments of choice have been another hallmark of BioShock games. This time out, the importance of choice isn't in where you wind up plot-wise. It's in how you play. The method in which you cobble together the upgrades you find with Elizabeth's combat support and the amazing verticality of the game's battlegrounds will leave you with a unique experience that you can transform as you go. Couple that with the various firearms and Vigors you'll collect and Infinite's play feels like it gives you more tools and a faster pace to use in an expertly crafted playground.

For all that's familiar, Irrational Games' new release does add new seasoning to that BioShock recipe. One of the big changes is in basic locomotion. Columbia's mass transit is a series of snaking pipeworks called Skylines and they provide a thrilling, vertiginous way to get around the city. They feel like a one-man roller coaster that you can shoot at people from. Aside from that, you can pounce on enemies from way on high or rain down gunshots while zipping along. And enemies will do the same to you, so these aren't an easy way out of most battles.

But it's the character of Elizabeth who represents the biggest change to the BioShock formula, which up until now gave you scant companionship on your adventures in Rapture. At first, Elizabeth might remind you in a broad way of the dog from Fable II. That pooch found you loot and helped you get around the world of Albion. You formed a simple but meaningful bond with it.

Elizabeth is far more complex. She's a fully scripted persona who aids you in combat and in scavenging, by finding and supplying health, money and ammo. Most impressively, she can manipulate tears, which are space-time hiccups that let her pull things from alternate reality through to this world. Discount vending stations, machine-gun turrets and grapple points are just a few of the assets she can summon for you. Which tears you have her manifest will affect the strategy options you have during a firefight and this branching opens up the uniqueness of the strategies available.


From an emotional perspective, things change immediately when you meet Elizabeth. She's na?ve, but with strong streaks of curiosity and desperation running through her. A skybound city doesn't feel like paradise when it's all you've ever known and she yearns to experience the world below. Columbia founder Father Comstock is a religious zealot, one who commands a city of totally obedient martyrs. When he tells them not to fight, it's far creepier than when you're battling them. He means to use Elizabeth's abilities to deliver an apocalyptic judgment to the America beneath him. But Comstock must also deal with an proletariat insurgency by the Vox Populi, who want to topple what they see as a corrupt oligarchy.


Elizabeth alternately wants to impress Booker and run away from him. They need each other and she never feels like a stack of AI scripts walking alongside you. When she throws you a health pack in a firefight, her need for you to survive is palpable. She's haunted by a lack of a past while Booker is chased by a history too full of blood. Together, their shared journey moves from wariness to warmth to resolution with real poignancy.

For all the talk of parts, this game is more than just the sum of its pieces. You're playing for story here, and that story is embedded through the entire fabric of Infinite. The more you explore Columbia, the more its made-up citizens and history pull you in. There's a mystery swirling around the clouds that surround the city and it kept me guessing until the very end of the game.

Early on, you get signs that something more than mere isolationism is amiss in Columbia. Those tears in reality's fabric are a tease to the main conceit of the game, with the gambit being nothing less than the re-writing of American history. Columbia's already well down that road as its spiritual revisionism has made demigods of George Washington, Thomas Jefferson and Ben Franklin. But you can't shake the creeping sense that many things are going to happen as a result of your actions. They do and they're all pretty weighty.


Given the fact that there's inevitable conflict waiting to joined, you might think that a repudiation of American self-aggrandizement is all there is to BioShock Infinite. The uppity sky-dwellers in Columbia need to be taken down a peg, right? But what's more surprising than the rude awakenings is the degree to which Infinite is a celebration of Americana. It's a game squeezed out of Normal Rockwell paintings, set to ragtime music and filled to the brim with jaunty bygone slang. It zips and zings, even with it's beating you down with giant robot president enemies.

And, yes, Irrational creative director Ken Levine and crew are lobbing a slew of scholarly -isms for players to chew through: racism, sexism, anti-intellectualism, 18th century revivalism and the gospel of industrialism as a cradle-to-grave caretaker of the worker. The tribalism that's inextricably part of America's spiritual DNA is a big part of the game's factions and battlefields, too. The Vox Populi--made of common-man laborers--think they have too little while the well-to-do Founders essentially believe that Comstock's vision of America is a better one than the one lived on solid ground.



If you're acquainted with the language of revolution and regime change, then lots of the rhetoric slung across the conflicts in Infinite will ring familiar to you. Opponents from different classes and backgrounds slander each other. Divine/universal logic is on our side. That kind of thing. The difference is that Infinite places players in the fires of tumult and shows them the result of bloody revolts up close. Most of the people you overhear in Infinite are racist, classist, snooty and surly. Yet you feel bad for them as some of the illusions keeping Columbia aloft begin to crumble. It's a hell of a thing to believe in a dream with all your being, for both good and bad reasons.

BioShock Infinite may not the first game to try to say something about the very nature of the country it was made in--and the people who make it up--but it's certainly amongst the best. Some scenes reminded me of how people who looked like me had an unbelievable array of prejudicial forces from public and private institutions set against them. Yet, even as I played through those moments, I was reminded that America is a big experiment. That experiment in letting people chart their own destinies has sometimes made it so brother fights against brother.

It's easy to dismiss those people floating in the fractured mirror Americas that we disagree with. They're wrong; we're right. Who cares why they are the way they are? But BioShock Infinite asks us to consider that very question and gives an answer that mixes hope with bitterness, wonder with despair and allegory with history. The game doesn't offer any advice about how to make everyone get along better but it makes a powerful argument for owning-- and owning up to--all of our collective past.

Gigabyte GeForce GTX Titan Review

The underlying connection was closed: The connection was closed unexpectedly.
Nvidian Kepler arkkitehtuuri debytoi vuosi sitten GeForce GTX 680, joka on istunut hieman mukavasti markkinoiden Suosituimmat yhden GPU näytönohjain, pakottaa AMD alentaa hintoja ja käynnistää erityinen HD 7970 GHz Edition kortti kuilun arvo. Huolimatta eläimellinen kilpailijansa, monet uskovat, että Nvidia oli tarkoitus tehdä sen 600-sarja lippulaiva vieläkin nopeammin käyttämällä GK110 siru, mutta määrätietoisesti hallussa takaisin GK104 säästää rahaa, koska se oli kilpailukykyinen tarpeeksi performance-wise.

Tämä ei tarkoita välttämättä pettynyt GTX-680 ihmistä. 28Nm osa pakkauksissa 3540 miljoonaa transistoria pienehkö 294 mm 2 kuolee ja antaa 18,74 tarjoaa per watti muisti kaistanleveys 192.2 GB/s, kun se kolminkertaistui GTX 580 CUDA-ytimet ja kaksinkertaistui TAUs--no small feat, olla varma. Kuitenkin me kaikki tiesimme, GK110 oli olemassa ja oli innokas näkemään, miten Nvidia toi kuluttajamarkkinoille--jos se edes päättänyt. Onneksi että odotus on nyt ohi.

Jälkeen päällään 12 kuukautta yhden GPU suorituskyvyn kruunu, GTX-680 on vallasta uusi GTX Titan. Ilmoitti 21. helmikuuta, titaani kantaa GK110 GPU transistori-määrä, joka on yli kaksinkertaistunut GTX 680 3,5 miljardista huikea 7,1 miljardia. Osa on noin 25-50 prosenttia enemmän varoja käytettävissään kuin Nvidian edellisen lippulaiva, kuten 2688 stream-prosessoria (jopa 75 %), 224 muokkausaine yksikköä (myös jopa 75 %) ja 48 rasteri operations (terve 50 % lisää).

Jos olet utelias, on syytä huomata, että on "vain" arvioidaan olevan 25-50 prosenttia suorituskyvyn saada, sillä titaani on saapunut pienempi kuin GTX-680. Nämä odotukset on kohtuullista olettaa, että Titan hinnoiteltu noin 50 % palkkio, joka olisi noin 700 dollaria. Mutta ei ole mitään oikeudenmukaista Titaanien hinnoittelusta--ja ei tarvitse olla. NVIDIA on markkinointi kortin äärimmäisen nopea ratkaisu äärimmäinen pelaajille syvät taskut, jossa Suositushinta on mahtava 1000 dollaria.

Tuo Titan dual-GPU GTX 690's alueella tai noin 120 % enemmän kuin GTX-680. Titaani ei tule hyvä lisäarvoa hinta vs. suorituskyky, mutta Nvidia on epäilemättä tietoinen tästä ja jossain määrin meidän täytyy kunnioittaa sitä niche-ylellisyystuote. Että mielessä, let's nostaa Titaanien huppu ja nähdä, mikä tekee siitä käydä ennen kuin se läpi meidän tavallista vertailuarvoja, haaste, johon nyt sisältyy runko latenssi mittauksia--lisää, että vähän.

GeForce Titan on totta käsittely-powerhorse. GK110 siru kantaa 14 SMX yksikköä 2688 CUDA ydintä, tarjoaa jopa 4,5 teraliukulukutoimitusta peak laskea suorituskykyä.

Kuten edellä todettiin, Titan tarjoaa core kokoonpanon, joka koostuu 2688 SPUs, 224 TAUs ja 48 ROPs. Kortin muisti subsystem kuuluu kuusi 64-bittinen muistin ohjaimet (384-bittinen) 6 gt GDDR5-muistia käydessä 6008MHz, joka toimii ulos huippu kaistanleveyden käyttö 288,4 GB/s--50 % enemmän kuin GTX-680.

Meillä Titan varustettiin Samsung K4G20325FD FC03 GDDR5-muistipiirejä, jotka on luokiteltu 1500 MHz--sama kuin voit löytää viittaus GTX 690.

Jos titaani alittaa GTX-680 on sen ytimen kellotaajuus, joka on asetettu 836 MHz vs. 1006 MHz. Että 17 % ero muodostuu hieman vauhtia kellon, Nvidian dynaaminen frequency ominaisuus, joka voi työntää Titan peräti 876MHz.

GTX Titan sisältää oletusarvon mukaan kaksi dual-link DVI-porttia, HDMI-portti ja yksi DisplayPort 1.2 liitin. Tukea 4K resoluution näyttöjä olemassa, vaikka on myös mahdollista tukea enintään neljä näytöt näytöt.

Historian moderni näytönohjain, osa 3

The formatter threw an exception while trying to deserialize the message: Error in deserializing body of request message for operation 'Translate'. The maximum string content length quota (30720) has been exceeded while reading XML data. This quota may be increased by changing the MaxStringContentLength property on the XmlDictionaryReaderQuotas object used when creating the XML reader. Line 2, position 31713. The formatter threw an exception while trying to deserialize the message: Error in deserializing body of request message for operation 'Translate'. The maximum string content length quota (30720) has been exceeded while reading XML data. This quota may be increased by changing the MaxStringContentLength property on the XmlDictionaryReaderQuotas object used when creating the XML reader. Line 1, position 32355.
With the turn of the century the graphics industry bore witness to further consolidation.

The pro market saw iXMICRO leave graphics entirely, while NEC and Hewlett-Packard both produced their last products, the TE5 and VISUALIZE FX10 series respectively. Evans & Sutherland also parted ways with the sale of its RealVision line to focus on the planetaria and fulldome projection systems.

In the consumer graphics market, ATI announced the acquisition of ArtX Inc. in February 2000, for around $400 million in stock. ArtX was developing the GPU codenamed Project Dolphin (eventually named ?Flipper?) for the Nintendo GameCube, which added significantly to ATI?s bottom line.
ATI GameCube GPU
Also in February, 3dfx announced a 20% workforce cut, then promptly moved to acquire Gigapixel for $186 million and gained the company?s tile-based rendering IP.

Meanwhile, S3 and Nvidia settled their outstanding patent suits and signed a seven-year cross-license agreement.

VIA assumed control of S3 around April-May which itself was just finishing a restructuring process from the acquisition of Number Nine. As part of S3?s restructuring, the company merged with Diamond Multimedia in a stock swap valued at $165 million. Diamond?s high-end professional graphics division, FireGL, was spun off as SONICblue and later sold to ATI in March 2001 for $10 million.

3DLabs acquired Intergraph?s Intense3D in April, while the final acts of 3dfx played out towards the end of the year, despite 2000 kicking off with the promise of a better future as the long-awaited Voodoo 5 5500 neared its debut in July. The latter ended up trading blows with the GeForce 256 DDR and won the high-resolution battle.

Where 3dfx was once a byword for raw performance, its strengths around this time laid in its full screen antialiasing image quality.

But where 3dfx was once a byword for raw performance, its strengths around this time laid in its full screen antialiasing image quality. The Voodoo 5 introduced T-buffer technology as an alternative to transformation and lighting, by basically taking a few rendered frames and aggregating them into one image. This produced a slightly blurred picture that, when run in frame sequence, smoothed out the motion of the animation.

3dfx?s technology became the forerunner of many image quality enhancements seen today, like soft shadows and reflections, motion blur, as well as depth of field blurring.

3dfx?s swan song, the Voodoo 4 4500, arrived October 19 after several delays ? unlike the 4200 and 4800 that were never released. The card was originally scheduled for spring as a competitor to Nvidia?s TNT2, but ended up going against the company?s iconic GeForce 256 DDR instead, as well as the much better performing GeForce 2 GTS and ATI Radeon DDR.

On November 14, 3dfx announced they were belatedly ceasing production and sale of their own-branded graphics cards, something that had been rumoured for some time but largely discounted. Adding fuel to the fire, news got out that upcoming Pentium 4 motherboards would not support the 3.3V AGP signalling required Voodoo 5 series.


The death knell sounded a month later for 3dfx when Nvidia purchased its IP portfolio for $70 million plus one million shares of common stock. A few internet wits later noted that the 3dfx design team which had moved to Nvidia eventually got both their revenge and lived up to their potential, by delivering the underperforming NV30 graphics chip powering the FX 5700 and FX 5800 cards behind schedule.

Prior to the Voodoo 5?s arrival, ATI had announced the Radeon DDR as ?the most powerful graphics processor ever designed for desktop PCs.? Previews of the card had already gone public on April 25, and only twenty-four hours later Nvidia countered with the announcement of the GeForce 2 GTS (GigaTexel Shader). The latter included Nvidia?s version of ATI?s Pixel Tapestry Architecture, named Nvidia Shading Rasterizer, allowing for effects such as specular shading, volumetric explosion, refraction, waves, vertex blending, shadow volumes, bump mapping and elevation mapping to be applied on a per-pixel basis via hardware.

The feature was believed to have made it to the previous NV10 (GeForce 256) chip but it remained disabled due to a hardware fault. The GTS also followed ATI?s Charisma Engine in allowing for all transform, clipping and lighting calculations to be supported by the GPU. That said, ATI went a step further with vertex skinning for a more fluid movement of polygons, and keyframe interpolation, where developers designed a starting and finishing mesh for an animation and the Charisma core calculated the intervening meshes.


The ATI Radeon DDR eventually launched for retail in August 2000. Backed by a superior T&L implementation and support for several of the upcoming DirectX 8 features, the Radeon DDR alongside the GeForce 2 GTS ushered in the use of DVI outputs by integrating support for the interface into the chip itself. The DVI output was more often found on OEM cards, however, as the retail variety usually sported VIVO plugs.

One downside to the Radeon DDR is that boards shipped with their core and memory downclocked from the promised 200MHz and 183MHz, respectively. In addition, drivers were once again less than optimal at launch. There were issues with 16-bit color and compatibility problems with VIA chipsets, but this did not stop the card from dominating the competition at resolutions higher than 1024x768x32. A price of $399 for the 64MB version stacked up well versus $349-399 for the 64MB GeForce 2 GTS, which it beat by a margin of 10-20% in benchmarks, and helped ATI maintain its number one position in graphics market share over Nvidia.

Nvidia wasn?t doing all that bad for themselves either. The company reported net income of $98.5 million for the fiscal year on record revenue of $735.3 million, driven in large part by its market segmentation strategy, releasing a watered-down MX version of the card in June and a higher clocked Ultra model in August. The latter dethroned the Radeon in terms of performance but it also cost $499. A Pro model arrived in December.

Besides releasing a GeForce 2 card at every price point, from the budget MX to the professional Quadro 2 range, Nvidia also released its first mobile chip in the form of the GeForce2 Go.
By the time 2001 dawned, the PC graphics market consisted of a discrete card duopoly, with both of them in addition to Intel supplying the vast majority of integrated graphics chipsets.

As 3dfx was undergoing its death throes in November, Imagination Tech (ex-VideoLogic) and ST Micro attempted to address the high volume budget market with the PowerVR series 3 KYRO. Typically ranging in price from $80 to $110 depending on the memory framebuffer, the card represented good value for the money in gaming at resolutions of 1024x768 or lower. It would have become more popular, had the GeForce2 MX arrived later, or not so aggressively priced at ~$110.

The KYRO II arrived in April 2001 with a bump in clock speeds compared to the original and manufactured on a smaller 180nm process by ST Micro. But once again the card faced stiff competition from the GeForce 2 MX. Nvidia rebadged the card as the MX200 and lopped 40% off its price, while adding a higher clocked MX400 card at the same price as the Kyro II.

When PowerVR failed to secure game development impetus for tile based rendering, and ST Micro closed down its graphics business in early 2002, Imagination Technologies moved from desktop graphics to mobile and leveraged that expertise into system on chip graphics. They licenced the Series 5/5XT/6 for use with ARM-based processors in the ultra portable and smartphone markets.

By the time 2001 dawned, the PC graphics market consisted of a discrete card duopoly, with both of them in addition to Intel supplying the vast majority of integrated graphics chipsets.

Meanwhile, Matrox and S3/VIA clung to the margins of traditional markets.

Building on the strides made with the GeForce 2 series, Nvidia unveiled the GeForce 3 on February 27, 2001 priced between $339 and $449. The card became the new king of the hill, but it really only came into its own at the (then) extreme resolution of 1600x1200, preferably with full screen antialiasing applied.


Initial drivers were buggy, especially in some OpenGL titles. What the new GeForce did bring to the table was DirectX 8, multisampling AA, quincunx AA (basically 2xMSAA + post process blur), 8x anisotrophic filtering as well as the unrivalled ability to handle 8xAF + trilinear filtering, and a programmable vertex shader which allowed for closer control of polygon mesh motion and a more fluid animation sequence.

There was also LMA (Lightspeed Memory Architecture) support -- basically Nvidia's version of HyperZ -- for culling pixels that would end up hidden behind others on screen (Z occlusion culling) as well as compressing and decompressing data to optimize use of bandwidth (Z compression).

Lastly, Nvidia implemented load-balancing algorithms as part of what they called the Crossbar Memory Controller, which consisted of four independent memory sub-controllers as opposed to the industry standard single controller, allowing incoming memory requests to be routed more effectively.
Nvidia NV2A
Nvidia NV2A inside Microsoft's Xbox
Nvidia?s product line later added the NV2A, a derivative of the GeForce 3 with GeForce4 attributes that was used in Microsoft's Xbox game console.

At this point, Nvidia controlled 31% of the graphics market to Intel?s 26% and ATI?s 17%.

As Nvidia complemented the GF3 line-up with underclocked Ti 200 and overclocked Ti 500 models, ATI hurried to ramp up deliveries of the Radeon 8500. The card was built around the R200 GPU using TSMC?s 150nm process (the same used by GeForce 3?s NV20). The chip had been announced in August and was eagerly awaited since John Carmack of id software talked it up saying it would run the new Doom 3 ?twice as well? as the GeForce 3.

ATI?s official R8500 announcement was no less enthusiastic. But reality kicked in once the card launched in October and was found to perform at the level of the underclocked GF3 Ti 200 in games. Unfinished drivers and a lack of workable Smoothvision antialiasing weighted heavily against the R8500 in its initial round of reviews. By the time the holiday season arrived, a second round of reviews showed that the drivers had matured to a degree and raised the R8500?s performance in-between the Ti 200 and the standard GF3.

Texture units per pixel pipeline

Very competitive pricing and a better all around feature set (2D image quality, video playback, performance under antialiasing) made the card a worthy competitor to the GF3 and Ti 500 nonetheless.

ATI?s sales for the year dropped to $1.04 billion as the company recorded a net loss of $54.2 million. The company began granting licenses to board partners to build and market graphics boards, while refocusing their resources on design and chip making.
ATI Xilleon

ATI also debuted the Set-Top-Wonder Xilleon, a development platform based on the Xilleon 220 SoC which provided a full processor, graphics, I/O, video and audio for set-top boxes integrated into digital TV designs.

To complement Xilleon, ATI acquired NxtWave Communications for $20 million in June 2002. The company specialized in digital signal processing and applications for set-top boxes and terrestrial digital solutions.

Keeping up with their product launch cycle, Nvidia released the GeForce 4 in February 2002. Three MX parts, three mobile parts based on the MX models, and two performance Titanium models (Ti 4400 and Ti 4600) made up the initial line up -- built on TSMC?s 150nm process. The GeForce 4 was effectively ready for release two months earlier but the launch was delayed to avoid eating into GeForce 3 sales over the holiday season.

The MX series cards were intended for the budget segment but they were still largely uninspiring as they were based on the old GeForce 2 architecture. MPEG2 decode added but the cards reverted to DirectX 7.0/7.1 support as the earlier GF2 MX line. Pricing at $99-179 reflected the reduced feature set.

The Titanium models on the other hand were excellent performers and in some instances managed a 50+% increase in performance over the GeForce3 Ti 500. The Ti 4600 became the performance champ overnight, easily disposing of the Radeon 8500, while the Ti 4200 at $199 represented the best value for money card.

But then came the Radeon 9700 Pro and promptly consigned every other card to also-ran status.

ATI Radeon 9700 Pro (FIC A97P)
Developed by a team that had originally formed the core of ArtX, the ATI R300 GPU delivered spectacularly and arrived very promptly. It was the first to bring DirectX 9.0 support, and by extension, the first architecture to support shader model 2.0, vertex shader 2.0, and pixel shader 2.0. Other notable achievements: it was the second GPU series to support AGP 8x -- SiS?s Xabre 80/200/400 line was first -- and implementing the first flip-chip GPU package.


About flip-chip GPU packages: Previous generations of graphics chips and other ICs used wire-bonding mounting. With this method, the chip sits on the board with the logic blocks sitting under the metal layers whose pads would be connected by thin wires arranged around the edges of the chip down to solder balls or pins on the underside. Flip?chip does away with the wire component through contact points (usually soldered in a ball grid array) directly on the ?top? of the chip, which is then inverted, or ?flipped? so that the solder points directly contact the substrate or circuit board. The chip then undergoes localised heating (reflow) to melt the solder that then forms the connection with the underlying contact points of the board.

ATI complemented the line-up in October by adding a non-Pro 9700 at $299 for those unable to part with $399 for the top model. Meanwhile, the cut down 9500 Pro ($199) and 9500 ($179) reached down through mainstream market segments, and the FireGL Z1/X1 filled in the $550-950 bracket for professional graphics. The All-In-Wonder 9700 Pro ($449) was also added in December.

ATI?s sales are likely to have taken a hit when it was found that many cards could be modded to their more expensive counterparts. Examples of this included the ability to turn a 9500 card into a 9700 using its reference board (with the full complement of memory traces), or a 9800 Pro to its XT counterpart. For the latter, a driver patch was made available to check if it would accept the mod, which consisted of soldering in a resistor or using a pencil to tweak the GPU and memory voltage control chip. Hard mods also included upgrading various 9800 models into a FireGL X2, while a patched/Omega driver had the ability to turn a $250 9800 SE 256MB into a $499 9800 Pro 256MB.

In addition to discrete graphics, ATI also introduced desktop integrated graphics and chipsets. These included the A3/ IGP 320 meant to be paired with AMD CPUs, RS200/IGP 330 & 340 for Intel chips, as well as the mobile series U1/IGP 320M for AMD platforms and RS200M for Pentium 4-M. All of them were complemented with ATI southbridges, specifically the IXP200/250.

SiS unveiled the Xabre line between the launch of the GeForce4 and the R300. The cards were consistently slower than Nvidia and ATI?s offerings at the same price points, and were handicapped by the lack of vertex shader pipelines. This translated into a heavy reliance upon drivers and game developers to get the most out of software emulation, thus keeping SiS in the margins of desktop discrete 3D graphics.

The Xabre line also implemented ?Turbo Texturing?, where framerates were increased by drastically reducing texture quality, and lacked anisotrophic filtering . All this did little to endear reviewers to the cards.

The Xabre line was the last under the SiS banner, as the company spun off its graphics division (renamed XGI) and merged with Trident Graphics a couple of months later in June.

The first of Nvidia?s FX series arrived on January 27, 2003 with the infamous ?Dustbuster? FX 5800 and the slightly faster (read: less slow) FX 5800 Ultra. When compared to the reigning champ, the ATI Radeon 9700 Pro (and non-Pro), the FX was much louder, it delivered inferior anisotrophic filtering (AF) quality and antialiasing (AA) performance, and was overall much slower. ATI was so far ahead that a second-tier Radeon 9700 card launched five months earlier comfortably outperformed the Ultra, and it was $100 cheaper ($299 vs $399).
The 3dfx design team which had moved to Nvidia got both their revenge and lived up to their potential, by delivering the underperforming NV30 graphics chip behind schedule.

The NV30 chip was supposed to debut in August, around the same time as the Radeon 9700, but ramping problems and high defect rates on TSMC?s Low-K 130nm process held Nvidia back. Some circles also argued that the company was strapped for engineering resources, with more than a few tied up with the NV2A Xbox console chip, the SoundStorm APU, as well as the motherboard chipsets.

Looking to move things forward Nvidia undertook a project to have several FX series chips fabricated on IBM?s more conventional Fluorosilicate glass (FSG) low-K 130nm process.

ATI refreshed its line of cards in March, starting with the 9800 Pro, featuring a R350 GPU that was basically an R300 with some enhancements to the Hyper-Z caching and compression instruction.

The RV350 and RV280 followed in April. The first of these, found inside the Radeon 9600, was built using the same TSMC 130nm low-K process that Nvidia had adopted, Meanwhile, the RV280 powering the Radeon 9200 was little more than a rebadged RV250 of the Radeon 9000 with AGP 8x support.
ATI Xbox GPU
The same month saw ATI and Nintendo sign a technology agreement that would eventually lead to the Hollywood GPU for the Nintendo Wii console. ATI added a second console coup in August, when Microsoft awarded the Xbox 360 GPU contract to them.

A scant three and a half months after the inglorious debut of the FX 5800, Nvidia took another shot with the NV35 (FX 5900 and FX 5900 Ultra). The new Detonator FX driver greatly improved AA and AF, almost matching ATI's solution in terms of quality. However the 5900 achieved what the 5800 could not. It knocked ATI?s Radeon 9800 Pro from its spot as the fastest card around, although at $499 apiece, few would actually take advantage of this.

As expected, ATI regained bragging rights in September with the release of the 9800 XT. Superior driver support ? mainly with some DX9 games ? also made the XT a better overall card than Nvidia?s counterpart, ensuring that ATI ended the year with the performance crown. The 9700 Pro remained the standout mainstream board, while the FX 5700 Ultra at $199 won the sub-$200 price segment.

ATI bounced back with a $35.2 million profit in 2003 after posting a $47.5 million loss in 2002. A good chunk of this came from higher selling prices for the dominant 9800 and 9600 cards. Meanwhile, Nvidia retained 75% of the DirectX 9 value segment market, thanks to the popularity of the FX 5200.
Source DirectX 9.0 Effects Trailer, shown during ATI's presentation of the Radeon 9800 XT and 9600 XT
The newly formed XGI launched the Xabre successor in a staggered release between September and November. Renamed Volari, the card line-up ranged from the $49 V3 to the dual GPU Duo V8 Ultra. The V3 was virtually a rebrand of Trident?s Blade XP4 and a DX 8.1 part, while the rest of the series (V5 and V8) was developed from the previous SiS Xabre and featured DX9.0 support.

For the most part, all of the models underdelivered, with the exception of the entry-level V3 which offered performance equal to the GeForce FX 5200 Ultra and and Radeon 9200. The Duo V8 Ultra was priced ~20% higher than the Radeon 9800 Pro 128MB, yet delivered performance on par or lower than the 9600XT.

Another company making a comeback into desktop graphics was S3. Unfortunately, the buying public now generally saw desktop graphics as a two horse race ? and S3 wasn?t one of the two.

XGI?s Volari line lingered on with the 8300 in late 2005, which was more or less on par with the Radeon X300SE/GeForce 6200 at $49, as well as the Z9/Z11 and XP10. The company was reabsorbed back into SiS in October 2010.

Another company making a comeback into desktop graphics was S3. After the graphics division was sold to VIA for $208 million plus the company?s $60 million debt, the restructured venture concentrated primarily on chipset projects.

DeltaChrome desktop cards were announced in January, but in time-honoured S3 fashion, the first S4 and S8 models didn?t start appearing in the retail channel until December. The new cards featured most of the new must-haves of 2003; DirectX 9 support, 16x AF, HD 1080p support, and portrait-mode display support.

Unfortunately, the buying public now generally saw desktop graphics as a two horse race ? and S3 wasn?t one of the two. While S3 was looking to keep competitive, ATI and Nvidia were driving each other to achieve ever-increasing levels of performance and image quality.

The DeltaChrome was succeeded by the GammaChrome in 2005.

Nvidia and ATI continued in 2005 their staggered launches. The former launched its first GDDR3 card in March as the FX 5700 Ultra, followed by the GeForce 6 series with the high-end 6800 range. The initial line up comprised the 6800 ($299), GT ($399), the Ultra ($499), and an overclocked variant known as the Ultra Extreme ($549) to counter ATI?s X800 XT Platinum Edition. The latter was sold by a select band of add-in board partners.

The 6800 Ultra 512MB was added on March 14 2005 and sold for the unbelievable price of $899 -- BFG added an overclocked version for $999. The midrange was well catered for with the 6600 series in September.

Nvidia?s feature set for the 6000 series included DirectX 9.0c support, shader model 3.0 (although the cards were never able to fully exploit this), Nvidia?s PureVideo decode and playback engine, and SLI support -- the multi-GPU performance multiplier IP that was acquired from 3dfx.

Reintroducing an old feature: SLI
Where the 3dfx implementation resulted in each processing unit being responsible for alternate line scans, Nvidia handled things in a few different ways. The company implemented split frame rendering (SFR), in which each GPU rendered the top or bottom half of the frame, alternate frame rendering (AFR) so GPUs rendered frames in turn, and in some cases the driver just disabled SLI depending on whether the game supported the feature. This last feature was a hit-or-miss early in driver development.

While the technology was announced in June, it required a motherboard with an nForce4 chipset to enable multi-GPU setups, and these didn?t start reaching the retail channel in numbers until late November. Adding fuel to the fire, initial driver releases where sporadic (at best) until into the following year.
While Nvidia's SLI was announced in June 2004, the required nForce4 motherboards didn't hit the retail channel in numbers until November, and initial driver releases where sporadic until into the following year.

Reviews at the time generally mirrored current performance, showing that two lower tier cards (like the 6600 GT SLI which could be had for $398) generally equalled one enthusiast card at lower resolutions and image quality. At highest resolutions and with antialiasing applied, however, single card setups still gained the upper hand. SLI and ATI?s CrossFire performance was as erratic then as it sometimes is now, running the full gamut from perfect scaling to not working at all.

Nvidia?s board partners immediately saw marketing opportunities with the re-invented tech, with Gigabyte offering a dual 6600 GT SLI card (the 3D1), followed by a dual 6600 (3D1-XL), and the 6800 GT (3D1-68GT). These cards not only required an nF4 chipset but also a Gigabyte branded motherboard as well.

Of the high-end single GPU cards, the 6800 Ultra and X800 XT/XT PE were fairly evenly matched, both in price and performance. But they weren't without their issues. The latter arrived in May and suffered supply constraints throughout its entire production life, while Nvidia?s flagship 6800 Ultra was extremely late arriving in August and suffered supply constraints too depending on distribution area, since the card was only made available by a percentage of board partners.

The 6800 GT generally bested the X800 Pro at $399, while the 6600 GT cleaned up in the $199 bracket.

Intense competition with Nvidia that year didn?t have an adverse effect on ATI?s bottom line, as profit peaked at $204.8 million for the year from nearly $2 billion in revenue.

One quirk associated with the well-received 6600 GT was that it initially launched as a PCI Express card, at a time when PCI-E was an Intel-only feature for motherboards designed for Pentium 4 processors. These chips generally lagged in gaming performance behind AMD?s offerings, which of course used the AGP data bus.

Nvidia?s 7000 series started rolling off the assembly lines well before the 6000 series had completed its model line-up. The 7800 GTX arrived a full five months before the reduced bill of materials (BoM) 6800 GS saw the light of day. The first iteration of the 7800 series was based around the G70 GPU on TSMC?s 110nm process, but quickly gave way to the G71-based 7900 series, made on TSMC?s 90nm process.

While the naming convention changed from ?NV? to ?G?, the latter were architecturally related to the NV40 series of the GeForce 6000. And while only fractionally larger than the NV40-45 at 334mm?, the G70 packed in an extra eighty million transistors (for a total of 302 million), adding a third more vertex pipelines and 50% more pixel pipelines. In most cases, the G70 was superseded within nine months, and in the case of the GS and GTX 512MB, the figure was 3 and 4 months respectively.

At the entry level, the 7100 GS continued the use of TurboCache (the ability for the board to use some system memory), which was introduced with the previous generation GeForce 6200 TC.


At the other end of the spectrum, the 7800 GTX 256MB hit retail on June 22 with an MSRP of $599, though its actual street price was higher in many instances. ATI wrested the single-GPU crown back with the X1800 XT, but Nvidia countered with a 512MB version of the 7800 GTX thirty-five days later and promptly regained the title.

Two months later, ATI launched the X1900 XTX, which traded blows with Nvidia?s flagship. This particular graphics horsepower race resulted in both cards being priced at $650. One spinoff of the cards moving to a 512MB frame buffer was that gaming at 2560x1600 with 32-bit color and a high level of image quality enabled was now possible via dual link DVI.
ATI Crossfire ATI's original CrossFire design
required using an external Y cable
ATI announced their multi-card Crossfire technology in May 2005 and made it available in September with the launch of the Xpress 200 Crossfire Edition chipset, and X850 XT Crossfire Master board. Due to a single-link TMDS, resolution and refresh rates were initially limited to 1600x1200 @60Hz, but a dual-link TMDS for 2560x1600 would soon replace it.

Unlike Nvidia?s solution of two identical cards communicating via a bridge connector, ATI implemented a master card with TMDS receiver, which accepted input from a slave card via external dongle and a Xilinx compositing chip.

Like Nvidia?s SLI, CrossFire offered alternative frame rendering (AFR) and split frame rendering (SFR), but also a rendering technique called SuperTiling. The latter offered a performance increase in certain applications, but it did not work with OpenGL or support accelerated geometry processing. Also like SLI, Crossfire faced its share of driver-related troubles.

ATI intended to have their R520 based cards ? their first to incorporate Shader Model 3.0 ? ready by the June-July timeframe, but the late discovery of a bug in the cell library forced a 4 month delay.

Initial launches comprised the X1800 XL/XT using the R520 core, the X1300 budget cards using the RV515 with essentially one quarter of the graphics pipelines of the R520, and the X1600 Pro/XT based on the RV530, which was similar to the RV515 but with a higher shader and vertex pipeline-to-TMU and ROP ratio.

Due to the initial delay with the R520, the GPU and its derivations were being replaced a scant three and a half months later by the R580-based X1900 series which used TSMC?s new 80nm process. Continuing with the roll out, half the graphics pipeline resources went into the RV570 (X1650 GT/XT and X1950 GT/Pro), while a shrunk RV530 became the RV535 powering the X1650 Pro as well as the X1300 XT.

ATI?s revenue rose to a record $2.2 billion for the year, the highest in the company?s history, aided by shipments of Xenos GPUs for the Xbox 360. Net profit, however, slumped to $16.9 million.

ATI?s revenue rose to a record $2.2 billion in 2005, the highest in the company?s history, aided by shipments of Xenos GPUs for the Xbox 360. Net profit, however, slumped to $16.9 million.

By this stage, any graphics card launch not based on an Nvidia or ATI GPU was received with a certain amount of curiosity, if not enthusiasm. Such was the scene when S3?s overhauled graphics line-up debuted in November.

The Chrome S25 and S27 promised good gaming performance based on their high clocks, but delivered a mostly sub-par product. Initial pricing at $99 (S25) and $115 (S27) put the cards in competition against Nvidia?s 6600/6600GT and ATI?s X1300Pro/X1600Pro, but neither S3 card stood up to the competition in any meaningful way, aside from power consumption. That slight advantage evaporated as ATI/AMD and Nvidia addressed the HTPC and entry-level market segment, effectively killing S3?s subsequent Chrome 400 and 500 series.

An added issue for S3 was that the cost of building the cards resulted in razor thin profits. The company needed high volume sales in a market dominated by two vendors. HTC were to acquire S3 in July 2012 for $300 million, a move originally seen as leverage in HTC?s and S3?s separate legal disputes with Apple.

Nvidia and ATI continued to hog the press coverage in 2006.

ATI acquired Macrosynergy, a Shanghai based design and engineering centre with personnel working in California and previously part of the XGI group. Then in May the company bought BitBoys in a $44 million deal.

Meanwhile, Nvidia?s first foray into dual-GPU single board products came in March, following in the footsteps of ATI, 3dfx, and XGI. The 7900 GX2 would sandwich two custom boards essentially carrying a couple of downclocked 7900 GTXs. But Asustek didn?t wait around for Nvidia?s dual-GPU solution, however, and released its own take as the Extreme N7800GT Dual ($900, 2000 units built), which paired two 7800 GT GPUs instead.

This card started Asus interest in limited edition dual-GPU boards, and possibly hardened Nvidia?s attitude towards board partners?, as Asustek products took the spotlight from their reference models at launch.

In the higher volume mainstream market, the 7600 GT and GS both provided solid performance and remarkable longevity, while ATI?s X1950 XTX and Crossfire ruled the top end enthusiast benchmarks for single GPU cards. The X1900 XT and GeForce 7900 GT were fairly evenly matched in the upper mainstream bracket.

ATI's David Orton and AMD's Hector Ruiz officially announce the historic merger
After twenty-one years as an independent company, ATI was bought out by AMD on October 25 2006 for a total price of $5.4 billion ? split between $1.7 billion from AMD, $2.5 billion borrowed from lending institutions, 57 million AMD shares and 11 million options/restricted stock units valued at $1.2 billion. At the time of the buy out, around 60-70% of ATI?s chipset/IGP revenues were accrued from a partnership with Intel based motherboards.
Two weeks after the ATI buy-out, Nvidia ushered in the age of unified shader architectures for PC graphics.

With a large part of Intel?s IGP chipset market moving to Nvidia, market share dropped dramatically. The logic behind the buy was a seemingly quick path to GPU technology, rather than use the $5.4 billion to develop AMD?s own IP and add licenced technology where needed. At the time, AMD was aiming at the quick introduction of Torrenza and the associated Fusion projects.

Two weeks after the ATI buy-out, Nvidia ushered in the age of unified shader architectures for PC graphics. ATI?s Xenos GPU for the Xbox 360 had already introduced the unified architecture to consoles.

This article is the third installment on a series of four. Next week we'll wrap things up, following the development of Radeon products under AMD's wing, the continued rivalry between GeForce and Radeon CPUs, the transition toward stream processing, and what the present a near future holds for graphics processors.