google-site-verification: google9a9812ee5b832dba.html April 2013 ~ Tech On Tips

Tuesday, April 30, 2013

Crysis 3 Performance Test: Graphics & CPU

The underlying connection was closed: The connection was closed unexpectedly.
Built with CryEngine2, the original Crysis raised the bar for PC gaming graphics in 2007 with the most stunningly detailed visuals that crippled even the fastest of the rigs. Looking back at our first Crysis performance article, which was based on the game's demo, the fastest GPU available at the time (the GeForce 8800 GTX 768 MB) struggled to average 30 fps when running at 1920 x 1200 with high quality settings is the DirectX 10.

Given how on "punishing the first game was, we were excited to explore the 2011 's CryEngine 3-based Crysis 2, but it was quickly apparent that the second installment wouldn't be a repeat performance. Not to say it didn't look better, but relative to Crytek's in the first title, the sequel didn't really set any new benchmarks. It was just another computer game that made great use of DX9, though DX11 was eventually patched in.

Fast forward two years and Crytek has given us another opportunity to hammer some hardware with the arrival of Crysis 2 this month. Like the second title, the third installment has been built with the CryEngine 3, though that doesn't mean you should expect lousy PC features, as the engine has been updated with improved dynamic cloth and vegetation, better lighting and shadows, and plenty more.
Crysis 3 benchmarks

Plus, PC gamers won't have to wait for graphical extras. Crysis 3 launched with high resolution textures, DX11 support and plenty of customization options that set it apart from the diluted consoles builds. The result looks incredible and we get the feeling this will prove to be the game that folks who are heavily best-invested in multi-GPU setups have been waiting for. Here's hoping we aren't woefully disappointed.

We'll be testing 18 DirectX 11 graphics card configurations from AMD and Nvidia, which is considerably less than the 29 me tested for Far Cry 3 because even with the medium quality preset activated, there are almost no low-to high-end graphics cards that can play Crysis 3, even at 1680 x 1050.

The latest drivers will be used, and every card will be paired with an Intel Core i7-3960X to remove the CPU bottlenecks that could influence the high-end GPU scores.

We're using Fraps to measure frame rates during the 90 seconds of gameplay footage from Crysis 3 's in the first level, "Post Human". The test starts as soon as Michael "Psycho" Sykes hands you his backup weapon, we then simply follow the party leader until the time runs out.
Crysis 3 benchmarks

We'll test in Crysis 3 at three common desktop display resolutions: 1680 x 1050, 1920 x 1200 and 2560 x 1600, using the DX11 mode. For the very high-quality test, we'll set the "overall quality" in the menu to the very high video quality while also setting the SMAA level to 1 (low). The high and medium-quality tests will also be conducted with SMAAx1 enabled.
Gigabyte Radeon HD 7970 GHz Edition (3072 MB), Gigabyte Radeon HD 7970 (3072 MB), Gigabyte Radeon HD 7950 Boost (3072 MB), Gigabyte Radeon HD 7950 (3072 MB), AMD Radeon HD 7870 (2048MB) AMD Radeon HD 7850 (2048MB) of the HIS Radeon HD 7770 (1024 MB) of the HIS Radeon HD 69 (2048MB), Gigabyte GeForce GTX 680 (4096MB), Gigabyte GeForce GTX 680 (2048MB), Gigabyte GeForce GTX 670 (2048MB), Gainward GeForce GTX 660 Ti (2048MB), Gigabyte GeForce GTX 660 (2048MB), Gigabyte GeForce GTX 650 Ti (2048MB), Gigabyte GeForce GTX 580 (1536MB)The Gigabyte GeForce GTX 560 Ti (1024 MB) Nvidia GeForce GTX 480 (1536MB), Intel Core i7 Extreme Edition 3960X (3.30 GHz) x 4 4 GB g.skill DDR3-1600 (CAS 8-8-8-20), the Gigabyte G1.Assassin2 (Intel X 79) OCZ ZX Series 1250wCrucial m4 512 GB (SATA 6 GB/s), Microsoft Windows 7 SP1 64-bitNvidia Forceware 314.07 AMD Catalyst 8.2 (Beta 6)

BioShock Infinite Review

AppId is over the quota Translate Request has too much data Parameter name: request


Depending on where in the country you live, you might have the nagging feeling that portions of the United States have broken clean off. Just up and decided to veer into their own orbits, consequences be damned.

And, because any notion that these States aren't United is a really uncomfortable one, a cacophony of voices all jostle to demonize their opposing factions. They say that the other side has selfishly detached themselves from the reality that we Americans are all supposed to share. How dare they?!

BioShock Infinite takes that nagging feeling of disunion and makes players wade through its century-ago antecedent, in a way that lays bare the agonizing personal costs paid to the grinding cycle of history.

Columbia is a chunk of America that has in fact gone to pieces. And the result is horrifying. But beautiful, too.


You haven't been to a place like this before. The fictional floating city where Infinite is set is all clockwork platforms and brass gears, its many sections populated with hucksters, strivers, lovers and schoolchildren. One minute, you're walking past a sheer drop, the next a park swings down into the open space. Sure, they seceded from the Union but it's such a bloom-lit paradise that you almost can't blame them.
BioShock Infinite Review Why: BioShock Infinite isn't just a worthy sequel to a much-loved predecessor. It also manages to be about America - touching on its past, present and possibilities ? in a way that makes it a must.
Developer: Irrational Games
Platforms: PC, PS3, Xbox 360 (version played)
Release Date: March 26
Type of game: First-person shooter that's also a twisted fable about American history What I played: Played through all of the campaign in about 10-12 hours.. My Two Favorite Things Excellent voicework all around ? anchored by Troy Baker and Courtnee Draper as Booker and Elizabeth ? makes this feel like a operatically violent radio play.Columbia is filled with gorgeous architecture and design. Fighting for survival in a place this pretty doesn't make the bullets hurt any less but does inspire awe.Two Things I Hated Despite how its wide-open spaces look, part of Infinite can feel very much llke they're on rails.Certain sections are just straight-up killboxes that will grind you up like hamburger.Made-to-Order Back-of-Box Quotes "Add Columbia to the list of video game places I wish actually existed, only without all the racism/sexism/spacetime tomfoolery." Evan Narcisse, Kotaku.com"Like Portal 2, BioShock Infinite is a sequel that builds on and maybe even surpasses the original game." Evan Narcisse, Kotaku.com
Then you come along. Well, not you. Booker DeWitt--the former soldier and Pinkerton agent players control -- isn't a cipher meant for you to occupy, like the mute protagonist from BioShock 1. He's his own man with a voice, a checkered past and reasons for staging a one-man invasion. Debt weighs heavy on his soul and the only way he can come clear of it is to fetch a supernaturally powered young woman named Elizabeth. If he gets her to the people who want her, then he might be able to get on with the rest of his life.

If you subscribe to the idea that there is in fact a formula for making a BioShock game, then Infinite will only support your thesis. Mix up sci-fi archetypes, comic-book super-science, ideologically driven conflict and old-school first-person-shooter love with narrative ambition and philosophical discord. The player character's special abilities get wielded through the left hand while weapons get gripped by the right and he must wend his way through an isolated city-state in turmoil. Once you do that, you have--in broad strokes -- the component parts of the games that have been called BioShock.

The powers you wield this time are called Vigors. You can mix and match them so that you can electrify a flock of crows after flinging them at an enemy. Or you can hold the soldiers you're fighting aloft and then set them on fire with telekinetic and pyrokinetic Vigors.

Important moments of choice have been another hallmark of BioShock games. This time out, the importance of choice isn't in where you wind up plot-wise. It's in how you play. The method in which you cobble together the upgrades you find with Elizabeth's combat support and the amazing verticality of the game's battlegrounds will leave you with a unique experience that you can transform as you go. Couple that with the various firearms and Vigors you'll collect and Infinite's play feels like it gives you more tools and a faster pace to use in an expertly crafted playground.

For all that's familiar, Irrational Games' new release does add new seasoning to that BioShock recipe. One of the big changes is in basic locomotion. Columbia's mass transit is a series of snaking pipeworks called Skylines and they provide a thrilling, vertiginous way to get around the city. They feel like a one-man roller coaster that you can shoot at people from. Aside from that, you can pounce on enemies from way on high or rain down gunshots while zipping along. And enemies will do the same to you, so these aren't an easy way out of most battles.

But it's the character of Elizabeth who represents the biggest change to the BioShock formula, which up until now gave you scant companionship on your adventures in Rapture. At first, Elizabeth might remind you in a broad way of the dog from Fable II. That pooch found you loot and helped you get around the world of Albion. You formed a simple but meaningful bond with it.

Elizabeth is far more complex. She's a fully scripted persona who aids you in combat and in scavenging, by finding and supplying health, money and ammo. Most impressively, she can manipulate tears, which are space-time hiccups that let her pull things from alternate reality through to this world. Discount vending stations, machine-gun turrets and grapple points are just a few of the assets she can summon for you. Which tears you have her manifest will affect the strategy options you have during a firefight and this branching opens up the uniqueness of the strategies available.


From an emotional perspective, things change immediately when you meet Elizabeth. She's na?ve, but with strong streaks of curiosity and desperation running through her. A skybound city doesn't feel like paradise when it's all you've ever known and she yearns to experience the world below. Columbia founder Father Comstock is a religious zealot, one who commands a city of totally obedient martyrs. When he tells them not to fight, it's far creepier than when you're battling them. He means to use Elizabeth's abilities to deliver an apocalyptic judgment to the America beneath him. But Comstock must also deal with an proletariat insurgency by the Vox Populi, who want to topple what they see as a corrupt oligarchy.


Elizabeth alternately wants to impress Booker and run away from him. They need each other and she never feels like a stack of AI scripts walking alongside you. When she throws you a health pack in a firefight, her need for you to survive is palpable. She's haunted by a lack of a past while Booker is chased by a history too full of blood. Together, their shared journey moves from wariness to warmth to resolution with real poignancy.

For all the talk of parts, this game is more than just the sum of its pieces. You're playing for story here, and that story is embedded through the entire fabric of Infinite. The more you explore Columbia, the more its made-up citizens and history pull you in. There's a mystery swirling around the clouds that surround the city and it kept me guessing until the very end of the game.

Early on, you get signs that something more than mere isolationism is amiss in Columbia. Those tears in reality's fabric are a tease to the main conceit of the game, with the gambit being nothing less than the re-writing of American history. Columbia's already well down that road as its spiritual revisionism has made demigods of George Washington, Thomas Jefferson and Ben Franklin. But you can't shake the creeping sense that many things are going to happen as a result of your actions. They do and they're all pretty weighty.


Given the fact that there's inevitable conflict waiting to joined, you might think that a repudiation of American self-aggrandizement is all there is to BioShock Infinite. The uppity sky-dwellers in Columbia need to be taken down a peg, right? But what's more surprising than the rude awakenings is the degree to which Infinite is a celebration of Americana. It's a game squeezed out of Normal Rockwell paintings, set to ragtime music and filled to the brim with jaunty bygone slang. It zips and zings, even with it's beating you down with giant robot president enemies.

And, yes, Irrational creative director Ken Levine and crew are lobbing a slew of scholarly -isms for players to chew through: racism, sexism, anti-intellectualism, 18th century revivalism and the gospel of industrialism as a cradle-to-grave caretaker of the worker. The tribalism that's inextricably part of America's spiritual DNA is a big part of the game's factions and battlefields, too. The Vox Populi--made of common-man laborers--think they have too little while the well-to-do Founders essentially believe that Comstock's vision of America is a better one than the one lived on solid ground.



If you're acquainted with the language of revolution and regime change, then lots of the rhetoric slung across the conflicts in Infinite will ring familiar to you. Opponents from different classes and backgrounds slander each other. Divine/universal logic is on our side. That kind of thing. The difference is that Infinite places players in the fires of tumult and shows them the result of bloody revolts up close. Most of the people you overhear in Infinite are racist, classist, snooty and surly. Yet you feel bad for them as some of the illusions keeping Columbia aloft begin to crumble. It's a hell of a thing to believe in a dream with all your being, for both good and bad reasons.

BioShock Infinite may not the first game to try to say something about the very nature of the country it was made in--and the people who make it up--but it's certainly amongst the best. Some scenes reminded me of how people who looked like me had an unbelievable array of prejudicial forces from public and private institutions set against them. Yet, even as I played through those moments, I was reminded that America is a big experiment. That experiment in letting people chart their own destinies has sometimes made it so brother fights against brother.

It's easy to dismiss those people floating in the fractured mirror Americas that we disagree with. They're wrong; we're right. Who cares why they are the way they are? But BioShock Infinite asks us to consider that very question and gives an answer that mixes hope with bitterness, wonder with despair and allegory with history. The game doesn't offer any advice about how to make everyone get along better but it makes a powerful argument for owning-- and owning up to--all of our collective past.

Gigabyte GeForce GTX Titan Review

The underlying connection was closed: The connection was closed unexpectedly.
Nvidian Kepler arkkitehtuuri debytoi vuosi sitten GeForce GTX 680, joka on istunut hieman mukavasti markkinoiden Suosituimmat yhden GPU näytönohjain, pakottaa AMD alentaa hintoja ja käynnistää erityinen HD 7970 GHz Edition kortti kuilun arvo. Huolimatta eläimellinen kilpailijansa, monet uskovat, että Nvidia oli tarkoitus tehdä sen 600-sarja lippulaiva vieläkin nopeammin käyttämällä GK110 siru, mutta määrätietoisesti hallussa takaisin GK104 säästää rahaa, koska se oli kilpailukykyinen tarpeeksi performance-wise.

Tämä ei tarkoita välttämättä pettynyt GTX-680 ihmistä. 28Nm osa pakkauksissa 3540 miljoonaa transistoria pienehkö 294 mm 2 kuolee ja antaa 18,74 tarjoaa per watti muisti kaistanleveys 192.2 GB/s, kun se kolminkertaistui GTX 580 CUDA-ytimet ja kaksinkertaistui TAUs--no small feat, olla varma. Kuitenkin me kaikki tiesimme, GK110 oli olemassa ja oli innokas näkemään, miten Nvidia toi kuluttajamarkkinoille--jos se edes päättänyt. Onneksi että odotus on nyt ohi.

Jälkeen päällään 12 kuukautta yhden GPU suorituskyvyn kruunu, GTX-680 on vallasta uusi GTX Titan. Ilmoitti 21. helmikuuta, titaani kantaa GK110 GPU transistori-määrä, joka on yli kaksinkertaistunut GTX 680 3,5 miljardista huikea 7,1 miljardia. Osa on noin 25-50 prosenttia enemmän varoja käytettävissään kuin Nvidian edellisen lippulaiva, kuten 2688 stream-prosessoria (jopa 75 %), 224 muokkausaine yksikköä (myös jopa 75 %) ja 48 rasteri operations (terve 50 % lisää).

Jos olet utelias, on syytä huomata, että on "vain" arvioidaan olevan 25-50 prosenttia suorituskyvyn saada, sillä titaani on saapunut pienempi kuin GTX-680. Nämä odotukset on kohtuullista olettaa, että Titan hinnoiteltu noin 50 % palkkio, joka olisi noin 700 dollaria. Mutta ei ole mitään oikeudenmukaista Titaanien hinnoittelusta--ja ei tarvitse olla. NVIDIA on markkinointi kortin äärimmäisen nopea ratkaisu äärimmäinen pelaajille syvät taskut, jossa Suositushinta on mahtava 1000 dollaria.

Tuo Titan dual-GPU GTX 690's alueella tai noin 120 % enemmän kuin GTX-680. Titaani ei tule hyvä lisäarvoa hinta vs. suorituskyky, mutta Nvidia on epäilemättä tietoinen tästä ja jossain määrin meidän täytyy kunnioittaa sitä niche-ylellisyystuote. Että mielessä, let's nostaa Titaanien huppu ja nähdä, mikä tekee siitä käydä ennen kuin se läpi meidän tavallista vertailuarvoja, haaste, johon nyt sisältyy runko latenssi mittauksia--lisää, että vähän.

GeForce Titan on totta käsittely-powerhorse. GK110 siru kantaa 14 SMX yksikköä 2688 CUDA ydintä, tarjoaa jopa 4,5 teraliukulukutoimitusta peak laskea suorituskykyä.

Kuten edellä todettiin, Titan tarjoaa core kokoonpanon, joka koostuu 2688 SPUs, 224 TAUs ja 48 ROPs. Kortin muisti subsystem kuuluu kuusi 64-bittinen muistin ohjaimet (384-bittinen) 6 gt GDDR5-muistia käydessä 6008MHz, joka toimii ulos huippu kaistanleveyden käyttö 288,4 GB/s--50 % enemmän kuin GTX-680.

Meillä Titan varustettiin Samsung K4G20325FD FC03 GDDR5-muistipiirejä, jotka on luokiteltu 1500 MHz--sama kuin voit löytää viittaus GTX 690.

Jos titaani alittaa GTX-680 on sen ytimen kellotaajuus, joka on asetettu 836 MHz vs. 1006 MHz. Että 17 % ero muodostuu hieman vauhtia kellon, Nvidian dynaaminen frequency ominaisuus, joka voi työntää Titan peräti 876MHz.

GTX Titan sisältää oletusarvon mukaan kaksi dual-link DVI-porttia, HDMI-portti ja yksi DisplayPort 1.2 liitin. Tukea 4K resoluution näyttöjä olemassa, vaikka on myös mahdollista tukea enintään neljä näytöt näytöt.

Historian moderni näytönohjain, osa 3

The formatter threw an exception while trying to deserialize the message: Error in deserializing body of request message for operation 'Translate'. The maximum string content length quota (30720) has been exceeded while reading XML data. This quota may be increased by changing the MaxStringContentLength property on the XmlDictionaryReaderQuotas object used when creating the XML reader. Line 2, position 31713. The formatter threw an exception while trying to deserialize the message: Error in deserializing body of request message for operation 'Translate'. The maximum string content length quota (30720) has been exceeded while reading XML data. This quota may be increased by changing the MaxStringContentLength property on the XmlDictionaryReaderQuotas object used when creating the XML reader. Line 1, position 32355.
With the turn of the century the graphics industry bore witness to further consolidation.

The pro market saw iXMICRO leave graphics entirely, while NEC and Hewlett-Packard both produced their last products, the TE5 and VISUALIZE FX10 series respectively. Evans & Sutherland also parted ways with the sale of its RealVision line to focus on the planetaria and fulldome projection systems.

In the consumer graphics market, ATI announced the acquisition of ArtX Inc. in February 2000, for around $400 million in stock. ArtX was developing the GPU codenamed Project Dolphin (eventually named ?Flipper?) for the Nintendo GameCube, which added significantly to ATI?s bottom line.
ATI GameCube GPU
Also in February, 3dfx announced a 20% workforce cut, then promptly moved to acquire Gigapixel for $186 million and gained the company?s tile-based rendering IP.

Meanwhile, S3 and Nvidia settled their outstanding patent suits and signed a seven-year cross-license agreement.

VIA assumed control of S3 around April-May which itself was just finishing a restructuring process from the acquisition of Number Nine. As part of S3?s restructuring, the company merged with Diamond Multimedia in a stock swap valued at $165 million. Diamond?s high-end professional graphics division, FireGL, was spun off as SONICblue and later sold to ATI in March 2001 for $10 million.

3DLabs acquired Intergraph?s Intense3D in April, while the final acts of 3dfx played out towards the end of the year, despite 2000 kicking off with the promise of a better future as the long-awaited Voodoo 5 5500 neared its debut in July. The latter ended up trading blows with the GeForce 256 DDR and won the high-resolution battle.

Where 3dfx was once a byword for raw performance, its strengths around this time laid in its full screen antialiasing image quality.

But where 3dfx was once a byword for raw performance, its strengths around this time laid in its full screen antialiasing image quality. The Voodoo 5 introduced T-buffer technology as an alternative to transformation and lighting, by basically taking a few rendered frames and aggregating them into one image. This produced a slightly blurred picture that, when run in frame sequence, smoothed out the motion of the animation.

3dfx?s technology became the forerunner of many image quality enhancements seen today, like soft shadows and reflections, motion blur, as well as depth of field blurring.

3dfx?s swan song, the Voodoo 4 4500, arrived October 19 after several delays ? unlike the 4200 and 4800 that were never released. The card was originally scheduled for spring as a competitor to Nvidia?s TNT2, but ended up going against the company?s iconic GeForce 256 DDR instead, as well as the much better performing GeForce 2 GTS and ATI Radeon DDR.

On November 14, 3dfx announced they were belatedly ceasing production and sale of their own-branded graphics cards, something that had been rumoured for some time but largely discounted. Adding fuel to the fire, news got out that upcoming Pentium 4 motherboards would not support the 3.3V AGP signalling required Voodoo 5 series.


The death knell sounded a month later for 3dfx when Nvidia purchased its IP portfolio for $70 million plus one million shares of common stock. A few internet wits later noted that the 3dfx design team which had moved to Nvidia eventually got both their revenge and lived up to their potential, by delivering the underperforming NV30 graphics chip powering the FX 5700 and FX 5800 cards behind schedule.

Prior to the Voodoo 5?s arrival, ATI had announced the Radeon DDR as ?the most powerful graphics processor ever designed for desktop PCs.? Previews of the card had already gone public on April 25, and only twenty-four hours later Nvidia countered with the announcement of the GeForce 2 GTS (GigaTexel Shader). The latter included Nvidia?s version of ATI?s Pixel Tapestry Architecture, named Nvidia Shading Rasterizer, allowing for effects such as specular shading, volumetric explosion, refraction, waves, vertex blending, shadow volumes, bump mapping and elevation mapping to be applied on a per-pixel basis via hardware.

The feature was believed to have made it to the previous NV10 (GeForce 256) chip but it remained disabled due to a hardware fault. The GTS also followed ATI?s Charisma Engine in allowing for all transform, clipping and lighting calculations to be supported by the GPU. That said, ATI went a step further with vertex skinning for a more fluid movement of polygons, and keyframe interpolation, where developers designed a starting and finishing mesh for an animation and the Charisma core calculated the intervening meshes.


The ATI Radeon DDR eventually launched for retail in August 2000. Backed by a superior T&L implementation and support for several of the upcoming DirectX 8 features, the Radeon DDR alongside the GeForce 2 GTS ushered in the use of DVI outputs by integrating support for the interface into the chip itself. The DVI output was more often found on OEM cards, however, as the retail variety usually sported VIVO plugs.

One downside to the Radeon DDR is that boards shipped with their core and memory downclocked from the promised 200MHz and 183MHz, respectively. In addition, drivers were once again less than optimal at launch. There were issues with 16-bit color and compatibility problems with VIA chipsets, but this did not stop the card from dominating the competition at resolutions higher than 1024x768x32. A price of $399 for the 64MB version stacked up well versus $349-399 for the 64MB GeForce 2 GTS, which it beat by a margin of 10-20% in benchmarks, and helped ATI maintain its number one position in graphics market share over Nvidia.

Nvidia wasn?t doing all that bad for themselves either. The company reported net income of $98.5 million for the fiscal year on record revenue of $735.3 million, driven in large part by its market segmentation strategy, releasing a watered-down MX version of the card in June and a higher clocked Ultra model in August. The latter dethroned the Radeon in terms of performance but it also cost $499. A Pro model arrived in December.

Besides releasing a GeForce 2 card at every price point, from the budget MX to the professional Quadro 2 range, Nvidia also released its first mobile chip in the form of the GeForce2 Go.
By the time 2001 dawned, the PC graphics market consisted of a discrete card duopoly, with both of them in addition to Intel supplying the vast majority of integrated graphics chipsets.

As 3dfx was undergoing its death throes in November, Imagination Tech (ex-VideoLogic) and ST Micro attempted to address the high volume budget market with the PowerVR series 3 KYRO. Typically ranging in price from $80 to $110 depending on the memory framebuffer, the card represented good value for the money in gaming at resolutions of 1024x768 or lower. It would have become more popular, had the GeForce2 MX arrived later, or not so aggressively priced at ~$110.

The KYRO II arrived in April 2001 with a bump in clock speeds compared to the original and manufactured on a smaller 180nm process by ST Micro. But once again the card faced stiff competition from the GeForce 2 MX. Nvidia rebadged the card as the MX200 and lopped 40% off its price, while adding a higher clocked MX400 card at the same price as the Kyro II.

When PowerVR failed to secure game development impetus for tile based rendering, and ST Micro closed down its graphics business in early 2002, Imagination Technologies moved from desktop graphics to mobile and leveraged that expertise into system on chip graphics. They licenced the Series 5/5XT/6 for use with ARM-based processors in the ultra portable and smartphone markets.

By the time 2001 dawned, the PC graphics market consisted of a discrete card duopoly, with both of them in addition to Intel supplying the vast majority of integrated graphics chipsets.

Meanwhile, Matrox and S3/VIA clung to the margins of traditional markets.

Building on the strides made with the GeForce 2 series, Nvidia unveiled the GeForce 3 on February 27, 2001 priced between $339 and $449. The card became the new king of the hill, but it really only came into its own at the (then) extreme resolution of 1600x1200, preferably with full screen antialiasing applied.


Initial drivers were buggy, especially in some OpenGL titles. What the new GeForce did bring to the table was DirectX 8, multisampling AA, quincunx AA (basically 2xMSAA + post process blur), 8x anisotrophic filtering as well as the unrivalled ability to handle 8xAF + trilinear filtering, and a programmable vertex shader which allowed for closer control of polygon mesh motion and a more fluid animation sequence.

There was also LMA (Lightspeed Memory Architecture) support -- basically Nvidia's version of HyperZ -- for culling pixels that would end up hidden behind others on screen (Z occlusion culling) as well as compressing and decompressing data to optimize use of bandwidth (Z compression).

Lastly, Nvidia implemented load-balancing algorithms as part of what they called the Crossbar Memory Controller, which consisted of four independent memory sub-controllers as opposed to the industry standard single controller, allowing incoming memory requests to be routed more effectively.
Nvidia NV2A
Nvidia NV2A inside Microsoft's Xbox
Nvidia?s product line later added the NV2A, a derivative of the GeForce 3 with GeForce4 attributes that was used in Microsoft's Xbox game console.

At this point, Nvidia controlled 31% of the graphics market to Intel?s 26% and ATI?s 17%.

As Nvidia complemented the GF3 line-up with underclocked Ti 200 and overclocked Ti 500 models, ATI hurried to ramp up deliveries of the Radeon 8500. The card was built around the R200 GPU using TSMC?s 150nm process (the same used by GeForce 3?s NV20). The chip had been announced in August and was eagerly awaited since John Carmack of id software talked it up saying it would run the new Doom 3 ?twice as well? as the GeForce 3.

ATI?s official R8500 announcement was no less enthusiastic. But reality kicked in once the card launched in October and was found to perform at the level of the underclocked GF3 Ti 200 in games. Unfinished drivers and a lack of workable Smoothvision antialiasing weighted heavily against the R8500 in its initial round of reviews. By the time the holiday season arrived, a second round of reviews showed that the drivers had matured to a degree and raised the R8500?s performance in-between the Ti 200 and the standard GF3.

Texture units per pixel pipeline

Very competitive pricing and a better all around feature set (2D image quality, video playback, performance under antialiasing) made the card a worthy competitor to the GF3 and Ti 500 nonetheless.

ATI?s sales for the year dropped to $1.04 billion as the company recorded a net loss of $54.2 million. The company began granting licenses to board partners to build and market graphics boards, while refocusing their resources on design and chip making.
ATI Xilleon

ATI also debuted the Set-Top-Wonder Xilleon, a development platform based on the Xilleon 220 SoC which provided a full processor, graphics, I/O, video and audio for set-top boxes integrated into digital TV designs.

To complement Xilleon, ATI acquired NxtWave Communications for $20 million in June 2002. The company specialized in digital signal processing and applications for set-top boxes and terrestrial digital solutions.

Keeping up with their product launch cycle, Nvidia released the GeForce 4 in February 2002. Three MX parts, three mobile parts based on the MX models, and two performance Titanium models (Ti 4400 and Ti 4600) made up the initial line up -- built on TSMC?s 150nm process. The GeForce 4 was effectively ready for release two months earlier but the launch was delayed to avoid eating into GeForce 3 sales over the holiday season.

The MX series cards were intended for the budget segment but they were still largely uninspiring as they were based on the old GeForce 2 architecture. MPEG2 decode added but the cards reverted to DirectX 7.0/7.1 support as the earlier GF2 MX line. Pricing at $99-179 reflected the reduced feature set.

The Titanium models on the other hand were excellent performers and in some instances managed a 50+% increase in performance over the GeForce3 Ti 500. The Ti 4600 became the performance champ overnight, easily disposing of the Radeon 8500, while the Ti 4200 at $199 represented the best value for money card.

But then came the Radeon 9700 Pro and promptly consigned every other card to also-ran status.

ATI Radeon 9700 Pro (FIC A97P)
Developed by a team that had originally formed the core of ArtX, the ATI R300 GPU delivered spectacularly and arrived very promptly. It was the first to bring DirectX 9.0 support, and by extension, the first architecture to support shader model 2.0, vertex shader 2.0, and pixel shader 2.0. Other notable achievements: it was the second GPU series to support AGP 8x -- SiS?s Xabre 80/200/400 line was first -- and implementing the first flip-chip GPU package.


About flip-chip GPU packages: Previous generations of graphics chips and other ICs used wire-bonding mounting. With this method, the chip sits on the board with the logic blocks sitting under the metal layers whose pads would be connected by thin wires arranged around the edges of the chip down to solder balls or pins on the underside. Flip?chip does away with the wire component through contact points (usually soldered in a ball grid array) directly on the ?top? of the chip, which is then inverted, or ?flipped? so that the solder points directly contact the substrate or circuit board. The chip then undergoes localised heating (reflow) to melt the solder that then forms the connection with the underlying contact points of the board.

ATI complemented the line-up in October by adding a non-Pro 9700 at $299 for those unable to part with $399 for the top model. Meanwhile, the cut down 9500 Pro ($199) and 9500 ($179) reached down through mainstream market segments, and the FireGL Z1/X1 filled in the $550-950 bracket for professional graphics. The All-In-Wonder 9700 Pro ($449) was also added in December.

ATI?s sales are likely to have taken a hit when it was found that many cards could be modded to their more expensive counterparts. Examples of this included the ability to turn a 9500 card into a 9700 using its reference board (with the full complement of memory traces), or a 9800 Pro to its XT counterpart. For the latter, a driver patch was made available to check if it would accept the mod, which consisted of soldering in a resistor or using a pencil to tweak the GPU and memory voltage control chip. Hard mods also included upgrading various 9800 models into a FireGL X2, while a patched/Omega driver had the ability to turn a $250 9800 SE 256MB into a $499 9800 Pro 256MB.

In addition to discrete graphics, ATI also introduced desktop integrated graphics and chipsets. These included the A3/ IGP 320 meant to be paired with AMD CPUs, RS200/IGP 330 & 340 for Intel chips, as well as the mobile series U1/IGP 320M for AMD platforms and RS200M for Pentium 4-M. All of them were complemented with ATI southbridges, specifically the IXP200/250.

SiS unveiled the Xabre line between the launch of the GeForce4 and the R300. The cards were consistently slower than Nvidia and ATI?s offerings at the same price points, and were handicapped by the lack of vertex shader pipelines. This translated into a heavy reliance upon drivers and game developers to get the most out of software emulation, thus keeping SiS in the margins of desktop discrete 3D graphics.

The Xabre line also implemented ?Turbo Texturing?, where framerates were increased by drastically reducing texture quality, and lacked anisotrophic filtering . All this did little to endear reviewers to the cards.

The Xabre line was the last under the SiS banner, as the company spun off its graphics division (renamed XGI) and merged with Trident Graphics a couple of months later in June.

The first of Nvidia?s FX series arrived on January 27, 2003 with the infamous ?Dustbuster? FX 5800 and the slightly faster (read: less slow) FX 5800 Ultra. When compared to the reigning champ, the ATI Radeon 9700 Pro (and non-Pro), the FX was much louder, it delivered inferior anisotrophic filtering (AF) quality and antialiasing (AA) performance, and was overall much slower. ATI was so far ahead that a second-tier Radeon 9700 card launched five months earlier comfortably outperformed the Ultra, and it was $100 cheaper ($299 vs $399).
The 3dfx design team which had moved to Nvidia got both their revenge and lived up to their potential, by delivering the underperforming NV30 graphics chip behind schedule.

The NV30 chip was supposed to debut in August, around the same time as the Radeon 9700, but ramping problems and high defect rates on TSMC?s Low-K 130nm process held Nvidia back. Some circles also argued that the company was strapped for engineering resources, with more than a few tied up with the NV2A Xbox console chip, the SoundStorm APU, as well as the motherboard chipsets.

Looking to move things forward Nvidia undertook a project to have several FX series chips fabricated on IBM?s more conventional Fluorosilicate glass (FSG) low-K 130nm process.

ATI refreshed its line of cards in March, starting with the 9800 Pro, featuring a R350 GPU that was basically an R300 with some enhancements to the Hyper-Z caching and compression instruction.

The RV350 and RV280 followed in April. The first of these, found inside the Radeon 9600, was built using the same TSMC 130nm low-K process that Nvidia had adopted, Meanwhile, the RV280 powering the Radeon 9200 was little more than a rebadged RV250 of the Radeon 9000 with AGP 8x support.
ATI Xbox GPU
The same month saw ATI and Nintendo sign a technology agreement that would eventually lead to the Hollywood GPU for the Nintendo Wii console. ATI added a second console coup in August, when Microsoft awarded the Xbox 360 GPU contract to them.

A scant three and a half months after the inglorious debut of the FX 5800, Nvidia took another shot with the NV35 (FX 5900 and FX 5900 Ultra). The new Detonator FX driver greatly improved AA and AF, almost matching ATI's solution in terms of quality. However the 5900 achieved what the 5800 could not. It knocked ATI?s Radeon 9800 Pro from its spot as the fastest card around, although at $499 apiece, few would actually take advantage of this.

As expected, ATI regained bragging rights in September with the release of the 9800 XT. Superior driver support ? mainly with some DX9 games ? also made the XT a better overall card than Nvidia?s counterpart, ensuring that ATI ended the year with the performance crown. The 9700 Pro remained the standout mainstream board, while the FX 5700 Ultra at $199 won the sub-$200 price segment.

ATI bounced back with a $35.2 million profit in 2003 after posting a $47.5 million loss in 2002. A good chunk of this came from higher selling prices for the dominant 9800 and 9600 cards. Meanwhile, Nvidia retained 75% of the DirectX 9 value segment market, thanks to the popularity of the FX 5200.
Source DirectX 9.0 Effects Trailer, shown during ATI's presentation of the Radeon 9800 XT and 9600 XT
The newly formed XGI launched the Xabre successor in a staggered release between September and November. Renamed Volari, the card line-up ranged from the $49 V3 to the dual GPU Duo V8 Ultra. The V3 was virtually a rebrand of Trident?s Blade XP4 and a DX 8.1 part, while the rest of the series (V5 and V8) was developed from the previous SiS Xabre and featured DX9.0 support.

For the most part, all of the models underdelivered, with the exception of the entry-level V3 which offered performance equal to the GeForce FX 5200 Ultra and and Radeon 9200. The Duo V8 Ultra was priced ~20% higher than the Radeon 9800 Pro 128MB, yet delivered performance on par or lower than the 9600XT.

Another company making a comeback into desktop graphics was S3. Unfortunately, the buying public now generally saw desktop graphics as a two horse race ? and S3 wasn?t one of the two.

XGI?s Volari line lingered on with the 8300 in late 2005, which was more or less on par with the Radeon X300SE/GeForce 6200 at $49, as well as the Z9/Z11 and XP10. The company was reabsorbed back into SiS in October 2010.

Another company making a comeback into desktop graphics was S3. After the graphics division was sold to VIA for $208 million plus the company?s $60 million debt, the restructured venture concentrated primarily on chipset projects.

DeltaChrome desktop cards were announced in January, but in time-honoured S3 fashion, the first S4 and S8 models didn?t start appearing in the retail channel until December. The new cards featured most of the new must-haves of 2003; DirectX 9 support, 16x AF, HD 1080p support, and portrait-mode display support.

Unfortunately, the buying public now generally saw desktop graphics as a two horse race ? and S3 wasn?t one of the two. While S3 was looking to keep competitive, ATI and Nvidia were driving each other to achieve ever-increasing levels of performance and image quality.

The DeltaChrome was succeeded by the GammaChrome in 2005.

Nvidia and ATI continued in 2005 their staggered launches. The former launched its first GDDR3 card in March as the FX 5700 Ultra, followed by the GeForce 6 series with the high-end 6800 range. The initial line up comprised the 6800 ($299), GT ($399), the Ultra ($499), and an overclocked variant known as the Ultra Extreme ($549) to counter ATI?s X800 XT Platinum Edition. The latter was sold by a select band of add-in board partners.

The 6800 Ultra 512MB was added on March 14 2005 and sold for the unbelievable price of $899 -- BFG added an overclocked version for $999. The midrange was well catered for with the 6600 series in September.

Nvidia?s feature set for the 6000 series included DirectX 9.0c support, shader model 3.0 (although the cards were never able to fully exploit this), Nvidia?s PureVideo decode and playback engine, and SLI support -- the multi-GPU performance multiplier IP that was acquired from 3dfx.

Reintroducing an old feature: SLI
Where the 3dfx implementation resulted in each processing unit being responsible for alternate line scans, Nvidia handled things in a few different ways. The company implemented split frame rendering (SFR), in which each GPU rendered the top or bottom half of the frame, alternate frame rendering (AFR) so GPUs rendered frames in turn, and in some cases the driver just disabled SLI depending on whether the game supported the feature. This last feature was a hit-or-miss early in driver development.

While the technology was announced in June, it required a motherboard with an nForce4 chipset to enable multi-GPU setups, and these didn?t start reaching the retail channel in numbers until late November. Adding fuel to the fire, initial driver releases where sporadic (at best) until into the following year.
While Nvidia's SLI was announced in June 2004, the required nForce4 motherboards didn't hit the retail channel in numbers until November, and initial driver releases where sporadic until into the following year.

Reviews at the time generally mirrored current performance, showing that two lower tier cards (like the 6600 GT SLI which could be had for $398) generally equalled one enthusiast card at lower resolutions and image quality. At highest resolutions and with antialiasing applied, however, single card setups still gained the upper hand. SLI and ATI?s CrossFire performance was as erratic then as it sometimes is now, running the full gamut from perfect scaling to not working at all.

Nvidia?s board partners immediately saw marketing opportunities with the re-invented tech, with Gigabyte offering a dual 6600 GT SLI card (the 3D1), followed by a dual 6600 (3D1-XL), and the 6800 GT (3D1-68GT). These cards not only required an nF4 chipset but also a Gigabyte branded motherboard as well.

Of the high-end single GPU cards, the 6800 Ultra and X800 XT/XT PE were fairly evenly matched, both in price and performance. But they weren't without their issues. The latter arrived in May and suffered supply constraints throughout its entire production life, while Nvidia?s flagship 6800 Ultra was extremely late arriving in August and suffered supply constraints too depending on distribution area, since the card was only made available by a percentage of board partners.

The 6800 GT generally bested the X800 Pro at $399, while the 6600 GT cleaned up in the $199 bracket.

Intense competition with Nvidia that year didn?t have an adverse effect on ATI?s bottom line, as profit peaked at $204.8 million for the year from nearly $2 billion in revenue.

One quirk associated with the well-received 6600 GT was that it initially launched as a PCI Express card, at a time when PCI-E was an Intel-only feature for motherboards designed for Pentium 4 processors. These chips generally lagged in gaming performance behind AMD?s offerings, which of course used the AGP data bus.

Nvidia?s 7000 series started rolling off the assembly lines well before the 6000 series had completed its model line-up. The 7800 GTX arrived a full five months before the reduced bill of materials (BoM) 6800 GS saw the light of day. The first iteration of the 7800 series was based around the G70 GPU on TSMC?s 110nm process, but quickly gave way to the G71-based 7900 series, made on TSMC?s 90nm process.

While the naming convention changed from ?NV? to ?G?, the latter were architecturally related to the NV40 series of the GeForce 6000. And while only fractionally larger than the NV40-45 at 334mm?, the G70 packed in an extra eighty million transistors (for a total of 302 million), adding a third more vertex pipelines and 50% more pixel pipelines. In most cases, the G70 was superseded within nine months, and in the case of the GS and GTX 512MB, the figure was 3 and 4 months respectively.

At the entry level, the 7100 GS continued the use of TurboCache (the ability for the board to use some system memory), which was introduced with the previous generation GeForce 6200 TC.


At the other end of the spectrum, the 7800 GTX 256MB hit retail on June 22 with an MSRP of $599, though its actual street price was higher in many instances. ATI wrested the single-GPU crown back with the X1800 XT, but Nvidia countered with a 512MB version of the 7800 GTX thirty-five days later and promptly regained the title.

Two months later, ATI launched the X1900 XTX, which traded blows with Nvidia?s flagship. This particular graphics horsepower race resulted in both cards being priced at $650. One spinoff of the cards moving to a 512MB frame buffer was that gaming at 2560x1600 with 32-bit color and a high level of image quality enabled was now possible via dual link DVI.
ATI Crossfire ATI's original CrossFire design
required using an external Y cable
ATI announced their multi-card Crossfire technology in May 2005 and made it available in September with the launch of the Xpress 200 Crossfire Edition chipset, and X850 XT Crossfire Master board. Due to a single-link TMDS, resolution and refresh rates were initially limited to 1600x1200 @60Hz, but a dual-link TMDS for 2560x1600 would soon replace it.

Unlike Nvidia?s solution of two identical cards communicating via a bridge connector, ATI implemented a master card with TMDS receiver, which accepted input from a slave card via external dongle and a Xilinx compositing chip.

Like Nvidia?s SLI, CrossFire offered alternative frame rendering (AFR) and split frame rendering (SFR), but also a rendering technique called SuperTiling. The latter offered a performance increase in certain applications, but it did not work with OpenGL or support accelerated geometry processing. Also like SLI, Crossfire faced its share of driver-related troubles.

ATI intended to have their R520 based cards ? their first to incorporate Shader Model 3.0 ? ready by the June-July timeframe, but the late discovery of a bug in the cell library forced a 4 month delay.

Initial launches comprised the X1800 XL/XT using the R520 core, the X1300 budget cards using the RV515 with essentially one quarter of the graphics pipelines of the R520, and the X1600 Pro/XT based on the RV530, which was similar to the RV515 but with a higher shader and vertex pipeline-to-TMU and ROP ratio.

Due to the initial delay with the R520, the GPU and its derivations were being replaced a scant three and a half months later by the R580-based X1900 series which used TSMC?s new 80nm process. Continuing with the roll out, half the graphics pipeline resources went into the RV570 (X1650 GT/XT and X1950 GT/Pro), while a shrunk RV530 became the RV535 powering the X1650 Pro as well as the X1300 XT.

ATI?s revenue rose to a record $2.2 billion for the year, the highest in the company?s history, aided by shipments of Xenos GPUs for the Xbox 360. Net profit, however, slumped to $16.9 million.

ATI?s revenue rose to a record $2.2 billion in 2005, the highest in the company?s history, aided by shipments of Xenos GPUs for the Xbox 360. Net profit, however, slumped to $16.9 million.

By this stage, any graphics card launch not based on an Nvidia or ATI GPU was received with a certain amount of curiosity, if not enthusiasm. Such was the scene when S3?s overhauled graphics line-up debuted in November.

The Chrome S25 and S27 promised good gaming performance based on their high clocks, but delivered a mostly sub-par product. Initial pricing at $99 (S25) and $115 (S27) put the cards in competition against Nvidia?s 6600/6600GT and ATI?s X1300Pro/X1600Pro, but neither S3 card stood up to the competition in any meaningful way, aside from power consumption. That slight advantage evaporated as ATI/AMD and Nvidia addressed the HTPC and entry-level market segment, effectively killing S3?s subsequent Chrome 400 and 500 series.

An added issue for S3 was that the cost of building the cards resulted in razor thin profits. The company needed high volume sales in a market dominated by two vendors. HTC were to acquire S3 in July 2012 for $300 million, a move originally seen as leverage in HTC?s and S3?s separate legal disputes with Apple.

Nvidia and ATI continued to hog the press coverage in 2006.

ATI acquired Macrosynergy, a Shanghai based design and engineering centre with personnel working in California and previously part of the XGI group. Then in May the company bought BitBoys in a $44 million deal.

Meanwhile, Nvidia?s first foray into dual-GPU single board products came in March, following in the footsteps of ATI, 3dfx, and XGI. The 7900 GX2 would sandwich two custom boards essentially carrying a couple of downclocked 7900 GTXs. But Asustek didn?t wait around for Nvidia?s dual-GPU solution, however, and released its own take as the Extreme N7800GT Dual ($900, 2000 units built), which paired two 7800 GT GPUs instead.

This card started Asus interest in limited edition dual-GPU boards, and possibly hardened Nvidia?s attitude towards board partners?, as Asustek products took the spotlight from their reference models at launch.

In the higher volume mainstream market, the 7600 GT and GS both provided solid performance and remarkable longevity, while ATI?s X1950 XTX and Crossfire ruled the top end enthusiast benchmarks for single GPU cards. The X1900 XT and GeForce 7900 GT were fairly evenly matched in the upper mainstream bracket.

ATI's David Orton and AMD's Hector Ruiz officially announce the historic merger
After twenty-one years as an independent company, ATI was bought out by AMD on October 25 2006 for a total price of $5.4 billion ? split between $1.7 billion from AMD, $2.5 billion borrowed from lending institutions, 57 million AMD shares and 11 million options/restricted stock units valued at $1.2 billion. At the time of the buy out, around 60-70% of ATI?s chipset/IGP revenues were accrued from a partnership with Intel based motherboards.
Two weeks after the ATI buy-out, Nvidia ushered in the age of unified shader architectures for PC graphics.

With a large part of Intel?s IGP chipset market moving to Nvidia, market share dropped dramatically. The logic behind the buy was a seemingly quick path to GPU technology, rather than use the $5.4 billion to develop AMD?s own IP and add licenced technology where needed. At the time, AMD was aiming at the quick introduction of Torrenza and the associated Fusion projects.

Two weeks after the ATI buy-out, Nvidia ushered in the age of unified shader architectures for PC graphics. ATI?s Xenos GPU for the Xbox 360 had already introduced the unified architecture to consoles.

This article is the third installment on a series of four. Next week we'll wrap things up, following the development of Radeon products under AMD's wing, the continued rivalry between GeForce and Radeon CPUs, the transition toward stream processing, and what the present a near future holds for graphics processors.






SimCity Review

AppId is over the quota AppId is over the quota


To many fans of the original city building simulation series, the idea of an online multiplayer game that required even solo players to be connected to the internet at all times seemed like a recipe for disaster. Maxis' latest creation is easily the most compelling SimCity
I've played since the 1989 original.
It's also a disaster.

The weekend before the game's March 5 launch, I had a chance to experience SimCity
the way everyone is supposed to be experiencing it right now. The handful of press participating barely put a dent on the special servers EA set up for the event. The game played (for the most part) flawlessly, giving early reviewers an exquisite taste of the collaborative multiplayer that defines the release. I saw what the developers no doubt wanted every player to see post-launch ? a new SimCity capable of bringing together people from across the planet to strive towards a common goal. It was glorious.

I collected that early experience in an article titled "SimCity
Won (and Broke) My Heart in Just Three Days." I had no idea how apropos that headline would become.


That first, teasing taste was followed by a nightmare for everyone involved. There were problems downloading the game. Problems connecting to servers. Problems getting together with friends to play during the brief moments when everything seemed to be working perfectly. While EA and Maxis work aggressively on a solution to these issues, player frustration and outrage continues to build.
SimCity Review WHY: One of the most compelling entries in the esteemed city building simulation series, SimCity's substantial connectivity problems aren't exactly giving players a choice in the matter.
Developer: Maxis
Platforms: PC
Release Date: March 5
Type of game: City Building Simulation.

What I played: Built, maintained and destroyed multiple cities during the press early start event. Once the game launched I attempted to collaborate with other members of the Kotaku
staff on our own private region, but only two of us (myself included) managed to successfully play long enough to build anything of lasting value. My Two Favorite Things Laying down regions and watching them grow organically and change dynamically based on the objects I place around them.Working together with other players for the good of an entire region adds meaning and purpose to my virtual cities. This is what a social game should be.My Two Least-Favorite Things There's never enough space for my ambition in these tiny plots of land, and claiming multiples in one region doesn't scratch my megalopolis itch.I don't mind a game that requires an always-on internet connection, as long as it returns the favor.Made-to-Order Back-of-Box Quotes "Hey guys. HEY GUYS. GUYS. Look at my city. No no, look at it now."
-Mike Fahey, Kotaku.com"Unable to connect to the Made-to-Order Back-of-Box Quotes server. Please try again."
-Mike Fahey."It says I'm online? But I'm at Wal-Mart, and the PC the game is loaded on is unplugged."
-Mike Fahey.
I am not filled with outrage; only disappointment, fueled by the knowledge that somewhere beyond these technical issues there's an outstanding game waiting to be played.

The original SimCity
is one of the greatest computer games of all time. When now-legendary game designer Will Wright realized that using the map editor he'd created for the game Raid on Bungling Bay was more entertaining than the game itself, he gave that editor to the world, creating an entirely new genre in the process. The creative freedom SimCity allowed was intoxicating. I couldn't tell you how long I played when I first launched the game ? the days ran together. I would fall asleep in my computer chair, wake up and continue playing.
Over the years, freedom and I have had a falling out. Giving me a sandbox to play in with little supervision is a surefire way to ensure I wander away from the sandbox, possibly into busy traffic. So much of my time is not my own these days that I need a more directed experience. I require more than my own devices.

This brave new multiplayer SimCity
grants me the focus I need to once again lose myself in the minutiae of running a virtual town. The success of my creation is intricately tied to the prosperity of other players'. They depend on me to foster a community of wealthy citizens that will flock to their shops to spend their simoleons. I depend on them to provide sewage treatment and medical services so that the wealthy citizens drawn to my tourist mecca don't die of cholera.
The SimCity
series has always been a balancing act, with players struggling to maintain the right ratio of residential to industrial to commercial, all the while ensuring that enough funds are invested in services to make sure the whole thing doesn't go up in flames. It's just now there are multiple performers in every region, taking turns walking the tightrope while the others hold the safety net (or drop it, as the case may be).
The multiplayer aspect also allows for excellent opportunities to show off your city-planning skills. The creative gamer thrives in the new SimCity
, thanks in no small part to the addition of curved and free-form road placement and the ability of residential, commercial and industrial zones to conform to these wild lines. These color-coded areas are painted more than placed, fresh buildings sprouting like architectural flowers that blow in the breeze of every little change the player makes. The GlassBox engine is a remarkable machine, transforming a technical process into something organic and beautiful. It's a joy to watch its work unfold, both from the sky above and at street level.
Players more interested in straight lines and statistics will find plenty to love in SimCity
as well. The game is filled with color-coded maps that communicate a wealth of complex information in the most efficient way possible. The interface, aside from the odd obtuse bits, is amazingly intuitive without feeling dumbed-down. Micro-management is an option, but not a necessity. It's one of the game's greatest strengths ? catering to multiplayer play styles while remaining completely accessible (I'm talking mechanics, not connection) to all.
Of course there are downsides. I wish the individual city plots were larger or expandable, giving my city room to stretch out, perhaps link up with other players' creations. I wish I understood how trade depots work, one of a few obtuse mechanics in an otherwise intuitive game.

And I wish I could play consistently. That would be nice.

Team Kotaku
had big plans for the SimCity launch. We set up a private region so we could further explore the symbiotic relationship between cities. I staked out my claim, a circular piece of land I decided to dedicate to tourism and travel. Stephen Totilo grabbed a plot, his city feeding mine with waste and sewage disposal. Between the two of us we managed to unlock two Great Works ? the Arcology and the International Airport ? massive undertakings built in special spots on the regional map, requiring cities to work together to harvest the resources necessary for their completion.
None of the others made it into the game.


Chris Person was able to claim two plots, but both bugged out before he could lay a single road. He can't access them, and we can't delete them. Jason Schreier hasn't been able to connect. Neither has Kirk Hamilton, who received my invitation to join the region yesterday ? two days after I sent it. Our grand plan will never be realized.

I understand the frustration and anger that players are feeling. Over the past three days I've slept maybe seven hours total, waking from shorts naps taken while waiting for server queues, maintenance downtime, server disconnects and the like. Each of those seven hours was spent in my computer chair, fearing I might miss an opportunity if I wandered off to the bedroom. I feel like I did when I played the original SimCity
, only now I'm much older and a lot less happy.

SimCity
's launch is more than just a disaster ? it's a tragedy, because somewhere beyond the rage, pain and technical issues there's an amazing game that I'm dying to play.
We'll revisit that Not Yet once EA gets the servers to a point where not playing is no longer mandatory.

SimCity Performance, Benchmarked

AppId is over the quota AppId is over the quota
Going down the memory lane, I can remember two computer games being responsible for getting me so interested in PCs. The original Command & Conquer was the first around 1995. Running on the venerable MS-DOS, I spent quite a bit of time playing that game at the ripe old age of 9 on our pokey HP powered i486.

Shortly after that I discovered SimCity 2000. The first SimCity title, which was released back in 1989, was before my time so I never played or laid eyes on the original. At the time SimCity 2000 was incredible, it was extremely detailed and offered what seemed like endless hours of gameplay. Some five years later SimCity 3000 was released (1999) and again much of my childhood was spent playing it.

For reasons that I cannot recall I never got into SimCity 4 (2003). I know I played it but for some reason it just didn?t speak to me like the previous two titles. Then along came SimCity Societies and at that point I thought my days of enjoying the SimCity games were over and for the better part of a decade they were.

But when Maxis announced last year that a sixth installment in the SimCity franchise was coming the hair on the back of my neck stood on end. From the announcement, it looked to be a dramatic overhaul from previous titles featuring full 3d graphics, online multiplayer gameplay, a new engine as well as several new features and gameplay changes.

One year of waiting later, like so many others I pre-ordered the game and sat waiting for it to become available for download. Unfortunately like everyone else, once the game became available and I finally managed to download it, I wasn?t actually able to play.

As you've probably heard for the past couple of weeks, the game requires an internet connection to play, meaning there is no offline mode. That in itself is extremely annoying but it?s much worse when the servers you are meant to play on cannot cope with demand and shut you out.

It took me several days of trying, as did the thousands of outraged fans. Since we planned to test SimCity I really needed to get in and work out how we were going to test the game. Thankfully by Sunday things improved and for the next three days I set about building our test environment.

Normally when we benchmark a first person shooter, finding a good portion of the game to test with is simply a matter of playing through the game until we find a section that is rather demanding. This generally requires an hour or two of gameplay and then we get to test in full. It?s a similar process when we test real-time strategy games such as StarCraft II, for example. In that instance we chose to play a 4v4 game, record the replay and use that for benchmarking.

But with SimCity things were considerably more complex and time consuming. Because the game's progress is stored on EA servers it?s not possible to just download and use someone else?s saved game of a massive city. While it is possible to load up the leaderboard within SimCity, see who has the biggest city, and check it out, we couldn't use that for testing either since it's a live city being played, thus forever changing and hardly a controlled-enough test environment.

There are a few pre-built cities, such as the one used in the tutorial ?Summer Shoals? but with a population of less than 4000 it doesn?t exactly make for the most demanding test environment. Therefore we created a city that has a population of half a million sims with three more cities just like it on the map.

When testing StarCraft II some readers were upset that we tested using a large 8-player map, claiming that they only play 1v1 and therefore get better performance. That is fine, but we wanted to show what it took to play the game in its most demanding state so that you'd never run into performance issues.

Getting back to SimCity, it?s a slightly different situation as all the regions are the same size. Some maps have more regions than others, but they are all 2x2 kilometers (comparable to SimCity 4's medium size). For testing we loaded one of our custom created cities (the same one each time) and increased the game speed to maximum, as this is how I always play anyway. Once that was done, we started a 60 second test using Fraps and in that time zoomed in and out multiple times while scrolling around the city.

As usual we tested at three different resolutions: 1680x1050, 1920x1200 and 2560x1600. The game was tested using two quality configurations, which we are calling maximum and medium. Normally we would test three different quality settings, but there was virtually no difference between 'max' and 'high' so we scrapped the latter.
HIS Radeon HD 7970 GHz (3072MB)HIS Radeon HD 7970 (3072MB)HIS Radeon HD 7950 Boost (3072MB)HIS Radeon HD 7950 (3072MB)HIS Radeon HD 7870 (2048MB)HIS Radeon HD 7850 (2048MB)HIS Radeon HD 7770 (1024MB)HIS Radeon HD 7750 (1024MB)HIS Radeon HD 6970 (2048MB)HIS Radeon HD 6870 (1024MB)HIS Radeon HD 6850 (1024MB)HIS Radeon HD 6790 (1024MB)HIS Radeon HD 6770 (1024MB)HIS Radeon HD 6750 (1024MB)HIS Radeon HD 5870 (2048MB)Gigabyte GeForce GTX Titan (6144MB)Gigabyte GeForce GTX 680 (2048MB)Gigabyte GeForce GTX 670 (2048MB)Gainward GeForce GTX 660 Ti (2048MB)Gigabyte GeForce GTX 660 (2048MB)Gigabyte GeForce GTX 650 Ti (2048MB)Gigabyte GeForce GTX 580 (1536MB)Gigabyte GeForce GTX 560 Ti (1024MB)Gigabyte GeForce GTX 560 (1024MB)Gigabyte GeForce GTX 550 Ti (1024MB)Gigabyte GeForce GTX 480 (1536MB)Gigabyte GeForce GTX 460 (1024MB)Intel Core i7-3960X Extreme Edition (3.30GHz)x4 4GB G.Skill DDR3-1600 (CAS 8-8-8-20)Gigabyte G1.Assassin2 (Intel X79)OCZ ZX Series 1250wCrucial m4 512GB (SATA 6Gb/s)Microsoft Windows 7 SP1 64-bitNvidia Forceware 314.14AMD Catalyst 13.2 (Beta 7)

Pebble Smartwatch Review

Error in deserializing body of reply message for operation 'Translate'. The maximum string content length quota (8192) has been exceeded while reading XML data. This quota may be increased by changing the MaxStringContentLength property on the XmlDictionaryReaderQuotas object used when creating the XML reader. Line 1, position 8623. AppId is over the quota


The Pebble smartwatch for iPhone and Android is a prime example of the bizarre times in which we live. In an era of people increasingly telling time by looking at the corner of their computer's screen or their smartphone lock screens, Pebble wants to make the wristwatch everyone's preferred timepiece. But instead of pitting the watch and smartphone as competitors for one function, Pebble tries to go beyond the default use case for either device and do much more.

Sony, Microsoft, Motorola, among others have also tried to turn the watch into more than just a timepiece, but even their millions of dollars in R&D and years of manufacturing experience couldn't quite get the category right. Yet Pebble, a humble Kickstarter project embraced by nearly 70,000 backers, is the most buzzed-about smartwatch yet. Have those early believers been vindicated by the release of a phenomenal product or are they just another cautionary tale of what happens when reality doesn't meet the hype?

Pebble retails for $150, and that cost doesn't buy good looks. The watch features a plain rectangular plastic face connected to an equally plain - though to be fair, quite comforting - band made of smooth rubber.

Multiple colors are available for the watch portion, but the glossy plastic frame and exposed charging connector are about as stylish as a watch found in a sporting goods store. The nondescript design is forgivable considering that no one is buying Pebble to make a fashion statement. While it's not the prettiest timepiece on the market, Pebble is simple enough to not clash with most attires or look cheap.

A 1.26-inch black and white e-paper display with 144 x 168 resolution shows the time and compatible notifications from an iOS or Android device. A backlight makes the screen more legible when in the dark. This light can also be automatically triggered by gently flicking one's wrist.

A menu button and a charge connector are on the left side, and navigation buttons for scrolling and selecting are on the right. Pebble is light as a feather, so pushing the buttons with too much force will cause the device to slant at an uncomfortable angle, but the light feel is otherwise a blessing.

Key specs to know about the Pebble include:
Scratchproof, shatter-resistant, and 5 ATM water-resistantVibrating motor to send notifications and alarms Accelerometer sensor to use gesture detection Bluetooth 2.1+EDR and Bluetooth 4.0Replaceable 22mm watch bands
Pebble's superficial shortcomings are a non-issue for most. The real value of a smartwatch is its "smarts" offered through software. While craftsmanship and fashion play a much larger role when selecting a standard wristwatch, Pebble attracted scores of customers by promoting the idea that their watch could do more than simply tell time. The Bluetooth 4.0-capable watch can link with a smartphone to download customizable clocks, relay notifications from a smartphone, and even initiate controls on an Apple iPhone or Google Android device.

The type of smartphone someone owns greatly affects just how smart Pebble can appear. An iPhone is currently able to receive notifications for incoming calls, calendar appointments, text messages, and iMessages, but not all apps are able to send notifications. The watch proves to be a nice timesaver because it shows Caller ID information and can accept or decline calls, though it doesn't automatically turn on speaker mode, so you'll still have to pick up the phone. I also loved Pebble's music control feature that's able to start, pause, and select next or previous tracks in the Music app. There are no volume controls, but the pause and next track buttons also work in other apps like Pandora and Slacker.

The Android version of Pebble is noticeably more powerful. In addition to the standard reminders and music controls previously mentioned, the Pebble Android app supports notifications for the Email, Gmail, Calendar, Google Voice, WhatsApp and Facebook apps. Even more control is available by using a third-party app that can route notifications from any app to the Pebble, something not available on an iPhone unless you are on a jailbroken device.

One uncomfortable aspect is that Pebble requires Gmail login information to access Gmail notifications, something that many might be hesitant to do, and for good reason in light of concerns for account security. Another shortcoming is that the app treats notifications on a last come, last served basis. There's no way to scroll through alerts, so it will only display the most recent message or tweet to appear, so you could conceivably miss an important notification.

Apps were also supposed to be a part of the Pebble package, but the watch sadly does not yet have those features. The original plan for Pebble included added functionality for apps, the most notable being run stats being displayed on-screen for RunKeeper. Despite having an SDK available since last spring, and an early demonstration of the feature, RunKeeper and other third-party apps are sadly absent from Pebble. The only things downloadable so far are additional clocks, which are nice but not enough to make anyone forget about what Pebble is capable of accomplishing.

As I alluded to earlier, Pebble is not the first device that attempts to extend the power of connectivity from the pocket to the wrist. The difference is that other solutions have used power-sucking LCD screens that require frequent charging, which is a non-starter for most people. Pebble avoids that issue entirely because the e-paper display is long lasting, and Bluetooth 4.0 uses less energy when paired with a capable smartphone. Pebble has an advertised seven days of battery life, but that can be lessened if Bluetooth is always on and someone constantly accesses the backlight.

I managed to use the Pebble frequently, with both the Bluetooth and backlight always enabled, for more than four days before the battery indicator began warning that it may be time to charge the device. Pebble lasted for at least another six hours even with the indicator on. The only concern will be related to the smartphone, which will noticeably drain faster depending on the device and usage habits. Pebble will continue to function as a watch when disconnected, but notifications will obviously cease once Bluetooth is disabled.

Pebble is a bold project that clearly ran out of the gate faster than anyone expected. The small startup that developed the watch has had to learn and adapt on the fly because of unexpected demands, and that has led to the device falling short of its potential.

The good news is that there's still a lot of upside and room to grow. Apps will eventually help fulfill Pebble's promise, and the device is already a good product. Over the past week, I've enjoyed being able to skip songs without having to reach for my phone and cut down on the amount of calls missed when I was in another room. I've even found a new way to locate my frequently misplaced phone by remotely starting a song and following the noise.

That won't be enough for most people to spend $150, so Pebble is currently a fancy toy with a chance at being more. Pebble is not yet the elegant solution that its Kickstarter project inspired, but it's a functional timepiece well on its way to becoming something special.

Tomb Raider Performance Test: Graphics & CPU

AppId is over the quota AppId is over the quota
Although this year's Tomb Raider reboot made our latest list of most anticipated PC games, I must admit that it was one of the games I was least looking forward to from a performance perspective. Previous titles in the franchise have received mixed to positive reviews, but gameplay aside, their visuals weren't exactly mind-blowing so we've never bothered doing a performance review on one -- until now, anyway.

As with the last few entries, Crystal Dynamics developed the new Tomb Raider 88 using the Crystal Engine -- albeit a heavily modified version. Being a multiplatform release, we were naturally worried about the game being geared toward consoles with PC being an afterthought, which has become increasingly common (Dead Space 3 comes to mind as a recent example) and generally results in lackluster graphics.

Those concerns were at least partially alleviated when we learned that the PC port was being worked on by Nixxes Software BV, the same folks who handled the PC versions of Hitman: Absolution and Deus Ex: Human Revolution, both of which were great examples of what we expect from decent ports in terms of graphical quality and customization. Hitman in particular really stressed our higher-end hardware.
Tomb Raider benchmarks

We were also relieved to learn that Tomb Raider supports DirectX 11, which brings access to rendering technologies such as depth of field, high definition ambient occlusion, hardware tessellation, super-sample anti-aliasing and contact-hardening shadows. Additionally, compared to the diluted console versions, the PC build offers better textures as well as AMD's TressFX real-time hair physics system.

The result should be a spectacular looking game that pushes the limits of today's enthusiast hardware -- key word being "should," of course -- so let's move on and see what the Tomb Raider reboot is made of.

We'll be testing 27 DirectX 11 graphics card configurations from AMD and Nvidia covering a wide range of prices from the affordable to the ultra-expensive. The latest drivers will be used, and every card will be paired with an Intel Core i7-3960X to remove CPU bottlenecks that could influence high-end GPU scores.

We're using Fraps to measure frame rates during 90 seconds of gameplay footage from Tomb Raider?s first level, the checkpoint is called "Stun." The test begins with Lara running to escape from a cave system.
Tomb Raider benchmarks

Our Fraps test ends just before Lara exits the cave, which is ironically where the built-in benchmark begins. We decided to test a custom section of the game rather than the stock benchmark because this is how we will test Tomb Raider in the future when reviewing new graphics cards. Using Fraps also allows us to record frame latency performance, though for this particular article we didn't include those.

Frame timings weren't included for two reasons: it's not easy to display all that data when testing 27 different GPUs, and we feel Nvidia needs more time to improve their drivers. We'll include frame time performance for Tomb Raider in our next GPU review.

We'll test Tomb Raider at three common desktop display resolutions: 1680x1050, 1920x1200 and 2560x1600 using DX11. We are also testing using the three top quality presets that includes Ultimate, Ultra and High. No changes will be made to the presets.
HIS Radeon HD 7970 GHz (3072MB)HIS Radeon HD 7970 (3072MB)HIS Radeon HD 7950 Boost (3072MB)HIS Radeon HD 7950 (3072MB)HIS Radeon HD 7870 (2048MB)HIS Radeon HD 7850 (2048MB)HIS Radeon HD 7770 (1024MB)HIS Radeon HD 7750 (1024MB)HIS Radeon HD 6970 (2048MB)HIS Radeon HD 6870 (1024MB)HIS Radeon HD 6850 (1024MB)HIS Radeon HD 6790 (1024MB)HIS Radeon HD 6770 (1024MB)HIS Radeon HD 6750 (1024MB)HIS Radeon HD 5870 (2048MB)Gigabyte GeForce GTX Titan (6144MB)Gigabyte GeForce GTX 680 (2048MB)Gigabyte GeForce GTX 670 (2048MB)Gainward GeForce GTX 660 Ti (2048MB)Gigabyte GeForce GTX 660 (2048MB)Gigabyte GeForce GTX 650 Ti (2048MB)Gigabyte GeForce GTX 580 (1536MB)Gigabyte GeForce GTX 560 Ti (1024MB)Gigabyte GeForce GTX 560 (1024MB)Gigabyte GeForce GTX 550 Ti (1024MB)Gigabyte GeForce GTX 480 (1536MB)Gigabyte GeForce GTX 460 (1024MB)Intel Core i7-3960X Extreme Edition (3.30GHz)x4 4GB G.Skill DDR3-1600 (CAS 8-8-8-20)Gigabyte G1.Assassin2 (Intel X79)OCZ ZX Series 1250wCrucial m4 512GB (SATA 6Gb/s)Microsoft Windows 7 SP1 64-bitNvidia Forceware 314.14AMD Catalyst 13.2 (Beta 7)

The History of the Modern Graphics Processor

Translate Request has too much data Parameter name: request AppId is over the quota
The evolution of the modern graphics processor begins with the introduction of the first 3D add-in cards in 1995, followed by the widespread adoption of the 32-bit operating systems and the affordable personal computer.

The graphics industry that existed before that largely consisted of a more prosaic 2D, non-PC architecture, with graphics boards better known by their chip?s alphanumeric naming conventions and their huge price tags. 3D gaming and virtualization PC graphics eventually coalesced from sources as diverse as arcade and console gaming, military, robotics and space simulators, as well as medical imaging.

The early days of 3D consumer graphics were a Wild West of competing ideas. From how to implement the hardware, to the use of different rendering techniques and their application and data interfaces, as well as the persistent naming hyperbole. The early graphics systems featured a fixed function pipeline (FFP), and an architecture following a very rigid processing path utilizing almost as many graphics APIs as there were 3D chip makers.

While 3D graphics turned a fairly dull PC industry into a light and magic show, they owe their existence to generations of innovative endeavour. Over the next few weeks (this is the first installment on a series of four articles)
we'll be taking an extensive look at the history of the GPU, going from the early days of 3D consumer graphics, to the 3Dfx Voodoo game-changer, the industry's consolidation at the turn of the century, and today's modern GPGPU.
The first true 3D graphics started with early display controllers, known as video shifters and video address generators. They acted as a pass-through between the main processor and the display. The incoming data stream was converted into serial bitmapped video output such as luminance, color, as well as vertical and horizontal composite sync, which kept the line of pixels in a display generation and synchronized each successive line along with the blanking interval (the time between ending one scan line and starting the next).

A flurry of designs arrived in the latter half of the 1970s, laying the foundation for 3D graphics as we know them.
Atari 2600
Atari 2600 released in September 1977
RCA?s ?Pixie? video chip (CDP1861) in 1976, for instance, was capable of outputting a NTSC compatible video signal at 62x128 resolution, or 64x32 for the ill-fated RCA Studio II console.

The video chip was quickly followed a year later by the Television Interface Adapter (TIA) 1A, which was integrated into the Atari 2600 for generating the screen display, sound effects, and reading input controllers. Development of the TIA was led by Jay Miner, who also led the design of the custom chips for the Commodore Amiga computer later on.

In 1978, Motorola unveiled the MC6845 video address generator. This became the basis for the IBM PC?s Monochrome and Color Display Adapter (MDA/CDA) cards of 1981, and provided the same functionality for the Apple II. Motorola added the MC6847 video display generator later the same year, which made its way into a number of first generation personal computers, including the Tandy TRS-80.

IBM PC Monochrome Display Adapter
IBM PC?s Monochrome Display Adapter

A similar solution from Commodore?s MOS Tech subsidiary, the VIC, provided graphics output for 1980-83 vintage Commodore home computers.

In November the following year, LSI?s ANTIC (Alphanumeric Television Interface Controller) and CTIA/GTIA co-processor (Color or Graphics Television Interface Adaptor), debuted in the Atari 400. ANTIC processed 2D display instructions using direct memory access (DMA). Like most video co-processors, it could generate playfield graphics (background, title screens, scoring display), while the CTIA generated colors and moveable objects. Yamaha and Texas Instruments supplied similar IC?s to a variety of early home computer vendors.

The next steps in the graphics evolution were primarily in the professional fields.

Intel used their 82720 graphics chip as the basis for the $1000 iSBX 275 Video Graphics Controller Multimode Board. It was capable of displaying eight color data at a resolution of 256x256 (or monochrome at 512x512). Its 32KB of display memory was sufficient to draw lines, arcs, circles, rectangles and character bitmaps. The chip also had provision for zooming, screen partitioning and scrolling.

SGI quickly followed up with their IRIS Graphics for workstations -- a GR1.x graphics board with provision for separate add-in (daughter) boards for color options, geometry, Z-buffer and Overlay/Underlay.
Intel's $1000 iSBX 275 Video Graphics Controller Multimode Board was capable of displaying eight color data at a resolution of 256x256 (or monochrome at 512x512).

Industrial and military 3D virtualization was relatively well developed at the time. IBM, General Electric and Martin Marietta (who were to buy GE?s aerospace division in 1992), along with a slew of military contractors, technology institutes and NASA ran various projects that required the technology for military and space simulations. The Navy also developed a flight simulator using 3D virtualization from MIT?s Whirlwind computer in 1951.

Besides defence contractors there were companies that straddled military markets with professional graphics.

Evans & Sutherland ? who were to provide professional graphics card series such as the Freedom and REALimage ? also provided graphics for the CT5 flight simulator, a $20 million package driven by a DEC PDP-11 mainframe. Ivan Sutherland, the company?s co-founder, developed a computer program in 1961 called Sketchpad, which allowed drawing geometric shapes and displaying on a CRT in real-time using a light pen.

This was the progenitor of the modern Graphic User Interface (GUI).

In the less esoteric field of personal computing, Chips and Technologies? 82C43x series of EGA (Extended Graphics Adapter), provided much needed competition to IBM?s adapters, and could be found installed in many PC/AT clones around 1985. The year was noteworthy for the Commodore Amiga as well, which shipped with the OCS chipset. The chipset comprised of three main component chips -- Agnus, Denise, and Paula -- which allowed a certain amount of graphics and audio calculation to be non-CPU dependent.

In August of 1985, three Hong Kong immigrants, Kwok Yuan Ho, Lee Lau and Benny Lau, formed Array Technology Inc in Canada. By the end of the year, the name had changed to ATI Technologies Inc.

ATI got their first product out the following year, the OEM Color Emulation Card. It was used for outputting monochrome green, amber or white phosphor text against a black background to a TTL monitor via a 9-pin DE-9 connector. The card came equipped with a minimum of 16KB of memory and was responsible for a large percentage of ATI?s CAD$10 million in sales in the company?s first year of operation. This was largely done through a contract that supplied around 7000 chips a week to Commodore Computers.
ATI's Color Emulation Card came with a minimum 16KB of memory and was responsible for a large part of the company?s CAD$10 million in sales the first year of operation.

The advent of color monitors and the lack of a standard among the array of competitors ultimately led to the formation of the Video Electronics Standards Association (VESA), of which ATI was a founding member, along with NEC and six other graphics adapter manufacturers.

In 1987 ATI added the Graphics Solution Plus series to its product line for OEM?s, which used IBM?s PC/XT ISA 8-bit bus for Intel 8086/8088 based IBM PC?s. The chip supported MDA, CGA and EGA graphics modes via dip switches. It was basically a clone of the Plantronics Colorplus board, but with room for 64kb of memory. Paradise Systems? PEGA1, 1a, and 2a (256kB) released in 1987 were Plantronics clones as well.
ATI EGA Wonder 800
ATI EGA 800: 16-color VGA emulation, 800x600 support
The EGA Wonder series 1 to 4 arrived in March for $399, featuring 256KB of DRAM as well as compatibility with CGA, EGA and MDA emulation with up to 640x350 and 16 colors. Extended EGA was available for the series 2,3 and 4.

Filling out the high end was the EGA Wonder 800 with 16-color VGA emulation and 800x600 resolution support, and the VGA Improved Performance (VIP) card, which was basically an EGA Wonder with a digital-to-analog (DAC) added to provide limited VGA compatibility. The latter cost $449 plus $99 for the Compaq expansion module.

ATI was far from being alone riding the wave of consumer appetite for personal computing.

Many new companies and products arrived that year.. Among them were Trident, SiS, Tamerack, Realtek, Oak Technology, LSI?s G-2 Inc., Hualon, Cornerstone Imaging and Winbond -- all formed in 1986-87. Meanwhile, companies such as AMD, Western Digital/Paradise Systems, Intergraph, Cirrus Logic, Texas Instruments, Gemini and Genoa, would produce their first graphics products during this timeframe.

ATI?s Wonder series continued to gain prodigious updates over the next few years.

In 1988, the Small Wonder Graphics Solution with game controller port and composite out options became available (for CGA and MDA emulation), as well as the EGA Wonder 480 and 800+ with Extended EGA and 16-bit VGA support, and also the VGA Wonder and Wonder 16 with added VGA and SVGA support.

A Wonder 16 was equipped with 256KB of memory retailed for $499, while a 512KB variant cost $699.

An updated VGA Wonder/Wonder 16 series arrived in 1989, including the reduced cost VGA Edge 16 (Wonder 1024 series). New features included a bus-Mouse port and support for the VESA Feature Connector. This was a gold-fingered connector similar to a shortened data bus slot connector, and it linked via a ribbon cable to another video controller to bypass a congested data bus.

The Wonder series updates continued to move apace in 1991. The Wonder XL card added VESA 32K color compatibility and a Sierra RAMDAC, which boosted maximum display resolution to 640x480 @ 72Hz or 800x600 @ 60Hz. Prices ranged through $249 (256KB), $349 (512KB), and $399 for the 1MB RAM option. A reduced cost version called the VGA Charger, based on the previous year?s Basic-16, was also made available.
ATI Mach8
ATI Graphics Ultra ISA (Mach8 + VGA)
ATI added a variation of the Wonder XL that incorporated a Creative Sound Blaster 1.5 chip on an extended PCB. Known as the VGA Stereo-F/X, it was capable of simulating stereo from Sound Blaster mono files at something approximating FM radio quality.

The Mach series launched with the Mach8 in May of that year. It sold as either a chip or board that allowed, via a programming interface (AI), the offloading of limited 2D drawing operations such as line-draw, color-fill and bitmap combination (Bit BLIT).

Graphics boards such as the ATI VGAWonder GT, offered a 2D + 3D option, combining the Mach8 with the graphics core (28800-2) of the VGA Wonder+ for its 3D duties. The Wonder and Mach8 pushed ATI through the CAD$100 million sales milestone for the year, largely on the back of Windows 3.0?s adoption and the increased 2D workloads that could be employed with it.

S3 Graphics was formed in early 1989 and produced its first 2D accelerator chip and a graphics card eighteen months later, the S3 911 (or 86C911). Key specs for the latter included 1MB of VRAM and 16-bit color support.

The S3 911 was superseded by the 924 that same year -- it was basically a revised 911 with 24-bit color -- and again updated the following year with the 928 which added 32-bit color, and the 801 and 805 accelerators. The 801 used an ISA interface, while the 805 used VLB. Between the 911?s introduction and the advent of the 3D accelerator, the market was flooded with 2D GUI designs based on S3?s original -- notably from Tseng labs, Cirrus Logic, Trident, IIT, ATI?s Mach32 and Matrox?s MAGIC RGB.

In January 1992, Silicon Graphics Inc (SGI) released OpenGL 1.0, a multi-platform vendor agnostic application programming interface (API) for both 2D and 3D graphics.
Microsoft was developing a rival API of their own called Direct3D and didn?t exactly break a sweat making sure OpenGL ran as well as it could under Windows.

OpenGL evolved from SGI?s proprietary API, called the IRIS GL (Integrated Raster Imaging System Graphical Library). It was an initiative to keep non-graphical functionality from IRIS, and allow the API to run on non-SGI systems, as rival vendors were starting to loom on the horizon with their own proprietary APIs.

Initially, OpenGL was aimed at the professional UNIX based markets, but with developer-friendly support for extension implementation it was quickly adopted for 3D gaming.

Microsoft was developing a rival API of their own called Direct3D and didn?t exactly break a sweat making sure OpenGL ran as well as it could under the new Windows operating systems.

Things came to a head a few years later when John Carmack of id Software, whose previously released Doom had revolutionised PC gaming, ported Quake to use OpenGL on Windows and openly criticised Direct3D.

GLQuake
Fast forward: GLQuake released in 1997 versus original Quake

Microsoft?s intransigence increased as they denied licensing of OpenGL?s Mini-Client Driver (MCD) on Windows 95, which would allow vendors to choose which features would have access to hardware acceleration. SGI replied by developing the Installable Client Driver (ICD), which not only provided the same ability, but did so even better since MCD covered rasterisation only and ICD added lighting and transform functionality (T&L).

During the rise of OpenGL, which initially gained traction in the workstation arena, Microsoft was busy eyeing the emerging gaming market with designs on their own proprietary API. They acquired RenderMorphics in February 1995, whose Reality Lab API was gaining traction with developers and became the core for Direct3D.

At about the same time, 3dfx?s Brian Hook was writing the Glide API that was to become the dominant API for gaming. This was in part due to Microsoft?s involvement with the Talisman project (a tile based rendering ecosystem), which diluted the resources intended for DirectX.

As D3D became widely available on the back of Windows adoption, proprietary APIs such as S3d (S3), Matrox Simple Interface, Creative Graphics Library, C Interface (ATI), SGL (PowerVR), NVLIB (Nvidia), RRedline (Rendition) and Glide, began to lose favor with developers.

It didn?t help matters that some of these proprietary APIs were allied with board manufacturers under increasing pressure to add to a rapidly expanding feature list. This included higher screen resolutions, increased color depth (from 16-bit to 24 and then 32), and image quality enhancements such as anti-aliasing. All of these features called for increased bandwidth, graphics efficiency and faster product cycles.
By 1993, market volatility had already forced a number of graphics companies to withdraw from the business, or to be absorbed by competitors.

The year 1993 ushered in a flurry of new graphics competitors, most notably Nvidia, founded in January of that year by Jen-Hsun Huang, Curtis Priem and Chris Malachowsky. Huang was previously the Director of Coreware at LSI while Priem and Malachowsky both came from Sun Microsystems where they had previously developed the SunSPARC-based GX graphics architecture.

Fellow newcomers Dynamic Pictures, ARK Logic, and Rendition joined Nvidia shortly thereafter.

Market volatility had already forced a number of graphics companies to withdraw from the business, or to be absorbed by competitors. Amongst them were Tamerack, Gemini Technology, Genoa Systems, Hualon, Headland Technology (bought by SPEA), Acer, Motorola and Acumos (bought by Cirrus Logic).

One company that was moving from strength to strength however was ATI.

As a forerunner of the All-In-Wonder series, late November saw the announcement of ATI?s 68890 PC TV decoder chip which debuted inside the Video-It! card. The chip was able to capture video at 320x240 @ 15 fps, or 160x120 @ 30 fps, as well as compress/decompress in real time thanks to the onboard Intel i750PD VCP (Video Compression Processor). It was also able to communicate with the graphics board via the data bus, thus negating the need for dongles or ports and ribbon cables.

The Video-It! retailed for $399, while a lesser featured model named Video-Basic completed the line-up.

Five months later, in March, ATI belatedly introduced a 64-bit accelerator; the Mach64.

The financial year had not been kind to ATI with a CAD$2.7 million loss as it slipped in the marketplace amid strong competition. Rival boards included the S3 Vision 968, which was picked up by many board vendors, and the Trio64 which picked up OEM contracts from Dell (Dimension XPS), Compaq (Presario 7170/7180), AT&T (Globalyst),HP (Vectra VE 4), and DEC (Venturis/Celebris).

S3 Vision 968
Vision 968: S3's first motion video accelerator

Released in 1995, the Mach64 notched a number of notable firsts. It became the first graphics adapter to be available for PC and Mac computers in the form of the Xclaim ($450 and $650 depending on onboard memory), and, along with S3's Trio, offered full-motion video playback acceleration.

The Mach64 also ushered in ATI?s first pro graphics cards, the 3D Pro Turbo and 3D Pro Turbo+PC2TV, priced at a cool $599 for the 2MB option and $899 for the 4MB.

ATI Mach64
ATI Mach64 VT with support for TV tuner

The following month saw a technology start-up called 3DLabs rise onto the scene, born when DuPont?s Pixel graphics division bought the subsidiary from its parent company, along with the GLINT 300SX processor capable of OpenGL rendering, fragment processing and rasterisation. Due to their high price the company's cards were initially aimed at the professional market. The Fujitsu Sapphire2SX 4MB retailed for $1600-$2000, while an 8MB ELSA GLoria 8 was $2600-$2850. The 300SX, however, was intended for the gaming market.
S3 seemed to be everywhere at that time. The high-end OEM marked was dominated by the company's Trio64 chipsets that integrated DAC, a graphics controller, and clock synthesiser into a single chip.

The Gaming GLINT 300SX of 1995 featured a much-reduced 2MB of memory. It used 1MB for textures and Z-buffer and the other for frame buffer, but came with an option to increase the VRAM for Direct3D compatibility for another $50 over the $349 base price. The card failed to make headway in an already crowded marketplace, but 3DLabs was already working on a successor in the Permedia series.

S3 seemed to be everywhere at that time. The high-end OEM marked was dominated by the company's Trio64 chipsets that integrated DAC, a graphics controller, and clock synthesiser into a single chip. They also utilized a unified frame buffer and supported hardware video overlay (a dedicated portion of graphics memory for rendering video as the application requires). The Trio64 and its 32-bit memory bus sibling, the Trio32, were available as OEM units and standalone cards from vendors such as Diamond, ELSA, Sparkle, STB, Orchid, Hercules and Number Nine. Diamond Multimedia?s prices ranged from $169 for a ViRGE based card, to $569 for a Trio64+ based Diamond Stealth64 Video with 4MB of VRAM.

The mainstream end of the market also included offerings from Trident, a long time OEM supplier of no-frills 2D graphics adapters who had recently added the 9680 chip to its line-up. The chip boasted most of the features of the Trio64 and the boards were generally priced around the $170-200 mark. They offered acceptable 3D performance in that bracket, with good video playback capability.

Other newcomers in the mainstream market included Weitek?s Power Player 9130, and Alliance Semiconductor?s ProMotion 6410 (usually seen as the Alaris Matinee or FIS?s OptiViewPro). Both offered excellent scaling with CPU speed, while the latter combined the strong scaling engine with antiblocking circuitry to obtain smooth video playback, which was much better than in previous chips such as the ATI Mach64, Matrox MGA 2064W and S3 Vision968.

Nvidia 1995

Nvidia launched their first graphics chip, the NV1, in May, and became the first commercial graphics processor capable of 3D rendering, video acceleration, and integrated GUI acceleration.

They partnered with ST Microelectronic to produce the chip on their 500nm process and the latter also promoted the STG2000 version of the chip. Although it was not a huge success, it did represent the first financial return for the company. Unfortunately for Nvidia, just as the first vendor boards started shipping (notably the Diamond Edge 3D) in September, Microsoft finalized and released DirectX 1.0.

The D3D graphics API confirmed that it relied upon rendering triangular polygons, where the NV1 used quad texture mapping. Limited D3D compatibility was added via driver to wrap triangles as quadratic surfaces, but a lack of games tailored for the NV1doomed the card as a jack of all trades, master of none.

Most of the games were ported from the Sega Saturn. A 4MB NV1 with integrated Saturn ports (two per expansion bracket connected to the card via ribbon cable), retailed for around $450 in September 1995.

Microsoft?s late changes and launch of the DirectX SDK left board manufacturers unable to directly access hardware for digital video playback. This meant that virtually all discrete graphics cards had functionality issues in Windows 95. Drivers under Win 3.1 from a variety of companies were generally faultless by contrast.
ATI Rage 3D
ATI announced their first 3D accelerator chip, the 3D Rage (also known as the Mach 64 GT), in November 1995.

The first public demonstration of it came at the E3 video game conference held in Los Angeles in May the following year. The card itself became available a month later. The 3D Rage merged the 2D core of the Mach64 with 3D capability.

Late revisions to the DirectX specification meant that the 3D Rage had compatibility problems with many games that used the API -- mainly the lack of depth buffering. With an on-board 2MB EDO RAM frame buffer, 3D modality was limited to 640x480x16-bit or 400x300x32-bit. Attempting 32-bit color at 600x480 generally resulted in onscreen color corruption, and 2D resolution peaked at 1280x1024. If gaming performance was mediocre, the full screen MPEG playback ability at least went some way in balancing the feature set.
The performance race was over before it had started, with the 3Dfx Voodoo Graphics effectively annihilating all competition.

ATI reworked the chip, and in September the Rage II launched. It rectified the D3DX issues of the first chip in addition to adding MPEG2 playback support. Initial cards, however, still shipped with 2MB of memory, hampering performance and having issues with perspective/geometry transform, As the series was expanded to include the Rage II+DVD and 3D Xpression+, memory capacity options grew to 8MB.

While ATI was first to market with a 3D graphics solution, it didn?t take too long for other competitors with differing ideas of 3D implementation to arrive on the scene. Namely, 3dfx, Rendition, and VideoLogic.


Screamer 2, released in 1996, running on Windows 95 with 3dfx Voodoo 1 graphics

In the race to release new products into the marketplace, 3Dfx Interactive won over Rendition and VideoLogic. The performance race, however, was over before it had started, with the 3Dfx Voodoo Graphics effectively annihilating all competition.

This article is the first installment on a series of four. If you enjoyed this, make sure to join us next week as we take a stroll down memory lane to the heyday of 3Dfx, Rendition, Matrox and young company called Nvidia.