UKGamer


Content

Latest

News
Articles

Community

Forum

UKGamer

About us
Network

Scan Windows 7

Talking software, ATi style
The software side of graphics cards are often overlooked. A couple of years ago we interviewed Ben Bar Haim, Vice President of Software at ATI. This time we take the spotlight and shine it into Andrew Dodd's face.

Could you please introduce yourself to our readers.
My name is Andrew Dodd and my official title is Software Products Manager. My role is mainly divided into two parts.

The first part is internal to ATI, where I develop (along with Terry Makedon) the CATALYST strategy, write requirements for features, work with developers etc. The second part of my role is external facing - creating presentation, and other materials to present to customers and press… and interviews.
As graphics chips become ever more programmable, there must surely come a day when the engineers deliver a core to you with nothing more than a list of hardware capabilities and the corresponding calls and leave it to the driver development team to find relevant and creative ways to work with them. Does increasing programmability mean increasing workload for the CATALYST team?
It's more of collaborative effort between many teams, than a hard division between hardware and software. But you're right, as graphics processors become more and more programmable they do require significantly more complex software. The other important part of the software effort is to expose many of the very cool hardware features to 3rd party developers so that they can take full advantage of them.
We believe that if you have to open Photoshop to see a problem with image quality then it's not a problem worth getting worked up over. Do you think the gaming community is shooting itself in the foot by being so against specific code optimizations?
I think application specific optimizations are a good thing as long as they do not affect the experience in any way, or reduce the image quality. If you have to zoom in 500% in Photoshop to see 5 pixels that may look a little different - you're bordering on the ridiculous side of things, and have way to much time on your hands ;-)
3DVelocity recently questioned ATi about the option of a smart driver IQ setting whereby lulls in the action, typically places where a player has a chance to survey the scenery, could see the driver intelligently turning up the image quality settings, while frantic scenes, often the most demanding on framerate, could see the driver dialling out those settings again to maintain fluid play. Is this a practical option and are ATi considering implementing it?
If you're referring to application detection, you honestly wouldn't see a difference in image quality in the game. So we would not ever turn them off, as there is no benefit to the user. If you are referring to turning on settings like anti-aliasing, anisotropic filtering, that would be a very cool feature, but unfortunately not possible as it would be impossible to make this a smooth transition between the two profiles (applications have to reload textures, etc.)
How much additional time and effort, above that put into the normal driver development regime, is/has been put into making sure ATi products "fit" into Vista's DRM? Would I get an honest answer if we asked whether you're in favour of the whole DRM thing?
There has been a lot of effort that has gone into supporting the entire DRM strategy, but users shouldn't think of it as taking resources away from working on more "useful" features. DRM is something that the whole industry is moving towards and something we need to support. And no you wouldn't get an honest answer from me ;-)
There must be months when very little gets changed in the fundamental driver architecture either because of absence/sickness or because the matters being worked on are taking longer than planned. How restrictive is the monthly driver schedule and do you ever get tempted to just rename a file and ship them out unaltered with a new revision number?
Although we are not changing the architecture of the driver every release, believe me there are a lot of other things are happening - bug fixes, performance enhancements and new features all get added just about every release. I definitely don't see a monthly posting cycle as a bad thing for our users - it always ensures our customers always have the latest and greatest from ATI.
What do you make of the various "third party" driver flavours currently available? Surely it leaves you between a rock and a hard place in that endorsing them suggests they are able to do things with your code that your own team can't, while at the same time slamming them looks like sour grapes.
None of the 3rd party tweakers are doing things with our source code, they're mostly just modifying registry values. But we have no problem with them doing this and we find some of their feedback very valuable. It is great that they take such an interest in what we are doing and work with us to try and find new ways to improve things.
There's no doubting how far ATi's drivers have come. You must be thrilled at being a major part of the "great ATi revival"?
It's been a great experience. Terry and I have been working on software since the very first CATALYST release, and we are proud to say that ATI software is the best there is in the graphics industry.
To what extent is your team’s work proactive and to what extent reactive? How much of a say do you get over the features you'd like to see implemented into future generation parts or are you entirely driven by what the engineers deliver to you?
Most of the time we work with our engineering teams and decide together what features would be useful for our customers.
We were once promised numerous plug-ins for Catalyst Control Center. What ever happened to them?
There are some plug-ins out there, but not as many as we would like. We're going to continue working on the SDK to make it much easier to create plug-ins. So hopefully this time next year, we'll see a lot more out there.
Unified drivers are great, but how difficult is it becoming to make the most from current GPUs without completely forsaking legacy products? Are owners of current generation cards sacrificing anything in order to satisfy the needs of older hardware?
Having a unified driver is something we definitely strive for, but as time passes, very old ASIC always get de-emphasized, so yes some of our ASICs that are 5 or more years old will get left behind, but we never make changes for new ASICs that negatively affect older ones.
What has happened to Truform? Is it officially dead and will 3Dc go the same way?
Truform no longer exists as a CATALYST feature - at the time it was created it was a useful feature, because most models had low triangle counts, hence turning on Truform made models look much better. However, with today's applications all models have a very high triangle count, and Truform is no longer necessary.

The main benefit of 3Dc is that it allows for very high compression of textures.

3Dc is here to stay for quite some time and, in fact, we even improved on it for the RADEON X1000 Series of products.
With Microsoft pretty much dominating the graphics scene with their ubiquitous DirectX suite, how hard has it become to implement new and exciting (or indeed exclusive) features while at the same time keeping in step with the current DirectX build?
As far as DirectX, we work very closely with Microsoft to ensure their API contains the latest and greatest features. As such, ATI is constantly contributing new features to every gamer in the world ¡V regardless of their choice of hardware. Standards are hugely important to us and we have been very proud to work as closely as we do with organizations like Microsoft to deliver tomorrow's platforms.
As LCD monitors come down in price (and in response times), so I presume the focus on super-high resolutions will shift. I mean most flat screens don't support resolutions above 1280x1024. Are you banking on the hardcore gamer sticking with their trusty CRT or will you be offering "bonus enhancements" at more moderate resolutions for LCD users?
There are some features we're working on for Notebook panels, but nothing right now for desktop LCDs. But that is an interesting idea, and something we'll put some thought into it.
People are keen to see Crossfire support under Linux. Is this planned and if so do you have a rough timescale?
This will come eventually, but our main focus for Linux right now is Workstation.
We would like to thank Andrew for taking time out to answer our questions.