Go Back   nV News Forums > Hardware Forums > Rumor Mill

Newegg Daily Deals

Reply
 
Thread Tools
Old 09-03-02, 09:31 AM   #1
Uttar
Registered User
 
Uttar's Avatar
 
Join Date: Aug 2002
Posts: 1,354
Send a message via AIM to Uttar Send a message via Yahoo to Uttar
Default ATI R300 & nVidia NV30 - Different visions

Hello everyone,

In this post, i'd like to compare nVidia and ATI marketing strateg and visions of the market. In my opinion, they both have a FULLY different opinion of the market.

ATI is currently betting on the fact that they got the peformance crown to sell their Radeon 9000

nVidia is now going to launch the NV28 and the NV30 - mid-end and high-end respectively. ( I won't try to speculate on the NV18 as too little is known )

nVidia opinion seems to be that the NV28, with nearly 40% more transistors than the GF4 Ti4600, will be truly sufficent for ANY DX8 game, and that more would be overkill. That isn't really incorrect. Who needs 6x AA and 16 tabs aniso?

Their opinion seems to be that DX8 should no longer be looked at - they're going to release the FASTEST DX8-only solution the world will ever see, the NV28, and that'll be truly sufficent for 95% of upcoming DX8 games.

And they got a HUGE advantage with that: they don't need to program DX9 features in too. IMO, if you had to retrieve DX9 parts of the R300, it'll most likely become 90 million transistors - or *really* not much more than the NV28 ( NV28 still having the problem of only 128 bit memory bus )


nVidia strategy with Cg is easy to understand: let the developers rapidly upgrade from DX8 to DX9, let them program great effects only working with the NV30 and stuff like that. In other words - let the developers have freedom to create things that make their cards sell.

So, nVidia most likely optimized the NV30 for DX9 - and it'll most likely won't be better at all in DX8 than the R300.

However, once DX9 games will come, the picture will most certainly change - and that's why nVidia is betting on Cg.


ATI, on the other hand, has the vision of a market where DX9 games will take so much time to be ready that having great DX9 performance is lost time - better to have great DX8 performance to make everyone think DX9 performance will own.

---
Summary
---

nVidia strategy is very simple - high DX8 performance with a DX8 card and high DX9 performance with a DX9 card - they don't think anyone cares that the performance of their DX9 card is so much better in DX8 than their DX8 card.

ATI, on the other hand, bets on the fact DX9 games will be nearly non-existant for months. And it isn't RenderMonkey which is gonna change that *grins* I'm sorry, but i prefer Cg with VC++ color coding.

Conclusion

Most people think nVidia wants the DX9 launch schedule delayed. I agree. But those same people think ATI really doesn't like that.
I disagree. ATI likely likes that - if their performance is compared to the NV30 in DX9, they doesn't stand a chance. So they prefer for the early previews to be done in DX8 - it's better for them.

DX9 launch being late is thus IMO accepted and preferred by nVidia *and* ATI.

Was long, nay?

Uttar
Uttar is offline   Reply With Quote
Old 09-03-02, 12:26 PM   #2
macro6
Guest
 
Posts: n/a
Default

i thought nv28 is a geforce4Ti + 8x agp? i read somewhere that it will hav some features taken out like the radeon 9000?



i swear, i read that somewhere....
  Reply With Quote
Old 09-03-02, 01:08 PM   #3
sancheuz
Registered User
 
Join Date: Jul 2002
Posts: 186
Default

Yeah, it wont be called geforcd4 ti though, they will probably name it something else, does anybody know what it will be called? The nv28 that is?
__________________
huh?
sancheuz is offline   Reply With Quote
Old 09-03-02, 01:35 PM   #4
Megatron
Powered by 6800GT
 
Megatron's Avatar
 
Join Date: Jul 2002
Location: Massachusetts
Posts: 239
Default Re: ATI R300 & nVidia NV30 - Different visions

Quote:
Originally posted by Uttar
if their performance is compared to the NV30 in DX9, they doesn't stand a chance.
And what would you be basing this statement on?
__________________
Athlon64 3200+
1Gb PC3200
BFG 6800GT
Windows XP
Megatron is offline   Reply With Quote
Old 09-03-02, 02:37 PM   #5
jbirney
Registered User
 
jbirney's Avatar
 
Join Date: Jul 2002
Posts: 1,430
Default

Good points but I see it differently:

Quote:
ATI is currently betting on the fact that they got the peformance crown to sell their Radeon 9000
Well thats not gonna to hurt them, but I dont see that as the major selling point. They major selling point is its the first true "bugdet" card that features DX8.1. It also have a bunch of features. OEMs like features to advertise.


Quote:
Their opinion seems to be that DX8 should no longer be looked at - they're going to release the FASTEST DX8-only solution the world will ever see, the NV28, and that'll be truly sufficent for 95% of upcoming DX8 games.
There as some talk that the NV28 was the ingergrade MCU of the nForce2 that had lower cost GF3 intergrated into it. That could explian the extra transistors over the GF4. I have no idea. I did find out today that one of the new parts was only a GF4 Mx only with AGP x8. This is a huge mistake IMHO we need more DX8 cards out there not more DX7 parts!



Quote:
nVidia strategy with Cg is easy to understand: let the developers rapidly upgrade from DX8 to DX9, let them program great effects only working with the NV30 and stuff like that. In other words - let the developers have freedom to create things that make their cards sell.
Guys pixel/vertex shaders are not that hard to write by hand. Its not the easiest thing to do but there are other things much worse. Dont take me wrong as I like what Cg can do. It will help and shave a small amount of time off the developement cycles. But unitl the mass of people out there have hardware that can use these new features then it wont mean jack. Again look back at the orignal TNT. How long before games were 32 bit? Look at the TnL unit of the GF, how long before games that used TnL were the norm? Look at Vertex and pixel shadders made possible by the GF3 and 16 months later we still only have a handful of games that use these features. The reason is very simple. Developers will always write their game to the base hardware specs. Dose not matter what the highend cards have as very few if any of those features will be used. Its simple logic.

Quote:
So, nVidia most likely optimized the NV30 for DX9 - and it'll most likely won't be better at all in DX8 than the R300.
I can see it the other way. I think for some first set of DX8 titles the extra TMU of the NV30 will give it a bit more muscle. But as more and more games use a fully DX8 pipeline that extra TMU does not do much so maybe it will sift? Who knows.

Quote:
ATI, on the other hand, has the vision of a market where DX9 games will take so much time to be ready that having great DX9 performance is lost time - better to have great DX8 performance to make everyone think DX9 performance will own.
The same is true for the NV30 as its probably only a few months behind the R300.

Quote:
nVidia strategy is very simple - high DX8 performance with a DX8 card and high DX9 performance with a DX9 card - they don't think anyone cares that the performance of their DX9 card is so much better in DX8 than their DX8 card.

ATI, on the other hand, bets on the fact DX9 games will be nearly non-existant for months. And it isn't RenderMonkey which is gonna change that *grins* I'm sorry, but i prefer Cg with VC++ color coding.
Actually RenderMonkey is add on for the popular design tools. It outputs code so you dont have to write a single line. Funny thing is, in most cases you can use RenderMonkey to output Cg code which then can be complied. Imagine that
jbirney is offline   Reply With Quote
Old 09-05-02, 09:28 PM   #6
StealthHawk
Guest
 
Posts: n/a
Default

Quote:
Originally posted by jbirney
There as some talk that the NV28 was the ingergrade MCU of the nForce2 that had lower cost GF3 intergrated into it. That could explian the extra transistors over the GF4. I have no idea. I did find out today that one of the new parts was only a GF4 Mx only with AGP x8. This is a huge mistake IMHO we need more DX8 cards out there not more DX7 parts!
the way i heard it was that NV18 was the nforce2 part that was quoted with the higher transistor count. we still know absolutely nothing about NV28 - unless some new information popped up while i was gone.
  Reply With Quote
Old 09-06-02, 11:19 AM   #7
jbirney
Registered User
 
jbirney's Avatar
 
Join Date: Jul 2002
Posts: 1,430
Default

Oh sorry my bad.
jbirney is offline   Reply With Quote
Reply


Thread Tools

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump


Similar Threads
Thread Thread Starter Forum Replies Last Post
NVIDIA Showcases GPU Breakthroughs at IBC 2012 News Latest Tech And Game Headlines 0 09-07-12 10:10 AM
Enhance Max Payne 3, Diablo III with GeForce R300 Drivers News Latest Tech And Game Headlines 0 05-22-12 06:30 PM
NV30 to rape the R300...!? SurfMonkey Rumor Mill 12 10-02-02 04:41 PM
Understanding CineFX - MUCH more than the R300 Uttar Rumor Mill 68 10-02-02 01:02 AM
WOOT R300 at 400 mhz already!!! druga runda Other Desktop Graphics Cards 28 08-22-02 10:22 PM

All times are GMT -5. The time now is 01:24 PM.


Powered by vBulletin® Version 3.7.1
Copyright ©2000 - 2014, Jelsoft Enterprises Ltd.
Copyright 1998 - 2014, nV News.