VOGONS


Reply 60 of 103, by leileilol

User metadata
Rank l33t++
Rank
l33t++
DrAnthony wrote on 2024-04-20, 20:36:

You can see the software fallback on the Kyro II kept up with the GeForce 2 without any sort of exotic CPU.

I'm certain Aquanox was more fillrate hungry than pushing vectors. PowerVR avoids overdraw and will only show a performance edge in poorly optimized games that are a little full of themselves (such as Aquanox). There's also no hardware depth buffer on the Kyro so that's less reading to do (and some effects, like lens flares/coronas will go missing on a few games if they rely on a pixel read)

Also 2001 results would long predate their EnTNL driver.

apsosig.png
long live PCem

Reply 61 of 103, by eddman

User metadata
Rank Member
Rank
Member
DrAnthony wrote on 2024-04-20, 20:36:
Sorry to drag this back up but I had to rack my old brain a bit to remember where I saw this benchmark but this really clearly s […]
Show full quote

Sorry to drag this back up but I had to rack my old brain a bit to remember where I saw this benchmark but this really clearly shows why hardware T&L was a nothinbuger.

https://www.anandtech.com/show/742/11

You can see the software fallback on the Kyro II kept up with the GeForce 2 without any sort of exotic CPU. It would be interesting to see where the original GeForce would fall in this benchmark as well.

Now, that isn't saying that it's completely useless, in real world conditions it would definitely be useful to free up CPU cycles for other tasks, but it was more of a floor raiser than pushing up the ceiling. I'm sure 3DFX was aware of this and likely was looking more towards a programmable future, they just couldn't stay alive long enough to get there.

Edit: Thought some more on this and realized that the gigahertz Athlon T-bird he used in this tested was released the same year (possibly within a few months) as the GeForce 2. It was high end, but wasn't the top dog for the Athlon that year and the T&L engine on the GPU couldn't outpace it here. It would be super interesting to see what it would take CPU wise to keep up with the GeForce 3 since it did provide a pretty significant leap.

Comparing hardware and software T&L using a single benchmark, at one resolution, and two different cards would skew the results, given the differences in the microarchitectures. The proper approach would be to use only a geforce 2 and simply disable and enable T&L in a multitude of benchmarking software.

Reply 62 of 103, by Hoping

User metadata
Rank Oldbie
Rank
Oldbie

I think the issue of 3dfx is almost like a disease that many of us suffer from, we like a brand AMD or Intel, AMD or Nvidia, or 3dfx, and when we choose a path, we always try to look for the advantages of what we choose, ignoring the flaws, even if they are obvious.
I think that's what happens with 3dfx, a lot of hardcore fans see those graphics cards as if they were some kind of holy grail, and I totally and absolutely respect that.
I think to be honest, we all know whether we want to see it or not, that the only interesting 3dfx cards are Voodoo 1, Banshee and Voodoo 2, which were the ones that made history, Voodoo 3 was no longer the same. And the later ones no longer had real relevance in the market. I remember seeing them on the shelves collecting dust and sometimes with the box discolored by the sunlight, because no one wanted to buy them. The same thing happened with Kyro cards, I remember a store that had shelves full of boxes of Kyro 4000 cards, hopefully the boxes would be empty...
I suffer from this disease with AMD, but I try hard to see the flaws so as not to blind myself.
At that time, a friend gave me his Voodoo 3 3000, I still have it, because I had bought a GeForce 2 MX and the difference was very noticeable, and it had not yet been six months since he bought the Voodoo 3. I remember that his anger was enormous, if I hadn't kept Voodoo 3, he would have thrown it out the window from the fifth floor. 😀
Things of youth.

Reply 63 of 103, by DrAnthony

User metadata
Rank Newbie
Rank
Newbie
eddman wrote on 2024-04-21, 07:06:
DrAnthony wrote on 2024-04-20, 20:36:
Sorry to drag this back up but I had to rack my old brain a bit to remember where I saw this benchmark but this really clearly s […]
Show full quote

Sorry to drag this back up but I had to rack my old brain a bit to remember where I saw this benchmark but this really clearly shows why hardware T&L was a nothinbuger.

https://www.anandtech.com/show/742/11

You can see the software fallback on the Kyro II kept up with the GeForce 2 without any sort of exotic CPU. It would be interesting to see where the original GeForce would fall in this benchmark as well.

Now, that isn't saying that it's completely useless, in real world conditions it would definitely be useful to free up CPU cycles for other tasks, but it was more of a floor raiser than pushing up the ceiling. I'm sure 3DFX was aware of this and likely was looking more towards a programmable future, they just couldn't stay alive long enough to get there.

Edit: Thought some more on this and realized that the gigahertz Athlon T-bird he used in this tested was released the same year (possibly within a few months) as the GeForce 2. It was high end, but wasn't the top dog for the Athlon that year and the T&L engine on the GPU couldn't outpace it here. It would be super interesting to see what it would take CPU wise to keep up with the GeForce 3 since it did provide a pretty significant leap.

Comparing hardware and software T&L using a single benchmark, at one resolution, and two different cards would skew the results, given the differences in the microarchitectures. The proper approach would be to use only a geforce 2 and simply disable and enable T&L in a multitude of benchmarking software.

Of course. There were in depth looks published at the time that I remember showing similar results but many have been lost to time. I also don't happen to have a full suite of period appropriate hardware (Kyro II, GeForce 2, GeForce 3 and whatnot) but if you happen to you certainly are welcome to run whatever benchmarks you feel are a valid set and share your findings.

Reply 64 of 103, by eddman

User metadata
Rank Member
Rank
Member

A full suite is not needed; just a single card with fixed-function HW T&L and period CPU. For real use case results the test software should be games (with an option to switch between SW and HW T&L) so that the CPU is processing more than just T&L calculations. I only had a TNT2 and then a GF4 Ti.

Reply 65 of 103, by DrAnthony

User metadata
Rank Newbie
Rank
Newbie
eddman wrote on 2024-04-21, 17:03:

A full suite is not needed; just a single card with fixed-function HW T&L and period CPU. For real use case results the test software should be games (with an option to switch between SW and HW T&L) so that the CPU is processing more than just T&L calculations. I only had a TNT2 and then a GF4 Ti.

Reasonable enough, I'm just not the person you should be targeting on this. If better data comes up I'd love to see to see it.

Reply 66 of 103, by eddman

User metadata
Rank Member
Rank
Member

I never targeted you. Just pointed out the type of testing that is needed to draw a proper conclusion.

Reply 67 of 103, by Hoping

User metadata
Rank Oldbie
Rank
Oldbie

Well, for example, in this thread (Netburst: Aiming for the Stars);
which begins at the time when Voodoo 3,4 and 5 were sold, with the Pentium 4 Socket 423 and 478; no one has used a 3dfx card.
There are Gerforce 2/3/4, FX, 6 and more modern, also Radeon cards from ATI. Although most are from Nvidia.
I used a FireGL.
Maybe it just so happens that none of the members who participated have any 3dfx cards, or it was something else, who knows... After all, 3DMark 2001 I think uses DX7, that is, it uses HT&L. Please correct me if I'm wrong.
And making a comparison by disabling HT&L on the cards that support it does not make much sense, because it would not show reality.
It would not be an objective test. It would be a worthless, subjective test.
But if it is done out of curiosity or fun, it would be interesting.
Like the curious attempts to run Doom 3 on 3dfx cards, those experiments are always interesting.

Reply 68 of 103, by eddman

User metadata
Rank Member
Rank
Member
Hoping wrote on 2024-04-21, 19:49:

And making a comparison by disabling HT&L on the cards that support it does not make much sense, because it would not show reality.
It would not be an objective test. It would be a worthless, subjective test.

The aim is to figure out the effectiveness of HW T&L alone. You cannot achieve that by comparing different cards that have different specs. Using the same card and testing in both HW and SW T&L modes is the proper objective way of doing it.

You could test multiple cards, but you still need to test them in both modes.

Reply 69 of 103, by Hoping

User metadata
Rank Oldbie
Rank
Oldbie
eddman wrote on 2024-04-21, 20:06:

The aim is to figure out the effectiveness of HW T&L alone. You cannot achieve that by comparing different cards that have different specs. Using the same card and testing in both HW and SW T&L modes is the proper objective way of doing it.

You could test multiple cards, but you still need to test them in both modes.

Well, it sounds like an interesting experiment to find out what is lost and gained with HT&L and without HT&L.
Many meaningless experiments are done around here, just for the love of these things. 😉.
Although many times they do not have much real use, only because it is interesting to do them...

Reply 70 of 103, by DrAnthony

User metadata
Rank Newbie
Rank
Newbie
Hoping wrote on 2024-04-21, 20:44:
Well, it sounds like an interesting experiment to find out what is lost and gained with HT&L and without HT&L. Many meaningless […]
Show full quote
eddman wrote on 2024-04-21, 20:06:

The aim is to figure out the effectiveness of HW T&L alone. You cannot achieve that by comparing different cards that have different specs. Using the same card and testing in both HW and SW T&L modes is the proper objective way of doing it.

You could test multiple cards, but you still need to test them in both modes.

Well, it sounds like an interesting experiment to find out what is lost and gained with HT&L and without HT&L.
Many meaningless experiments are done around here, just for the love of these things. 😉.
Although many times they do not have much real use, only because it is interesting to do them...

Well that and it's hard to say how much effort NV put into software fallbacks on cards with hardware T&L. The comparison with the PowerVR card was useful in that regard as they at least named it a feature and may have put in some decent optimizations.

Reply 71 of 103, by Hoping

User metadata
Rank Oldbie
Rank
Oldbie
DrAnthony wrote on 2024-04-22, 12:26:

Well that and it's hard to say how much effort NV put into software fallbacks on cards with hardware T&L. The comparison with the PowerVR card was useful in that regard as they at least named it a feature and may have put in some decent optimizations.

Maybe, a starting point could be 3DMark99, if I'm not mistaken, it uses DX6, so there would be no need to disable HT&L.
I have a Prophet 4000 (Kyro 1), and I never paid much attention to it, I don't remember which computer it is on, the Voodoo 3 is on some computer with a K6-2, and I don't remember which one either. 😉
I think that if I had to choose a card for DX6 between the Voodoo 3 and the Kyro, I would choose the Kyro.
My problem is that I am not a fan of benchmarks, I always simply look for what meets what I am looking for, I am more lazy to do benchmarks than to try the game I want to play looking for the best experience within my tastes.
Anyway; Someone who has a Voodoo 4 or 5 should look at this thread, it seems unlikely to me that that will happen.
And it seems that the reason Voodoo 4 and 5 didn't have HT&L was simply because 3dfx were living in their own world. 😀

Reply 72 of 103, by Bruno128

User metadata
Rank Member
Rank
Member
Hoping wrote on 2024-04-22, 14:03:

Anyway; Someone who has a Voodoo 4 or 5 should look at this thread

3DMark99 on V4 in period correct system. You are right it's DX6 test so no TnL.

The attachment 3dm99-3698pts-v4-4500-2xAA-p3-866eb.jpg is no longer available
The attachment 3dm99-6404pts-v4-4500-noAA-p3-866eb.jpg is no longer available

SBEMU compatibility reports list | Navigation thread


Now playing:
Gold Rush: My VLB 486 (now with SC-55)
Baldur's Gate: Bridging compatibility gap in this year 2000 build
Arcanum: Acrylic 2003 build (January 2024)

Reply 73 of 103, by Hoping

User metadata
Rank Oldbie
Rank
Oldbie

Well, in this thread (3dmark99 MegaThread) there are many 3DMark99 results, and it seems that the Geforce 2 would be above the Voodoo 4 Under the same conditions. It is not strange either, since on paper, at least, it doubles the Voodoo 4 4500, in fill rates and memory bandwidth.
So even though the Geforce 2 GTS can't put into reality the capabilities it shows on paper, I think brute force wins in the end.
If we trust the data in Techpowerup, the Geforce 2 GTS was launched on the market a few months before the Voodoo 4.
Edit: I'm not a fan of Nvidia hardware, but the reality is what it is.

Reply 74 of 103, by swaaye

User metadata
Rank l33t++
Rank
l33t++

T&L style tech wasn't super useful until Direct3D 8 anyway. It was very well marketed though and the tech review sites ate it up.

The other thing is a some games were still favoring Glide. Voodoo5 runs Unreal Engine 1.x games with great performance, quality and efficiency. A GeForce 2 does not.

And also, GeForce S3TC/DXT1 was low quality until GeForce 4 MX. GeForce 1-2 in particular are some of the lowest image quality of the contemporaries.

Last edited by swaaye on 2024-04-22, 15:53. Edited 8 times in total.

Reply 75 of 103, by Bruno128

User metadata
Rank Member
Rank
Member
Hoping wrote on 2024-04-22, 15:18:

Geforce 2 would be above the Voodoo 4 Under the same conditions

True. It is a superior product and even the GF2MX beats V4 by a small margin let alone DDR versions.

SBEMU compatibility reports list | Navigation thread


Now playing:
Gold Rush: My VLB 486 (now with SC-55)
Baldur's Gate: Bridging compatibility gap in this year 2000 build
Arcanum: Acrylic 2003 build (January 2024)

Reply 76 of 103, by eddman

User metadata
Rank Member
Rank
Member

3dmark and other synthetic benchmarks are useless. Only games should be used for testing, and must have both SW and HW options.

Reply 77 of 103, by Dothan Burger

User metadata
Rank Member
Rank
Member

Doesn’t Midtown madness 2 let you toggle T&L?

Reply 78 of 103, by The Serpent Rider

User metadata
Rank l33t++
Rank
l33t++
swaaye wrote on 2024-04-22, 15:41:

Voodoo5 runs Unreal Engine 1.x games with great performance, quality and efficiency. A GeForce 2 does not.

Mostly irrelevant with OpenGL API.

And also, GeForce S3TC/DXT1 was low quality until GeForce 4 MX.

Also irrelevant, because all affected cards have a workaround for that.

I must be some kind of standard: the anonymous gangbanger of the 21st century.

Reply 79 of 103, by leileilol

User metadata
Rank l33t++
Rank
l33t++
eddman wrote on 2024-04-22, 15:48:

3dmark and other synthetic benchmarks are useless. Only games should be used for testing, and must have both SW and HW options.

Unless those games get screwed by driver regressions like CVAs not being detected in Quake3 leading to nVidia's big lead over Radeons

The Serpent Rider wrote on 2024-04-22, 21:10:

Mostly irrelevant with OpenGL API.

OpenGLDrv wasn't officially feature complete. Also the OpenGLDrv shipping with Deus Ex had some severe graphical glitches with the Geforces, like *everyone* would get shades. In 2000 a Geforce must use the D3DDrv and deal with the detail overdraw. Donhal and Kentie's drivers didn't exist for years

apsosig.png
long live PCem