VOGONS


Reply 140 of 230, by myne

User metadata
Rank Oldbie
Rank
Oldbie

About on par with a geforce2/3
https://www.tomshardware.com/reviews/geforce3 … nce,311-16.html

I built:
Convert old ASUS ASC boardviews to KICAD PCB!
Re: A comprehensive guide to install and play MechWarrior 2 on new versions on Windows.
Dos+Windows 3.11 auto-install iso template (for vmware)
Script to backup Win9x\ME drivers from a working install
Re: The thing no one asked for: KICAD 440bx reference schematic

Reply 141 of 230, by Hoping

User metadata
Rank Oldbie
Rank
Oldbie

The GPU of Atom processors with XP support seems to be quite capable for games up to 2002 at best, although compatibility is sometimes trial and error, I don't consider any Atom to be capable of running all DX8/9 games correctly, let alone with the integrated GPU. With the integrated GPU I would say limit it to DX7 and below, OpenGL support seems to be pretty good in general.
Any AMD APU will outperform an Atom with its own GPU.
Adding a dedicated GPU to an Atom processor is not very useful because of the low IPC of the Atom and also that it is common for PCIE slots on Atom motherboards to be limited to 1X.
In this thread tests were done with Atom processors.
Gaming on my Intel Atom

Reply 142 of 230, by gerwin

User metadata
Rank l33t
Rank
l33t
DaveDDS wrote on 2024-08-14, 11:02:

ched a screen-grab of the final result - funny, I ran it with the
options it showed when it started (didn't change anything) but it says
"The benchmark was not run using default settings."

The default 3DMark2001 resolution is 1024x768. But AFAIK that netbook has a 1024x600 screen size. (There is some optional tool to allow for scrolled screen size x768 in height)
So you get slighlty higher framerate and final score, compared to what one would have gotten with default resolution.

--> ISA Soundcard Overview // Doom MBF 2.04 // SetMul

Reply 143 of 230, by BitWrangler

User metadata
Rank l33t++
Rank
l33t++

The ITX Intel Johnstown Atom N270 board has a single PCI socket. I have an HD 2400 PCI, an FX6200 PCI and a 9400GT I am gonna try in it sometime. Chipset gfx in that one I'd only consider good for DX7 stuff really, maybe SIMS and strategy stuff okay at newer end.

Unicorn herding operations are proceeding, but all the totes of hens teeth and barrels of rocking horse poop give them plenty of hiding spots.

Reply 144 of 230, by DaveDDS

User metadata
Rank Member
Rank
Member

Little better score on the ZBOX - this was with a 1024x768 display

Keep in mind that this was running on Win7-32 instead of XP.

I don't think I saw an FPS below 90 with many up to 200-300 (some even 400+)

Dave ::: https://dunfield.themindfactory.com ::: "Daves Old Computers"->Personal

Dave ::: https://dunfield.themindfactory.com ::: "Daves Old Computers"->Personal

Reply 145 of 230, by Hoping

User metadata
Rank Oldbie
Rank
Oldbie

The Celeron 2957u is of Haswell architecture, and although it may sound ridiculous, you cannot compare an Atom with a Celeron, the Atom are designed for the lowest possible power consumption, while the Celeron use the same architecture as their superior brothers, but they are greatly reduced, If I'm not mistaken, of course.
I did the test of an Atom D525 with a dedicated GPU a while ago, and it was very obvious that the processor, or the motherboard overall, were not up to par with the GPU, even being a low-end GPU, an HD 5450, the use of GPU was always very low.
So, for me, not an option for DX8/9.

Reply 146 of 230, by oldhighgerman

User metadata
Rank Member
Rank
Member

The recent celerons, perhaps also to some extent, the earliest ones, which were a paired down Pentium II or some such, for what it is, pretty capable for light use. I have an Asus laptop, something I toss in my backpack, that was 100$ is on Black Freya's day 2 years ago I guess. 2 core no hyper threading. Now I posted a thread thread about an appliance I found on eBay the other day, pretty cheap, that has an earlier Celeron, but 4 cores. Now generally you won't see the additional cores in general. But that ain't too shabby.

The atom is strictly an embedded chip as far as I'm concerned. My old Asus eeepc 900a was an alright albeit tiny little notebook. Atom based, 32 bit. Mine was absutely beat to snot, but still worked. I may have tossed it, as no one wanted anything to do with it.

My atomic Pi's, wherever they are, also have Atoms. For what it is it's fine. I think the damned things even run Windows 10. It was made to control a robot. Performance was not an ossie. But atoms and celerons are wide apart in terms of capability.

When I needed a cheap unit to finish a course I was taking 10 years ago, I got an Asus x205t, Atom based. It was fine at the time, did everything I needed it to do. Again a c note on Thanksgiving sale. I gave it to someone who quickly installed some Linux on it. 6 months later he tells me it's a poc. Atoms and most low end AMD chips, not that there are much of them around in 2024 (new) are to be avoided.

Reply 147 of 230, by BitWrangler

User metadata
Rank l33t++
Rank
l33t++

The generalisations would work if Intel was anywhere near consistent with their branding. The situation of the past decade or so is that sometimes celeron and pentium branded models are on an atom architecture, sometimes they are on a bridge/lake core i architecture. In the mobile field there's a few celerons that are pretty much the same speed as the 2 core U series i5s, which I guess means if you make stuff down to a price or down to a power target it comes out about the same. In the early atom cores they seemed to go clock for clock with early dothan/core1 core on integer stuff, but maybe had trimmed back FPU. Then by the four core units vs the core 2, they'd dropped back to about 50%, meaning a 4 core atom at 2.4 ghz was about the same as a core2 duo at 2.4ghz as long as the app was well threaded. Otherwise, SOL. Celerons have been on and off terrible and pretty good, too crippled vs crippled in ways that don't turn out to matter much.

Anyway, yah you don't get good single core power on the many core units, as would be ideal for XP, and clock didn't really ramp much, so the earlier single and duals around 2Ghz might be P4 3Ghz-ish, but then single core perf kinda stays there for ages and they drop power and add cores.

Unicorn herding operations are proceeding, but all the totes of hens teeth and barrels of rocking horse poop give them plenty of hiding spots.

Reply 148 of 230, by Hoping

User metadata
Rank Oldbie
Rank
Oldbie

The Pentium M's were far superior to the P4's both Banias and Dothan.
I had no idea about Intel's name switching back and forth between Celeron, Pentium and Atom.
I would say that the first generations of the Atom would be at a similar level to a Pentium III, the N270 were used with XP or linux, and were not very fast in XP, I have an Acer Aspire One ZG5 that uses a N270 with 1.5gb ram and XP and is only relatively fluid from my point of view. The N450 is a little better and feels smoother in XP, although it may be influenced by the fact that I tested it with 2gb RAM, the D525 in XP does not improve much, the frequency is only 1.8Ghz and as already said the multicore does not offer a great benefit in XP without single core performance.
They are curious to pass the time, but their usefulness is limited to software much, much older than those processors, and they certainly have nothing to do against the computers mentioned above.

Reply 149 of 230, by oldhighgerman

User metadata
Rank Member
Rank
Member

Well this much I'll say for celerons (as opposed to low end AMD crappola). If you need a cheap cpu to get up and running, Intel by far has what you need. I mean 22$, in a good economy anyway. For a while late model celerons we're going for 50-60$ US on the street.

I have a 7 year old laptop with an AMD A4-7210. In 2024 it's still the fastest laptop I own 🤣. It has a mediocre or a bit better 17.3" screen, and for what it is it's fine. But I have to believe an equivalent Pentium chip would make it better (wasn't offered, only core series.in in Intel versions). It was just under 300$ to my door. No complaints. How could I. But every other AMD (at the.low end) cpu I encountered was crappola. And you'll pay twice.as much as a Celeron.

I'm not a fan boy. I'm just practical. That's why in this present space-time continuum, the all around best bang for the buck is Intel. I'm done paying big bucketoos for computers. Learned my lesson long ago. Fastest chip (other then xeons) is an i5-11400f. Now it would be nice to cram a fast i9 into some crate. Eventually.

Reply 150 of 230, by Ozzuneoj

User metadata
Rank l33t
Rank
l33t

This is slightly OT at this point, but because there was some discussion earlier in the thread about which GPUs are good for tiny XP computers, I felt this would be relevant.

https://www.realworldtech.com/tile-based-rast … on-nvidia-gpus/

Apparently this wasn't something that Nvidia advertised and I don't believe this is something that the tech\graphics community has talked about much, but the Maxwell architecture was confirmed via an investigation by Real World Tech back in 2016 to use tile-based rasterization. This would explain why Nvidia's GPUs from that point on saw such large improvement in efficiency (both in power and memory bandwidth) compared to previous generations or AMDs competing architectures.

I believe that would make Maxwell the first functional implementation of this method of rendering on the desktop since STMicro's Kyro2 back in 2001. This allowed the Kyro2 to compete with cards as fast as the Geforce 2 GTS and Ultra in some situations, despite having half the memory bandwidth (or less). Of course, other things held it back most of the time, but tile-based rendering was what allowed it to compete at all. So, seeing similar benefits on Maxwell-based cards should be expected.

Now for some blitting from the back buffer.

Reply 151 of 230, by The Serpent Rider

User metadata
Rank l33t++
Rank
l33t++

Got the cards. So far K1200 consumes the same 22W as K620? Huh. Not sure I can trust the GPU-Z on this one, although it's indeed has lower clock and voltage than K620. Anyway, it's about 33% faster in Heaven (1175 score) and F.E.A.R. (96 average vs 126 average).

Now W4100 is about the same speed as K620 when overclocked to 820MHz (unofficial MSI Afterburner limit) and without touching the memory clock. Tied in F.E.A.R. with average 96 fps and Heaven score is 711 being one on lower side, due to tessellation. Keep in mind that GCN 1.0 easily scales beyond 1GHz, but in this particular case would require BIOS modding. The card works at 1.050V and have similar amount of active transistors, so it probably consumes about 30W realistically.

Last edited by The Serpent Rider on 2024-08-17, 21:02. Edited 1 time in total.

I must be some kind of standard: the anonymous gangbanger of the 21st century.

Reply 152 of 230, by Hoping

User metadata
Rank Oldbie
Rank
Oldbie
The Serpent Rider wrote on 2024-08-17, 19:58:

Got the cards. So far K1200 consumes the same 22W as K620? Huh. Not sure I can trust the GPU-Z on this one, although it's indeed has lower clock and voltage than K620. Anyway, it's almost twice as fast in Heaven (1175 score), but F.E.A.R. result is less impressive (96 average vs 126 average).

Looking forward to hearing what you find out about the performance of those two cards in XP.

Reply 153 of 230, by The Serpent Rider

User metadata
Rank l33t++
Rank
l33t++

Well, W4100 is out of the question for now. It doesn't have Windows XP drivers officially and I don't know which string in inf file needs to be edited, like with W4300. R7 250E should have official support though.

Overall I'm not very satisfied with K1200, because power limit on this card can be lowered only down to 85%, which this card NEVER reaches under normal circumstances anyway. And Maxwell bios is very locked. The card reaches 78 C, which would be very bad in a thin client.

I must be some kind of standard: the anonymous gangbanger of the 21st century.

Reply 154 of 230, by Ozzuneoj

User metadata
Rank l33t
Rank
l33t
The Serpent Rider wrote on 2024-08-17, 22:37:

Well, W4100 is out of the question for now. It doesn't have Windows XP drivers officially and I don't know which string in inf file needs to be edited, like with W4300. R7 250E should have official support though.

Overall I'm not very satisfied with K1200, because power limit on this card can be lowered only down to 85%, which this card NEVER reaches under normal circumstances anyway. And Maxwell bios is very locked. The card reaches 78 C, which would be very bad in a thin client.

I wonder if the GTX 745 would allow more tweaking? It is a 4GB K620, without the Quadro branding.

Regardless, I can't imagine the GCN 1.0 cards consuming less power or putting out less heat for the same performance level when crammed into a tiny box. And even these should all be miles better than a low profile GTS 450 with regard to heat and power.

Why not just cram them into your Thin Clients and test them out? 😁

It is possible that TCs of this size just don't have the cooling capacity to handle another 20-30 watts of heat from a dedicated GPU. Gaming loads may already push the system close to the limits of what it was designed for, even before adding another card to the mix.

EDIT: Also, have you tried Maxwell Bios Tweaker? It should work even on these Quadros. You may be able to just flat out adjust the power limit directly that way to make a lower wattage card that drops right into a system without any fear of it running too hot.

Now for some blitting from the back buffer.

Reply 155 of 230, by Hoping

User metadata
Rank Oldbie
Rank
Oldbie

AMD had nothing equivalent to Nvidia's options in this case, i.e. performance, low profile and XP drivers, and I have many more AMD graphics cards than Nvidia.
But in this case I see no choice on AMD's side, the latest low profile AMD graphics cards with XP support are a far cry in performance from Nvidia's options. And worse, the few that are available are very expensive for their performance, unless I missed a model. Unless someone manages to modify the drivers for the Firepro cards, but still the price performance ratio would be worse I think..

Reply 156 of 230, by The Serpent Rider

User metadata
Rank l33t++
Rank
l33t++

I haven't tried modified Nvflash yet. Since GTX 745 is a desktop card, it won't have any locks.

Hoping wrote on 2024-08-18, 08:56:

the latest low profile AMD graphics cards with XP support

Like I said, R7 250E has official support. Only FirePro analog does not have it.

I must be some kind of standard: the anonymous gangbanger of the 21st century.

Reply 157 of 230, by oldhighgerman

User metadata
Rank Member
Rank
Member

My AMD Geode dev system. I thought it was buried in the garage. Found it in hallway closet.
Pretty cute, no. NFS.

The attachment IMG_20240818_111258420.jpg is no longer available

Reply 158 of 230, by Ozzuneoj

User metadata
Rank l33t
Rank
l33t
The Serpent Rider wrote on 2024-08-18, 09:33:

I haven't tried modified Nvflash yet. Since GTX 745 is a desktop card, it won't have any locks.

It seems like using Maxwell Bios Tweaker + nVflash should solve the issue of voltage control, since others have used it to massively overclock K620 and K1200 cards. From what I have seen, there are guides to bypass any issues that prevent flashing a modified BIOS for Geforce or Quadro cards.

The Serpent Rider wrote on 2024-08-18, 09:33:
Hoping wrote on 2024-08-18, 08:56:

the latest low profile AMD graphics cards with XP support

Like I said, R7 250E has official support. Only FirePro analog does not have it.

As mentioned before, the R7 250E is just an HD 7750. I can't find any reports online of there being any difference between them.

Now for some blitting from the back buffer.

Reply 159 of 230, by Hoping

User metadata
Rank Oldbie
Rank
Oldbie

The R7 250 on paper, seems to be quite a bit worse than the K620 and let's not mention the 750Ti which will make even more of a difference. Around here the low profile R7 250 is not very common from what I've looked and its price tends to be higher than the K620 so it's not a logical choice.
And from what I've read in some threads on this forum, the AMD GNC cards don't seem to me to have very suitable drivers for XP.
Drivers have been a problem for AMD for years, it seems that they inherited Ati's problem, everything is fine as long as you don't ask too much in aspects for which they were not intended, for me the clearest current example are the RX problems with DX11, I have a RX6700XT, and I already learned that if the game uses DX11 is better to use another computer with another graphics, for example a RX 580, even a RX 470 and Windows 7.
Although the GNC architecture at the beginning I think it was very good, but it didn't develop well over the years. The HD 7970 had high power consumption, but showed very good performance at the time, but all GNC2 and GNC3 cards were pretty weak in power consumption/performance, GNC4 I think improved, but I don't think they have XP support,
That's why there don't seem to be any interesting low profile options, because high power consumption needs a big heatsink, and a more complex VRM, simple as that.