VOGONS


SFF CRT "DOS/9x/XP" sleeper time machine

Topic actions

Reply 20 of 38, by mgtroyas

User metadata
Rank Newbie
Rank
Newbie

Okay, after spending too many days reinstalling everything, then troubleshooting, I've found the cause of the palette corruption on DirectDraw / early DirectX games like this was a Windows 7 build... not a Windows update, not a DirectX update... it was the program Fences! I'd bet it redraws itself when these games switch resolution on start, and then corrupts the palette after until you do a reboot. Now I can finally go to sleep.

Reply 21 of 38, by mgtroyas

User metadata
Rank Newbie
Rank
Newbie

I finally received the Radeon R7 250 (2GB GDDR3).

Installing the drivers is a little tricky, as Phil explained on his Youtube video. You must install the 14-4-xp32-64-dd-ccc-pack2, but drivers install will fail. Then you must install them manually, selecting the right model.

I also had problems when connecting my CRT to the DVI-I port of the R7 250. All resolutions were capped at 60Hz, no matter what I tried. I though it was a BIOS limitation, but fortunately it was something different, and fixable: the card failed to read the EDID from my monitor, and played safe. So I had to disable automatic detection on Catalyst Control Center and set manually 1280x1024 as max resolution and 120Hz as max refresh rate. Then it honored my CRT drivers and worked as expected. Just remember to tick the "don't show not supported modes" on advanced Display settings, or you'll boot to a blank CRT with an "out of range" error message. Well, it'll probably happen to you anyway, as it did to me, no idea why, so you better have at hand a DisplayPort cable or adapter to connect a modern monitor and change the settings again.

Performance wise, unfortunately the R7 250 is similar to the HD7770 I already had (but I needed the DVI port). Going to the extreme, it struggles with Crysis at 1152x864 and 4x FSAA, while the GTX 750ti was very playable with these settings. The GTX, being more recent has many more shaders, and with the DDR5 memory, can move games from 2012 and later with ease. OTOH, as we know Nvidia drivers broke compatibility with DirectX 6, that's why I'm testing the AMD options, if not this beast would be a perfect GPU for a Windows XP SFF build...

I went to the Catalyst Control Center, AMD Overdrive section, and overclocked the CPU, RAM and power consumption to the max. That made a great performance increase (15-20%?). On Crysis I disabled FSAA and now it's playable, and much snappier if I lower the Shaders from High to Medium. Overlord also runs fantastic, on the HD 7700 it struggled, I don't know if it's only the overclocking or it's somewhat faster on some games.

The good news is DirectX 6 compatibility is perfect, all the problematic games (Colin McRae Rally, Dark Forces II, AVP, etc) run now without any problem. That was the goal... and worked!

Nvidia drivers allow to create custom resolutions with ease, but that's not the case with CCC. But the trick of editing "DALNonStandardModesBCD1" field on the registry worked. I added 640x400@70Hz for DosBox gaming, and 640x480@120, 800x600@100, although perhaps it wasn't necessary.

The card is almost silent on idle, it runs the fan at 25%, not 35% as the HD 7770 did. Anyway it starts ramping up too soon, making it noisy during no so intense loads. The solution was the same for both the HD7700 and the R7 250:
- Dump the BIOS using the GPU-Z feature.
- Edit the BIOS with vBIOS Editor 7.0. Set 20% for <50ºC, instead of 25% for <40ºC, now fan is much quieter, and temps while playing shouldn't be a real problem.
- Flash the BIOS back with the MS-DOS version of AMDVBFlash, with a USB Floppy drive and a bootable floppy.

Now my XP build is 99.9999% compatible with Windows games from Windows 95 to Windows 7 pre-steam, yay! And I can play anything from the MS-DOS era via DosBox-ECE. That makes about 30 years of gaming available on a single build, SFF size, silently, with low power consumption, on a CRT... just fabulous!

Reply 22 of 38, by mgtroyas

User metadata
Rank Newbie
Rank
Newbie

I've been fighting for getting my Radeon HD 7750 SFF working with my VGA monitor, as is the fastest Radeon SFF with XP support. The problem is it only has mini-DisplayPort interfaces (and there's another model but also digital only, with two DVI-D), so I must use a HDMI/DP to VGA adapter. In this case the custom resolutions method is not valid, so I was stuck with 60Hz refresh rate for all the resolutions, a limitation of the EDID/DDC table of the converter.

But... I have found a solution: there's a tool for FirePro cards (the professional versions of the Radeons) called AMD Custom Timing Tool that allows to create custom resolutions, and blindly bypasses/overrides the EDID/DDC. The problem is it only works with FIrePro cards and I have a Radeon. But an user on this thread hacked it to work on any Radeon card, it can be downloaded here.

Since last post I also found a 19 inch Philips 109B6 CRT, capable of 1600x1200@75Hz, and I found all my VGA converters fell short, they're designed only for 1920x1080@60Hz. So I researched and there's one VGA converter, the Startech DP2VGAHD20, made for 4K@60Hz that allows higher refresh rates. I got it and it works in combination with the previous tool! For the adventurers, just reset the configuration of the Radeon drivers using this old DDU version still compatible with XP and the Radeon HD 7750, and use this timing calculator to enter the "Detailed Timing" for the modes that are not being allowed by the tool using the "Basic Timing" mode. The key, after many researching, I found out is:

"Total" = "H Total" or "V Total"
"Display" = "H Active" or "V Active"
"Sync Start" = "H Active"+"H Front Porch" or "V Active"+"V Front Porch"
"Sync Width" = "H Sync" or "V Sync"

Here's my SFF Windows XP desktop at a whooping 2048x1560 resolution!

OabjDOm.jpeg

7wVlzp7.jpg

Qq9DCLG.jpg

4hPaElr.jpg

Last edited by mgtroyas on 2024-03-16, 21:43. Edited 1 time in total.

Reply 23 of 38, by bZbZbZ

User metadata
Rank Member
Rank
Member

Wow, nice find! Thanks for the tip regarding the Startech DP2VGAHD20, I'm sure that will be handy for people who come across this thread in the future.

What refresh rate does your monitor hit at 2048x1560? Both of my 19" CRTs run 1600x1200 @ 75Hz and 1280x1024 @ 85Hz. I actually prefer the latter, as 85Hz feels noticeably smoother to me than 75Hz.

Reply 24 of 38, by mgtroyas

User metadata
Rank Newbie
Rank
Newbie

Thanx, yep, I spent a lot of time researching so I wanted to share it, probably there are more use cases where it'll become helpful.

At 2048x1536 I can only get 60Hz. At 1920x1440 64Hz is the max as stated in the user manual. It's simple once you understand it, the maximum horizontal frequency of the monitor is the hard limit, for my Philips it's 96kHZ, and using the calculator I linked, 1920x1440@64 corresponds to 95.655Hz.

I also noticed 85Hz resolutions being softer than 75Hz. It's better I suppose for fullscreen gaming, as it's free analog antialiasing, but it's worse for desktop usage, text appears blurrier, I still have to compare with calm.

Anyway, playing a game like Quake 4 at 1600x1200 with that kind of analog softening blew my mind, it's the most beautiful image quality I've ever experienced, looking just "natural" instead of "pixelated". In fact you can find in Youtube lots of videos of people playing modern games on high end CRTs like the Sony GDM-FW900 and praising it, and I couldn't agree more.

Reply 25 of 38, by mgtroyas

User metadata
Rank Newbie
Rank
Newbie

Well, I've digged deeper in the rabbit hole and I've done more interesting findings. One thing that the AMD Custom Timings Tool doesn't allow is defining a resolution below 640x480. I like defining a 640x400@70Hz mode because it allows 320x200 DOS games run under DOSBox exactly as in the original hardware, with an advantage:

  • Either using 640x480 or 640x400 you can disable aspect ratio correction on DOSBox and stretch the image to fill the screen using the CRT monitor adjustments, just as we did back in the day. DOSBox, not finding 640x400 resolution available on your system, will choose 640x480 as the closest one and then use it.
  • But if you have both, you can define 70Hz as the forced resolution for 640x400 on RefreshLock, as this use case is the only one using that mode on modern systems, and set 640x480 at your preferred refresh rate (like 120Hz) for any other program. If you only have a 640x480@70Hz mode, you must switch manually on RefreshLock each time you're using DOSBox and restore it afterward. And in games that use both modes you're screwed.

So, how to define a mode the Custom Timing Tool doesn't like? Well, I've discovered that this tool is exactly what AMD later integrated on their Crimson drivers to allow us defining custom resolutions on later versions of Windows. I monitored the tool with ProcessMonitor from the Sysinternals package for Windows XP while defining a custom resolution, and found out it's writting on this Registry path:

HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\Video\{F245CA71-FFDA-40CB-8EF8-69825ECC14A4}\0000\DAL2_DATA__2_0\Common\EDID_8526_F966

Where the video card identifier betheen "{}" is random, and the EDID numbers are dependant on your monitor/adapter. Inside, it creates a binary value called "ModeTimingOverrides_DP_Conn12563" where the last numbers seem to be also random... and this value is the same the Crimson drivers use to save the custom resolutions you create on later versions of Windows. That's why the fields you fill defining a custom resolution on both interfaces are exactly the same.

And luckily for us some user on a forum understood the format this data is being stored, it's hexadecimal but Little-Endian:

LRBoknU.png

So how you can define a 640x400@70Hz mode? You first define the 640x480@70Hz mode using the Custom Timing Tool, then edit the key using the registry editor. I found the values "E0 01" that in reverse order ("01 E0") are 480 in hexadecimal, so as I wanted it to be 400, which in hex is 190, I edited those "E0 01" values to "90 01". Rebooted the PC and... the mode is now available and shows on the Custom Timing Tool as 640x400@70Hz! Of course, the same method is valid for changing other parameters like the refresh rate.

And so, for Windows XP, for AMD video Cards, we finally know the obscure and undocumented registry value ("DALNonStandardModesBCD1") that allows us to define custom video modes for analog (VGA, DVI-I) interfaces, and the even more obscure and undocumented registry value that allows us to do the same but on digital (DVI-D, DisplayPort, HDMI) interfaces.

Last edited by mgtroyas on 2024-10-14, 21:26. Edited 1 time in total.

Reply 26 of 38, by mgtroyas

User metadata
Rank Newbie
Rank
Newbie

Hi, only wanted to note the Radeon HD 7750 SFF (at least my PowerColor) accepts overclocking and the performance increase is pretty noticeable. You can directly do it from "AMD Overdrive" section in Catalyst Control Center (I use versin 14.4, latest available under XP):
- CPU can be upped from 800MHz stock to 850. 900 locks my computer, but this parameter seems to be little relevant (perhaps already powerful enough for the XP/Vista era games).
- Power limit is much more interesting. I can upper it 20% and the card still doesn't go above 70ºC too much on high load, and performance increase is pretty noticeable. The 7750 TDP is 55W and my card is configured at 45W so no surprise it can cope. I haven't tested beyond (CCC doesn't allow without BIOS modification) but don't feel the need to cook the card. Also, this setting seems to be "forgotten" by CCC from time to time, so you must enter the settings and reapply it.
- vRAM memory, being DDR5, allows upping from stock 1125 to 1250 (default speed on GTX 750) and 1375MHz (default of GTX 750Ti, BIOS modding needed). It allows higher clocks, 1500MHz boots but hangs pretty soon in-game. The speed increase is pretty noticeable at higher resolutions. It allows me to play at rock solid 75Hz at 1600x1200 in NFS Most Wanted!

To MOD the BIOS you can use VBE 7.0 tool, to extract the BIOS from the card you can use GPU-Z, and to apply it after modifying it boot from a floppy on DOS and use ATIFlash 4.11 tool.

cmp-gif.52229

Personally I've changed the following:
- Lower the default fan speed to 30% so is completely quiet on idle (other models probably will need different value, just play with it under CCC and hear the difference).
- If the fan is too noisy at high temperatures (not my case) modify the fan ramps to kick in at higher temps.
- Set the vRAM default speed to 1250MHz (pretty safe) and the maximum to 1500 (to allow overclocking above 1250).
- Set the CPU default to 850MHz (pretty safe).
- Set the power limit to 55W by default (so I don't have to apply back each time it goes to default by itself, and allows playing with higher values on CCC).

With all this changes I'm not missing the GTX 750/750Ti higher performance, and I have the AMD compatibility with older DirectX games (no visual artifacting like in nVidia recent cards).

In a few weeks I'll receive a second HD 7750 and will try to enable Crossfire between them just for fun: I have never done it and there are possible blockers like the motherboard of the Optiplex only having PCIe 16x 4x lanes on second slot, the SFF power supply being pretty limited, and the SFF cards themselves lacking Crossfire connector for using a bridge cable. If it works at all I'm curious about what kind of performance gain (if any) will it offer.

Reply 27 of 38, by mgtroyas

User metadata
Rank Newbie
Rank
Newbie

I bought a Visiontek HD7750 SFF but it doesn't allow Crossfire, even Overdrive, perhaps a BIOS problem. Anyway two cards on this case would overheat in like 5 minutes. Perhaps I'll try again in the future.

What I also bought (this is becoming a botomless pit of money) is a Club3D HD7750 SFF with not only HDMI and DVI-D but... VGA output! I can now connect my 19" CRT to the videocard without the DP2VGA adapter. The adapter worked very well but had two minor issues:
- Resolution changes take about 4-5 seconds instead of 1-2, noticeable on some games that change resolution frequently (i.e. Prehistorik 2, Jazz Jackrabbit).
- From time to time there seems to be an out of sync problem between the adapter and this CRT, which causes a 1 second blank image, although lately it only happened once every day or so.
Image quality is the same, so the monitor is the "bottleneck", both the adapter and the card's RAMDAC are good quality. With the adapter(s) all these months I've always experienced blurriness on 1280x960@85Hz and 1600x1200@75 resolutions, close to the maximum horizontal refresh rate/frequency of the monitor, but with the direct VGA output the image is as crisp as in other modes. The RAMDAC is much better... and also the monitor, I'm really impressed with this Philips.

The Club3D is the 1GB DDR5 model, the others I previously bought were the 2GB DDR5 models. I've seen even for the latest DX9 games like Dirt 3 and NFS Shift 1GB VRAM is just enough, so it shouldn't be a problem, and as a bonus, some games like Midtown Madness 1/2 and Motocross Madness 1/2 didn't like the 2GB of VRAM. Now I can run them on 3D accelerated mode!

TBH in the current status I cannot think of any game that has some compatibility problem with this build. This is both a big satisfaction, and I suggest it to anyone that just want to have this practical/compact/cheap/silent one build for all games approach, but also sad as it seems like the end of the way on this hacking adventure... we'll see.

Reply 28 of 38, by mgtroyas

User metadata
Rank Newbie
Rank
Newbie

I just bought another 19" CRT, a Samsung SyncMaster 957MB, on sleek grey matching my Optiplex, for 10€ plus shipping. I was tempted to buy a 21" Nokia that reached 1600x1200@85Hz but it was too big for my desk and sold by 120€.

rlUbUXJ.jpeg

LbEXAj0.jpeg

The SyncMaster had some advantages over my current Philips 109B6:
- It has component (BNC) input, so I can use a short BNC to VGA cable for minimum distortion at high bandwidth modes.
- It doesn't have an integrated VGA cable, so I can use a shorter one and so minimize distortion this way too, surprisingly near as good result using a 20cm VGA cord than with BNC shielded cable.
- Better geometry, image is linear to near the edges, where it has some compression (CRT curved tube on flat glass will always have some distortion).
- Smoother image, not only because of the shorter cable. At 1600x1200@75Hz it's bearable for desktop use. And 2048x1536@60Hz is gorgeous for FPS fullscreen gaming.
- More detailed blacks. The Philips crushed blacks, and if you cranked up brightness to restore them, whites where then blown up.
- More advanced menu controls like focus and linearity.
- It has an USB port, if connected to the PC you can use a software called MouScreen 2.0 to setup the monitor instead of using the built-in menus and buttons.
- In my case, it's near mint condition, the Philips have some cracks on the case and a small scratch on the tube.

The cons:
- It doesn't accept 640x400@70Hz, complains about too low horizontal refresh. I've switched to 640x400@140Hz, and smooth scrolling is not too bad on DOSBox with the plus that less straining to the eyes.
- It produces some high frequency whine on some modes with some images, it depends on the number of bright pixels displayed. I'm very sensitive to this and I found it not too noticeable so probably a minor issue for many.
- It has a more dull image, MagicBright button has three positions, the second one is just slightly too bright for my taste, also it resets every time you turn off the monitor. I've fount a fix, clearly by default image is low saturated, I increased saturation on ATI Catalyst Center to 115% and now it has a crisper image, near the one I enjoyed on my Philips.

After some tweaking I'm very pleased with this monitor. And the VGA/BNC connection makes it in fact a double input CRT. And it happens my Optiplex can use the integrated Intel HD and dedicated ATI Radeon video cards simultaneously, if a VGA monitor is connected to each one, so I connected both to the CRT and effectively I have a "virtual" dual monitor setup and can switch input from the menu.

The Intel HD happens to be very compatible with old games that have artifacts or don't run at all on modern Radeon and GeForce, so for these few games that didn't run even on the Radeon (the Geforce is much more problematic) I can quickly make the Intel HD the primary monitor on Windows, launch a game fullscreen on it, and switch to it from the monitor menu, increasing so the compatibility of this setup to the 99.999%.

The other use case is connecting my modern PC with the DP2VGAHD adapter to the VGA input via a 2m long cable, and image quality is acceptable, much better
than when I was using a cheap VGA KVM.

Reply 29 of 38, by Joseph_Joestar

User metadata
Rank l33t++
Rank
l33t++
mgtroyas wrote on 2025-01-16, 09:01:

- It has a more dull image, MagicBright button has three positions, the second one is just slightly too bright for my taste, also it resets every time you turn off the monitor. I've fount a fix, clearly by default image is low saturated, I increased saturation on ATI Catalyst Center to 115% and now it has a crisper image, near the one I enjoyed on my Philips.

I think there may be an OSD option for adjusting RGB color intensity. I have that on my 795MB and it's set to 50 by default for all three colors. You can increase it slightly for better saturation.

By the way, I have a driver installation CD for Samsung monitors from that time, though I'm not sure if it supports that exact model. If you want, I can upload the disc image.

PC#1: Pentium MMX 166 / Soyo SY-5BT / S3 Trio64V+ / Voodoo1 / YMF719 / AWE64 Gold / SC-155
PC#2: AthlonXP 2100+ / ECS K7VTA3 / Voodoo3 / Audigy2 / Vortex2
PC#3: Athlon64 3400+ / Asus K8V-MX / 5900XT / Audigy2
PC#4: i5-3570K / MSI Z77A-G43 / GTX 970 / X-Fi

Reply 30 of 38, by mgtroyas

User metadata
Rank Newbie
Rank
Newbie

Thank you, yes I'm already aware of the custom color intensity settings, this 957MB resets to defaults with all three at 100%, I found it too greenish as in my Philips, so I settled on 96 Red, 86 Green, 100 blue. I also set contrast at 100 (I don't see brights crushing like in the Philips) and brightness to 14, calibrated using QuickGamma. I know brightness seems too low, but increasing it only washes out the colors, doesn't increase overall intensity of the image, but funnily enought MagicBright clearly does, so it has internal access to some tube parameters the normal OSD hasn't. On second MagicBright position (Internet), contrast 100 and brightness 0 image is gorgeous, but a too bright for my taste. I wish it could be user configurable and made default on power on.

Thank you for the CD offer, it's always interesting to archive these artifacts for future use, luckily as of today googling around I've already found the manual, the official drivers (and slightly tweaked the max horizontal refresh rate in 1kHZ to allow some higher refresh rates like a fantastic 1280x960@95Hz) and found a similar CD on archive.org with the MouScreen 2.0 tool, that worked on mine without problems, but these downloads could disappear in any moment.

Reply 31 of 38, by The Serpent Rider

User metadata
Rank l33t++
Rank
l33t++
mgtroyas wrote on 2024-09-09, 07:44:

- Power limit is much more interesting. I can upper it 20% and the card still doesn't go above 70ºC too much on high load, and performance increase is pretty noticeable. The 7750 TDP is 55W and my card is configured at 45W so no surprise it can cope. I haven't tested beyond (CCC doesn't allow without BIOS modification) but don't feel the need to cook the card. Also, this setting seems to be "forgotten" by CCC from time to time, so you must enter the settings and reapply it.

GCN 1.0 cards power limit was pretty simple: it does not regulate voltage, only core clocks to fit into target wattage. So simple core overclocking does not change overall heat dissipation much, because it's linear, unlike core + voltage overclocking.

I must be some kind of standard: the anonymous gangbanger of the 21st century.

Reply 32 of 38, by mgtroyas

User metadata
Rank Newbie
Rank
Newbie
The Serpent Rider wrote on 2025-01-16, 11:51:

GCN 1.0 cards power limit was pretty simple: it does not regulate voltage, only core clocks to fit into target wattage. So simple core overclocking does not change overall heat dissipation much, because it's linear, unlike core + voltage overclocking.

Interesting, but I don't get it, I can set all three core clock, RAM clock, and power limit independently on Catalyst Control Center or editing the BIOS, so they shouldn't be correlated, isn't it?

Edit: ahh ok, just realized the Power tune tab, so the card goes down to a lower clock settings when it reaches the power consumption threshold. That's interesting! Thank you very much.

Mmm so that makes that power control inefficient and it'll go downclocking too easily I assume.

Reply 33 of 38, by H3nrik V!

User metadata
Rank Oldbie
Rank
Oldbie
mgtroyas wrote on 2025-01-16, 09:01:

I just bought another 19" CRT, a Samsung SyncMaster 957MB, on sleek grey matching my Optiplex, for 10€ plus shipping. I was tempted to buy a 21" Nokia that reached 1600x1200@85Hz but it was too big for my desk and sold by 120€.

rlUbUXJ.jpeg

LbEXAj0.jpeg

The SyncMaster had some advantages over my current Philips 109B6:
- It has component (BNC) input, so I can use a short BNC to VGA cable for minimum distortion at high bandwidth modes.
- It doesn't have an integrated VGA cable, so I can use a shorter one and so minimize distortion this way too, surprisingly near as good result using a 20cm VGA cord than with BNC shielded cable.
- Better geometry, image is linear to near the edges, where it has some compression (CRT curved tube on flat glass will always have some distortion).
- Smoother image, not only because of the shorter cable. At 1600x1200@75Hz it's bearable for desktop use. And 2048x1536@60Hz is gorgeous for FPS fullscreen gaming.
- More detailed blacks. The Philips crushed blacks, and if you cranked up brightness to restore them, whites where then blown up.
- More advanced menu controls like focus and linearity.
- It has an USB port, if connected to the PC you can use a software called MouScreen 2.0 to setup the monitor instead of using the built-in menus and buttons.
- In my case, it's near mint condition, the Philips have some cracks on the case and a small scratch on the tube.

The cons:
- It doesn't accept 640x400@70Hz, complains about too low horizontal refresh. I've switched to 640x400@140Hz, and smooth scrolling is not too bad on DOSBox with the plus that less straining to the eyes.
- It produces some high frequency whine on some modes with some images, it depends on the number of bright pixels displayed. I'm very sensitive to this and I found it not too noticeable so probably a minor issue for many.
- It has a more dull image, MagicBright button has three positions, the second one is just slightly too bright for my taste, also it resets every time you turn off the monitor. I've fount a fix, clearly by default image is low saturated, I increased saturation on ATI Catalyst Center to 115% and now it has a crisper image, near the one I enjoyed on my Philips.

After some tweaking I'm very pleased with this monitor. And the VGA/BNC connection makes it in fact a double input CRT. And it happens my Optiplex can use the integrated Intel HD and dedicated ATI Radeon video cards simultaneously, if a VGA monitor is connected to each one, so I connected both to the CRT and effectively I have a "virtual" dual monitor setup and can switch input from the menu.

The Intel HD happens to be very compatible with old games that have artifacts or don't run at all on modern Radeon and GeForce, so for these few games that didn't run even on the Radeon (the Geforce is much more problematic) I can quickly make the Intel HD the primary monitor on Windows, launch a game fullscreen on it, and switch to it from the monitor menu, increasing so the compatibility of this setup to the 99.999%.

The other use case is connecting my modern PC with the DP2VGAHD adapter to the VGA input via a 2m long cable, and image quality is acceptable, much better
than when I was using a cheap VGA KVM

.

10€? Holy smoke, that's a steal! Congrats!

If it's dual it's kind of cool ... 😎

--- GA586DX --- P2B-DS --- BP6 ---

Please use the "quote" option if asking questions to what I write - it will really up the chances of me noticing 😀

Reply 34 of 38, by mgtroyas

User metadata
Rank Newbie
Rank
Newbie

Yeah, here in Spain many local offerings are still at prices before the "retro-investors" craze, that's why I'm getting some spares before it ends. But yeah, I had paid 25€ for the 19" Philips, which seemed pretty cheap to me, but 10€ for the Samsung... clearly the owner considered it just bulky trash.

Reply 35 of 38, by H3nrik V!

User metadata
Rank Oldbie
Rank
Oldbie
mgtroyas wrote on 2025-01-17, 08:42:

Yeah, here in Spain many local offerings are still at prices before the "retro-investors" craze, that's why I'm getting some spares before it ends. But yeah, I had paid 25€ for the 19" Philips, which seemed pretty cheap to me, but 10€ for the Samsung... clearly the owner considered it just bulky trash.

Don't tell the sellers that, please 🤣 last CRT I saw in Denmark was a 19" with Trinitron tube for the equivalent of around 140€. It was sold in a few days ..

If it's dual it's kind of cool ... 😎

--- GA586DX --- P2B-DS --- BP6 ---

Please use the "quote" option if asking questions to what I write - it will really up the chances of me noticing 😀

Reply 36 of 38, by mgtroyas

User metadata
Rank Newbie
Rank
Newbie

Oh, high pedigree parts are also expensive here, anything with the word Sony or Trinitron sells at the prices you say, corrected to Spain wages level, but I love sailing against the flow and getting value of the things that are similar in quality (or better) but not mainstream. Not paying premium also eases experimentation to do so.

Reply 37 of 38, by mgtroyas

User metadata
Rank Newbie
Rank
Newbie

I just wanted to share and document a new discovery (for me) that fixes some games that weren't being rendered smoothly or at correct speed on my build. It's Riva Tune Statistics Server (RTSS from now on), part of RivaTuner and later part of MSI AfterBurner, still in use today. I started using it for the in-game OSD to show refresh rate, GPU clocks and other statistics, but also has the "Framerate Limit" and "Scanline Sync" features that works extremely well as I'll try to explain.

- Scanline Sync is "G-Sync/FreeSync in software" (so perfect for any CRT): a way of getting vertical sync smoothness, ideally without tearing , but without the additional lag and other problems that real vsync can cause. It tries to refresh the image on screen during the period the scanlines are not being drawn on the monitor.
User RealNC explained it perfectly:

For scanline sync to work correctly, the game needs to run vsync OFF. That means either just plain old vsync OFF or fast sync. ( […]
Show full quote

For scanline sync to work correctly, the game needs to run vsync OFF. That means either just plain old vsync OFF or fast sync. (Fast sync will work because it runs games with vsync OFF.) OK, so now as to how to use it, you need to enter the scanline RTSS should try to sync the FPS limiter against. Using "0" is special and means "disable scanline sync." We'll call it "s-sync" from now on because I don't want to have to type the word "scanline" over and over again :p

The most useful way to use s-sync is to use a negative value. RTSS interprets this as an offset relative to the last scanline in your current monitor resolution. The goal of s-sync is to give you a way of syncing the FPS limiter against a scanline that is not visible. At 1080p, you have 1080 visible scanlines, plus a couple dozen more that are invisible. If you sync against an invisible one, that means tearing will be invisible. If the total amount of scanlines is 1130 for example, if you enter "-50", then you are trying to sync against scanline 1080, which is the last visible one. This is not perfect however, and in reality RTSS will be only able to sync against a scanline greater than the one you specified. So you need to experiment. -20, -30, -40, and so on, until you see the tearline disappear off the bottom of the screen.

The flush option makes s-sync more accurate. Without flushing, syncing can be very erratic and jittery. The tearing will probably not be located around the target scanline, but will jump up and down seemingly at random. Flush will make it more accurate. "1" is only available in DX 10, DX 11 and OpenGL games. It's very fast but can be less accurate. "2" also works in DX 9, DX 12 and Vulkan games, but is slower and requires a stronger GPU to work without stutter. If you use flush=1 in a DX9/12 or Vulkan game, it's treated as flush=2. Yes, you can do "1/2 s-sync", by clicking the "scanline sync" button until it says "scanline sync x/2". This will cap your FPS to half your refresh rate. The "x*2" setting will do the opposite, it will cap to double your refresh rate.

If your GPU is not fast enough, or the game is too heavy and you can't get the tearing to be hidden, you can instead use fast sync in the nvidia panel, and sync against scanline "1". This will have more input lag (because you're syncing against the first scanline instead of the last one), but this will allow fast sync to hide the tearline. If you try to sync against a higher scanline, input lag goes down, but the possibility of fast sync stutter will increase.

- Frame Limit on the other hand, does exactly what it says: limit the max rendering rate on a game, independently of the refresh rate of the video mode configured, or the capabilities of the video card. It's crucial on games that have some of their internal calculations tied to rendering rate.

Some examples that benefit from these two features:

  • Carmageddon (Windows version): ingame timer speed will be faster than realtime on high refresh rates, so it will run out artificially soon. Pedestrian running speed is also tied, so the game is more difficult. And it also probably fixes known crashes on some particular levels as I'm starting to think they're related to collision physics also being tied to framerate, not noticeable on gameplay but I suspect calculations happen more frequently, and on some borderline cases like collision with small objects (both the player and the AI drivers) some overflow happens and game crashes, seeming "randomly". These problems don't happen on Carmageddon software renderer because framerate is usually low, but on Windows (DirectX) version and probably also 3dfx version (native or via Glide wrapper) higher framerates are factible. Limitting framerate to 60 seems to make the game run perfectly, and combined with ScanlineSync, smooth as butter. A personal white whale for me personally.
  • Quake 3 Arena: as stated here
    COM_MAXFPS is the maximum graphical framerate permitted. You can not use any MaxFPS value. The only valid values are those which […]
    Show full quote

    COM_MAXFPS is the maximum graphical framerate permitted. You can not use any MaxFPS value. The only valid values are those which are equal to (1000/x) where x is an integer. So for example your 125fps comes from (1000/8 = 125). If you try and set MaxFPS to 120, you will still get 125fps. This is because any invalid setting is rounded up to the next (1000/x).
    Some valid Values for MaxFPS:
    1000 / 07 = ~142
    1000 / 08 = 125
    1000 / 09 = ~111
    1000 / 10 = 100

    So in my case I wanted to play at 1600x1200@75Hz but gameplay wasn't smooth. Even with vsync there was a periodic hiccup, and the cause is that the game tries to run at 77Hz (1000/13), 2 Hz above. In fact, with a custom video mode of 1600x1200@77Hz (happens to be the maximum allowed by my Samsung CRT) the game was strangely smooth. Scanline Sync allows this at 75Hz or any other refresh rate we wish.
  • Dungeon Keeper 2: this game has some minor visual effect speeds tied to framerate, like text scrolling on top status bar and torches fire animation. Also probably scroll speed. And is famous for being unstable on modern systems, with random crashes like Carmageddon... setting 60Hz as Frame Limit solves all of them.
  • Need for Speed Underground 2 (and probably others in the series): this game's rendering rate tanks if you enable vsync. On my build, it reaches 100FPS consistently at 1600x1200@75Hz without vsync but drops from 75 to 37 FPS now and then when vsync is enabled. And thankfully Scanline Sync again allows the no-vsync performance with vsync smoothness. This is also a personal achievement, as this game runs perfect with vsync on with my GTX 750 but I had to swap it for a Radeon HX 7750 to get early DirectX compatibility...
  • DOSBox: we are lucky that recently DOSBox Staging added support to VRR (G-Sync/FreeSync) but... what if you're using a CRT monitor to get the real experience? Well, then you can define custom resolutions like 640x400@70Hz and enable vsync, then standard DOSBox will have smooth scrolling in games like Crystal Caves (intro scrolling), Prehistorik 2 (map scrolling), etc. But, what if your CRT monitor (like my SyncMaster!) or video card, or OS (probably every windows since XP) for some reason doesn't like that custom resolution? Well, then you can define 640x400@140Hz but then scrolling won't be 100% perfect. But... then using Scanline Sync and a valule of "-20" (as -30 was too high for such a small vertical resolution) will get you that smooth scrolling, and without the need to enable vsync!
  • I know there are many more games with problems (like too fast ingame timers) tied to framerate, and I think this method will probably fix most of them. I'll carry on doing tests on my build, and will add to the list other ocurences.

And as an alternative, I wanted to point out the other way to limit framerate on early DirectX games on XP is using the fantastic cnc-ddraw wrapper, that works great on XP and is still being enhanced as of today, but has some implications that Scanline Sync don't have.

Last, I can confirm RTSS 7.3.3 works great on Windows XP, but final 7.3.6 version was compiled using Visual Studio 2022 and as such is no longer compatible with Windows XP. RTSS 7.3.3 is bundled with MSI Afterburner 4.6.5, the last one working under Windows XP for the same reason, and can be extracted from the setup EXE using for instance 7-zip. RTSS 6.x and older also work on XP but don't have the option to customize OSD so this one is the most complete feature wise.

Reply 38 of 38, by Joseph_Joestar

User metadata
Rank l33t++
Rank
l33t++

Many games have issues when running above 60 FPS. If you want to play a game at high refresh rates, it's best to check its entry on the PC Gaming Wiki beforehand.

I remember running into weird problems with Star Wars KOTOR while playing it at 120 FPS on my CRT monitor. Turns out it was one of those games. Limiting the frame rate via RTSS is often needed to get such games to run correctly.

PC#1: Pentium MMX 166 / Soyo SY-5BT / S3 Trio64V+ / Voodoo1 / YMF719 / AWE64 Gold / SC-155
PC#2: AthlonXP 2100+ / ECS K7VTA3 / Voodoo3 / Audigy2 / Vortex2
PC#3: Athlon64 3400+ / Asus K8V-MX / 5900XT / Audigy2
PC#4: i5-3570K / MSI Z77A-G43 / GTX 970 / X-Fi