So is double scan another way of just saying Progressive Native Resolution? Or is this a way of making lower resolution games use the full resolution without interlacing? I'm not familiar with how old PC monitors works in regards to resolution.
Not the same thing, progressive scan means no interlacing, that the entire frame is drawn fully every frame instead of rows alternating every other frame. Consoles can also do that, i.e. PS2 or Gamecube had many games toggle between interlacing or progressive scan. But they still connected to a TV with all that entails.
VGA monitor's double scan has nothing to do with framerate, it means that when a game with 320x240 displays in 640x480 VGA, there's no vertical gaps between pixels and each pixel is doubled. Same for 200p games stretched to be 400 pixels tall. So when people add emulator lines that round pixel edges to these games, they look nothing like they used to. We always had visible pixels and yes, they were ugly.
So, 480i is interlaced, 480p is progressive, double scan is basically 240p DOS games on 480p monitors, which get double scanned instead of showing on native resolution (so you get two lines per pixel). What it means in practice is that there's no visible scanline gaps on the screen and it "looks like LCD".
Also, old sprites made from digitized photos (claymation in Doom, Mortal Kombat, Road Rash, Area 51) or screenshots of 3D renders (Diablo, Fallout, Baldur's Gate, Resident Evil backgrounds) aren't pixel art and there's zero reasons to pretend the conversion artifacts in them is super complex design to take advantage of old displays. That's utter bull, everyone had a different screen. Games were just designed around hardware limitations. As they still are, it's just less obvious now (we still got things like texture pop-in or draw distance).
And nobody thought in pixels back then, see actual dev interviews. They knew it won't look the same on every screen, everything was eyeballed. The deliberate design around old screens defects is largely a modern myth. Dithering to simulate shadows and half-tones predates CRTs, it was used in bloody Medieval gravures! in somes consoles, PS1 for example, dithering isn't even intended, it's a side-effect, like polygon wobble.
*** *** ***
Side-track about PC gaming:
Personal computers always had multiple display types and tried to run on as many things as possible, so when VGA became the standard and superceded CGA, EGA and whatever else, you would heve majority of games AND monitors run through it (it's that blue connector with 2 screws on the side), and it would support tons of modes and resolutions including higher ones, like SVGA and XGA and even HD.
My 1080p 24 inch widescreen monitor
still has a VGA output (in addition to HDMI), so did my 14 inch curved CRT that only went to 800x600, or larger 19 inch flat CRT that supported 1600x1200. And you are running a PC game on any of those
natively. It
already looks how it was intended to look. There's no need to simulate a different display. And none of them have visible scanlines. Or composite/sVideo/SCART input artifacts that shaders often emulate.
I.e. a PC port of Sonic would not blend waterfalls with the composite output computers don't use (VGA, DVI, HDMI and DisplayPort are all superior), but it also will most likely have built-in smoothing shaders to toggle because why the hell not? It's all part of PC gaming (Sonic CD version on Steam is great!)
To add to it, even before flat liquid crystal displays became mainstream, flat plasma and early LCD screens already existed on computers... ever played Heroes of Might and Magic on a laptop that had more kilograms than it displayed FPS? We did hotseat multiplayer like that at a friend's. Good luck viewing that from any angle that isn't starting directly into the screen. But at home I had a CRT monitor running the same game. Which wasn't designed specifically for either, but to be ran from everything possible. Most of them had multiple resolutions to choose. The argument that "games had pixel art designed specifically with CRTs in mind" that OP was told quickly falls apart on IBM-compatible personal computers.
*** *** ***
You can use whatever you want, of course, but when I see people saying there's an only "good and correct" way to display 2D graphics and end up watching people add... CRT shaders... to an actual CRT monitor that doesn't show these effects on its own, I wince. Insert that overused SOTN picture of Dracula portrait made on emulator with modern shaders as an example of "this is how games used to look back in the day" for extra pain.
The latest trend was scanline fetishists unironically telling people that to display some old games "correctly", they need an 8k OLED HDR display, I am not even joking, they're that deluded. Seeing a comment that said that he dreams of a 16k monitor to play Prince of Persia (200p resolution)
"as it was intended" broke me. This game was played on blurry 480p monitors and ancient ghosty plasmas that are older than most commenters!
Sorry for the rant, I just burned out on every emulator messageboard and YouTube comments having those cork-sniffing scanline connoiseurs saying "the art needs scanlines to look correct and good" after claiming that Donkey Kong Country is hand drawn pixel art (it's not), here's the short:
tl;dr: for games on PC, CRT shaders are adding tons of things that weren't there. Anyone claiming it's the "correct" way to play them is LARPing. If you like those, sure. That's the point of computers, tweaking things how you want, but scanline purists aren't even being authentic, they're trying to make everything look like a NTSC SONY Trinitron, which looks nothing like majority of TVs used to, let alone any handheld or PC display... or a PAL TV with shadow mask (which yes, absolutely can run 30 and 60FPS NTSC games).
Even for consoles it's questionable, later PS1s, for example, had a BUILT-IN LCD, yet you still have people claiming
the console that had a physical liquid crystal display embedded into it "needs CRT effects". No, just no. It doesn't.