There's definitely an argument to be made that this is actually superior. Developers expected their games to be seen on consumer CRTs of questionable quality through something as poor as an RF connection so a lot of the art is designed around this. The imperfections of a CRT were often used in creative ways.
*Me who grew up with decent CRTs that I still own, because my parents knew that by investing in something decent, it would last longer and save money in the long term.*
The heck are you talking about?
But yeah, you're not wrong. Just look at Dracula's eyes on Castlevania Symphony of the Night. A clear signal gives you a red dot, an RF signal gives you an actual eye.
But as someone who grew up with decent monitors and a decent television, as well as a Windows 95 computer and later a Windows 98 computer, both of which output an image at 1024x768 with ease (mind you, you couldn't see anything on the desktop if you did, because the text was too small to read), my nostalgia has me wanting to improve that graphical quality to the absolute limit. (Also for the record, I pretty much grew up just above the poverty line. My parents just knew how to save their money and spend it wisely.)
As such, I very much dislike this whole "nearest-neighbour-only" thing that a lot of low-poly indie games insist on forcing upon us. Give me bilinear filtering! Some of us grew up with a computer not a PS1! I want indistinguishable smears, not chunky blocks!
But that's just me.
Although when it comes down to the really low-graphic games like the Atari 2600; I make sure to put on as many image ruining filters as possible.
There's nothing to look at already, so I figure that by adding crap like scanlines and ghosting, I'll be giving myself something to look at. :D