audioguru said:
Nigel Goodwin said:
The relatively small number of wires in the grill of a Trinitron TV CRT restrict the resolution just as a shadowmask does.
Most definitions of Trinitron on the web mention increased sharpness and brightness over a shadow-mask CRT.
But you are also correct, a site says that the highest resolution and very expensive CRT's have a shadow-mask.
As you say (and I agree) the Trinitron system tends to give slightly higher brightness than a shadow mask, but at the cost of mechanical stability - try hitting the front of a trinitron CRT, the picture goes crazy as the thin wires rattle!. This also causes the nasty deguassing noise when you turn a Sony TV on!.
But both types of CRT are manufactured for their specific use, and TV tubes (of either type) have far lower resolution, and far higher brightness, than monitor CRT's.
most broadcasters don't use RGB quality VCR's or cameras.
Broadcasters over here make and record everything in high-definition that is also clearly visible.
Unfortunately it's recorded in MPEG, so throws much of the quality away :lol:
How come, err, why do movies from the '70's and '80's look so blurry (I think they are film, not old videotape) when The Wizard of Oz made in colour in 1939 looks so sharp? Maybe the producer invested lots of $$$$$ (sorry my keyboard doesn't have a font for quid). :lol:
The movies involved probably come off videotape?, or a small format (cheaper!) cine film - 'proper' films use 35mm film, which is where the stills camera format came from.
Some of the classic British TV series, like 'The Avengers' and 'The Prisoner' still look stunning today, simply because they were made on film, more modern stuff made on VCR looks rubbish in comparison.
Incidently, a number of years ago I spent a day at 'Yorkshire TV Studios', where they filmed 'The Darling Buds Of May' - not a program I have ever watched, but it was made on film - simply to give better quality!.