Re: Blurry picture

From: Hársfalvi Levente <>
Date: Thu, 17 Nov 2011 11:54:29 +0100
Message-ID: <>

On 2011-11-17 00:24, Rhialto wrote:
> Somewhat unrelated, but interesting nonetheless: for several Doctor Who
> episodes from the 1970's that were made in colour, no colour video was
> archived. At best, there are black and white telerecordings: basically a
> film camera pointed at (and (hopefully in phase) synchronised with) a
> black and white monitor.
> Now it turns out that the resolution was so good, that the colour signal
> is visible in the luminance. And some bright people have worked on
> restoring the colour information from that. Unfortunately, the
> information I could find about it is a few years old and it seems no
> more recent developments happened. See .

That's very neat indeed! (I've just read the article. One would think,
an attempt like that would need a film scanner of very high resolution
at the first place, in order to oversample the image so that adjacent
rows and geometry could be restored with as little hassle as possible...
Seems like they just had to fight with the additional obstacle of not
having such digitizer, which must have made things much more difficult).

Thinking that over, there's yet something more that affects the picture
quality of the VIC-II, or, better said, the C64 and probably the C128 too.

(I'm only speaking about PAL machines, because I don't (yet) have an
NTSC model so I could never see it in action, yet).

The composite color signal was originally designed in a way that luma
and chroma affect each other as little as possible, even though the
spectrums of the two components just overlap each other. (Better said:
chroma is embedded into the spectrum of luma, with the color subcarrier
freq purposely selected so that at the end, real "overlap" of particular
spectral lines would be minimal). In theory, this would ensure that even
though luma and chroma are transmitted in the same 5-6MHz band mixed
together, they can be separated with no loss of bandwidth of the
components (especially luma, which is practically image sharpness) and
with minimal artifacts and crosstalk.

(In practice, the chroma signal would result in a constantly
"interlacing" pattern on top of luma, which, with the arrival of better
picture tubes, became more serious of a problem, so manufacturers
started to invent techniques to get rid of it ie. filter chroma remnants
out of luma before the signal would be displayed. On the downside, this
usually results in a loss of bandwidth for luma ie. a more blurry luma
image. At least the tv processors I know of, TDA836* and the like,
specify an output (RGB) bandwidth of only 3.5MHz for PAL composite
input, and even less, 2.8MHz for NTSC).

Even with fully conforming to the original specification, there are
special cases when the formula, err, fails seriously. Typical example is
the man in the suit :-). Some suits are made of textile with sharp,
regular, frequently alternating patterns. Recording and broadcasting
those just fools the idea of minimal artifacts of luma/chroma
separation, and the result is massive color moire and artifacts above
the respective surfaces.

Our computers are more special, ie. conform to the original spec is
generally a "don't care" bit as long as the construction provides
colors, and second, the pattern "transmitted" by the machine is regular
(because it's basically a matrix of pixels, replayed at a constant
sampling frequency). That'd alone seriously call for keeping luma and
chroma apart at the first place, if composite video is to be generated -
ie. keeping luma's band top below the subcarrier freq + bandwidth of
color, in order to avoid artifacts.

The C64 design has some aspects that violate those rules. First, luma is
AFAIK not filtered before mixing with chroma. Second, the pixel clock of
the C64 is quite high, and is also special.

People who use C64s with decent CRT displays and either RF or composite
connection have for long noticed the color artifacts that generally
appear around sharp luma edges. This is a consequence of above - more
specifically, the consequence of luma's overlapping to the color
signal's band at the point of sharp luma transitions (that are then
decoded as color transitions in the display).

Also, if someone creates a pattern of black and white stripes on the
screen, one pixel of width each, he'll see a color gradient on top of
the stripes on composite displays... As explanation, the pixel clock of
the PAL C64 is 16/9 the color subcarrier frequency. A series of black
and white pixels is therefore a square signal whose base frequency is
half that, in other words 8/9 of the color subcarrier frequency - very
close, almost equal. This signal will definitely be catched by the
chroma separator in the display, and get displayed as color. As this
signal'd be constantly shifting in reference to the PAL burst (since it
is "slower" than that of PAL's nominal frequency), the result is a
gradient of constantly changing color. From the proportion, we could
also conclude that the gradient is periodic for 16 pixels.

...Now, imagine how that'd look like if the PAL reference phase was
unknown, or constantly shifting... :-DDD

Best regards,


       Message was sent through the cbm-hackers mailing list
Received on 2011-11-17 11:00:03

Archive generated by hypermail 2.2.0.