Re: Open hardware AV to digital conversion

From: Ingo Korb <ml_at_akana.de>
Date: Mon, 05 Jan 2015 21:32:25 +0100
Message-ID: <u4ms5dl46.fsf@dragon.akana.de>
silverdr@wfmh.org.pl writes:

> It's not a problem of this kind. It's a problem that more and more
> devices plainly fail syncing to the original signal and producing /any/
> usable output.

I know. I have at least one device that doesn't like the C64's signal
because it gets detected as SECAM.

>> Signal incorrectly treated as 576i and deinterlaced (GBS-8220):
>> http://snowcat.de/atari/desktop-gbs.gif
>
> One of the reasons why I want to give /correct/ 576i on the output
> rather than the 288p, which (too) many devices don't understand.

If you do that, you can be sure to get issues due to deinterlacing like
the example in that GIF image - so don't and instead convert the signal
to 480p/576p which should be acceptable to almost any display that is
currently on the market.

By the way: 480i/576i over HDMI can be problematic too, for example I
have one combination of scaler and TV here where a 480i HDMI signal from
the scaler is incorrectly displayed on the TV (the two fields are offset
by a dozen lines).

> Good. There are devices that understand this type of signal.

Yes - although unfortunately the XRGB Mini is a bit of a "boutique"
device with a rather high price tag. Its advantage however is that it
was specifically designed to handle "weird" video signals from old
computers/consoles/arcade boards and thus treats them properly.

> I have a few broadcast studio devices that understand this type of
> signal too but more of them don't.

Actually I would expect exactly that because I don't think that the
engineers of these devices ever expected them to be fed with such
non-standards-compliant signals.

> Some just show the virtual finger and say there is no (proper)
> signal. The same about current consumer displays. It is becoming not
> even hit-and-miss. It is becoming more and more miss-and-miss.

Some devices are weirder than others - my Dell U2410 monitor accepts
240p, 480i and 576i over VGA, but not 288p. There are others that fail
just in certain modes, e.g. a DVDO Edge Green (video scaler) accepts
240p fine as long as you don't turn on Game Mode which in theory should
only switch to a faster deinterlacing algorithm which wouldn't even be
used in that mode.

> The project is supposed to secure the ability to see something on modern
> displays. First either through converters/upscalers or through the
> typical (composite/s-video) inputs.

I really don't see any problem with line-doubling the incoming signal to
completely avoid any deinterlacing artifacts in this case, "modern
display" implies HD support these days which also covers
480p/576p. You will probably want to output component video anyway if
you insist on an analog output because TVs with analog RGB support are
not that common in the US (except for VGA, but that is on its way out
and usually does not support 480i/576i anyway).

> Second (once the first is done and the signal is already also
> available in the digital domain) through providing a digital output
> for modern displays.

You should consider using an HDMI output as soon as possible - in my
experience the HDMI input on a TV or scaler seems to care a lot more
about "sane" signal timings than an analog input, probably because for
HDMI you have to get the blanking signal right too.

> Again - I exactly don't want the errors, which I know too well from too
> many devices not working properly with the VIC's original output.

The error demonstrated in the two GIF images cannot be fixed by forcing
the video signal into an interlaced timing - instead you will introduce
these errors on devices that can properly handle the original signal.

> What I want is to make it norm compliant while enhancing/filtering on
> the way.

Which reminds me: How do you propose to shift the signal timing enough
that you can add[1] half a line to change the timing from 240p to 480i?
A fully asynchronous output that is not synched to the input signal
would of course be completely unacceptable - that would introduce even
worse video errors than the unneccessary deinterlacing that I'm arguing
against.

[1] Or remove? I think the signal was missing the half-line at the
boundary between the two fields, but the problem is the same either way.

> I had very similar discussion with Gerrit - he also didn't like the idea
> but I believe that this is because of lack of mutual understanding.

I'm pretty sure that I understand what you are planning, the part that I
don't get is why you want to do it in a way that will with 100%
certainty lead to a bad result when you could instead build something
that will improve the situation even for people whose devices produce
a working-but-not-great picture.

-ik

       Message was sent through the cbm-hackers mailing list
Received on 2015-01-05 21:00:04

Archive generated by hypermail 2.2.0.