Re: Open hardware AV to digital conversion

From: silverdr_at_wfmh.org.pl
Date: Thu, 08 Jan 2015 18:11:54 +0100
Message-ID: <54AEBA5A.5050404@wfmh.org.pl>
On 2015-01-05 21:32, Ingo Korb wrote:

>> One of the reasons why I want to give /correct/ 576i on the output
>> rather than the 288p, which (too) many devices don't understand.
>
> If you do that, you can be sure to get issues due to deinterlacing like
> the example in that GIF image - so don't and instead convert the signal
> to 480p/576p which should be acceptable to almost any display that is
> currently on the market.
>
> By the way: 480i/576i over HDMI can be problematic too, for example I
> have one combination of scaler and TV here where a 480i HDMI signal from
> the scaler is incorrectly displayed on the TV (the two fields are offset
> by a dozen lines).
>
>> Good. There are devices that understand this type of signal.
>
> Yes - although unfortunately the XRGB Mini is a bit of a "boutique"
> device with a rather high price tag. Its advantage however is that it
> was specifically designed to handle "weird" video signals from old
> computers/consoles/arcade boards and thus treats them properly.

Which is very much what we need but its price tag is one of the main 
reasons I don't expect everyone of us (and the population of casual 
"retro gamers") having those lie around. Even people being known for 
spending inadequate amounts of money on this hobby (like the 
undersigned) will think twice before giving $400+. And there is also the 
problem, which many good devices of this category suffer from - the 
perceivable lags. In the worst case they cause video getting out of sync 
with audio.


>> One of the reasons why I want to give /correct/ 576i on the output
>> rather than the 288p, which (too) many devices don't understand.
>
> If you do that, you can be sure to get issues due to deinterlacing like
> the example in that GIF image - so don't and instead convert the signal
> to 480p/576p which should be acceptable to almost any display that is
> currently on the market.
 >
> [...]
 >
> I really don't see any problem with line-doubling the incoming signal to
> completely avoid any deinterlacing artifacts in this case, "modern
> display" implies HD support these days which also covers
> 480p/576p. You will probably want to output component video anyway

That's the problem. The first stage I want to do is to keep the 64 being 
a 64, meaning it can still be connected to whatever it was before, 
/plus/ those to which it couldn't over the same pins/cables. This means 
the thing has to go inside. Preferably as a "riser" for VIC and give at 
the same output lines a proper composite/s-video signal.

> if you insist on an analog output

I do, because I want it to remain 100% compatible with what was there 
before.

> because TVs with analog RGB support are not that common in the US

and I am not having RGB at this stage anyway. I have (sampled) 
luma/chroma and if I scan-double this, I bet (haven't tried) the same or 
even more devices won't accept the 576p on composite/s-video lines as 
the non-interlaced 288p.

> (except for VGA, but that is on its way out
> and usually does not support 480i/576i anyway).

VGA is definitely passé. I don't even think of it anymore. It's meant to 
be composite/s-video for compatibility and later on HDMI (which has a 
chance to be still there for a few years).


> You should consider using an HDMI output as soon as possible

I agree.

>> Again - I exactly don't want the errors, which I know too well from too
>> many devices not working properly with the VIC's original output.
>
> The error demonstrated in the two GIF images cannot be fixed by forcing
> the video signal into an interlaced timing - instead you will introduce
> these errors on devices that can properly handle the original signal.

No - once I have the proper signal 
(afiltered->digitised->dfiltered->interlaced), cutting off the last 
element from this chain is a matter of toggling one GPIO bit. As I wrote 
I want full compatibility and on the same socket/pins as original. This 
means you have a device that works fine - there you go, no change. You 
have one that doesn't work - let's flip a bit and you get a signal that 
every sane device shouldn't have any excuse to reject.

>> What I want is to make it norm compliant while enhancing/filtering on
>> the way.
>
> Which reminds me: How do you propose to shift the signal timing enough
> that you can add[1] half a line to change the timing from 240p to 480i?

The same as the "regular" devices do. There's plenty of time in 
VBLANK/porches where there is no video. I count the fields and I delay 
each line (out of VBLANK area) as much as is required for processing 
whenever I process/output odd fields and I do the same plus properly 
timed half-line period for lines constituting even fields. At a flip of 
a bit I skip the extra delay for even fields.

> A fully asynchronous output that is not synched to the input signal
> would of course be completely unacceptable

Full ACK.

> - that would introduce even
> worse video errors than the unneccessary deinterlacing that I'm arguing
> against.

Right. No, that's not an option. I want it also to be line based because 
I want to introduce /minimal/ lags, which are plaguing the experience. 
In the worst case the upscaler working on a full-frame buffer introduces 
its lags, then the processed signal is output to a display device, which 
does its own processing and in the end there is a perceivable lack of 
sync between the video and audio for example. Not to mention 
game/joystick reactions for another good example.

>> I had very similar discussion with Gerrit - he also didn't like the idea
>> but I believe that this is because of lack of mutual understanding.
>
> I'm pretty sure that I understand what you are planning, the part that I
> don't get is why you want to do it in a way that will with 100%
> certainty lead to a bad result when you could instead build something
> that will improve the situation even for people whose devices produce
> a working-but-not-great picture.

I think because I can't output 576p over composite/s-video pins of the 
64, can I? And I can't expect the devices to swallow that kind of 
signal. If we skip VGA, I'd have to go purely digital, which would mean 
there is no "cheap" internal variant to which one could add an HDMI 
module later on and would mean either dropping the analogue output or 
making an external device. Such are available (for $400+) already.

-- 
SD!

       Message was sent through the cbm-hackers mailing list
Received on 2015-01-08 18:00:03

Archive generated by hypermail 2.2.0.