Re: ROM Dump of Amiga Keyboard controller

From: silverdr_at_wfmh.org.pl
Date: Fri, 18 Jul 2014 21:11:20 +0200
Message-ID: <etPan.53c97158.66334873.1855@szaman.lan>
On 2014-07-18 at 20:41:57, Gerrit Heitsch (gerrit@laosinh.s.bawue.de) wrote:

> > Gerrit, could you be so kind as to return for a moment to the clock phase subject and elaborate  
> a little what was it that we were worried about?
>  
> Well, the thing is, the 6502 has no 1 Byte / 1 Cycle opcode. Even 1 byte
> opcodes take at least 2 cycles.
>  
> Now, the problem is, when you start feeding bytes to the 6502 via Port
> C, you have to find a way to sync your supplied commands to the
> execution in the 6502.
>  
> You can't do that with a string of $EA (NOP) since that takes 2 cycles
> and you can run into the following:
>  
> EA EA EA EA EA EA EA A9 00
>  
> ex -- ex -- ex -- ex -- ex
>  
> ex = executed
> -- = Byte loaded as Operand or dummy cycle.
>  
> So you never reach the A9 and load the Accu with #$00.
>  
> To remedy that you need a command sequence that makes sure a certain
> Byte WILL be executed as a command, from then on you know the internal
> state of the CPU and it becomes easy. That's what I supplied with the
> BIT command. Since with a long enough sequence of NOPs, there are only 2
> alternatives, it will be either the first or the second:
>  
> EA EA EA EA 24 EA EA A9 00
>  
> ex -- ex -- ex -- -- ex --
>  
> -- ex -- ex -- ex -- ex --
>  
>  
> Either way, A9 will be executed as command and the accu will be loaded
> with #$00.

Yes, this part was clear. Still thanks for such a well-presented explanation here once more. If you don't mind, I would use parts of it? It was the part about the clock phase, which was not so clear. 

> The other thing I remember was that the 6500/1 does divide the
> externally supplied clock by two. But the longer I think about it, the
> less I see that as a problem as long as you feed the bytes at the right
> speed (meaning half the clockspeed), the clock phase is of no
> consequence as long as the data read by the CPU is stable the moment
> it's sampled.

How I understood it, is that this internal clock, which is the one actually dictating the CPU's actions (fetching opcodes/data), might be oddly phased in regards to the supplied one. This could theoretically lead to a situation when we /think/ we supply the data to the port using the correct timing but in reality it happens to be improperly timed, like too late for example.. ?

I remember there were worries about it but eventually it seemed to be.. irrelevant? Or we (Jim) were lucky to hit it correctly. Also because there can't be a very high percentage/angle of the phase difference?

--  
SD!

       Message was sent through the cbm-hackers mailing list
Received on 2014-07-18 20:00:02

Archive generated by hypermail 2.2.0.