How did Parallel ATA/IDE Independent Device Timing work electrically?












4














Back in the days of Parallel ATA (a.k.a. IDE) it was possible to connect two drives with a single cable. The ATA standard defines a number of modes with different speeds which the drives and the controller use to communicate over the ATA cable. Controllers could support a feature called "independent device timing" to allow each drive to communicate at its fastest speed, instead of both drives being slowed down to the speed of the slowest drive. How did that work at the electrical and protocol level?



Naively I would think that the slower drive would get confused by the higher clock speed data transfer for the faster drive, and if not locking up outright it would not be able to detect when the transfer for the fast drive would be finished and so when it could expect a next transaction directed at itself. But apparently that didn't happen, so how was independent device timing implemented?



edit: clarify I'm talking about ATA interface transmission speed, not the drive's own read/write speed










share|improve this question
























  • Are you sure this is a thing at all? Maybe some source for your claim about slowing down, clock or whatsoever?
    – Raffzahn
    6 hours ago






  • 1




    The main reference on the internet for independent device timing seems to be pcguide.com/ref/hdd/if/ide/confTiming-c.html. Different UDMA modes use lower cycle times/higher frequencies, so somehow devices only supporting a lower UDMA mode are not getting confused by higher UDMA mode transfers to the other drive on the cable. If you feed any random digital device data at a speed it was not designed for it is not going to work, but in some way it does work for IDE drives.
    – JanKanis
    6 hours ago










  • Could it be that you miss the fact, that IDE is not a drive interface, but a subset of the AT ISA bus? There is no direct interaction with a drive, but simple I/O operations on the ISA bus. UDMA is just a speed up to the memory cycle used over this interface (after being detatched form the original bus interface due later development). Threse are two different issues.
    – Raffzahn
    6 hours ago










  • I'm aware of that. It's the speed of the bus I'm talking about. The different UDMA modes define different speeds of the bus, and somehow the controller is able to use a higher speed with a device supporting it, without breaking another attached device that doesn't support that speed.
    – JanKanis
    6 hours ago






  • 1




    There are several ways this could work, but I can't find any definitive answer on if one of these is correct. Maybe the control messages are at the slower speed and only the DMA transfer is at the faster speed, and so the slower device knows for how long to ignore the bus. Maybe the standard defines that devices should be resistant to higher speed signals they can't interpret. Maybe something else still.
    – JanKanis
    6 hours ago
















4














Back in the days of Parallel ATA (a.k.a. IDE) it was possible to connect two drives with a single cable. The ATA standard defines a number of modes with different speeds which the drives and the controller use to communicate over the ATA cable. Controllers could support a feature called "independent device timing" to allow each drive to communicate at its fastest speed, instead of both drives being slowed down to the speed of the slowest drive. How did that work at the electrical and protocol level?



Naively I would think that the slower drive would get confused by the higher clock speed data transfer for the faster drive, and if not locking up outright it would not be able to detect when the transfer for the fast drive would be finished and so when it could expect a next transaction directed at itself. But apparently that didn't happen, so how was independent device timing implemented?



edit: clarify I'm talking about ATA interface transmission speed, not the drive's own read/write speed










share|improve this question
























  • Are you sure this is a thing at all? Maybe some source for your claim about slowing down, clock or whatsoever?
    – Raffzahn
    6 hours ago






  • 1




    The main reference on the internet for independent device timing seems to be pcguide.com/ref/hdd/if/ide/confTiming-c.html. Different UDMA modes use lower cycle times/higher frequencies, so somehow devices only supporting a lower UDMA mode are not getting confused by higher UDMA mode transfers to the other drive on the cable. If you feed any random digital device data at a speed it was not designed for it is not going to work, but in some way it does work for IDE drives.
    – JanKanis
    6 hours ago










  • Could it be that you miss the fact, that IDE is not a drive interface, but a subset of the AT ISA bus? There is no direct interaction with a drive, but simple I/O operations on the ISA bus. UDMA is just a speed up to the memory cycle used over this interface (after being detatched form the original bus interface due later development). Threse are two different issues.
    – Raffzahn
    6 hours ago










  • I'm aware of that. It's the speed of the bus I'm talking about. The different UDMA modes define different speeds of the bus, and somehow the controller is able to use a higher speed with a device supporting it, without breaking another attached device that doesn't support that speed.
    – JanKanis
    6 hours ago






  • 1




    There are several ways this could work, but I can't find any definitive answer on if one of these is correct. Maybe the control messages are at the slower speed and only the DMA transfer is at the faster speed, and so the slower device knows for how long to ignore the bus. Maybe the standard defines that devices should be resistant to higher speed signals they can't interpret. Maybe something else still.
    – JanKanis
    6 hours ago














4












4








4







Back in the days of Parallel ATA (a.k.a. IDE) it was possible to connect two drives with a single cable. The ATA standard defines a number of modes with different speeds which the drives and the controller use to communicate over the ATA cable. Controllers could support a feature called "independent device timing" to allow each drive to communicate at its fastest speed, instead of both drives being slowed down to the speed of the slowest drive. How did that work at the electrical and protocol level?



Naively I would think that the slower drive would get confused by the higher clock speed data transfer for the faster drive, and if not locking up outright it would not be able to detect when the transfer for the fast drive would be finished and so when it could expect a next transaction directed at itself. But apparently that didn't happen, so how was independent device timing implemented?



edit: clarify I'm talking about ATA interface transmission speed, not the drive's own read/write speed










share|improve this question















Back in the days of Parallel ATA (a.k.a. IDE) it was possible to connect two drives with a single cable. The ATA standard defines a number of modes with different speeds which the drives and the controller use to communicate over the ATA cable. Controllers could support a feature called "independent device timing" to allow each drive to communicate at its fastest speed, instead of both drives being slowed down to the speed of the slowest drive. How did that work at the electrical and protocol level?



Naively I would think that the slower drive would get confused by the higher clock speed data transfer for the faster drive, and if not locking up outright it would not be able to detect when the transfer for the fast drive would be finished and so when it could expect a next transaction directed at itself. But apparently that didn't happen, so how was independent device timing implemented?



edit: clarify I'm talking about ATA interface transmission speed, not the drive's own read/write speed







hardware ide






share|improve this question















share|improve this question













share|improve this question




share|improve this question








edited 3 hours ago

























asked 6 hours ago









JanKanis

1786




1786












  • Are you sure this is a thing at all? Maybe some source for your claim about slowing down, clock or whatsoever?
    – Raffzahn
    6 hours ago






  • 1




    The main reference on the internet for independent device timing seems to be pcguide.com/ref/hdd/if/ide/confTiming-c.html. Different UDMA modes use lower cycle times/higher frequencies, so somehow devices only supporting a lower UDMA mode are not getting confused by higher UDMA mode transfers to the other drive on the cable. If you feed any random digital device data at a speed it was not designed for it is not going to work, but in some way it does work for IDE drives.
    – JanKanis
    6 hours ago










  • Could it be that you miss the fact, that IDE is not a drive interface, but a subset of the AT ISA bus? There is no direct interaction with a drive, but simple I/O operations on the ISA bus. UDMA is just a speed up to the memory cycle used over this interface (after being detatched form the original bus interface due later development). Threse are two different issues.
    – Raffzahn
    6 hours ago










  • I'm aware of that. It's the speed of the bus I'm talking about. The different UDMA modes define different speeds of the bus, and somehow the controller is able to use a higher speed with a device supporting it, without breaking another attached device that doesn't support that speed.
    – JanKanis
    6 hours ago






  • 1




    There are several ways this could work, but I can't find any definitive answer on if one of these is correct. Maybe the control messages are at the slower speed and only the DMA transfer is at the faster speed, and so the slower device knows for how long to ignore the bus. Maybe the standard defines that devices should be resistant to higher speed signals they can't interpret. Maybe something else still.
    – JanKanis
    6 hours ago


















  • Are you sure this is a thing at all? Maybe some source for your claim about slowing down, clock or whatsoever?
    – Raffzahn
    6 hours ago






  • 1




    The main reference on the internet for independent device timing seems to be pcguide.com/ref/hdd/if/ide/confTiming-c.html. Different UDMA modes use lower cycle times/higher frequencies, so somehow devices only supporting a lower UDMA mode are not getting confused by higher UDMA mode transfers to the other drive on the cable. If you feed any random digital device data at a speed it was not designed for it is not going to work, but in some way it does work for IDE drives.
    – JanKanis
    6 hours ago










  • Could it be that you miss the fact, that IDE is not a drive interface, but a subset of the AT ISA bus? There is no direct interaction with a drive, but simple I/O operations on the ISA bus. UDMA is just a speed up to the memory cycle used over this interface (after being detatched form the original bus interface due later development). Threse are two different issues.
    – Raffzahn
    6 hours ago










  • I'm aware of that. It's the speed of the bus I'm talking about. The different UDMA modes define different speeds of the bus, and somehow the controller is able to use a higher speed with a device supporting it, without breaking another attached device that doesn't support that speed.
    – JanKanis
    6 hours ago






  • 1




    There are several ways this could work, but I can't find any definitive answer on if one of these is correct. Maybe the control messages are at the slower speed and only the DMA transfer is at the faster speed, and so the slower device knows for how long to ignore the bus. Maybe the standard defines that devices should be resistant to higher speed signals they can't interpret. Maybe something else still.
    – JanKanis
    6 hours ago
















Are you sure this is a thing at all? Maybe some source for your claim about slowing down, clock or whatsoever?
– Raffzahn
6 hours ago




Are you sure this is a thing at all? Maybe some source for your claim about slowing down, clock or whatsoever?
– Raffzahn
6 hours ago




1




1




The main reference on the internet for independent device timing seems to be pcguide.com/ref/hdd/if/ide/confTiming-c.html. Different UDMA modes use lower cycle times/higher frequencies, so somehow devices only supporting a lower UDMA mode are not getting confused by higher UDMA mode transfers to the other drive on the cable. If you feed any random digital device data at a speed it was not designed for it is not going to work, but in some way it does work for IDE drives.
– JanKanis
6 hours ago




The main reference on the internet for independent device timing seems to be pcguide.com/ref/hdd/if/ide/confTiming-c.html. Different UDMA modes use lower cycle times/higher frequencies, so somehow devices only supporting a lower UDMA mode are not getting confused by higher UDMA mode transfers to the other drive on the cable. If you feed any random digital device data at a speed it was not designed for it is not going to work, but in some way it does work for IDE drives.
– JanKanis
6 hours ago












Could it be that you miss the fact, that IDE is not a drive interface, but a subset of the AT ISA bus? There is no direct interaction with a drive, but simple I/O operations on the ISA bus. UDMA is just a speed up to the memory cycle used over this interface (after being detatched form the original bus interface due later development). Threse are two different issues.
– Raffzahn
6 hours ago




Could it be that you miss the fact, that IDE is not a drive interface, but a subset of the AT ISA bus? There is no direct interaction with a drive, but simple I/O operations on the ISA bus. UDMA is just a speed up to the memory cycle used over this interface (after being detatched form the original bus interface due later development). Threse are two different issues.
– Raffzahn
6 hours ago












I'm aware of that. It's the speed of the bus I'm talking about. The different UDMA modes define different speeds of the bus, and somehow the controller is able to use a higher speed with a device supporting it, without breaking another attached device that doesn't support that speed.
– JanKanis
6 hours ago




I'm aware of that. It's the speed of the bus I'm talking about. The different UDMA modes define different speeds of the bus, and somehow the controller is able to use a higher speed with a device supporting it, without breaking another attached device that doesn't support that speed.
– JanKanis
6 hours ago




1




1




There are several ways this could work, but I can't find any definitive answer on if one of these is correct. Maybe the control messages are at the slower speed and only the DMA transfer is at the faster speed, and so the slower device knows for how long to ignore the bus. Maybe the standard defines that devices should be resistant to higher speed signals they can't interpret. Maybe something else still.
– JanKanis
6 hours ago




There are several ways this could work, but I can't find any definitive answer on if one of these is correct. Maybe the control messages are at the slower speed and only the DMA transfer is at the faster speed, and so the slower device knows for how long to ignore the bus. Maybe the standard defines that devices should be resistant to higher speed signals they can't interpret. Maybe something else still.
– JanKanis
6 hours ago










1 Answer
1






active

oldest

votes


















3














I think I found the main part of the answer by searching through a standards document.



IDE connectors have some dedicated and re-purposed signal lines specifically for DMA. Some of these stay asserted for as long as a DMA transfer is in progress, allowing the other device to see when a DMA has finished and thus when new commands can be sent by the host. [1] I saw somewhere that DMA cannot be used if the other device on the cable does not support at least some DMA mode (so that it understands the meaning of the control signals), but I cannot find the reference back.



For sending commands there are similar signal lines that indicate if a command is in progress or if the command is done. When writing the command there are dedicated signal lines that indicates the target device and register. Writing a command is limited to the speed of the slowest device, writing data can be done at the target device speed as long as the device selection and register signal lines don't change. [2]



[1]: The ATA standard ("Information Technology - AT Attachment with Packet Interface – 7 Volume 2") of which I found a draft here; section 8: Parallel interface signal assignments and descriptions, 8.1, 8.2 (p 61-63).



[2]: idem section 12.2: transfer timing. Table "Register transfer to/from device" note 4 (p 148):




Mode shall be selected no higher than the highest mode supported by the slowest device.




Table "PIO data transfer to/from device" note 4 (p150):




Mode may be selected at the highest mode for the device if CS(1:0) and DA(2:0) do not change between read or write cycles or selected at the highest mode supported by the slowest device if CS(1:0) or DA(2:0) do change between read or write cycles.







share|improve this answer























    Your Answer








    StackExchange.ready(function() {
    var channelOptions = {
    tags: "".split(" "),
    id: "648"
    };
    initTagRenderer("".split(" "), "".split(" "), channelOptions);

    StackExchange.using("externalEditor", function() {
    // Have to fire editor after snippets, if snippets enabled
    if (StackExchange.settings.snippets.snippetsEnabled) {
    StackExchange.using("snippets", function() {
    createEditor();
    });
    }
    else {
    createEditor();
    }
    });

    function createEditor() {
    StackExchange.prepareEditor({
    heartbeatType: 'answer',
    autoActivateHeartbeat: false,
    convertImagesToLinks: false,
    noModals: true,
    showLowRepImageUploadWarning: true,
    reputationToPostImages: null,
    bindNavPrevention: true,
    postfix: "",
    imageUploader: {
    brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
    contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
    allowUrls: true
    },
    noCode: true, onDemand: true,
    discardSelector: ".discard-answer"
    ,immediatelyShowMarkdownHelp:true
    });


    }
    });














    draft saved

    draft discarded


















    StackExchange.ready(
    function () {
    StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fretrocomputing.stackexchange.com%2fquestions%2f8597%2fhow-did-parallel-ata-ide-independent-device-timing-work-electrically%23new-answer', 'question_page');
    }
    );

    Post as a guest















    Required, but never shown

























    1 Answer
    1






    active

    oldest

    votes








    1 Answer
    1






    active

    oldest

    votes









    active

    oldest

    votes






    active

    oldest

    votes









    3














    I think I found the main part of the answer by searching through a standards document.



    IDE connectors have some dedicated and re-purposed signal lines specifically for DMA. Some of these stay asserted for as long as a DMA transfer is in progress, allowing the other device to see when a DMA has finished and thus when new commands can be sent by the host. [1] I saw somewhere that DMA cannot be used if the other device on the cable does not support at least some DMA mode (so that it understands the meaning of the control signals), but I cannot find the reference back.



    For sending commands there are similar signal lines that indicate if a command is in progress or if the command is done. When writing the command there are dedicated signal lines that indicates the target device and register. Writing a command is limited to the speed of the slowest device, writing data can be done at the target device speed as long as the device selection and register signal lines don't change. [2]



    [1]: The ATA standard ("Information Technology - AT Attachment with Packet Interface – 7 Volume 2") of which I found a draft here; section 8: Parallel interface signal assignments and descriptions, 8.1, 8.2 (p 61-63).



    [2]: idem section 12.2: transfer timing. Table "Register transfer to/from device" note 4 (p 148):




    Mode shall be selected no higher than the highest mode supported by the slowest device.




    Table "PIO data transfer to/from device" note 4 (p150):




    Mode may be selected at the highest mode for the device if CS(1:0) and DA(2:0) do not change between read or write cycles or selected at the highest mode supported by the slowest device if CS(1:0) or DA(2:0) do change between read or write cycles.







    share|improve this answer




























      3














      I think I found the main part of the answer by searching through a standards document.



      IDE connectors have some dedicated and re-purposed signal lines specifically for DMA. Some of these stay asserted for as long as a DMA transfer is in progress, allowing the other device to see when a DMA has finished and thus when new commands can be sent by the host. [1] I saw somewhere that DMA cannot be used if the other device on the cable does not support at least some DMA mode (so that it understands the meaning of the control signals), but I cannot find the reference back.



      For sending commands there are similar signal lines that indicate if a command is in progress or if the command is done. When writing the command there are dedicated signal lines that indicates the target device and register. Writing a command is limited to the speed of the slowest device, writing data can be done at the target device speed as long as the device selection and register signal lines don't change. [2]



      [1]: The ATA standard ("Information Technology - AT Attachment with Packet Interface – 7 Volume 2") of which I found a draft here; section 8: Parallel interface signal assignments and descriptions, 8.1, 8.2 (p 61-63).



      [2]: idem section 12.2: transfer timing. Table "Register transfer to/from device" note 4 (p 148):




      Mode shall be selected no higher than the highest mode supported by the slowest device.




      Table "PIO data transfer to/from device" note 4 (p150):




      Mode may be selected at the highest mode for the device if CS(1:0) and DA(2:0) do not change between read or write cycles or selected at the highest mode supported by the slowest device if CS(1:0) or DA(2:0) do change between read or write cycles.







      share|improve this answer


























        3












        3








        3






        I think I found the main part of the answer by searching through a standards document.



        IDE connectors have some dedicated and re-purposed signal lines specifically for DMA. Some of these stay asserted for as long as a DMA transfer is in progress, allowing the other device to see when a DMA has finished and thus when new commands can be sent by the host. [1] I saw somewhere that DMA cannot be used if the other device on the cable does not support at least some DMA mode (so that it understands the meaning of the control signals), but I cannot find the reference back.



        For sending commands there are similar signal lines that indicate if a command is in progress or if the command is done. When writing the command there are dedicated signal lines that indicates the target device and register. Writing a command is limited to the speed of the slowest device, writing data can be done at the target device speed as long as the device selection and register signal lines don't change. [2]



        [1]: The ATA standard ("Information Technology - AT Attachment with Packet Interface – 7 Volume 2") of which I found a draft here; section 8: Parallel interface signal assignments and descriptions, 8.1, 8.2 (p 61-63).



        [2]: idem section 12.2: transfer timing. Table "Register transfer to/from device" note 4 (p 148):




        Mode shall be selected no higher than the highest mode supported by the slowest device.




        Table "PIO data transfer to/from device" note 4 (p150):




        Mode may be selected at the highest mode for the device if CS(1:0) and DA(2:0) do not change between read or write cycles or selected at the highest mode supported by the slowest device if CS(1:0) or DA(2:0) do change between read or write cycles.







        share|improve this answer














        I think I found the main part of the answer by searching through a standards document.



        IDE connectors have some dedicated and re-purposed signal lines specifically for DMA. Some of these stay asserted for as long as a DMA transfer is in progress, allowing the other device to see when a DMA has finished and thus when new commands can be sent by the host. [1] I saw somewhere that DMA cannot be used if the other device on the cable does not support at least some DMA mode (so that it understands the meaning of the control signals), but I cannot find the reference back.



        For sending commands there are similar signal lines that indicate if a command is in progress or if the command is done. When writing the command there are dedicated signal lines that indicates the target device and register. Writing a command is limited to the speed of the slowest device, writing data can be done at the target device speed as long as the device selection and register signal lines don't change. [2]



        [1]: The ATA standard ("Information Technology - AT Attachment with Packet Interface – 7 Volume 2") of which I found a draft here; section 8: Parallel interface signal assignments and descriptions, 8.1, 8.2 (p 61-63).



        [2]: idem section 12.2: transfer timing. Table "Register transfer to/from device" note 4 (p 148):




        Mode shall be selected no higher than the highest mode supported by the slowest device.




        Table "PIO data transfer to/from device" note 4 (p150):




        Mode may be selected at the highest mode for the device if CS(1:0) and DA(2:0) do not change between read or write cycles or selected at the highest mode supported by the slowest device if CS(1:0) or DA(2:0) do change between read or write cycles.








        share|improve this answer














        share|improve this answer



        share|improve this answer








        edited 3 hours ago

























        answered 3 hours ago









        JanKanis

        1786




        1786






























            draft saved

            draft discarded




















































            Thanks for contributing an answer to Retrocomputing Stack Exchange!


            • Please be sure to answer the question. Provide details and share your research!

            But avoid



            • Asking for help, clarification, or responding to other answers.

            • Making statements based on opinion; back them up with references or personal experience.


            To learn more, see our tips on writing great answers.





            Some of your past answers have not been well-received, and you're in danger of being blocked from answering.


            Please pay close attention to the following guidance:


            • Please be sure to answer the question. Provide details and share your research!

            But avoid



            • Asking for help, clarification, or responding to other answers.

            • Making statements based on opinion; back them up with references or personal experience.


            To learn more, see our tips on writing great answers.




            draft saved


            draft discarded














            StackExchange.ready(
            function () {
            StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fretrocomputing.stackexchange.com%2fquestions%2f8597%2fhow-did-parallel-ata-ide-independent-device-timing-work-electrically%23new-answer', 'question_page');
            }
            );

            Post as a guest















            Required, but never shown





















































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown

































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown







            Popular posts from this blog

            Ellipse (mathématiques)

            Quarter-circle Tiles

            Mont Emei