Abstract
The emergence of satellite communication links offers the data communication designer increased throughput, but at the same time forces him to deal with greatly increased signal transmit time which may eliminate all potential throughput increases. Using a mathematical model, this paper compares batch throughput efficiency of two IBM line control methods, Binary Synchronous Communications (BSC) and Synchronous Data Link Control (SDLC), along with some hypothetical extensions to SDLC. Although the model treats links of any description, the focus here is on links with long signal transit time and low bit error rate, such as satellite links. Classic analog terrestrial links are included for comparison. The model consists of an information source with an infinite supply of data transmitting to an information sink which transmsits no data but does acknowledge received frames as defined in the various protocols. Throughput efficiency is defined as the ratio of the time spent by the data source transmitting original information bytes (i.e., bytes in the SDLC or BSC Information Field, excluding those bytes retransmitted due to errors) to the time spent transmitting all bytes (i.e., information, control, and error recovery) plus time taken by any gaps in continuous transmission. The comparison applies to steady-state batch transmission, so the link is assumed to be point-to-point. Both full duplex and half duplex protocols are considered in SDLC, but only half duplex protocols are contained in BSC. A hypothetical extension to SDLC, Asynchronous Response Mode (ARM), and SDLC Normal Response Mode (NRM) are included in full duplex. The model shows no difference in throughput efficiency between half duplex NRM and ARM, so only one result is presented for half duplex SDLC. Both SDLC and BSC are data link control architectures; for details concerning a particular implementation, the reader should consult the product description. More detail on SDLC and BSC is available in References 1-4. Two transmission links are considered. The terrestrial analog link consists of traditional analog facilities, while the satellite link consists of only a satellite connection between the end points (i.e., the end points are very close to the satellite earth stations, such that negligible transit time and link errors are introduced in the terminal-to-earth station links). Each link is characterized by its bit error rate (BER) and round trip delay (RTD). Bit error rate is the long-term average of bit errors per bit transmitted, while round trip delay is defined as the amount of time between the last bit of a given SDLC frame or BSC block being transmitted and the first bit of its response being received at the source. RTD includes the time (in seconds) necessary to generate and transmit a line-control response at the receiving end, as well as any delay in equipment between the end points. Results are obtained using a simple bit-independent error model. The next section describes the applicable equations, while the following section discusses results.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.