Abstract
Discrete memoryless multicast network (DM-MN) is considered in this paper. We analyze the lower bounds of noisy network coding (NNC) and distributed decode-forward (DDF) for DM-MN, and show that both NNC and DDF ignore the channel output observed at the transmitter. Motivated by this observation, new coding schemes are proposed to improve NNC and DDF by exploiting the transmitter’s observation and applying hybrid relaying strategies. We first study a special case when the transmitter’s observation is rate-limited feedback signals, and propose a scheme that strictly improves NNC when feedback rates are sufficiently large. For the relay channel with perfect relay-transmitter feedback, our achievable rate reduces to Gabbai and Bross’s rate, which is strictly larger than NNC, DDF, and all known lower bounds on the achievable rates proposed for the setup without feedback. In our scheme, both relays and receivers compress their received signals like NNC, and the relays decode independent “common” and “private” parts of the source message. The generated compression indices are sent to the transmitter through feedback, from which the transmitter reconstructs the receivers’ and relays’ inputs and can thus cooperate with the receivers and relays. We then extend our idea to DM-MN without feedback. For this case, although the transmitter observes channel output, both NNC and DDF simply ignore it, while our new scheme has the transmitter utilize its channel output to decode a set of relays’ and receivers’ compression indices, which achieves some cooperation levels between the transmitter and the receivers and relays. An enhanced relay channel is introduced to show that our scheme strictly outperforms NNC and DDF.
Published Version
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have