Abstract

While the recurrent neural network (RNN) architecture has been the go-to model in transient modeling, recently the temporal convolutional network (TCN) has been garnering more attention as it has a longer memory than recurrent architectures with the same capacity. In this paper, we propose the use of the TCN for transient simulation of high-speed channels. The adaptive successive halving algorithm (ASH-HPO) is used to perform automated hyperparameter optimization for the TCN. It has two components, progressive sampling and successive halving. It iteratively expand the size of training dataset and eliminates a certain percentage of bad performing models. The progressive sampling component is modified to preserve the original sequencing of time series data to prevent information leakage. Also, the successive halving component is modified so that each eliminated model must be validated using at least two different validation datasets before it is being removed. The robustness of the proposed method is demonstrated using four high-speed channel examples, and the TCN is compared against existing convolutional neural network long short-term memory (CNN-LSTM) and dilated causal convolution (DCC) models. The TCN outperforms the other models consistently in all four tasks in terms of training speed, amount of training data to converge, and accuracy.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call