Study on Prediction Model of Effluent Total Nitrogen Based on Bidirectional Recurrent Neural Network

  • Abstract
  • Literature Map
  • Similar Papers
Abstract
Translate article icon Translate Article Star icon
Take notes icon Take Notes

Study on Prediction Model of Effluent Total Nitrogen Based on Bidirectional Recurrent Neural Network

Similar Papers
  • Conference Article
  • Cite Count Icon 1
  • 10.1109/icpr.2004.482
Improvement of bidirectional recurrent neural network for learning long-term dependencies
  • Aug 23, 2004
  • Jinmiao Chen + 1 more

Bidirectional recurrent neural network (BRNN) is a non-causal generalization of recurrent neural networks (RNNs). Due to the problem of vanishing gradients, BRNN cannot learn long-term dependencies efficiently with gradient descent. To tackle the long-term dependency problem, we propose segmented-memory recurrent neural network (SM-RNN) and develop a bidirectional segmented-memory recurrent neural network(BSMRNN). We test the performance of BSMRNN on the problem of information latching. Our experimental results show that BSMRNN outperforms BRNN on long-term dependency problems.

  • Book Chapter
  • Cite Count Icon 10
  • 10.1007/978-3-540-28648-6_79
Capturing Long-Term Dependencies for Protein Secondary Structure Prediction
  • Jan 1, 2004
  • Jinmiao Chen + 1 more

Bidirectional recurrent neural network (BRNN) is a noncausal system that captures both upstream and downstream information for protein secondary structure prediction. Due to the problem of vanishing gradients, the BRNN can not learn remote information efficiently. To limit this problem, we propose segmented memory recurrent neural network (SMRNN) and obtain a bidirectional segmented-memory recurrent neural network (BSMRNN) by replacing the standard RNNs in BRNN with SMRNNs. Our experiment with BSMRNN for protein secondary structure prediction on the RS126 set indicates improvement in the prediction accuracy.

  • Research Article
  • Cite Count Icon 115
  • 10.1002/er.7360
Stacked bidirectional LSTM RNN to evaluate the remaining useful life of supercapacitor
  • Oct 14, 2021
  • International Journal of Energy Research
  • Chunli Liu + 4 more

To predict the remaining useful life of supercapacitor, a data-based model is established by using a stacked bidirectional long short-term memory recurrent neural network. On the basis of the traditional long short-term memory recurrent neural network, a reverse recurrent layer with t time and subsequent time values in the input sequence is added. A stacked network can ensure enough capacity space. Simulation results show that the network has superior performance when the number of hidden layers is 2, the predicted RMSE and MAE are 0.0275 and 0.0241, respectively. Meanwhile, simulation compares ordinary and bidirectional recurrent neural networks and the bidirectional recurrent neural networks with different recurrent units. For subsequent ameliorate, this project will add swarm intelligence algorithm to optimize the initial weight of neural network and reduce the initial prediction error.

  • Research Article
  • Cite Count Icon 34
  • 10.1007/s00500-005-0489-5
Bidirectional segmented-memory recurrent neural network for protein secondary structure prediction
  • May 18, 2005
  • Soft Computing
  • J Chen + 1 more

The formation of protein secondary structure especially the regions of β-sheets involves long-range interactions between amino acids. We propose a novel recurrent neural network architecture called segmented-memory recurrent neural network (SMRNN) and present experimental results showing that SMRNN outperforms conventional recurrent neural networks on long-term dependency problems. In order to capture long-term dependencies in protein sequences for secondary structure prediction, we develop a predictor based on bidirectional segmented-memory recurrent neural network (BSMRNN), which is a noncausal generalization of SMRNN. In comparison with the existing predictor based on bidirectional recurrent neural network (BRNN), the BSMRNN predictor can improve prediction performance especially the recognition accuracy of β-sheets.

  • Conference Article
  • Cite Count Icon 21
  • 10.1109/healthcom.2017.8210840
Transfer bi-directional LSTM RNN for named entity recognition in Chinese electronic medical records
  • Oct 1, 2017
  • Xishuang Dong + 5 more

In this paper, a transfer bi-directional recurrent neural networks (RNN) is proposed for named entity recognition (NER) in Chinese electronic medical records (EMRs) that aims to extract medical knowledge such as phrases recording diseases and treatments automatically. We propose a two-step procedure where the first step is to train a shallow bi-directional RNN in the general domain, and the second step is to transfer knowledge from the general domain to train a deeper bi-directional RNN for recognizing medical concepts from Chinese EMRs. Specifically, this is achieved by initializing the shallow parts of the deeper network in the second step with parameter weights from the bi-directional RNN trained in the first step. Then the deeper networks are re-trained on the Chinese EMRs. Experimental results show that NER performances are improved by the transferred knowledge significantly.

  • Conference Article
  • Cite Count Icon 2
  • 10.1109/icpr.2004.1333842
Improvement of bidirectional recurrent neural network for learning long-term dependencies
  • Jan 1, 2004
  • Jinmiao Chen + 1 more

Bidirectional recurrent neural network (BRNN) is a non-causal generalization of recurrent neural networks (RNNs). Due to the problem of vanishing gradients, BRNN cannot learn long-term dependencies efficiently with gradient descent. To tackle the long-term dependency problem, we propose segmented-memory recurrent neural network (SM-RNN) and develop a bidirectional segmented-memory recurrent neural network(BSMRNN). We test the performance of BSMRNN on the problem of information latching. Our experimental results show that BSMRNN outperforms BRNN on long-term dependency problems.

  • PDF Download Icon
  • Research Article
  • Cite Count Icon 12
  • 10.3389/fbioe.2020.00808
Riboflow: Using Deep Learning to Classify Riboswitches With ∼99% Accuracy.
  • Jul 14, 2020
  • Frontiers in Bioengineering and Biotechnology
  • Keshav Aditya R Premkumar + 2 more

Riboswitches are cis-regulatory genetic elements that use an aptamer to control gene expression. Specificity to cognate ligand and diversity of such ligands have expanded the functional repertoire of riboswitches to mediate mounting apt responses to sudden metabolic demands and signal changes in environmental conditions. Given their critical role in microbial life, riboswitch characterisation remains a challenging computational problem. Here we have addressed the issue with advanced deep learning frameworks, namely convolutional neural networks (CNN), and bidirectional recurrent neural networks (RNN) with Long Short-Term Memory (LSTM). Using a comprehensive dataset of 32 ligand classes and a stratified train-validate-test approach, we demonstrated the accurate performance of both the deep learning models (CNN and RNN) relative to conventional hyperparameter-optimized machine learning classifiers on all key performance metrics, including the ROC curve analysis. In particular, the bidirectional LSTM RNN emerged as the best-performing learning method for identifying the ligand-specificity of riboswitches with an accuracy >0.99 and macro-averaged F-score of 0.96. An additional attraction is that the deep learning models do not require prior feature engineering. A dynamic update functionality is built into the models to factor for the constant discovery of new riboswitches, and extend the predictive modeling to new classes. Our work would enable the design of genetic circuits with custom-tuned riboswitch aptamers that would effect precise translational control in synthetic biology. The associated software is available as an open-source Python package and standalone resource for use in genome annotation, synthetic biology, and biotechnology workflows.

  • Research Article
  • Cite Count Icon 98
  • 10.1364/oe.27.019650
End-to-end optimized transmission over dispersive intensity-modulated channels using bidirectional recurrent neural networks.
  • Jun 28, 2019
  • Optics Express
  • Boris Karanov + 3 more

We propose an autoencoding sequence-based transceiver for communication over dispersive channels with intensity modulation and direct detection (IM/DD), designed as a bidirectional deep recurrent neural network (BRNN). The receiver uses a sliding window technique to allow for efficient data stream estimation. We find that this sliding window BRNN (SBRNN), based on end-to-end deep learning of the communication system, achieves a significant bit-error-rate reduction at all examined distances in comparison to previous block-based autoencoders implemented as feed-forward neural networks (FFNNs), leading to an increase of the transmission distance. We also compare the end-to-end SBRNN with a state-of-the-art IM/DD solution based on two level pulse amplitude modulation with an FFNN receiver, simultaneously processing multiple received symbols and approximating nonlinear Volterra equalization. Our results show that the SBRNN outperforms such systems at both 42 and 84 Gb/s, while training fewer parameters. Our novel SBRNN design aims at tailoring the end-to-end deep learning-based systems for communication over nonlinear channels with memory, such as the optical IM/DD fiber channel.

  • Book Chapter
  • Cite Count Icon 115
  • 10.1007/978-3-319-46681-1_42
Bi-directional LSTM Recurrent Neural Network for Chinese Word Segmentation
  • Jan 1, 2016
  • Yushi Yao + 1 more

Recurrent neural network (RNN) has been broadly applied to natural language process (NLP) problems. This kind of neural network is designed for modeling sequential data and has been testified to be quite efficient in sequential tagging tasks. In this paper, we propose to use bi-directional RNN with long short-term memory (LSTM) units for Chinese word segmentation, which is a crucial task for modeling Chinese sentences and articles. Classical methods focus on designing and combining hand-craft features from context, whereas bi-directional LSTM network (BLSTM) does not need any prior knowledge or pre-designing, and is expert in creating hierarchical feature representation of contextual information from both directions. Experiment result shows that our approach gets state-of-the-art performance in word segmentation on both traditional Chinese datasets and simplified Chinese datasets.

  • Conference Article
  • Cite Count Icon 1
  • 10.1109/iccsec.2017.8446758
Bi-Directional LSTM Recurrent Neural Network for Lumbar Vertebrae Identification in X-Ray Images
  • Dec 1, 2017
  • Yang Li + 2 more

Duo to the capability of providing online patient pose, mobile C-arm X-ray images play a key role in image-guided minimally invasive spine surgery. However, automatic lumbar vertebrae identification is still a challenge task because of the inherent limitation of mobile C-arm. In order to solve these problems, a novel automatic lumbar vertebrae identification method is proposed, which based on bidirectional long short-term memory (LSTM) recurrent neural network (RNN). First, in order to solve the problem of lumbar vertebrae texture overlapping in X-ray images, the curvature features of 3D lumbar vertebrae model, which are common to the 2D X-ray images, are taken as the input of the model. Second, in order to simulate the multi-view imaging of intraoperative C-arm, the bi-directional recurrent neural network is exploited to learn the correlation of lumbar curvature features at different imaging angles. Finally, in order to avoid of gradient vanishing and error blowing up, the LSTM neuron is applied to replace the notes of bi-directional RNN. Experiment results show that our method identified lumbar vertebrae more accurately than another two methods.

  • Research Article
  • Cite Count Icon 85
  • 10.1016/j.isatra.2020.07.011
Bidirectional deep recurrent neural networks for process fault classification
  • Jul 13, 2020
  • ISA Transactions
  • Gavneet Singh Chadha + 3 more

Bidirectional deep recurrent neural networks for process fault classification

  • Conference Article
  • Cite Count Icon 74
  • 10.1109/icassp.2016.7472841
Bi-directional recurrent neural network with ranking loss for spoken language understanding
  • Mar 1, 2016
  • Ngoc Thang Vu + 3 more

This paper presents our latest investigation of recurrent neural networks for the slot filling task of spoken language understanding. We implement a bi-directional Elman-type recurrent neural network which takes the information not only from the past but also from the future context to predict the semantic label of the target word. Furthermore, we propose to use ranking loss function to train the model. This improves the performance over the cross entropy loss function. On the ATIS benchmark data set, we achieve a new state-of-the-art result of 95.56% F1-score without using any additional knowledge or data sources.

  • Research Article
  • Cite Count Icon 23
  • 10.1016/j.buildenv.2022.108896
Imputing missing indoor air quality data with inverse mapping generative adversarial network
  • Feb 22, 2022
  • Building and Environment
  • Zejun Wu + 5 more

Imputing missing indoor air quality data with inverse mapping generative adversarial network

  • Research Article
  • Cite Count Icon 11
  • 10.1109/jsen.2022.3198882
SEMG Onset Detection via Bidirectional Recurrent Neural Networks With Applications to Sports Science
  • Oct 1, 2022
  • IEEE Sensors Journal
  • Mert Ergeneci + 2 more

Surface electromyography (sEMG) provides physiological information that can be used in sports science. In many applications, sEMG signal activity, i.e., contractions, needs to be detected in the stream of sensor recordings. During sports exercises, the impact of any collision on the body due to an athlete’s movement (e.g., jump) forms an additive noise called motion-induced artifact (MIA) in sEMG recordings. This study proposes a bidirectional long short-term memory recurrent neural network (BLSTM-RNN) to automatically identify sEMG signal activity in measurements that include MIA. The proposed model is compared with the state-of-the-art techniques that are envelope, sample entropy (SampEn), modified adaptive linear energy detector (M-ALED), and adaptive contraction detection (ACD). As hamstring strain injuries (HSIs) are the most frequent and recurring injuries in professional football, this article uses sEMG data of different hamstring exercises performed by first-team players of the Leeds United Football Club. On data recorded using state-of-the-art sensors, the classification accuracy of the proposed solution is 96.73%, while the other methods reach 61.41% (sEMG envelope), 84.95% (SampEn), 58.86% (M-ALED), and 65.54% (ACD).

  • Research Article
  • Cite Count Icon 51
  • 10.1109/tcbb.2007.1055
Cascaded Bidirectional Recurrent Neural Networks for Protein Secondary Structure Prediction
  • Oct 1, 2007
  • IEEE/ACM Transactions on Computational Biology and Bioinformatics
  • Jinmiao Chen + 1 more

Protein secondary structure (PSS) prediction is an important topic in bioinformatics. Our study on a large set of non-homologous proteins shows that long-range interactions commonly exist and negatively affect PSS prediction. Besides, we also reveal strong correlations between secondary structure (SS) elements. In order to take into account the long-range interactions and SS-SS correlations, we propose a novel prediction system based on cascaded bidirectional recurrent neural network (BRNN). We compare the cascaded BRNN against another two BRNN architectures, namely the original BRNN architecture used for speech recognition as well as Pollastri's BRNN that was proposed for PSS prediction. Our cascaded BRNN achieves an overall three state accuracy Q3 of 74.38\%, and reaches a high Segment OVerlap (SOV) of 66.0455. It outperforms the original BRNN and Pollastri's BRNN in both Q3 and SOV. Specifically, it improves the SOV score by 4-6%.

Save Icon
Up Arrow
Open/Close
  • Ask R Discovery Star icon
  • Chat PDF Star icon

AI summaries and top papers from 250M+ research sources.