Abstract

MatDL: A Lightweight Deep Learning Library in MATLAB

Highlights

  • MatDL (Fayek 2017) is an open-source lightweight deep learning (LeCun, Bengio, and Hinton 2015; Goodfellow, Bengio, and Courville 2016) library native in MATLAB that implements some most commonly used deep learning algorithms

  • The library comprises functions that implement the following: (1) basic building blocks of modern neural networks such as affine transformations, convolutions, nonlinear operations, dropout, batch normalization, etc.; (2) popular architectures such as deep neural networks (DNNs), convolutional neural networks (ConvNets), and recurrent neural networks (RNNs) and their variant, the long short-term memory (LSTM) RNNs; (3) optimizers such stochastic gradient descent (SGD), RMSProp and ADAM; as well as (4) boilerplate functions for training, gradients checking, etc. Most of these functions can run on a CPU or a MATLABcompatible CUDA-enabled GPU

  • MatDL was inspired by Stanford’s CS231n (Fei-Fei and others 2017) and Torch (Collobert, Kavukcuoglu, and Farabet 2011), and is conceptually similar to Keras (Chollet and others 2015) and Lasagne (Dieleman et al 2015), but unlike these libraries, it is natively implemented in MATLAB

Read more

Summary

Introduction

MatDL (Fayek 2017) is an open-source lightweight deep learning (LeCun, Bengio, and Hinton 2015; Goodfellow, Bengio, and Courville 2016) library native in MATLAB that implements some most commonly used deep learning algorithms. The library comprises functions that implement the following: (1) basic building blocks of modern neural networks such as affine transformations, convolutions, nonlinear operations, dropout, batch normalization, etc.; (2) popular architectures such as deep neural networks (DNNs), convolutional neural networks (ConvNets), and recurrent neural networks (RNNs) and their variant, the long short-term memory (LSTM) RNNs; (3) optimizers such stochastic gradient descent (SGD), RMSProp and ADAM; as well as (4) boilerplate functions for training, gradients checking, etc.

Results
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call