Abstract
The clustering problem is solved, using probabilistic programming languages belonging to two families—languages that implement graphical models (Infer.NET) and arbitrary computable generative models (Church). A comparison is made of the features and efficiency of the implementations. It is established that the Infer.NET language has higher accuracy and throughput of the implementation, but that it required the use of an imperative component of the language, which exceeds the limits of generative models. An autoencoder—a standard element of deep-learning networks—has been implemented in the Church language, which did not require the implementation of specialized network-training methods. It is shown that there is great potential in general-purpose probabilistic languages, the implementation of which, however, requires inference methods to be developed.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have