Abstract

AbstractIn this chapter, we consider constant-time algorithms for continuous optimization problems. Specifically, we consider quadratic function minimization and tensor decomposition, both of which have numerous applications in machine learning and data mining. The key component in our analysis isgraph limit theory, which was originally developed to study graphs analytically.

Highlights

  • We turn our attention to constant-time algorithms for continuous optimization problems

  • This section reviews the basic concepts of graph limit theory

  • They formulated the estimation problem as the minimization of a squared loss and showed that the Pearson divergence can be estimated from the minimum value

Read more

Summary

Introduction

We turn our attention to constant-time algorithms for continuous optimization problems. We consider quadratic function minimization and tensor decomposition, both of which have numerous applications in machine learning and data mining. The key component in our analysis is graph limit theory, which was originally developed to study graphs analytically. Throughout this chapter, we assume the real RAM model, in which we can perform basic algebraic operations on real numbers in one step. For a positive integer n, let [n] denote the set {1, 2, . The algorithms and analysis presented in this chapter are based on [5, 6]

Graph Limit Theory
Background
Preliminaries
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call