Abstract

The goal of this paper is to build average convergence and almost sure convergence for ND (negatively dependent) sequences of random variables under sublinear expectation space. By using the basic definition of sublinear expectation space, Markov inequality, and C r inequality, we extend average convergence and almost sure convergence theorems for ND sequences of random variables under sublinear expectation space, and we provide a way to learn this subject.

Highlights

  • Classical probability theorems are widely used in many fields, which only hold on some occasions of model certainty

  • E limit theorem of nonadditive probability or nonlinear expectation is a challenging question of interest

  • Under the framework of Peng, many limit theorems are gradually established, such as Zhang [4,5,6,7,8] studied some inequalities under sublinear expectation spaces, some limit theorems for sublinear expectation spaces, and Marcinkiewiczs strong law of large numbers for nonlinear expectations; Bayraktar and Munk [9] acquired an α-stable limit theorem under sublinear expectation; Xu and Zhang [10] achieved three-series theorem for independent random variables under sublinear expectations with applications; Wu and Jiang [11] researched strong law of large numbers and Chover’s law of the iterated logarithm under sublinear expectations

Read more

Summary

Introduction

Classical probability theorems are widely used in many fields, which only hold on some occasions of model certainty. E second part mainly introduces definitions and lemmas of sublinear expectation spaces that we need to use. In a sublinear expectation space (Ω, H, E^), a random vector

Results
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call