This paper provides the first general technique for proving information lower bounds on two-party unbounded-rounds communication problems. We show that the discrepancy lower bound, which applies to randomized communication complexity, also applies to information complexity. More precisely, if the discrepancy of a two-party function f with respect to a distribution $$\mu $$μ is $$Disc_\mu f$$Discμf, then any two party randomized protocol computing f must reveal at least $$\Omega (\log (1/Disc_\mu f))$$Ω(log(1/Discμf)) bits of information to the participants. As a corollary, we obtain that any two-party protocol for computing a random function on $$\{0,1\}^n\times \{0,1\}^n$${0,1}n?{0,1}n must reveal $$\Omega (n)$$Ω(n) bits of information to the participants. In addition, we prove that the discrepancy of the Greater-Than function is $$\Omega (1/\sqrt{n})$$Ω(1/n), which provides an alternative proof to the recent proof of Viola (Proceedings of the twenty-fourth annual ACM-SIAM symposium on discrete algorithms, SODA 2013, New Orleans, LA, USA, 6---8 Jan 2013, pp 632---651, 2013) of the $$\Omega (\log n)$$Ω(logn) lower bound on the communication complexity of this well-studied function and, combined with our main result, proves the tight $$\Omega (\log n)$$Ω(logn) lower bound on its information complexity. The proof of our main result develops a new simulation procedure that may be of an independent interest. In a followup breakthrough work of Kerenidis et al. (53rd annual IEEE symposium on foundations of computer science, FOCS 2012, New Brunswick, NJ, USA, 20---23 Oct 2012, pp 500---509, 2012), our simulation procedure served as a building block towards a proof that almost all known lower bound techniques for communication complexity (and not just discrepancy) apply to information complexity as well.