Abstract

We present a new framework in which agents with limited and heterogeneous cognitive ability—modeled as finite depths of reasoning—learn from their neighbors in social networks. Each agent tracks old information using Bayes-like formulas, and uses a shortcut when reasoning on behalf of multiple neighbors exceeds her cognitive ability. Surprisingly, agents with moderate cognitive ability are capable of partialing out old information and learn correctly in social quilts, a tree-like union of cliques (fully-connected subnetworks). Agents with low cognitive ability may fail to learn in any network, even when they receive a large number of signals. We also identify a critical cutoff level of cognitive ability, determined by the network structure, above which an agent's learning outcome remains the same even when her cognitive ability increases.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call