Abstract

Combinatorics discrepancy theory studies the ways in which a given state deviates from the ideal. A set system's disparity in its classical form has been shown to be bound in certain situations. Though they may not hold for other types of discrepancy, no mathematician has been able to produce an example that defies these limits as of yet. Due to the large number of set systems (kmn for m sets and n elements, each choosing one of k possible values), it is not viable to use traditional techniques, such as human reasoning or computer brute forcing, to identify the few set systems with a significant discrepancy. However, in a 2021 preprint, Adam Zsolt Wagner shows how the deep cross-entropy method—a popular Neural Network (NNs)-based Reinforcement-Learning (RL) technique—could effectively find examples that disprove open conjectures in two other combinatorial subfields: pattern avoidance and graph theory. Therefore, in light of these encouraging findings, we wondered if Wagner's method could help us find examples that defy the upper bound of discrepancy variations. To start answering this question, we examined whether Wagner's approach might be extended to find set systems that exhibit significant inconsistencies, either as classical or variant (prefix and fractional in our instance) systems. Our findings could help approximation algorithms, which employ discrepancy theory to tackle certain problems, as well as discrepancy theory itself.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.