This paper proposes a neural generative model, namely Table2Seq, to generate a natural language sentence based on a table. Specifically, the model maps a table to continuous vectors and then generates a natural language sentence by leveraging the semantics of a table. Since rare words, e.g., entities and values, usually appear in a table, we develop a flexible copying mechanism that selectively replicates contents from the table to the output sequence. We conduct extensive experiments to demonstrate the effectiveness of our Table2Seq model and the utility of the designed copying mechanism. On the WIKIBIO and SIMPLEQUESTIONS datasets, the Table2Seq model improves the state-of-the-art results from 34.70 to 40.26 and from 33.32 to 39.12 in terms of BLEU-4 scores, respectively. Moreover, we construct an open-domain dataset WIKITABLETEXT that includes 13 318 descriptive sentences for 4962 tables. Our Table2Seq model achieves a BLEU-4 score of 38.23 on WIKITABLETEXT outperforming template-based and language model based approaches. Furthermore, through experiments on 1 M table-query pairs from a search engine, our Table2Seq model considering the structured part of a table, i.e., table attributes and table cells, as additional information outperforms a sequence-to-sequence model considering only the sequential part of a table, i.e., table caption.