A linear programming (LP)-based framework is presented for obtaining converses for finite blocklength lossy joint source-channel coding problems. The framework applies for any loss criterion, generalizes certain previously known converses, and also extends to multi-terminal settings. The finite blocklength problem is posed equivalently as a nonconvex optimization problem and using a lift-and-project-like method, a close but tractable LP relaxation of this problem is derived. Lower bounds on the original problem are obtained by the construction of feasible points for the dual of the LP relaxation. A particular application of this approach leads to new converses, which recover and improve on the converses of Kostina and Verdu for finite blocklength lossy joint source-channel coding and lossy source coding. For finite blocklength channel coding, the LP relaxation recovers the converse of Polyanskiy, Poor and Verdu and leads to a new improvement on the converse of Wolfowitz, showing thereby that our LP relaxation is asymptotically tight with increasing blocklengths for channel coding, lossless source coding, and joint source-channel coding with the excess distortion probability as the loss criterion. Using a duality-based argument, a new converse is derived for finite blocklength joint source-channel coding for a class of source-channel pairs. Employing this converse, the LP relaxation is also shown to be tight for all blocklengths for the minimization of the expected average symbolwise Hamming distortion of a $q$ -ary uniform source over a $q$ -ary symmetric memoryless channel for any $q\in {\mathbb {N}}$ . The optimization formulation and the lift-and-project method are extended to networked settings and demonstrated by obtaining an improvement on a converse of Zhou et al. for the successive refinement problem for successively refinable source-distortion measure triplets.