Due to their flexibility and ability to incorporate non-linear relationships, Mixed-Integer Non-Linear Programming (MINLP) approaches for optimization are commonly presented as a solution tool for real-world problems. Within this context, piecewise linear (PWL) approximations of non-linear continuous functions are useful, as opposed to non-linear machine learning-based approaches, since they enable the application of Mixed-Integer Linear Programming techniques in the MINLP framework, as well as retaining important features of the approximated non-linear functions, such as convexity. In this work, we extend upon fast algorithmic approaches for modeling discrete data using PWL regression by tuning them to allow the modeling of continuous functions. We show that if the input function is convex, then the convexity of the resulting PWL function is guaranteed. An analysis of the runtime of the presented algorithm shows which function characteristics affect the efficiency of the model, and which classes of functions can be modeled very quickly. Experimental results show that the presented approach is significantly faster than five existing approaches for modeling non-linear functions from the literature, at least 11 times faster on the tested functions, and up to a maximum speedup of more than 328,000. The presented approach also solves six benchmark problems for the first time.