Federated learning (FL) is a machine learning setting which allows multiple participants collaboratively to train a model under the orchestration of a server without disclosing their local data. Vertical federated learning (VFL) is a special structure in FL. It handles the situation where participants have the same ID space but different feature spaces. In order to guarantee the security and privacy of the local data of each participant, homomorphic encryption (HE) is often used to transmit intermediate parameters or data during the training process. In most VFL frameworks, a trusted third-party server is necessary because the plaintexts of the parameters need to be revealed for the computation. However, it is hard to find such a credible entity in the real world. Existing methods for solving this problem are either communication-intensive or unsuitable for multi-party scenarios. By combining secret sharing (SS) and HE, we propose a novel VFL framework without any trusted third parties called EFMVFL. It allows intermediate parameters to be transmitted among multiple parties without revealing the plaintexts. EFMVFL is applicable to generalized linear models (GLMs) and supports flexible expansion to multiple participants. Extensive experiments under Logistic Regression and Poisson Regression show that our framework is outstanding in communication (reduced by 3.2×– 6.8×) and efficiency (accelerated by 1.6×– 3.1×).