Activities of daily living (ADLs) are a crucial aspect of human life, especially in remote health monitoring and fall detection. As smartphones have become an integral part of our daily routines, with their ability to perform complex calculations, connect to the internet, and incorporate various sensors, researchers have been inspired to explore human activity recognition systems. This paper focuses on accelerometer and gyroscope data from iOS-based smartphones. We developed a data collection app to record fall types (e.g., Falling Right, Falling Left) and fall-like activities (e.g., Sitting Fast, Jumping). Volunteers carried smartphones naturally in their pockets during experiments, presenting a challenge of noise but enhancing user comfort. We applied different Machine Learning algorithms (Decision Trees, Random Forest, Logistic Regression, k-Nearest Neighbor, XGBoost, LightGBM, and Neural Networks) to analyze the collected dataset. In contrast to typical studies, our approach replicated real-world smartphone usage. The paper presents and analyzes promising results from the study. Furthermore, we implemented the trained model as a real-time mobile application for potential users. This research illustrates the potential of smartphones in fall detection and opens the way for user-friendly solutions in remote health monitoring.