Inertial sensors in smartphones enable the relative state measurements for pedestrian localization without global positioning system (GPS) signals or beacons. Any such navigation method should notice the reality that the phone is placed at uncontrolled variant poses with respect to the human body. In this work, we focus on pedestrian position estimation from the raw inertial measurement unit (IMU) measurements without any constraint on the device carrying manners and propose a novel deep inertial odometry solution. By presenting the expression of continuous rotating, we are able to release the reliance on the unreliable orientation provided by the sensor application program interface (API). Moreover, we propose a novel loss formulation by representing the velocity as the average velocity magnitude and the moving direction. The proposed approach was assessed via the public RoNIN dataset. The localization performance of the public dataset trained network model was then evaluated by real scenario trials in the CUHK campus. Experimental results show that our model is capable of providing robust velocity estimates and generating more accurate trajectories than state-of-the-art inertial odometry methods. Specifically, the localization evaluation in the CUHK campus contains 60-min inertial signals with a length of 3 km. The trained odometry network is with the 30th percentile accuracy of 2.26 m and the 50th percentile accuracy of 4.98 m.