It is one of the great challenges for a robot to learn compliant movements in interaction tasks. The robot can easily acquire motion skills from a human tutor by kinematics demonstration, however, this becomes much more difficult when it comes to the compliant skills. This paper aims to provide a possible solution to address this problem by proposing a two-stage approach. In the first stage, the human tutor demonstrates the robot how to perform a task, during which only motion trajectories are recorded without the involvement of force sensing. A dynamical movement primitives (DMPs) model which can generate human-like motion is then used to encode the kinematics data. In the second stage, a biomimetic controller, which is inspired by the neuroscience findings in human motor learning, is employed to obtain the desired robotic compliant behaviors by online adapting the impedance profiles and the feedforward torques simultaneously. Several tests are conducted to validate the effectiveness of the proposed approach.