Abstract

Acoustical levitation is an area with many applications ranging from medical drug delivery to micro-particle sorting. An application which has gained attention recently is the creation of volumetric displays by using small levitating objects. Advances in inexpensive ultrasonic phased arrays have increased the availability of dynamically controllable beamformers which enables the manipulation of the levitating objects in time and space. This allows for interpreting the levitating objects similarly to pixels in a normal display yet with an additional spatial dimension. Most implementations so far are based on the so-called Gor'kov formulation of radiation force. This formulation coupled with numerical optimization allows for calculation of the phases of the individual elements in the array. By exploiting symmetries in the solution, it is possible to impose an acoustic trap signature phase pattern onto simple focusing methods. Using off-the-shelf mid-air ultrasonic haptics systems to provide multiple focus points to which the phase patterns are applied allows for real-time control of multiple levitating objects. Current systems are limited to a handful of individually controllable objects so visualization is limited to abstract information. We present an overview of the state-of-the-art and discuss limitations and possibilities. [Work supported by Horizon 2020, No. 737087.]

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call