Nonvisual natural user interfaces can facilitate gesture-based interaction without having to rely on a physical display. Consequently, this may significantly increase available interaction space on mobile devices, where screen real estate is limited. Interacting with invisible objects is challenging though, as such techniques do not provide any spatial feedback but rely entirely on users’ visuospatial memory. This paper presents an interaction technique that appropriates a user’s arm using haptic feedback to point out the location of nonvisual objects; thereby, allowing for spatial interaction with nonvisual objects. User studies evaluate the effectiveness of two different single-arm target-scanning strategies for selecting an object in 3D and two bimanual target-scanning strategies for selecting an object in 2D. Potential useful applications of our techniques are outlined.