When intercepting a moving target, we typically rely on vision to determine where the target is and where it will soon be. The accuracy of visually guided interception can be represented by a model that combines the perceived position and velocity of the target to estimate when and where to hit it and guides the finger accordingly with a short delay. We might expect the accuracy of interception to similarly depend on haptic judgments of position and velocity. To test this, we conducted separate experiments to measure the precision and any biases in tactile perception of position and velocity and used our findings to predict the precision and biases that would be present in an interception task if it were performed according to the principle described earlier. We then performed a tactile interception task to test our predictions. We found that interception of tactile targets is guided by similar principles as interception of visual targets.
- multisensory/cross-modal processing