Robots can be moved in various ways. For example, industrial robots with very specific tasks are typically provided with well-defined and/or well-thought-out paths to be followed by end effectors in configuration (i.e. Cartesian) space. These paths may be converted to joint trajectories (in a process referred to as “inverse kinematics”) that are actually implemented by joints of the robot in order for the robot to move the end effector along the provided path. In many instances, the paths and/or corresponding joint actuator trajectories may be pre-calculated (e.g., tailored) to avoid singularities. However, robots may also be controlled interactively, e.g., by manually “jogging” the robot and/or its end effector in Cartesian space, e.g., using a joystick or directional buttons. Because paths followed by end effectors of interactively-controlled robots may be unpredictable, it might not be feasible to pre-calculate or tailor joint actuator trajectories to avoid singularities. Consequently, when interactively-controlled robots approach or reach singularities, they may behave erratically and/or shut down (e.g., as a safety mechanism).