||This thesis introduces the need for full-arm haptic rendering and reports potential benefits related to the evaluation of static models in confined, or workspace limiting, virtual environments. There are many examples where haptic rendering has been used to add the sense of touch to virtual models; however, the majority of previous research has focused on tool or hand interactions. In this research, the SARCOS DTS Master makes haptic interactions possible across the entire human arm. Users are able to naturally assess workspace and force limitations imposed by the human arm when evaluating confined virtual environments. Simple polygonal models are used to represent the virtual environment and virtual arm. Fast collision detection is implemented using spacialized normal cone hierarchies and local descent algorithms, and an admittance control scheme is used to control the torques of the DTS Master's powerful hydraulic actuators. To test the benefits of full-arm haptic rendering, experiment participants are subject to five virtual environments, once with hand-only haptic rendering and once with full-arm haptic rendering. The objective benefits are measured in terms of task completion time, maximum force application, and assessment accuracy. Participants are also asked to answer subjective questions about the ease of completing the task with and without full-arm haptic rendering. Results of the experiments show that in most confined environments completion times are reduced, applied forces at the target increase, and participants are more likely to find a path to the target with full-arm haptic rendering. Subjectively, participants prefer full-arm haptic rendering as it reduces their dependency on visual cues while reaching for and applying a sustained force to target.