Statistical graphs, such as line graphs are widely used in multimodal communication settings. Language accompanies graphs and humans produce gestures during the course of communication. For visually impaired people, haptic-audio interfaces provide perceptual access to graphical representations. The local and sequential character of haptic perception introduces limitations in haptic perception of hard-to-encode information, which can be resolved by providing audio assistance. In this article we first present a review of multimodal interactions between gesture, language and graphical representations. We then focus on methodologies for investigating hard-to-encode information in graph comprehension. Finally, we present a case study to provide insight for designing audio assistance.