To download the dissertation, click the link.


Arik, Engin. Ph.D., Purdue University, May, 2009.  Spatial Language: Insights from Sign and Spoken Languages.

Committee: Ronnie Wilbur, Diane Brentari, Elaine Francis, Myrdene Anderson, Dan I. Slobin.

This dissertation examined how sign and spoken languages represent space in their linguistic systems by proposing the Crossmodal Spatial Language Hypothesis (CSLH), which claims that the features from spatial input are not necessarily mapped on the spatial descriptions regardless of modality and language. Moreover, CSLH explains that the way languages convey spatial relations is bound to the representational system: Spatial Representations (SR), Reference Frames (RF), Temporal Representations (TR), Conceptual Structure (CS), and Linguistic Representations (LR).

To test the hypothesis, a systematic study of spatial language (sign, speech, and co-speech gestures) on the data obtained from experiments and elicitation tasks was conducted in sign languages (TID, HZJ, ASL, and ÖGS) and spoken languages (Turkish, English, and Croatian). The findings uncovered a large amount of variation in the signed and spoken descriptions of static situations and dynamic situations. Additionally, despite some shared characteristics of the two domains, the analyses indicated that space and time are encoded in SR and TR. The results provided supporting evidence for CSLH.

The findings suggested that language users construct a spatial relation between the objects in a given time, employ a reference frame, which may not be encoded in the message, and use the same conceptual structure comprised of BE-AT for static spatial situations and GO-BE-AT for static dynamic situations. Experimental results also showed that language users do not have to distinguish left/right from front/back, in/on from at, to from toward, cause from go, and cause to move from cause to move together in their descriptions. Interestingly, the descriptions involved go-type predicates (go, walk) for both static and dynamic situations.

Further analyses revealed not only a modality effect (signers > speakers) but also a language effect. Careful consideration of the data revealed that there were similarities and differences within and across modalities. Future study can shed more light on these variations and patterns.

Some of the Dissertation Materials can be found in the following links. 

Disclaimer: Copyright © Engin Arik 2009. The original research was supported in part by NSF grant (BCS-0345314) awarded to Ronnie Wilbur and the Bilsland Dissertation Fellowship by Purdue University. These materials can be used for informational or research purposes only. It can be viewed or downloaded on a single computer for your non-commercial use. For instructions, you may consult me via email at L.enginarik AT gmail.com. Please notify me if you ever use these materials for research purposes.

Please cite as: 

Arik, E. (2009). Spatial language: Insights from sign and spoken languages. PhD Dissertation. Purdue University, West Lafayette, IN.

The materials were reduced in size and zipped.

Pictures for static-angular relations are here.

Pictures for static-angular/topological relations are here.

Movies for motion events are here.

Movies for motion event interactions are here.

Subpages (1): Dissertation
Engin Arik,
Oct 15, 2009, 6:29 AM