Rutgers Linguistics was well represented at this year’s LSA Annual Meeting, hosted in New York City from January 4 to 7.
Prof. Adam Jardine, together with collaborators Prof. Jane Chandlee of Haverford College and Scott Nelson of Stony Brook University, opened the LSA session on Formal Language Theory in Morphology and Phonology with a tutorial:
Title:“Tutorial on Morpho-phonological Analysis with Logic and Model Theory”
Abstract: The tutorial will be forty minutes in total, split into three sections: 1) defining phonological representations in model theory; 2) defining phonological processes with first-order translations; and 3) extending first-order logic with boolean monadic recursive schemes, which can define iterative processes and capture elsewhere condition-like effects. Participants will be given step by step instructions and upon completion will be able to provide a logical analysis on any data set of their choosing. The tutorial will also focus on bigger picture questions related to computational complexity such as choice of representation and type of logic used for the analysis.
Their slides are available here.
Fifth year graduate student Tatevik Yolyan also presented at the session on Formal Language Theory in Morphology and Phonology.
Title: Weak Determinism and Simultaneous Application via Boolean Monadic Recursive Schemes
Abstract: Weakly deterministic functions are a subregular class of functions that are hypothesized to describe the expressivity of natural language phonology. While there exists an informal and empirically-motivated notion of what constitutes a weakly deterministic pattern, there does not exist a consensus among phonologists on how to formalize the boundary between weakly deterministic and properly regular functions. This talk presents weakly deterministic functions through the framework of Boolean Monadic Recursive Schemes (BMRS), which provides a logical description of string functions. Within this framework, I formally define a ‘simultaneous application’ operator over two string functions, show that it can be used to model the computational nature of weakly deterministic functions, and explore the theoretical implications of using simultaneous application as a formal characterization of weakly deterministic maps.
Her slides are available here.
Fourth year graduate student Jiaxing Yu, together with Shannon Bryant, a Post doctoral associate for the Rutgers Center for Cognitive Science, presented a talk:
Title: “Comparing reflexive and personal pronouns in Chinese locative prepositional phrases”.
Abstract: In many languages, both reflexive and personal pronouns within locative prepositional phrases (LPPs) can be co-construed with a local subject, making LPPs an ideal testing ground for nonsyntactic factors influencing pronoun use. Focusing on Chinese, we experimentally tested the extent to which acceptability of reflexives ziji and ta-ziji and personal pronoun ta depends on event type (motion/perception) and relation type (contact/non-contact). We find that effects follow the trend previously reported for English but affect different forms to different degrees. Along with advancing understanding of binding in LPPs, this work contributes to comparisons between ziji and ta-ziji and the typology of pronouns more broadly.
Third year graduate student Jiayuan Chen presented a poster:
Title: The distribution of the copula shi in Mandarin embedded and matrix sluicing
Abstract: In Mandarin sluicing, the copula shi may precede the wh-phrase and it’s sometimes optional. This study investigates experimentally exactly when shi is optional. Many existing analyses claim that shi is optional only when the wh-phrase is complex. I present 2 naturalness rating experiments showing that shi is only optional in matrix sluicing, regardless of the wh-phrase. I argue that a general preference for complex wh-phrases gave the illusion that shi can be optional when the wh-phrase is complex. These findings challenge the assumptions existing analyses make about when shi is optional, thereby questioning their explanations for why shi is optional.