This software is available in the Treehouse. You can also install it on your home machines, but you'll need to be running linux. If you don't already have a linux machine, we suggest Ubuntu+LKB.
Natural language processing (NLP) enables computers to make use of data represented in human language (including the vast quantities of data available on the web) and to interact with computers on human terms. Applications from machine translation to speech recognition and web-based information retrieval demand both precision and robustness from NLP technology. Meetings these demands will require better hand-built grammars of human languages combined with sophisticated statistical processing methods. This class focuses on the implementation of linguistic grammars, drawing on a combination of sound grammatical theory and engineering skills.
Class meetings will alternate between lectures and discussion sessions. We will cover the implementation of constraints in morphology, syntax and semantics within a unification-based lexicalist framework of grammar. Weekly exercises will focus on building up an implemented grammar for a language of your choice (everyone must work on a different language, so be prepared to work with a language you don't know well!), based on the LinGO Grammar Matrix. At the end of the quarter, we will use the various grammars in a machine translation task.
Prerequisites: Linguistics 566 or equivalent. No programming experience is required.
Note: To request academic accommodations due to a disability, please contact Disabled Student Services, 448 Schmitz, 206-543-8924 (V/TTY). If you have a letter from Disabled Student Services indicating that you have a disability which requires academic accommodations, please present the letter to the instructor so we can discuss the accommodations you might need in this class.
Weekly lab exercises, typically assigned on Mondays and due by Friday night. Course time on Thursdays will be used for discussion of the exercises, so please work on them ahead of time and bring questions. Lab exercises will require write-ups to explain the phenomena as manifested in your language and how you implemented your analysis. Active class participation will be viewed favorably when it comes to grading.
Everyone will complete Lab 1 individually, but students are be expected to work in pairs starting with Lab 2. Partners will alternate doing the write up portion of the lab, and have the grades for the labs where they did the write up weighted more heavily in their final course grade.
Lab exercises are to be turned in via Canvas.
Under construction---will be updated.
All course recordings will be posted on our Canvas page. If I'm slow to make them available there, please ping me over the Canvas discussions.
|1/9, 1/11||Testsuites, [incr tsdb()]||Lab 1:
||W1/10||Ch 4, 5|
Bender et al 2011 (in course Canvas)
|1/16, 1/18||The Grammar Matrix: Motivations, technical details||Lab 2: Testsuites/customization I:
||1/19||Bender et al 2010 (in Canvas)|
|1/23, 1/25||Morphotactics in the Grammar Matrix,
Lab 3 phenomena
|Lab 3: Testsuites/customization II:
|1/30, 2/1||Minimal Recursion Semantics||Lab 4: Testsuites/customization III:
||2/2||Copestake, Flickinger, Pollard, and Sag, 2005 (esp. Sec 3)|
|2/6, 2/8||Modification, Discoruse status, Argument optionality; Precision grammars and corpus data||Lab 5:
||2/9||(Optional: Baldwin et al 2005)|
|2/13, 2/15||Clause types, Illocutionary force, Wh- questions||Lab 6:
|2/20, 2/22||VPM, the LOGON MT architecture||Lab 7:
||2/23||Oepen et al 2007|
|2/27, 3/1||MT continued||Lab 8:
|3/6, 3/8||The Grammar Matrix: Future directions||Machine Translation Extravaganza|