In computational linguistics, early systems relied heavily on dictionaries with machine code, but this approach was problematic due to maintainability and error-prone control structures. As a result, computational linguists shifted towards using general, especially nondeterministic, algorithms for syntactic analysis, emphasizing separation of program and linguistic data. This shift was also influenced by the appeal of syntactic theories involving rule-based grammars, which provided a theoretical foundation for linguistic analysis.
The trend now moves towards distributed responsibility among the lexicon, semantic components, and cognitive strategies, with a focus on data structures and strategic control rather than rigid algorithms. This approach allows for more flexible and adaptive systems, particularly in speech understanding, where boundaries between morphological, syntactic, and semantic processes are blurred.
Various techniques, such as Augmented Transition Networks (ATNs), represent grammatical facts in executable code, but these are compiled from formalisms that allow only linguistically motivated operations. Nondeterministic procedures are still valuable, but they do not rely on complex control structures. Syntactic processors now use data structures like charts and agendas, with tasks executed based on strategic considerations.
The lexicon plays a crucial role in modern linguistic theory, with recent work suggesting that transformational rules can be replaced by lexical devices. Papers in this section, though diverse, form a coherent set, exploring topics like lexical gaps, word meanings, and the interaction between components in speech understanding systems. These works highlight the importance of strategic components in linguistic theory and the need for flexible, data-driven approaches in syntactic processing.In computational linguistics, early systems relied heavily on dictionaries with machine code, but this approach was problematic due to maintainability and error-prone control structures. As a result, computational linguists shifted towards using general, especially nondeterministic, algorithms for syntactic analysis, emphasizing separation of program and linguistic data. This shift was also influenced by the appeal of syntactic theories involving rule-based grammars, which provided a theoretical foundation for linguistic analysis.
The trend now moves towards distributed responsibility among the lexicon, semantic components, and cognitive strategies, with a focus on data structures and strategic control rather than rigid algorithms. This approach allows for more flexible and adaptive systems, particularly in speech understanding, where boundaries between morphological, syntactic, and semantic processes are blurred.
Various techniques, such as Augmented Transition Networks (ATNs), represent grammatical facts in executable code, but these are compiled from formalisms that allow only linguistically motivated operations. Nondeterministic procedures are still valuable, but they do not rely on complex control structures. Syntactic processors now use data structures like charts and agendas, with tasks executed based on strategic considerations.
The lexicon plays a crucial role in modern linguistic theory, with recent work suggesting that transformational rules can be replaced by lexical devices. Papers in this section, though diverse, form a coherent set, exploring topics like lexical gaps, word meanings, and the interaction between components in speech understanding systems. These works highlight the importance of strategic components in linguistic theory and the need for flexible, data-driven approaches in syntactic processing.