[Press Release] [Participants] [Program] [Photos]

Computational Linguistics for the New Millennium:
Divergence or Synergy?
An International Symposium held at the University of Heidelberg,
21-22 July, 2000

(Deutsche Version)

Computational Linguistics has become a mature science. The research activities within the field are rapidly growing and there are an increasing number of so-called language technology products on the market. Also courses on computational linguistics seem to become more and more attractive to a broad range of students. However, does that in itself secure the quality of our discipline? Surely not!

Clearly, the main fields of our science, e.g., syntax, semantics and pragmatics are becoming more profound. But are we also searching for bridges between them or are we just broadening the gaps, step by step? With competing theories, it often seems that each new approach is partly reinventing the wheel, just to be able to solve some particular problem in a more concise manner. Should we not rather bundle our research efforts to achieve more efficacy?

On the other hand, our research projects are often forced to be application-oriented by the supporting financial institutions. Is this the right way to promote a synergy between theory and practice? Practitioners are claimed to be disappointed by the researchers and vice versa. As an effect, so-called language technology products often hardly incorporate any computational linguistics theory at all. But who is to blame? The practitioner who has to create a defined piece of language technology software, or the researcher who wants to find a thorough solution to a theoretical problem?

Nonetheless, there are signs of a synergy within computational linguistics theories and between computational linguistics theory and practice. We can observe synergetic effects within academic research, for example the strong preference for feature-based approaches to syntax or logic-based formalisms to semantics. Also the empirical evaluation of new findings and new theories has become a broadly accepted commitment within the community.

Partly, these synergetic trends have been enabled by new research directions in computational linguistics. Through the availability of large tagged corpora, statistical methods of natural language understanding have become more attractive. However, empirical methods have their limitations and, in the long run, the quality of language technology products depends heavily on the quality of the research behind it. Thus a research strategy that interleaves both seems to be worth discussing.

But how far-reaching are these synergetic effects? Are they just scratching the surface or are they altering the deep structures of our science? Should we reinforce them?

There is one further important aspect to discuss: the curriculum. We are responsible for the next generation of researchers and practitioners. And what we teach is what we get! But the reality may be somewhat disappointing. Almost every university teaches its idiosyncratic computational linguistics course, depending on where the departments were historically located within the structures of the universities. And research efforts are obviously influenced by the research interests of the faculty. In addition these idiosyncrasies often limit the mobility of students and faculty. Do we not in fact need an international curriculum for computational linguistics, one that is based on the answers to questions such as those posed during this symposium?