Loading Events
This event has passed.

Abstract

We will overview Revita, a project in CALL – computer-aided (human) language learning. Specifically, we work on language learning beyond the elementary level – for intermediate and advanced learners (beginners get adequate support from the myriad existing services and applications). Revita aims to simulate a good teacher, by modeling and assessing the learner’s state and progress. One important aspect of our approach is allowing the user to learn from /authentic/ materials – arbitrary texts chosen by the users themselves.

According to surveys, Revita is the first AI-based system of scale that:

  • works beyond the elementary level: targets intermediate to advanced learners,
  • has multi-lingual focus (an English-only system exists for advanced essay assessment),
  • is used in official university-level curricula – beyond “academic” experiments.

Our approach allows us to collect data to analyze patterns of language learning in great depth and detail. From on-going studies with actual learners, we collect data about the learning process – about typical mistakes, paths of progress, etc. This data provides a playground for research problems, which we will discuss. I will focus on two sides of the research:

  • building the tools needed to collect data, and
  • methods of analysis and applications of the collected data, which include neural networks, in particular, translation and sequence-to-sequence models.

Bio

Roman Yangarber has led the Research Group in NLP at the department of Computer Science, University of Helsinki, over the last 10 years. The group has been working on a variety of themes in NLP, researching how language works, and how computers can better understand language. Research themes include analysis of news media, and modeling language evolution. The more recent research – AI support for language learning – has resulted in a system being used by end-users at several universities; it has also won a best paper award at a Digital Humanities conference last year.