Personal Homepage

Personal Information

MORE+

Associate Professor

Supervisor of Master's Candidates

E-Mail:

Date of Employment:2025-05-21

School/Department:软件学院

Education Level:博士研究生

Business Address:新主楼C808,G517

Gender:Male

Contact Information:18810578537

Degree:博士

Status:Employed

Alma Mater:北京伊人99

Discipline:Software Engineering
Computer Science and Technology

Junfan Chen

+

Gender:Male

Education Level:博士研究生

Alma Mater:北京伊人99

Paper

Current position: Home / Paper
Implicit Word Reordering with Knowledge Distillation for Cross-Lingual Dependency Parsing

Journal:Thirty-Ninth AAAI Conference on Artificial Intelligence (AAAI), CCF-A
Abstract:Word order difference between source and target languages is a major obstacle to cross-lingual transfer, especially in the dependency parsing task. Current works are mostly based on order-agnostic models or word reordering to mitigate this problem. However, such methods either do not leverage grammatical information naturally contained in word order or are computationally expensive as the permutation space grows exponentially with the sentence length. Moreover, the reordered source sentence with an unnatural word order may be a form of noising that harms the model learning. To this end, we propose an Implicit Word Reordering framework with Knowledge Distillation (IWR-KD). This framework is inspired by that deep networks are good at learning feature linearization corresponding to meaningful data transformation, e.g. word reordering. To realize this idea, we introduce a knowledge distillation framework composed of a word-reordering teacher model and a dependency parsing student model. We verify our proposed method on Universal Dependency Treebanks across 31 different languages and show it outperforms a series of competitors, together with experimental analysis to illustrate how our method works towards training a robust parser.
Co-author:Zhuoran Li,Chunming Hu,Junfan Chen, Zhijun Chen,Richong Zhang
Indexed by:国际学术会议
Page Number:24530--24538
Translation or Not:no
Date of Publication:2025-01-01