Linked e-resources

Details

Intro
Preface
Organization
Contents
Transn's Submission for CCMT 2023 Quality Estimation Task
1 Introduction
2 Related Work
3 Feature-Enhanced Estimator for Sentence-Level QE
3.1 Model Architecture
3.2 Pretraining Corpus Generation
3.3 Model Ensemble
4 Experiments
4.1 Datasets
4.2 Training and Evaluation
4.3 Results and Analysis
4.4 Model Ensemble
5 Conclusion
References
HW-TSC's Neural Machine Translation System for CCMT 2023
1 Introduction
2 Dataset
2.1 Data Size
2.2 Data Pre-processing
3 System Overview

3.1 Bilingual System
3.2 Low-Resource System
3.3 Multilingual System
3.4 Zero-Referencing System
4 Method
4.1 Regularized Dropout
4.2 Bidirectional Training
4.3 Data Diversification
4.4 Forward Translation
4.5 Back-Translation
4.6 Alternated Training
4.7 Curriculum Learning
4.8 Transductive Ensemble Learning
5 Experiments
5.1 Bilingual System Evaluation Results
5.2 Low-Resource System Evaluation Results
5.3 Multilingual System Evaluation Results
5.4 Zero-Referencing System Evaluation Results
6 Conclusion
References

CCMT2023 Tibetan-Chinese Machine Translation Evaluation Technical Report
1 Introduction
2 Data Processing
2.1 Data
2.2 Data Preprocessing
3 Model
3.1 Model Select
3.2 Model Ensemble
3.3 Iterative Fine-Tuning
4 Experiment
4.1 Experimental Environment
4.2 Experimental Setup
5 Results and Analysis
6 Summary
References
Korean-Chinese Machine Translation Method Based on Independent Language Features
1 Introduction
2 Related Work
2.1 Korean-to-Chinese Machine Translation
2.2 Multilingual Unsupervised and Supervised Embeddings
3 Method

3.1 Independent Language Feature Extraction Model
3.2 Translation Model
4 Experiment
4.1 Datasets
4.2 Settings
4.3 Main Results
5 Analysis
5.1 Ablation Experiments
5.2 Case Study
6 Conclusion
References
NJUNLP's Submission for CCMT 2023 Quality Estimation Task
1 Introduction
2 Methods
2.1 Unsupervised Methods
2.2 Supervised Methods
3 Experiments
3.1 Dataset
3.2 Settings
3.3 Single Model Results
3.4 Ensemble
4 Conclusion
References
HIT-MI&T Lab's Submission to CCMT 2023 Automatic Post-editing Task
1 Introduction

2 Architecture
3 Data Augmentation
3.1 Synthetic Data Generation
3.2 ChatGPT-Based Data Augmentation
4 Experiments
4.1 Set-Up
4.2 Results of Different Architectures
4.3 Results of Data Augmentation
4.4 Results of Multi-model Ensemble
5 Conclusion
References
A k-Nearest Neighbor Approach for Domain-Specific Translation Quality Estimation
1 Introduction
2 Proposed Method
2.1 Overall Architecture
2.2 XLM-R Encoder
2.3 Classifier
2.4 k-Nearest Neighbor
2.5 Loss Function
3 Experiments
3.1 Datasets
3.2 Settings
3.3 Results and Analysis

Browse Subjects

Show more subjects...

Statistics

from
to
Export