Разделы презентаций


Quality Estimation in Machine Translation

Plan Why do we need Neural Machine Translation? What are Seq2Seq models? Can we apply same techniques for Quality Estimation task? What have we achieved?Andrew Golman, DIHT18.06.2019

Слайды и текст этой презентации

Слайд 1Quality Estimation in Machine Translation
AndreY golman
Diht, 3rd year bachelor student
18.06.2019
Andrew

Golman, DIHT
18.06.2019

Quality Estimation in Machine TranslationAndreY golmanDiht, 3rd year bachelor student18.06.2019Andrew Golman, DIHT18.06.2019

Слайд 2Plan
Why do we need Neural Machine Translation?
What are

Seq2Seq models?
Can we apply same techniques for Quality Estimation

task?
What have we achieved?

Andrew Golman, DIHT

18.06.2019

Plan Why do we need Neural Machine Translation? What are Seq2Seq models? Can we apply same techniques

Слайд 31990s – 2010s: Statistical Machine Translation
Pictures from Stanford CS224n, lecture

8
Alignment issues
Huge word- and phrase-level dictionaries
For every pair of languages!
Andrew

Golman, DIHT

18.06.2019

1990s – 2010s: Statistical Machine TranslationPictures from Stanford CS224n, lecture 8Alignment issuesHuge word- and phrase-level dictionariesFor every

Слайд 42014: Neural Machine Translation
A way to do Machine Translation

with a single neural network
Most complex models have 200

million parameters
More fluent than SMT
Pick up the meaning first, then phrase it

Andrew Golman, DIHT

18.06.2019

2014: Neural Machine Translation A way to do Machine Translation with a single neural network Most complex

Слайд 5RNN concept
Predicted word probabilities
Source sentence
Predicted word probabilities
Andrew Golman, DIHT
18.06.2019
Encoded words
Hidden

states
Picture from Stanford CS224n, lecture 6

RNN conceptPredicted word probabilitiesSource sentencePredicted word probabilitiesAndrew Golman, DIHT18.06.2019Encoded wordsHidden statesPicture from Stanford CS224n, lecture 6

Слайд 6Sequence-to-sequence translation
Andrew Golman, DIHT
18.06.2019


Picture from Stanford CS224n, lecture

Sequence-to-sequence translationAndrew Golman, DIHT18.06.2019  Picture from Stanford CS224n, lecture 8

Слайд 7Evaluation
Compare with baseline translations
Pay assessors for marking errors

Build a neural system for error detection
Andrew Golman, DIHT
18.06.2019

Evaluation Compare with baseline translations Pay assessors for marking errors Build a neural system for error detectionAndrew

Слайд 8Neural Quality Estimation
Andrew Golman, DIHT
18.06.2019
к просмотру

Neural Quality EstimationAndrew Golman, DIHT18.06.2019к просмотру

Слайд 9Similar model for Quality Estimation
First bidirectional LSTM
Second bidirectional

LSTM
OK/BAD labels
Error classification
Picture from [2]
Andrew Golman, DIHT
18.06.2019

Similar model for Quality Estimation First bidirectional LSTM Second bidirectional LSTMOK/BAD labelsError classificationPicture from [2]Andrew Golman, DIHT18.06.2019

Слайд 10WMT task
English-German, English-Spanish, English-Russian datasets
15,000 labelled sentences
Pretrained

models are allowed
Results from [2]
Andrew Golman, DIHT
18.06.2019

WMT task English-German, English-Spanish, English-Russian datasets 15,000 labelled sentences Pretrained models are allowedResults from [2]Andrew Golman, DIHT18.06.2019

Слайд 11Our experiments
Use CRFs (lattice-structured RNNs) for final classification
Try transformer architecture
Results

from [3]
Andrew Golman, DIHT
18.06.2019

Our experimentsUse CRFs (lattice-structured RNNs) for final classificationTry transformer architectureResults from [3]Andrew Golman, DIHT18.06.2019

Слайд 12Summary
Andrew Golman, DIHT
18.06.2019
NMT is more efficient than SMT
RNNs

are used for text generation
First NMT systems were based

on two RNNs
Quality estimation models can be used to compare translation systems
Results are improving every year
SummaryAndrew Golman, DIHT18.06.2019 NMT is more efficient than SMT RNNs are used for text generation First NMT

Слайд 13References
Google’s Neural Machine Translation System: Bridging the Gap between Human

and Machine Translation, Yonghui Wu, Mike Schuster, Zhifeng Chen, Quoc V.

Le, Mohammad Norouzi, 2016
Predictor-estimator using multilevel task learning with stack propagation for neural quality estimation, Hyun Kim, Jong-Hyeok Lee and Seung-Hoon Na, 2017
Comparison of Various Architectures for World-Level Quality Estimation Mikhail Mosyagin, Amir Yagudin, Andrey Golman, 2019


Andrew Golman, DIHT

18.06.2019

ReferencesGoogle’s Neural Machine Translation System: Bridging the Gap between Human and Machine Translation, Yonghui Wu, Mike Schuster, Zhifeng

Слайд 14Thank You for Your attention
Andrew Golman, DIHT
18.06.2019
mailto:golman.as@phystech.edu

Thank You for Your attentionAndrew Golman, DIHT18.06.2019mailto:golman.as@phystech.edu

Обратная связь

Если не удалось найти и скачать доклад-презентацию, Вы можете заказать его на нашем сайте. Мы постараемся найти нужный Вам материал и отправим по электронной почте. Не стесняйтесь обращаться к нам, если у вас возникли вопросы или пожелания:

Email: Нажмите что бы посмотреть 

Что такое TheSlide.ru?

Это сайт презентации, докладов, проектов в PowerPoint. Здесь удобно  хранить и делиться своими презентациями с другими пользователями.


Для правообладателей

Яндекс.Метрика