ENGLISH    |  

MT勉強会/2012

Machine translation at NAIST

  • We focus on statistical machine translation, with particular interests in:
    • training & decoding algorithms
    • integration of rich syntactical and semantic information
    • speech translation
  • Currently we have two back-to-back meetings: MT Study Group & IWSLT team meeting.
    • The purpose of MT Study Group is to read and discuss research papers.
    • The purpose of IWSLT team meeting is to discuss progress/issues for the IWSLT2012 campaign.

Information

  • Date:Thursday
  • Time:13:00 - 14:00 IWSLT meeting / 14:00 - 15:00 MT Study Group
  • Location: Alternate weekly between Matsumoto-lab seminar room (A707) and Nakamura-lab seminar room.
  • Internal page

Schedule 2012

  • 4/18 (Wed) 10:00- Kickoff meeting for IWSLT Campaign
    • Learn the KIT system at IWSLT 2011
  • 4/26 (Thu) @ Matsumoto-lab: Kickoff meeting for MT Study group
  • 5/10 @ Nakamura-lab: neubig & kevinduh
  • 5/17 @ Matsumoto-lab: shuhei-k
    • Dyer et. al. Unsupervised Word Alignment with Arbitrary Features (ACL2011)
  • 5/24: takatomo-k
    • Koehn & Hoang. Factored Translation Models (EMNLP2007)
  • 5/31 katsuhiko-h
    • Haghighi, DeNero, Klein. Approximate Factoring for A* Search (NAACL2007)
  • 6/07 masaya-o
    • Gimpel & Smith. Structured Ramp Loss Minimization for Machine Translation (NAACL2012)
  • 6/15 tetsuo-s
    • Simianer et. al. Joint feature selection in distributed stochastic learning for large-scale discriminative training in SMT (ACL2012)
  • 6/21 NAACL review
    • kevinduh: Jagarlamudi & Daume, Low-dimensional Discriminative Re-ranking
    • masaya-o: Watanabe, Optimized Online Rank Learning for Machine Translation
    • neubig: Takstrom+, Cross-lingual Word Clusters for Direct Transfer of Linguistic Structure; Ture, Encouraging Consistent Translation Choices
    • shuhei-k: Rush & Petrov, Vine Pruning for Efficient Multi-Pass Dependency Parsing
  • 6/28 ACL/EMNLP practice talks
    • kevinduh: Learning to Translate with Multiple Objectives
    • katsuhiko-h: Head-driven Transition-based Parsing with Top-down Prediction
    • neubig: Machine Translation without Words through Substring Alignment
    • neubig: Inducing a Discriminative Parser to Optimize Machine Translation Reordering
  • 7/05 tomoki-f
    • Bangalore+, Real-time Incremental Speech-to-Speech Translation of Dialogs (NAACL10)
  • 7/19 thichinh-t
    • Genzel, Automatically Learning Source-side Reordering Rules for Large Scale Machine Translation (COLING10)
  • 7/26 shuhei-k
    • Burkett+, Joint Parsing and Alignment with Weakly Synchronized Grammars (NAACL10)
  • 8/2 neubig & kevinduh (ACL/EMNLP review)
    • Burkett & Klein, Transforming Trees to improve Syntactic Convergence
    • Eidelman, Boyd-Graber, Resnik, Topic Models for Dynamic Translation Model Adaptation
  • 8/30 takamoto-k
    • Aguero, Adell, Bonafonte, Prosody generation for speech-to-speech translation
  • 9/6 katsuhiko-h
    • Efficient CCG Parsing: A-star vs. Adaptive Supertagging
  • 9/13 masaya-o
    •  Tuning as Linear Regression, NAACL 2012
  • 9/28 tetsuo-s
    • Exact Decoding of Phrase-Based Translation Models through Lagrangian Relaxation
  • 10/4 kevinduh
    • Continuous Space Translation Models with Neural Networks
  • 10/11 shuhei-k
    • Exact Decoding of Syntactic Translation Models through Lagrangian Relaxation
  • 10/18 neubig
    • A Spectral Algorithm for Learning Hidden Markov Models
  • 10/25 masaya-o, miao-y
    • Detection of Non-native Sentences using Machine-translated Training Data
  • 11/01 tomoki-f, hiroali-sh
    • Automatic Sentence Segmentation and Punctuation Prediction for Spoken Language Translation
  • 11/08 takeyuki-k, takatomo-k
    • A Probabilistic Forest-to-String Model for Language Generation from Typed Lambda Calculus Expressions
  • 11/15 isao-ni, kazuya-ko
    • Syntactic Reordering in Preprocessing for Japanese→English Translation: MIT System Description for NTCIR-7 Patent Translation Task
  • 11/22 tetsuo-k
    • Optimal Search for Minimum Error Rate Training
  • 11/29 kevinduh
    • MERT by Sampling the Translation Lattice
  • 12/6 cancel
  • 12/13 shuhei-k
    • Adjoining Tree-to-String Translation
  • 12/20 neubig
    • Lattice-based Minimum Error Rate Training for Statistical Machine Translation
    • Efficient Minimum Error Rate Training and Minimum Bayes-Risk Decoding for Translation Hypergraphs and Lattices
  • 1/10/2013 COLING review
    • kevinduh: Simple and Effective Parameter Tuning for Domain Adaptation of Statistical Machine Translation
    • neubig: Bayesian Language Modelling of German Compounds
  • 1/17/2013 COLING review
    • shuhei-k: Stacking of Dependency and Phrase Structure Parsers
    • takatomo-k: Factored Language Model based on Recurrent Neural Network
    • masaya-o: Optimizing for Sentence-Level BLEU+1
    • hiroaki-sh: Active Error Detection and Resolution for Speech-to-Speech Translation (IWSLT2012)
    • isao-ni: Alignment by Bilingual Generation andMonolingual Derivation
    • tomoki-f: Overview of the IWSLT 2011 Evaluation Campaign
    • kazuya-ko: Tree-based Translation without Using Parse Trees
  • 1/24/2013 henning-s
    • Continuous Space Language Models using Restricted Boltzmann Machines
  • 1/31/2013 takatomo-k, hiroaki-sh
    • Left-to-Right Target Generation for Hierarchical Phrase-based Translation
  • 2/7/2013 cancel
  • 2/14/2013 isao-ni, kazuya-ko
    • Does more data always yield better translations?
  • 2/21/2013 masaya-o, tomoki-f
    • The HIT-LTRC Machine Translation System for IWSLT 2012