Simple Test Time Scaling for Machine Translation: Kaze-MT at the WMT25 General Translation Task

Open Access
Authors
Publication date 2025
Host editors
  • B. Haddow
  • T. Kocmi
  • P. Koehn
  • C. Monz
Book title Tenth Conference on Machine Translation : Proceedings of the Conference
Book subtitle WMT 2025 : November 8-9, 2025
ISBN (electronic)
  • 9798891763418
Event 10th Conference on Machine Translation, WMT 2025
Pages (from-to) 651–656
Publisher Kerrville, TX: Association for Computational Linguistics
Organisations
  • Faculty of Science (FNWI) - Informatics Institute (IVI)
Abstract
This paper describes the Kaze-MT submission to the WMT25 General Machine Translation task (Japanese–Chinese). Our system deliberately adopts a minimalist Test-Time Scaling (TTS) pipeline with three stages—Sampling, Scoring, and Selection—while avoiding any task-specific fine-tuning, in-context exemplars, or bespoke decoding heuristics. In the sampling stage, we use the zero-shot Qwen2.5-72B-Instruct model to generate 512 candidate translations under a fixed temperature schedule designed to encourage lexical and syntactic diversity without sacrificing fluency. In the scoring stage, each candidate is evaluated by multiple reference-free quality estimation (QE) models—KIWI-22, MetricX-24 Hybrid-XXL, and Remedy-24-9B. The selection stage aggregates metric-specific rankings and chooses the candidate with the lowest mean rank, which we found more stable than averaging raw scores across heterogeneous ranges. We submit to both constrained and unconstrained tracks with minimal configuration changes. According to official preliminary results, our submissions are competitive on automatic metrics; in human evaluation, Kaze-MT falls within the 8–13 cluster, delivering performance comparable to CommandA-WMT and DeepSeek-V3 and outperforming other large LLM baselines such as Mistral-Medium and other extensively tuned MT systems.
Document type Conference contribution
Language English
Published at https://doi.org/10.18653/v1/2025.wmt-1.40
Downloads
2025.wmt-1.40 (Final published version)
Permalink to this page
Back