搜索结果: 1-2 共查到“文学 Reducing Training Data Mismatch”相关记录2条 . 查询时间(0.062 秒)
N-gram Weighting: Reducing Training Data Mismatch in Cross-Domain Language Model Estimation
Reducing Training Data Mismatch Cross-Domain Language Model Estimation
2015/3/10
N-gram Weighting: Reducing Training Data Mismatch in Cross-Domain Language Model Estimation.
N-gram Weighting: Reducing Training Data Mismatch in Cross-Domain Language Model Estimation
N-gram Weighing Data Mismatch Cross-Domain Language Model
2014/11/27
In domains with insufficient matched training data, language models are often constructed by interpolating component models trained from partially matched corpora. Since the ngrams from such corpora m...