sample["net_input"]["src_tokens"] is really in tgt language.
if torch.cuda.is_available():
s = utils.move_to_cuda(sample)
else:
s = sample
self.backtranslation_generator.cuda()
input = s["net_input"]
After Change
sample. Note in this case, sample["target"] is None, and
sample["net_input"]["src_tokens"] is really in tgt language.
s = utils.move_to_cuda(sample) if self.cuda else sample
input = s["net_input"]
srclen = input["src_tokens"].size(1)
hypos = self.backtranslation_generator.generate(
input,