BLEURT: Learning Robust Metrics for Text Generation
Paper
•
2004.04696
•
Published
•
1
language:
tags:
Google's T5 The T5 was built by the Google team in order to create a general-purpose model that can understand the text. The basic idea behind t5 was to deal with the text processing problem as a “text-to-text” problem, i.e. taking the text as input and producing new text as output.
Baseline Preprocessing This code repository serves as a supplementary for the main repository, which can be used to do basic preprocessing of the Totto dataset.
We used the T5 for the conditional generation model to fine-tune with, 24000 steps with the ToTTo dataset using BLEURT as a metric.