NLG using AR models
In the previous section, you have learned how it is possible to train an AR model on your own corpus. As a result, you have trained the GPT-2 version of your own. But the missing answer to the question How can I use it? remains. To answer that, let's proceed as follows:
- Let's start generating sentences from the model you have just trained, as follows:
def generate(start, model): input_token_ids = tokenizer_gpt.encode(start, return_tensors='tf') output = model.generate( input_token_ids, max_length = 500, num_beams = 5, temperature = 0.7, no_repeat_ngram_size=2, num_return_sequences=1 &...