NLG using AR models
In the previous section, you learned how it is possible to train an AR model on your own corpus. As a result, you trained a GPT-2 version of your own. But the answer to the question How can I use it? remains. To answer that, let’s proceed as follows:
- Let’s start generating sentences from the model you have just trained, as follows:
def generate(start, model): input_token_ids = tokenizer_gpt.encode(start, return_tensors='tf') output = model.generate( input_token_ids, max_length = 500, num_beams = 5, temperature = 0.7, no_repeat_ngram_size=2, num_return_sequences=1 ) ...