sharaddition/paraphrase-gpt π’π β π
About
T5 model fine tuned on GPT-3.5 generated paraphrase corpus of 6.3 million unique sentences.

Example Output
Prompt:
"These practices, also known as βdark patterns,β are in the nature of unfair trade practices and are covered under the Consumer Protection Act, which is punishable, according to Union consumer affairs secretary Rohit Kumar Singh."
Output
According to Rohit Kumar Singh, the Union consumer affairs secretary, 'dark patterns' are unfair trade practices that fall under the Consumer Protection Act and can be punished.Rohit Kumar Singh, the Union consumer affairs secretary, has stated that 'dark patterns' are unfair trade practices and fall under the Consumer Protection Act, which is punishable by law.The 'dark patterns' are unfair trade practices that fall under the scope of the Consumer Protection Act and can be punished, as stated by Union consumer affairs secretary Rohit Kumar Singh.Union consumer affairs secretary Rohit Kumar Singh has stated that 'dark patterns' are unfair trade practices, which fall under the scope of the Consumer Protection Act and can be punished.'Dark patterns,β also known as unfair trade practices, are covered by the Consumer Protection Act and can result in punishment, according to Union consumer affairs secretary Rohit Kumar Singh.
Performance Metrics
2.37s
Prediction Time
2.54s
Total Time
All Input Parameters
{ "top_p": 1, "prompt": "These practices, also known as βdark patterns,β are in the nature of unfair trade practices and are covered under the Consumer Protection Act, which is punishable, according to Union consumer affairs secretary Rohit Kumar Singh.", "num_beams": 5, "max_length": 250, "temperature": 0.75, "num_beam_groups": 5, "diversity_penalty": 3, "repetition_penalty": 10, "no_repeat_ngram_size": 2, "num_return_sequences": 5 }
Input Parameters
- top_p
- When decoding text, samples from the top p percentage of most likely tokens; lower to ignore less likely tokens
- prompt (required)
- Prompt to send to FLAN-T5.
- num_beams
- Number of output sequences to generate
- max_length
- Maximum number of tokens to generate. A word is generally 2-3 tokens
- temperature
- Adjusts randomness of outputs, greater than 1 is random and 0 is deterministic, 0.75 is a good starting value.
- num_beam_groups
- Number of output sequences to generate
- diversity_penalty
- Penalty for repeated words in generated text; 1 is no penalty, values greater than 1 discourage repetition, less than 1 encourage it.
- repetition_penalty
- Penalty for repeated words in generated text; 1 is no penalty, values greater than 1 discourage repetition, less than 1 encourage it.
- no_repeat_ngram_size
- No repeat n_gram size.
- num_return_sequences
- Number of output sequences to generate
Output Schema
Output
Version Details
- Version ID
3a66bc6c1327de5459cb18b2f10550693bc69662a5e29c67a971776f8574f1b1
- Version Created
- May 13, 2023