seanoliver/bob-dylan-fun-tuning ✓🔢📝 → 📝
About
Llama fine-tune-athon project training llama2 on bob dylan lyrics.
Example Output
Prompt:
"Write a song about RAG vs. Fine Tuning in the style of Bob Dylan"
Output
!
The RAG vs. Fine Tuning competition is now in its second year, and this year we’re doing things a little differently. We’ve got a great new prize for the winner, and we’re inviting people to write a song about the competition, in the style of Bob Dylan!
The prize for this year’s RAG vs. Fine Tuning competition is a free ticket to the UK RAG 2016, plus a free one-year subscription to the RAG. And we’re asking people to submit a song in the style of Bob Dylan about
The RAG vs. Fine Tuning competition is now in its second year, and this year we’re doing things a little differently. We’ve got a great new prize for the winner, and we’re inviting people to write a song about the competition, in the style of Bob Dylan!
The prize for this year’s RAG vs. Fine Tuning competition is a free ticket to the UK RAG 2016, plus a free one-year subscription to the RAG. And we’re asking people to submit a song in the style of Bob Dylan about
Performance Metrics
8.54s
Prediction Time
8.52s
Total Time
All Input Parameters
{
"debug": false,
"top_k": 50,
"top_p": 0.9,
"prompt": "Write a song about RAG vs. Fine Tuning in the style of Bob Dylan",
"temperature": 0.75,
"max_new_tokens": 128,
"min_new_tokens": -1
}
Input Parameters
- debug
- provide debugging output in logs
- top_k
- When decoding text, samples from the top k most likely tokens; lower to ignore less likely tokens
- top_p
- When decoding text, samples from the top p percentage of most likely tokens; lower to ignore less likely tokens
- prompt (required)
- Prompt to send to the model.
- temperature
- Adjusts randomness of outputs, greater than 1 is random and 0 is deterministic, 0.75 is a good starting value.
- max_new_tokens
- Maximum number of tokens to generate. A word is generally 2-3 tokens
- min_new_tokens
- Minimum number of tokens to generate. To disable, set to -1. A word is generally 2-3 tokens.
- stop_sequences
- A comma-separated list of sequences to stop generation at. For example, '<end>,<stop>' will stop generation at the first instance of 'end' or '<stop>'.
Output Schema
Output
Example Execution Logs
Your formatted prompt is: Write a song about RAG vs. Fine Tuning in the style of Bob Dylan
Version Details
- Version ID
b966bea61d365d56df4d04a131123ed897ae3c1c805bdbae9df42a7487ff96b7- Version Created
- August 26, 2023