fofr/star-trek-gpt-j-6b 🔢📝❓ → 📝
About
gpt-j-6b trained on the Memory Alpha Star Trek Wiki

Example Output
Prompt:
"What was the most popular holodeck program?"
Output
The most popular holodecks were those at Starfleet Academy, and holographic programs created by Starfleet Command. The Academy featured a 'Star Trek'-themed program in its gymnasium, while other holosuites included a small zoo with various animals including Klingons, Cardassians, Ferengis, Bajorans, Yridians, Tellarites, Xindi-Reptilians, Vulcans, Romulans, Federation members, and several others from different Star Fleet organizations.
Performance Metrics
5.32s
Prediction Time
5.45s
Total Time
All Input Parameters
{ "top_k": 50, "top_p": 1, "prompt": "What was the most popular holodeck program?", "decoding": "top_p", "max_length": 500, "temperature": 0.75, "repetition_penalty": 1.2 }
Input Parameters
- top_k
- Valid if you choose top_k decoding. The number of highest probability vocabulary tokens to keep for top-k-filtering
- top_p
- Valid if you choose top_p decoding. When decoding text, samples from the top p percentage of most likely tokens; lower to ignore less likely tokens
- prompt (required)
- Input Prompt.
- decoding
- Choose a decoding method
- max_length
- Maximum number of tokens to generate. A word is generally 2-3 tokens
- temperature
- Adjusts randomness of outputs, greater than 1 is random and 0 is deterministic, 0.75 is a good starting value.
- repetition_penalty
- Penalty for repeated words in generated text; 1 is no penalty, values greater than 1 discourage repetition, less than 1 encourage it.
Output Schema
Output
Example Execution Logs
The attention mask and the pad token id were not set. As a consequence, you may observe unexpected behavior. Please pass your input's `attention_mask` to obtain reliable results. Setting `pad_token_id` to `eos_token_id`:50256 for open-end generation.
Version Details
- Version ID
af6aeca95b3448bdd320fef895082fb310979b2d605c7db8160d7ca9760822d9
- Version Created
- April 11, 2023