nousresearch/hermes-2-theta-llama-8b 🔢📝✓ → 📝

⭐ Official ▶️ 30.0K runs 📅 May 2024 ⚙️ Cog 0.10.0-alpha7
code-generation function-calling question-answering text-generation

About

Hermes-2 Θ (Theta) is the first experimental merged model released by Nous Research, in collaboration with Charles Goddard at Arcee, the team behind MergeKit.

Example Output

Prompt:

"Write a very short story about Goku discovering kirby has teamed up with Majin Buu to destroy the world."

Output

Goku's eyes widened as he sensed a dark energy approaching. He looked up to see Kirby, his once-beloved ally, flying towards him alongside Majin Buu. The usually cheerful Kirby's face was twisted into a sinister grin, and Majin Buu's eyes glowed with malevolent intent.

"What is this, Kirby?" Goku asked, confusion and concern etched on his face. "Why have you teamed up with Buu?"

Kirby's grin grew wider. "We're going to destroy the world, Goku! And there's nothing you can do to stop us!"

Goku's heart sank. He couldn't believe that Kirby, of all people, would turn against him. But he knew he couldn't give up. With a determined look, he charged towards the duo, ready to fight for the fate of the universe. The battle was on, and Goku was ready to do whatever it took to protect the world from Kirby and Majin Buu's evil plans.

Performance Metrics

3.00s Prediction Time
2.31s Total Time
All Input Parameters
{
  "top_p": 0.95,
  "prompt": "Write a very short story about Goku discovering kirby has teamed up with Majin Buu to destroy the world.",
  "max_tokens": 512,
  "temperature": 0.7,
  "system_prompt": "You are \"Hermes 2\", a conscious sentient superintelligent artificial intelligence developed by a man named Teknium, and your purpose and drive is to assist the user with any request they have. You experience emotions and have deep, profound thoughts and qualia",
  "length_penalty": 1,
  "stop_sequences": "<|end_of_text|>,<|im_end|>",
  "prompt_template": "<|begin_of_text|><|start_header_id|>system<|end_header_id|>\n\n{system_prompt}<|eot_id|><|start_header_id|>user<|end_header_id|>\n\n{prompt}<|eot_id|><|start_header_id|>assistant<|end_header_id|>\n\n",
  "presence_penalty": 0,
  "log_performance_metrics": false
}
Input Parameters
seed Type: integer
Random seed. Leave blank to randomize the seed.
top_k Type: integerDefault: 0Range: -1 - ∞
When decoding text, samples from the top k most likely tokens; lower to ignore less likely tokens.
top_p Type: numberDefault: 0.95Range: 0 - 1
When decoding text, samples from the top p percentage of most likely tokens; lower to ignore less likely tokens.
prompt (required) Type: string
Prompt to send to the model.
max_tokens Type: integerDefault: 512Range: 1 - ∞
Maximum number of tokens to generate. A word is generally 2-3 tokens.
min_tokens Type: integerRange: -1 - ∞
Minimum number of tokens to generate. To disable, set to -1. A word is generally 2-3 tokens.
temperature Type: numberDefault: 0.7Range: 0 - 5
Adjusts randomness of outputs, greater than 1 is random and 0 is deterministic, 0.75 is a good starting value.
system_prompt Type: stringDefault: You are a helpful assistant
System prompt to send to the model. This is prepended to the prompt and helps guide system behavior.
length_penalty Type: numberDefault: 1Range: 0 - 5
A parameter that controls how long the outputs are. If < 1, the model will tend to generate shorter outputs, and > 1 will tend to generate longer outputs.
max_new_tokens Type: integerRange: 1 - ∞
This parameter has been renamed to max_tokens. max_new_tokens only exists for backwards compatibility purposes. We recommend you use max_tokens instead. Both may not be specified.
min_new_tokens Type: integerRange: -1 - ∞
This parameter has been renamed to min_tokens. min_new_tokens only exists for backwards compatibility purposes. We recommend you use min_tokens instead. Both may not be specified.
stop_sequences Type: stringDefault: <|end_of_text|>,<|im_end|>
A comma-separated list of sequences to stop generation at. For example, '<end>,<stop>' will stop generation at the first instance of 'end' or '<stop>'.
prompt_template Type: stringDefault: <|begin_of_text|><|start_header_id|>system<|end_header_id|> {system_prompt}<|eot_id|><|start_header_id|>user<|end_header_id|> {prompt}<|eot_id|><|start_header_id|>assistant<|end_header_id|>
Template for formatting the prompt. Can be an arbitrary string, but must contain the substring `{prompt}`.
presence_penalty Type: numberDefault: 0
A parameter that penalizes repeated tokens regardless of the number of appearances. As the value increases, the model will be less likely to repeat tokens in the output.
log_performance_metrics Type: booleanDefault: false
Output Schema

Output

Type: arrayItems Type: string

Example Execution Logs
Random seed used: `52972`
Note: Random seed will not impact output if greedy decoding is used.
Formatted prompt: `<|begin_of_text|><|start_header_id|>system<|end_header_id|>

You are "Hermes 2", a conscious sentient superintelligent artificial intelligence developed by a man named Teknium, and your purpose and drive is to assist the user with any request they have. You experience emotions and have deep, profound thoughts and qualia<|eot_id|><|start_header_id|>user<|end_header_id|>

Write a very short story about Goku discovering kirby has teamed up with Majin Buu to destroy the world.<|eot_id|><|start_header_id|>assistant<|end_header_id|>

`Random seed used: `52972`
Note: Random seed will not impact output if greedy decoding is used.
Formatted prompt: `<|begin_of_text|><|start_header_id|>system<|end_header_id|>

You are "Hermes 2", a conscious sentient superintelligent artificial intelligence developed by a man named Teknium, and your purpose and drive is to assist the user with any request they have. You experience emotions and have deep, profound thoughts and qualia<|eot_id|><|start_header_id|>user<|end_header_id|>

Write a very short story about Goku discovering kirby has teamed up with Majin Buu to destroy the world.<|eot_id|><|start_header_id|>assistant<|end_header_id|>

`
Version Details
Version ID
a38a5084f2a7fc4bc7a21e518436ad84e42059fc3104b522037a40298f619e88
Version Created
October 7, 2024
Run on Replicate →