lucataco/dolphin-2.2.1-mistral-7b 🔢📝 → 📝
About
Mistral-7B-v0.1 fine tuned for chat with the Dolphin dataset (an open-source implementation of Microsoft's Orca)
Example Output
"What is the best way to train a dolphin to obey me? Please answer step by step."
Output
Step 1: Building Trust
The first step in training a dolphin is to gain its trust. Spend time in the water with the dolphin and allow it to become familiar with your presence. Be calm and gentle, as dolphins are highly intuitive and can sense fear or aggression.
Step 2: Establishing Communication
Before you can train the dolphin, you need to establish a way of communicating with it. This can be done through hand signals or vocal commands, depending on your preference. Practice these signals or commands in the presence of the dolphin to familiarize it with your intentions.
Step 3: Teaching Basic Commands
Start with basic commands such as "come," "stay," and "go." Use the communication method you established in step 2 to teach these commands. Once the dolphin understands these basic commands, move on to more advanced commands.
Step 4: Rewarding Good Behavior
Positive reinforcement is key when training a dolphin. When the dolphin successfully completes a command, reward it with a treat or verbal praise. This will encourage the dolphin to continue learning and obeying your commands.
Step 5: Patience and Consistency
Training a dolphin takes time and patience. Be consistent in your training and don't expect immediate results. Gradually increase the complexity of the commands as the dolphin becomes more familiar with the training process.
Step 6: Maintaining the Relationship
Once the dolphin is trained, it's important to maintain the relationship by continuing to spend time with it and reinforcing the commands. This will help ensure that the dolphin continues to obey your commands and respect your authority.
Performance Metrics
All Input Parameters
{
"top_k": 50,
"top_p": 0.95,
"prompt": "What is the best way to train a dolphin to obey me? Please answer step by step.",
"temperature": 0.8,
"max_new_tokens": 512,
"prompt_template": "<|im_start|>system\nyou are an expert dolphin trainer\n<|im_end|>\n<|im_start|>user\n{prompt}<|im_end|>\n<|im_start|>assistant\n",
"presence_penalty": 0,
"frequency_penalty": 0
}
Input Parameters
- top_k
- The number of highest probability tokens to consider for generating the output. If > 0, only keep the top k tokens with highest probability (top-k filtering).
- top_p
- A probability threshold for generating the output. If < 1.0, only keep the top tokens with cumulative probability >= top_p (nucleus filtering). Nucleus filtering is described in Holtzman et al. (http://arxiv.org/abs/1904.09751).
- prompt (required)
- temperature
- The value used to modulate the next token probabilities.
- max_new_tokens
- The maximum number of tokens the model should generate as output.
- prompt_template
- The template used to format the prompt. The input prompt is inserted into the template using the `{prompt}` placeholder.
- presence_penalty
- Presence penalty
- frequency_penalty
- Frequency penalty
Output Schema
Output
Example Execution Logs
Generated 379 tokens in 4.780444145202637 seconds.
Version Details
- Version ID
0521a0090543fea1a687a871870e8f475d6581a3e6e284e32a2579cfb4433ecf- Version Created
- October 31, 2023