lucataco/ollama-llama3.3-70b 🔢📝 → 📝
About
Ollama Llama 3.3 70B

Example Output
Prompt:
"Who are you?"
Output
I'm an artificial intelligence model known as Llama. Llama stands for "Large Language Model Meta AI."
Performance Metrics
2.14s
Prediction Time
85.73s
Total Time
All Input Parameters
{ "top_p": 0.95, "prompt": "Who are you?", "max_tokens": 512, "temperature": 0.7 }
Input Parameters
- top_p
- Controls diversity of the output. Lower values make the output more focused, higher values make it more diverse.
- prompt (required)
- Input text for the model
- max_tokens
- Maximum number of tokens to generate
- temperature
- Controls randomness. Lower values make the model more deterministic, higher values make it more random.
Output Schema
Output
Version Details
- Version ID
29f7aa41293e897979d3e118ec8527542e5457417ae5d70e92b5f3f10033c5c3
- Version Created
- December 17, 2024