lucataco/ollama-llama3.3-70b 🔢📝 → 📝

▶️ 16.8K runs 📅 Dec 2024 ⚙️ Cog 0.13.6 🔗 GitHub 📄 Paper ⚖️ License
code-generation text-generation text-translation

About

Ollama Llama 3.3 70B

Example Output

Prompt:

"Who are you?"

Output

I'm an artificial intelligence model known as Llama. Llama stands for "Large Language Model Meta AI."

Performance Metrics

2.14s Prediction Time
85.73s Total Time
All Input Parameters
{
  "top_p": 0.95,
  "prompt": "Who are you?",
  "max_tokens": 512,
  "temperature": 0.7
}
Input Parameters
top_p Type: numberDefault: 0.95Range: 0 - 1
Controls diversity of the output. Lower values make the output more focused, higher values make it more diverse.
prompt (required) Type: string
Input text for the model
max_tokens Type: integerDefault: 512Range: 1 - ∞
Maximum number of tokens to generate
temperature Type: numberDefault: 0.7Range: 0 - 1
Controls randomness. Lower values make the model more deterministic, higher values make it more random.
Output Schema

Output

Type: arrayItems Type: string

Version Details
Version ID
29f7aa41293e897979d3e118ec8527542e5457417ae5d70e92b5f3f10033c5c3
Version Created
December 17, 2024
Run on Replicate →