justmalhar/meta-llama-3.2-1b 🔢📝 → 📝

▶️ 254 runs 📅 Sep 2024 ⚙️ Cog 0.9.8
code-generation multilingual text-generation

About

Meta Llama 3.2 1B

Example Output

Prompt:

"How many r's are there in the word strawberry?"

Output

There are three R's in the word "strawberry".

Performance Metrics

0.45s Prediction Time
44.84s Total Time
All Input Parameters
{
  "top_p": 0.95,
  "prompt": "How many r's are there in the word strawberry?",
  "max_tokens": 512,
  "temperature": 0.6
}
Input Parameters
top_p Type: numberDefault: 0.95Range: 0 - 1
Controls diversity of the output. Lower values make the output more focused, higher values make it more diverse.
prompt (required) Type: string
Input text for the model
max_tokens Type: integerDefault: 512Range: 1 - ∞
Maximum number of tokens to generate
temperature Type: numberDefault: 0.7Range: 0 - 1
Controls randomness. Lower values make the model more deterministic, higher values make it more random.
Output Schema

Output

Type: arrayItems Type: string

Example Execution Logs
Total runtime: 0.1365213394165039
Version Details
Version ID
c4a7531b0c6f3701bd470744223fef3d26c8dd11e6f05733650bb329555543a7
Version Created
September 26, 2024
Run on Replicate →