lucataco/ollama-llama3-70b 📝 → 📝

▶️ 6.6K runs 📅 Jul 2024 ⚙️ Cog 0.9.12 🔗 GitHub 📄 Paper
code-generation question-answering text-generation

About

Cog wrapper for Ollama llama3:70b

Example Output

Prompt:

"tell me a joke"

Output

Here's one:

Why couldn't the bicycle stand up by itself?

(Wait for it...)

Because it was two-tired!

Hope that made you laugh!

Performance Metrics

1.38s Prediction Time
126.21s Total Time
Input Parameters
prompt (required) Type: string
Input text for the model
Output Schema

Output

Type: arrayItems Type: string

Example Execution Logs
[GIN] 2024/07/09 - 03:28:03 | 200 |  1.370060953s |       127.0.0.1 | POST     "/api/generate"
Total runtime: 1.3725180625915527
Version Details
Version ID
1ce61ccd1142369a00a4d3ef8304ff9031901e60eb8572e076690d91664101cc
Version Created
July 9, 2024
Run on Replicate →