lucataco/ollama-qwq 🔢📝 → 📝

▶️ 58 runs 📅 Dec 2024 ⚙️ Cog 0.13.6 🔗 GitHub 📄 Paper ⚖️ License
code-generation question-answering text-generation

About

Ollama QwQ 32B

Example Output

Prompt:

"How many r in strawberry"

Output

Let's see... the word is "strawberry." I need to find out how many 'r's are in it. Okay, let's spell it out slowly: S-T-R-A-W-B-E-R-R-Y. I can see that there are 'r's in there. Let me count them one by one.

First, I look at each letter:

S is not r,

T is not r,

R is an r—so that's one,

A is not r,

W is not r,

B is not r,

E is not r,

R is another r—so now that's two,

R is another r—so that makes three,

And Y is not r.

So, total 'r's in "strawberry" are three.

But wait, let me double-check. Sometimes I might miss one if I'm not careful.

Spell it again: S-T-R-A-W-B-E-R-R-Y.

Start counting again:

The first R is the third letter,

The second R is the ninth letter,

The third R is the tenth letter.

Yes, three 'r's in "strawberry."

I think that's correct.

Final Answer

[ \boxed{3} ]

Performance Metrics

8.79s Prediction Time
81.92s Total Time
All Input Parameters
{
  "top_p": 0.95,
  "prompt": "How many r in strawberry",
  "max_tokens": 512,
  "temperature": 0.7
}
Input Parameters
top_p Type: numberDefault: 0.95Range: 0 - 1
Controls diversity of the output. Lower values make the output more focused, higher values make it more diverse.
prompt (required) Type: string
Input text for the model
max_tokens Type: integerDefault: 512Range: 1 - ∞
Maximum number of tokens to generate
temperature Type: numberDefault: 0.7Range: 0 - 1
Controls randomness. Lower values make the model more deterministic, higher values make it more random.
Output Schema

Output

Type: arrayItems Type: string

Version Details
Version ID
4bfc56cd5731ba68a0cbc95e40fe670b7780dd61f2c53e09411e4154f7a70207
Version Created
December 17, 2024
Run on Replicate →