google-deepmind/gemma-2-2b-it 🔢📝 → 📝
About
Gemma2 2b Instruction-tuned variant by Google

Example Output
Prompt:
"Write me a poem about Machine Learning."
Output
A labyrinth of data, vast and deep,
Where patterns hide, secrets to keep.
Machine learning, with algorithms bright,
Unveils the truth through logic's light.
From simple tasks, it starts its quest,
To solve complex problems, put them to test.
With each iteration, knowledge grows,
As models learn, their wisdom shows.
Neural networks, like brains they mimic,
Connecting nodes, information trickling.
Deep learning dives into unknown terrain,
Extracting insights from unseen domain.
Predictive power, in every stride,
Forecasting future, where trends reside
Performance Metrics
8.10s
Prediction Time
8.11s
Total Time
All Input Parameters
{ "top_k": 50, "top_p": 0.9, "prompt": "Write me a poem about Machine Learning.", "temperature": 0.6, "max_new_tokens": 128, "repetition_penalty": 1.2 }
Input Parameters
- top_k
- When decoding text, samples from the top k most likely tokens; lower to ignore less likely tokens
- top_p
- When decoding text, samples from the top p percentage of most likely tokens; lower to ignore less likely tokens
- prompt
- Prompt to send to the model.
- temperature
- Adjusts randomness of outputs, greater than 1 is random and 0 is deterministic, 0.75 is a good starting value.
- max_new_tokens
- Maximum number of tokens to generate. A word is generally 2-3 tokens
- repetition_penalty
- Repetition penalty parameter.
Output Schema
Output
Version Details
- Version ID
ff924e24b20727e4e04b9721b403b1a75500b7b8b934714ed2b34afc6de69673
- Version Created
- August 1, 2024