meta/codellama-13b 🔢📝 → 📝
About
A 13 billion parameter Llama tuned for code completion

Example Output
Prompt:
"
function sum two integers
def
"Output
add(a, b):
return a + b;
Driver Code
if name == 'main':
Taking input from user.
print("Enter first integer: ")
num1 = int(input()))
print("
Enter second integer: ")
Performance Metrics
3.79s
Prediction Time
3.77s
Total Time
All Input Parameters
{ "debug": false, "top_k": 250, "top_p": 0.95, "prompt": "# function sum two integers\ndef", "temperature": 0.95, "max_new_tokens": 128, "min_new_tokens": -1, "repetition_penalty": 1.15, "repetition_penalty_sustain": 256, "token_repetition_penalty_decay": 128 }
Input Parameters
- top_k
- Top K
- top_p
- Top P
- prompt (required)
- Prompt
- max_tokens
- Max number of tokens to return
- temperature
- Temperature
- repeat_penalty
- Repetition penalty
- presence_penalty
- Presence penalty
- frequency_penalty
- Frequency penalty
Output Schema
Output
Example Execution Logs
Prompt: # function sum two integers def ** Speed: 46.84 tokens/second
Version Details
- Version ID
cc618fca92404570b9c10d1a4fb5321f4faff54a514189751ee8d6543db64c8f
- Version Created
- September 28, 2023