arthur630-tech/mob 🔢📝 → 📝

▶️ 1.2K runs 📅 Dec 2024 ⚙️ Cog 0.13.6
multilingual text-generation text-translation

About

Example Output

Prompt:

"你好呀"

Output

你好呀!我叫李红,是中国上海的。上海是中国最大的城市,人口有14亿人。这里有很多现代化的建筑,上海有很多世界级的博物馆,例如上海博物馆。上海还非常擅

Performance Metrics

3.94s Prediction Time
149.40s Total Time
All Input Parameters
{
  "n": 1,
  "top_k": 100,
  "top_p": 1,
  "prompt": "你好呀",
  "max_length": 50,
  "temperature": 0.75,
  "repetition_penalty": 1
}
Input Parameters
n Type: integerDefault: 1Range: 1 - 5
Number of output sequences to generate
top_k Type: integerDefault: 100Range: 1 - ∞
计算每个可能的下一个 token 的概率分布
top_p Type: numberDefault: 1Range: 0.01 - 1
When decoding text, samples from the top p percentage of most likely tokens; lower to ignore less likely tokens
prompt (required) Type: string
Text prompt to send to the model.
max_length Type: integerDefault: 50Range: 1 - ∞
Maximum number of tokens to generate. A word is generally 2-3 tokens
temperature Type: numberDefault: 0.75Range: 0.01 - 5
Adjusts randomness of outputs, greater than 1 is random and 0 is deterministic, 0.75 is a good starting value.
repetition_penalty Type: numberDefault: 1Range: 0.01 - 5
Penalty for repeated words in generated text; 1 is no penalty, values greater than 1 discourage repetition, less than 1 encourage it.
Output Schema

Output

Type: string

Example Execution Logs
start generate
end generate
Version Details
Version ID
209e2fde138326f89fdb246df01426ffd698560f66a4dc8d01736ecca98d2d97
Version Created
December 18, 2024
Run on Replicate →