nomagick/chatglm3-6b 🔢📝 → 📝

▶️ 32.1K runs 📅 Oct 2023 ⚙️ Cog 0.8.6 🔗 GitHub 📄 Paper ⚖️ License
text-generation text-translation

About

A 6B parameter open bilingual chat LLM | 开源双语对话语言模型

Example Output

Prompt:

"<|system|>
You are a helpful assistant.
<|user|>
帮我更换一下桌面壁纸
<|assistant|> interpreter"

Output

from PIL import Image
import matplotlib.pyplot as plt

# 加载壁纸图片
wallpaper_path = "path/to/wallpaper.jpg"
wallpaper = Image.open(wallpaper_path)

# 设置新的桌面壁纸
def set_wallpaper(image):
    # 获取屏幕尺寸
    screen_width, screen_height = plt.get_size_height()
    # 计算图片的缩放比例
    scale = min(screen_width/image.width, screen_height/image.height)
    # 计算新的图片尺寸
    new_width, new_height = int(image.width * scale), int(image.height * scale)
    # 缩放图片
    new_image = image.resize((new_width, new_height))
    # 设置为桌面壁纸
    plt.margins(0, 0, 0, 0)
    plt.savefig(new_image, format="png")
    plt.close()
    os.system("setwallpaper " + str(new_image))

# 设置壁纸
set_wallpaper(wallpaper)

Performance Metrics

13.12s Prediction Time
13.15s Total Time
All Input Parameters
{
  "top_p": 0.8,
  "prompt": "<|system|>\nYou are a helpful assistant. \n<|user|>\n帮我更换一下桌面壁纸\n<|assistant|> interpreter\n",
  "max_tokens": 2048,
  "temperature": 0.75
}
Input Parameters
top_p Type: numberDefault: 0.8Range: 0 - 1
Top_p
prompt Type: stringDefault: <|system|> You are a helpful assistant <|user|> 请使用英文重复这段话:"为了使模型生成最优输出,当使用 ChatGLM3-6B 时需要使用特定的输入格式,请按照示例格式组织输入。" <|assistant|>
Prompt for completion
max_tokens Type: integerDefault: 2048Range: 1 - 32768
Max new tokens to generate
temperature Type: numberDefault: 0.75Range: 0 - 5
Temperature
Output Schema

Output

Type: arrayItems Type: string

Version Details
Version ID
745fc66ffdc2cb4d82fbc18197beb86b2c86186dbb7744c9d65f293f0a370a98
Version Created
October 27, 2023
Run on Replicate →