cjwbw/gorilla 📝❓🔢 → ❓

▶️ 92 runs 📅 Nov 2023 ⚙️ Cog 0.8.3 🔗 GitHub 📄 Paper ⚖️ License
code-generation text-generation

About

Gorilla: Large Language Model Connected with Massive APIs

Example Output

Prompt:

"I would like to translate 'I feel very good today.' from English to French."

Output

{"code_parsed":"from transformers import pipeline
translator = pipeline('translation_en_to_fr', model='Helsinki-NLP/opus-mt-en-fr')
translated_text = translator("I feel very good today.")[0]['translation_text']"","domain_parsed":"Natural Language Processing Translation
","api_call_parsed":"pipeline('translation_en_to_fr', model='Helsinki-NLP/opus-mt-en-fr')
","raw_text_response":"
<<>>: Natural Language Processing Translation
<<>>: pipeline('translation_en_to_fr', model='Helsinki-NLP/opus-mt-en-fr')
<<>>: Hugging Face Transformers
<<>>: 1. We first import the pipeline function from the transformers library provided by Hugging Face.
2. We then use the pipeline function to create a translation model.
3. We specify the model 'Helsinki-NLP/opus-mt-en-fr' to be loaded. This model is trained for English to French translation tasks, which is exactly what we need for translating 'I feel very good today.'
4. We can pass the English text as input and the model will return the translated French text.
<<>>: from transformers import pipeline
translator = pipeline('translation_en_to_fr', model='Helsinki-NLP/opus-mt-en-fr')
translated_text = translator(\"I feel very good today.\")[0]['translation_text']"","explanation_parsed":"1. We first import the pipeline function from the transformers library provided by Hugging Face.
2. We then use the pipeline function to create a translation model.
3. We specify the model 'Helsinki-NLP/opus-mt-en-fr' to be loaded. This model is trained for English to French translation tasks, which is exactly what we need for translating 'I feel very good today.'n4. We can pass the English text as input and the model will return the translated French text.
","api_provider_parsed":"Hugging Face Transformers
"}

Performance Metrics

9.01s Prediction Time
254.51s Total Time
All Input Parameters
{
  "prompt": "I would like to translate 'I feel very good today.' from English to French.",
  "model_name": "gorilla-llm/gorilla-falcon-7b-hf-v0",
  "temperature": 0.7,
  "max_new_tokens": 1024
}
Input Parameters
prompt Type: stringDefault: I would like to translate 'I feel very good today.' from English to French.
model_name Default: gorilla-llm/gorilla-mpt-7b-hf-v0
Choose a model.
temperature Type: numberDefault: 0.7
Adjusts randomness of outputs, greater than 1 is random and 0 is deterministic.
max_new_tokens Type: integerDefault: 1024
Max new tokens to generate.
Output Schema

Output

Example Execution Logs
/root/.pyenv/versions/3.11.4/lib/python3.11/site-packages/transformers/generation/utils.py:1270: UserWarning: You have modified the pretrained model configuration to control generation. This is a deprecated strategy to control generation and will be removed soon, in a future version. Please use a generation configuration file (see https://huggingface.co/docs/transformers/main_classes/text_generation )
warnings.warn(
The attention mask and the pad token id were not set. As a consequence, you may observe unexpected behavior. Please pass your input's `attention_mask` to obtain reliable results.
Setting `pad_token_id` to `eos_token_id`:11 for open-end generation.
Version Details
Version ID
4a1a7ce831b1315ab68a6b3ccf54bf376de787c93198a95a329582105b70e083
Version Created
November 7, 2023
Run on Replicate →