Thursday, August 17, 2023

Open-Orca-Platypus Step by Step Local Installation

 This tutorial is a step by step guide to install OrcaPlatypus on local machine easily and quickly in AWS.



Commands Used:

!pip install transformers


!git clone https://github.com/PanQiWei/AutoGPTQ

cd AutoGPTQ

!pip3 install .


from transformers import AutoTokenizer, pipeline, logging

from auto_gptq import AutoGPTQForCausalLM, BaseQuantizeConfig


model_name_or_path = "TheBloke/OpenOrca-Platypus2-13B-GPTQ"


use_triton = False


tokenizer = AutoTokenizer.from_pretrained(model_name_or_path, use_fast=True)


model = AutoGPTQForCausalLM.from_quantized(model_name_or_path,

        use_safetensors=True,

        trust_remote_code=False,

        device="cuda:0",

        use_triton=use_triton,

        quantize_config=None)

prompt = "Tell me about Stoics"

prompt_template=f'''### Instruction:


{prompt}


### Response:

'''


print("\n\n*** Generate:")


logging.set_verbosity(logging.CRITICAL)


print("*** Pipeline:")

pipe = pipeline(

    "text-generation",

    model=model,

    tokenizer=tokenizer,

    max_new_tokens=512,

    temperature=0.7,

    top_p=0.95,

    repetition_penalty=1.15

)


print(pipe(prompt_template)[0]['generated_text'])

No comments: