Friday, September 1, 2023

Install and Train Model on AWS SageMaker - Step by Step

This tutorial guides you as how to install and run Ludwig to train a model on AWS Sagemaker easily and quickly. This shows training Llama model on AWS Sagemaker.

 



Commands Used:


!pip install transformers

!git clone https://github.com/PanQiWei/AutoGPTQ

cd AutoGPTQ

! git checkout a7167b1

!pip3 install .

!pip uninstall -y tensorflow --quiet

!pip install "ludwig[llm]" --quiet


!pip install huggingface_hub

from huggingface_hub import notebook_login

notebook_login()


import yaml


config_str = """

model_type: llm

base_model: TheBloke/Llama-2-7B-GPTQ


quantization:

  bits: 4


adapter:

  type: lora


prompt:

  template: |

    ### Instruction:

    {instruction}


    ### Input:

    {input}


    ### Response:


input_features:

  - name: prompt

    type: text


output_features:

  - name: output

    type: text


trainer:

  type: finetune

  learning_rate: 0.0001

  batch_size: 1

  gradient_accumulation_steps: 16

  epochs: 3

  learning_rate_scheduler:

    warmup_fraction: 0.01


preprocessing:

  sample_ratio: 0.1

"""


config = yaml.safe_load(config_str)



import logging

from ludwig.api import LudwigModel



model = LudwigModel(config=config, logging_level=logging.INFO)

results = model.train(dataset="ludwig://alpaca")

print(results)


No comments: