Member-only story

LLMs In Context Learning (ICL)— Hands On Example

Buse Köseoğlu
7 min readApr 17, 2024

--

ICL is formally introduced by GPT-3. It ensures that the desired output is obtained by giving examples of language models for one or more tasks without any training or gradient updates.

ICL is a prompt engineering method. It is used for fine tuning for certain tasks. Examples for train in ICL are given in the prompt. Model weights do not change in ICL.

Larger models have higher ICL ability and give better results.

To make the example, we must first download the necessary libraries.

%pip install --upgrade pip
%pip install --disable-pip-version-check \
torch==1.13.1 \
torchdata==0.5.1 --quiet
%pip install \
transformers==4.27.2 \
datasets==2.11.0 --quiet

We will use Huggingface’s dataset and transformers libraries.

  • The datasets in huggingface can be used with the Dataset library API.
  • New models can be loaded, trained and registered with the Transformers library’s API.
from datasets import load_dataset
from transformers import AutoModelForSeq2SeqLM
from transformers import AutoTokenizer
from transformers import GenerationConfig

DialogSum is a large-scale dialogue summarization dataset, consisting of 13,460 (Plus 100 holdout data for topic…

--

--

Responses (1)