Model Card: GPT-2-TREC

An in-domain GPT-2, pre-trained from scratch on the TREC dataset text.

Model Details

Description

This model is based on the GPT-2 architecture and was pre-trained from scratch (in-domain) using the text in TREC dataset, excluding its test split.

Checkpoints

Intermediate checkpoints from the pre-training process are available and can be accessed using specific tags, which correspond to training epochs and steps:

Epoch Step Tags
1 51 epoch-1 step-51
5 255 epoch-5 step-255
10 511 epoch-10 step-511
20 1023 epoch-20 step-1023
40 2046 epoch-40 step-2046
60 3070 epoch-60 step-3070
80 4093 epoch-80 step-4093
100 5116 epoch-100 step-5116
120 6140 epoch-120 step-6140
140 7163 epoch-140 step-7163
160 8186 epoch-160 step-8186
180 9210 epoch-180 step-9210
199 10200 epoch-199 step-10200

To load a model from a specific intermediate checkpoint, use the revision parameter with the corresponding tag:

from transformers import AutoModelForCausalLM

model = AutoModelForMaskedLM.from_pretrained("<model-name>", revision="<checkpoint-tag>")

Sources

  • Paper: [Information pending]

Training Details

For more details on the training procedure, please refer to the base model's documentation: Training procedure.

Training Data

All texts from TREC dataset, excluding the test partition.

Training Hyperparameters

  • Precision: fp16
  • Batch size: 8
  • Gradient accumulation steps: 12

Uses

For typical use cases and limitations, please refer to the base model's guidance: Inteded uses & limitations.

Bias, Risks, and Limitations

This model inherits potential risks and limitations from the base model. Refer to: Limitations and bias.

Environmental Impact

  • Hardware Type: NVIDIA A100 PCIE 40GB
  • Runtime: 28.5
  • Cluster Provider: Artemisa
  • Compute Region: EU
  • Carbon Emitted: 4.42 kg CO2 eq.

Citation

BibTeX:

[More Information Needed]

Downloads last month
20
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for cglez/gpt2-trec

Finetuned
(2082)
this model

Dataset used to train cglez/gpt2-trec