Custom GPT Implementation
CompletedDeveloped a miniature GPT model implementing custom tokenization, self-attention, and transformer blocks. Designed neural architecture with positional embeddings, achieving coherent autoregressive text generation. Includes training pipeline, evaluation metrics, and sample generation scripts. Compared performance with open-source transformer models and visualized attention weights for interpretability.

Developed a miniature GPT model implementing custom tokenization, self-attention, and transformer blocks. Designed neural architecture with positional embeddings, achieving coherent autoregressive text generation. Includes training pipeline, evaluation metrics, and sample generation scripts. Compared performance with open-source transformer models and visualized attention weights for interpretability.