*Untitled - Notepad
🗙
PHP Logo

About

Hi, I am Ubaada. I am a Software Engineer based in New Zealand.

Projects

Word Map:
Maps how machine learning models associate word with different cultures and countries in vector space.

Original Transformer Reimplementation:
Reimplementation and training of the original 2017 "Attention is all you need" transformer model from 2017.

VBE tool:
A simple tool to convert Variable Byte Encoding to and from decimal numbers.

Cleaned BookSum dataset:
A cleaned version of the BookSum dataset published on HuggingFace. The dataset is a collection of book chapters, whole books, and their summaries. BookSum dataset is used for training and evaluating summarization machine learning models.

Summarization LLM Models:
Some Efficient Attention Transformer models fine-tuned on the BookSum dataset above for summarization. Efficient Attention, as opposed to the regular attention mechanism used in Transformer language models like ChatGPT, allows us to process longer sequences of text more efficiently with less hardware resources.

Search Engine
An information retriever (search engine) written in C for parsing and searching the WSJ collection using an inverted index.

GitHub Stats

PHP - 31.9%
Java - 23.0%
C - 12.7%
Python - 10.5%
HTML - 9.8%
JavaScript - 7.4%
CSS - 4.6%
TSQL - 0.1%
Makefile - 0.1%