Hi! I'm Shreeya, an Applied Scientist specializing in machine learning and natural language processing (NLP), with a particular passion for low-resource NLP, especially in South Asian languages.
Welcome to #icodeformyभाषा! In this blog I will be exploring the intricacies of Nepali grammar. I am fascinated by rich morphology of Nepali and that passion is what fuels #icodeformyभाषा. In addition, I will also write on architecture and foundations of LLMs and discuss my research and findings in low-resource NLP.
You can also find me on LinkedIn!
Recent work:
Here are some recent posts:
Low-Resource NLP in the Era of LLMs - Introduction
There has, undoubtedly, been drastic shifts in the landscape of Natural Language Processing (NLP) research and development with the breakthrough of Large Language Models (LLMs) like ChatGPT. However the majority of LLMs are optimized for a few high-resource languages such as English. This is because these LLMs are pretrained with large corpora of text f…
Do LLMs Engage in True Reasoning?
Can LLMs “truly” reason? This question of whether LLMs are truly capable of reasoning is one of the widely discussed questions in the field of AI today. There are a significant number of studies and claims both in favor and against that LLMs exhibiting reasoning capabilities. This discussion is not limited within the AI research community but it extends…
Strawberry (o1) - Does changing language affect its reasoning?
On September 12, OpenAI released its new series of AI models trained with reinforcement learning to perform complex reasoning - called o1. Two versions of the model were released, o1-preview and o1-mini. These models are trained to “think” before answering or it is trained to generate a long chain of thought (CoT) before answering. This allows the model…
Please reach via dms if you are interested in collaborating on the opinion piece or collecting reading materials!
