loading page

Real-time Retrieval Argumentation for Large Language Models
  • Udit Raj
Udit Raj
INDIAN INSTITUTE OF TECHNOLOGY PATNA BIHTA -801106

Corresponding Author:[email protected]

Author Profile

Abstract

We are in the midst of a race of who can scrape most of the internet and put it on their servers in the least timeframe. Yes, that’s how all the language models are conventionally built. However, this process is slow, this process is old, this process hallucinates, and doesn’t know what is happening in the real-world. And that’s why this convectional process needs a replacement. Real-Time Retrieval Argumentation is the alternate architecture for LLMs that the new age models need to adapt. With RTRA, a model trained on just 7 billion parameters can beat models trained on hundreds of billions of parameters. This research paper marks a paradigm shift by challenging our dependency on huge computational resources to build a precise and efficient model by introducing a novel approach to train and develop language models.
25 Apr 2024Submitted to TechRxiv
02 May 2024Published in TechRxiv