You need to agree to share your contact information to access this model
The information you provide will be collected, stored, processed and shared in accordance with the Ekoaham Dutivnasti Privacy Policy.
VRINDA PROPRIETARY LICENSE AGREEMENT
VRINDA-8B Version Release Date: December 18, 2024
"Agreement" means the terms and conditions for use, reproduction, distribution and modification of the VRINDA Materials set forth herein.
"Documentation" means the specifications, manuals and documentation accompanying VRINDA-8B distributed by Ekoaham Dutivnasti at https://vrinda.ekoahamdutivnasti.com.
"Licensee" or "you" means you, or your employer or any other person or entity (if you are entering into this Agreement on such person or entity's behalf), of the applicable age required to consent to the terms of this Agreement.
"VRINDA Materials" means, collectively, Ekoaham Dutivnasti's proprietary VRINDA-8B model and any Documentation made available under this Agreement.
1. License Rights and Redistribution
a. Grant of Rights. Subject to the terms of this Agreement, Ekoaham Dutivnasti hereby grants you a non-exclusive, non-transferable, revocable license to:
- Use the VRINDA Materials for personal, non-commercial purposes
- Use the VRINDA Materials for academic research and evaluation
- Use the VRINDA Materials for educational purposes
b. Restrictions. You shall NOT:
- Use the VRINDA Materials for any commercial purpose without written permission
- Modify, fine-tune, or create derivative works from the VRINDA Materials
- Redistribute, share, or transfer the VRINDA Materials to any third party
- Sell, sublicense, or commercially exploit the VRINDA Materials
- Use the VRINDA Materials to train other AI models
- Reverse engineer or extract components from the VRINDA Materials
- Use the VRINDA Materials for any illegal, harmful, or malicious purposes
2. Additional Commercial Terms
If you wish to use VRINDA Materials for commercial purposes, you must obtain a separate commercial license from Ekoaham Dutivnasti. Contact: [email protected]
3. Disclaimer of Warranty
THE VRINDA MATERIALS ARE PROVIDED "AS IS" WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED. EKOAHAM DUTIVNASTI DISCLAIMS ALL WARRANTIES INCLUDING MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE.
4. Limitation of Liability
IN NO EVENT SHALL EKOAHAM DUTIVNASTI BE LIABLE FOR ANY INDIRECT, INCIDENTAL, SPECIAL, OR CONSEQUENTIAL DAMAGES ARISING FROM USE OF THE VRINDA MATERIALS.
5. Termination
This Agreement terminates automatically upon any violation of its terms. Upon termination, you must delete all copies of the VRINDA Materials.
6. Governing Law
This Agreement shall be governed by the laws of India.
By clicking "Submit" below, you accept the terms of this Agreement and acknowledge that you have read our Privacy Policy.
Log in or Sign Up to review the conditions and access this model content.
Overview
VRINDA-8B is a powerful 8 billion parameter large language model specifically designed for seamless bilingual conversations in English and Hindi. Built with a focus on natural dialogue, emotional intelligence, and cultural understanding, VRINDA represents the next generation of AI assistants tailored for the Indian subcontinent.
Highlights
| Feature | Description |
|---|---|
| Architecture | Transformer-based decoder-only model |
| Parameters | 8 Billion |
| Context Window | 32,768 tokens |
| Languages | English, Hindi, Hinglish |
| Precision | BFloat16 |
Model Capabilities
🗣️ Bilingual Fluency
Native-level understanding and generation in both English and Hindi, with natural code-switching (Hinglish) support.
🧠 Advanced Reasoning
Strong performance on mathematical reasoning, logical deduction, and multi-step problem solving.
💬 Conversational Excellence
Designed for natural, engaging dialogue with personality and emotional awareness.
🇮🇳 Cultural Intelligence
Deep understanding of Indian context, expressions, idioms, and cultural nuances.
Quick Start
Installation
pip install transformers torch accelerate
Basic Usage
from transformers import AutoTokenizer, AutoModelForCausalLM
import torch
model_id = "ekoahamdutivnasti/VRINDA"
# Load model and tokenizer
tokenizer = AutoTokenizer.from_pretrained(model_id, trust_remote_code=True)
model = AutoModelForCausalLM.from_pretrained(
model_id,
torch_dtype=torch.bfloat16,
device_map="auto",
trust_remote_code=True
)
# Create conversation
messages = [
{"role": "system", "content": "You are VRINDA, a helpful AI assistant."},
{"role": "user", "content": "Hello VRINDA, how are you?"}
]
# Generate response
text = tokenizer.apply_chat_template(messages, tokenize=False, add_generation_prompt=True)
inputs = tokenizer(text, return_tensors="pt").to(model.device)
outputs = model.generate(
**inputs,
max_new_tokens=512,
temperature=0.7,
top_p=0.9,
do_sample=True
)
response = tokenizer.decode(outputs[0], skip_special_tokens=True)
print(response)
Quantized Inference (Low VRAM)
from transformers import AutoTokenizer, AutoModelForCausalLM, BitsAndBytesConfig
import torch
quantization_config = BitsAndBytesConfig(
load_in_4bit=True,
bnb_4bit_compute_dtype=torch.bfloat16,
bnb_4bit_use_double_quant=True,
bnb_4bit_quant_type="nf4"
)
model = AutoModelForCausalLM.from_pretrained(
"ekoahamdutivnasti/VRINDA",
quantization_config=quantization_config,
device_map="auto",
trust_remote_code=True
)
Technical Specifications
| Property | Value |
| Model Type | Causal Language Model (Decoder-only) |
| Total Parameters | 8,000,000,000 |
| Hidden Size | 4,096 |
| Intermediate Size | 12,288 |
| Number of Layers | 36 |
| Attention Heads | 32 |
| Key-Value Heads | 8 (GQA) |
| Head Dimension | 128 |
| Vocabulary Size | 151,936 |
| Max Position Embeddings | 40,960 |
| RoPE Theta | 1,000,000 |
| Activation Function | SiLU |
| Normalization | RMSNorm (eps=1e-6) |
Training
VRINDA-8B was trained on carefully curated datasets encompassing:
- Conversational Data: Natural dialogue patterns in English and Hindi
- Emotional Intelligence: Empathetic response generation
- Cultural Content: Indian context, expressions, and cultural knowledge
- Reasoning Tasks: Mathematical and logical problem-solving
- Bilingual Corpus: Code-switching and translation capabilities
Intended Use
Recommended Applications
- Personal AI assistants
- Customer support chatbots (Hindi/English)
- Educational tools
- Content generation
- Research and development
Out-of-Scope Use
- Medical, legal, or financial advice
- Critical decision-making systems
- Generating harmful or misleading content
- Any application violating the license terms
Limitations
- May occasionally generate factually incorrect information
- Performance may vary on highly specialized technical domains
- Should not be used as a sole source for critical decisions
- May reflect biases present in training data
License
This model is released under the VRINDA Proprietary License.
| Permission | Status |
|---|---|
| Personal Use | ✅ Allowed |
| Research Use | ✅ Allowed |
| Commercial Use | ❌ Requires License |
| Modification | ❌ Prohibited |
| Redistribution | ❌ Prohibited |
See LICENSE for complete terms and conditions.
Try VRINDA
Contact
| Organization | Ekoaham Dutivnasti |
| Business Inquiries | [email protected] |
| Website | ekoahamdutivnasti.com |
| VRINDA Web App | vrinda.ekoahamdutivnasti.com |
| Android App | Google Play Store |
| Privacy Policy | Privacy Policy |
Citation
@misc{vrinda8b2024,
title={VRINDA-8B: A Bilingual Large Language Model for English and Hindi},
author={Ekoaham Dutivnasti},
year={2024},
publisher={HuggingFace},
url={https://huggingface.co/ekoahamdutivnasti/VRINDA}
}
VRINDA-8B — Bridging Languages, Understanding Cultures
© 2024 Ekoaham Dutivnasti. All rights reserved.
Built upon open-source foundation model architecture. We acknowledge the contributions of the open-source AI community.
- Downloads last month
- -