JibayAI jibay-s1-en Language Model

Model Overview

Jibay-s1-en is a 270 million parameter conversational AI model developed by JibayAI. Built on the Gemma3 architecture, this model is specifically designed for chat and dialogue applications while maintaining strong general language understanding capabilities.

Poster

Key Features

  • Architecture: Based on the Gemma3 model structure
  • Parameters: 270 million parameters optimized for efficiency and performance
  • Primary Use Case: Chat and conversational AI applications
  • Flexibility: Supports fine-tuning and retraining for specific tasks
  • Language: English

Technical Specifications

  • Framework: PyTorch
  • Library: Transformers
  • License: Apache License 2.0
  • Model Format: Compatible with Hugging Face Transformers

Intended Use

The model is designed for:

  • Conversational AI and chatbots
  • Dialogue systems
  • Customer support automation
  • Educational assistants
  • Creative writing assistance
  • General purpose Q&A systems

Usage

Basic Inference

from transformers import AutoTokenizer, AutoModelForCausalLM

model_name = "JibayAI/jibay-s1-en"
tokenizer = AutoTokenizer.from_pretrained(model_name)
model = AutoModelForCausalLM.from_pretrained(model_name)

# Your inference code here

Fine-tuning

The model supports fine-tuning for specific domains or tasks using standard PyTorch and Transformers workflows.

License

This model is released under the Apache License 2.0. See the LICENSE file for details.

Requirements

· PyTorch >= 1.9.0 · Transformers >= 4.25.0 · Python >= 3.8

Limitations

· Primarily trained on English data · May require fine-tuning for specialized domains · Standard LLM limitations apply (potential for biased or incorrect outputs)

Ethical Considerations

Users should:

· Implement appropriate content filtering · Monitor outputs for potential biases · Use responsibly in accordance with ethical AI guidelines

Support

For questions and issues regarding the model, please open an issue in the project repository.

Citation

If you use this model in your research or applications, please cite JibayAI and reference the Gemma3 architecture.


Model developed by JibayAI | Gemma

JibayAi 2025©

Downloads last month
33
Safetensors
Model size
0.3B params
Tensor type
BF16
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for JibayAi/jibay-s1-en-270m

Finetuned
(105)
this model

Collection including JibayAi/jibay-s1-en-270m