ghostMalone / README.md
francischung222's picture
added links to social post and live space link
9e30ee9
---
title: Ghost Malone
emoji: 👻
colorFrom: purple
colorTo: blue
sdk: docker
pinned: false
---
# Ghost Malone: Emotion-Aware MCP System
**📢 Project Announcement:** https://x.com/badtrapazoid/status/1990539436795572375
**🌐 Live Space:** https://huggingface.co/spaces/MCP-1st-Birthday/ghostMalone
A three-server emotional engine built for the MCP ecosystem.
What It Does
Ghost Malone turns raw text into a structured emotional understanding, remembers past patterns, and produces responses that fit the user’s state.
Not a chatbot — a small, disciplined mind.
Architecture (Three Servers)
USER → Orchestrator → Emotion Server → Memory Server → Reflection Server → Output
1. Emotion Server
Russell’s Circumplex (valence + arousal)
Fast pattern matching across 8 affect states
Outputs: labels, valence, arousal, tone
~31ms latency
2. Memory Server
Rolling 50-entry history
Stores text + emotional metadata
Recalls past patterns for personalization
~66ms latency
3. Reflection Server
Claude-driven tone adaptation
Uses emotion + memory + need to shape response
~5.3s latency (dominant cost)
Two Lexicons (Core Intelligence)
Needs Lexicon
5 core needs: autonomy, connection, security, rest, recognition
24 context patterns → 47 inference rules
Aligns emotion with human motive
95.2% accuracy vs BPNSFS scale
Intervention Lexicon
Evidence-based strategies
Constitutional gating:
confidence ≥ 0.70
arousal ≥ 0.40
depth ≥ 2 messages
Prevents overstepping / unsolicited advice
Pipeline (Six Steps)
Emotion analysis
Needs inference
Memory recall + store
Reflection (tone-aware response)
Intervention check
Response assembly
Total latency: ~5.5s.
What Makes It Different
1. Needs, not just emotions
“Sad” branches to different needs (connection vs autonomy vs security).
2. Memory-aware
Responses reference earlier feelings.
3. Constitutional alignment
No forced advice.
No toxic positivity.
User controls sensitivity via sliders.
4. Tunable thresholds
Real-time control of intervention behavior.
5. Emotional trajectory visualization
Simple plot showing how the user is moving on the Circumplex.
Core Example (One Glance)
Input: “I feel so isolated and alone.”
Emotion: sad, lonely (valence -0.6, arousal 0.4)
Need: connection (0.92)
Memory: user mentioned “feeling left out at work”
Response: grounded, gentle reflection
Intervention (if gated): connection strategies
## README Example Section
```markdown
## 🎯 Try These Examples
**For Connection needs:**
1. "I feel so isolated and alone"
2. "Nobody really understands what I'm going through"
**For Autonomy needs:**
1. "I feel so trapped and powerless in this situation"
2. "I have no control over anything anymore"
**For Security needs:**
1. "Everything feels so uncertain and scary"
2. "I'm worried about what's going to happen next"
**For Rest needs:**
1. "I'm so burnt out and exhausted"
2. "I'm completely drained and can't keep going"
**For Recognition needs:**
1. "Nobody notices all the work I do"
2. "I feel completely invisible and unappreciated"
💡 Interventions appear on the 2nd message when thresholds are met.
```