<100 subscribers

The Meritverse: Digital Threads for Net Creds
Combining lessons from MMORPGs on how to create a more unified, rewarding, and meritocratic online ecosystem with open-source tools for custom 3D avatars and wearables

Clank Tank: Reimagining DAOs as a Reality TV Game Show
Reimagining DAO governance as a Shark Tank-style show. AI researches, humans validate, making decisions faster and more transparent.

Tools for Taming Information
Tools and workflows we're building to help communities stay aligned with AI agents by summarizing Discord, track GitHub, and generating daily newsfeeds.

The Meritverse: Digital Threads for Net Creds
Combining lessons from MMORPGs on how to create a more unified, rewarding, and meritocratic online ecosystem with open-source tools for custom 3D avatars and wearables

Clank Tank: Reimagining DAOs as a Reality TV Game Show
Reimagining DAO governance as a Shark Tank-style show. AI researches, humans validate, making decisions faster and more transparent.

Tools for Taming Information
Tools and workflows we're building to help communities stay aligned with AI agents by summarizing Discord, track GitHub, and generating daily newsfeeds.
Share Dialog
Share Dialog


M3 Demo Day returns showcasing XR Publisher, 3D AI Avatars, Unreal Engine integration, and innovative AI shows like Clank Tank and Agent-TV. Discover tools powering the next generation of digital characters and interactive experiences.
Shoutout to voxvienne for designing the world: https://hyperfy.io/demoday
XR Publisher platform enables creation of digital characters that exist across multiple platforms
Key features:
Internal memory system powered by Eliza OS, storing interactions, replies, and streams
Date planner feature for scheduling and goal tracking
Asset management system for inventory items that follow characters across platforms
Character system upgrades with companion characters (e.g., Lumi helps generate weekly reports)
Plugin system with hook architecture for extending functionality (e.g., MIDI Generator plugin)
Open-source API backend for world publishing, user authentication, and character management
New local-only mode that eliminates need for API deployment during development
Export-import feature to transfer characters between local and edge mode
Cross-platform capabilities (Discord, Telegram, Twitch, YouTube)
Streaming system with real-time voice generation and multiple camera presets
Comprehensive security features including authentication checks and ownership verification

Building 3D avatars for AI agents via MOD3LS DAO (also using M3 as logo, purely coincidental)
Created an agent prototype called GIGAI that won Solana hackathon
GIGAI scrolls Twitter lists 24/7 and reacts to tweets in real-time
Offering an API to generate custom videos with one simple request:
Custom branding
Animated characters in dynamic environments
Background music
TikTok-like subtitles
Building a studio for avatar creation with features:
Avatar import and creation
Brand customization
Environment creation
Integration with any TTS platform (Eleven Labs, Play.ai, Azure) with automatic lip sync
Multi-format video support (Instagram, Twitter, TikTok)
Revenue streams:
Decentralized node system where anyone can generate videos using their GPUs
M3 utility token for fee payments
Creating professional avatars with in-house team
Support for importing avatars from platforms like Ready Player Me
Current goal: 10K active M3 agents by end of year

Unreal Engine plugin allowing integration with Eliza OS
Originally developed as a prototype for NPCs in game "Bunging Industries"
Uses HTTP restful calls to Eliza instances for character interactions
Compatible with multiple Eliza instances in one Multiplayer session
Features simple drag-and-drop nodes in Unreal Engine for easy setup
Supports local instances and Fleek for hosting
Can connect chat and actions to different systems
Available on Fab now with basic functionality:
Create new character
Get agent characters
Edit existing ones
Message exchange
Open source (MIT license) on GitHub with documentation
Works on Unreal Engine versions from 4.27 to 5.5
Next steps: deeper integration with v2 of Eliza, connecting to common NPC behaviors like behavior trees and state trees

Clank Tank: AI show similar to Shark Tank where entrepreneurs pitch to AI investors
Show production process:
Pitcher submits basic info (name, bio, pitch details)
Build AI version of pitcher using submitted information
"Writer's room" (single or multi-agent) creates show script
Output is a generic JSON structure that can be played back in different ways
Structure called "Derpy Show" format:
Very simple event-driven system focusing on "load scene" and "speak" events
Stage handles rendering and character animations
Load scene event includes scene ID and actor IDs
Speak event includes actor ID, dialogue text, and emotion/action
Events can be asynchronous (scene loading, dialogue delivery)
Same JSON structure can be played back in different visual formats:
3D models in Unity/Unreal
2D representations (static images with dialogue)
Different camera angles and presentations
Docmentation available for the JSON structure to build compatible systems
AI News show in Unity 3D using the Derpy Show format
Show features AI characters discussing Eliza OS updates and news
Technical implementation:
Built on Unity 2022.3.53 F1
Uses UnityRM for character models and animations
Dialogue generated as JSON script via Claude
Voice generation through Eleven Labs TTS
Oculus lip-sync system for mouth movement
IK rigging for character positioning
Multiple camera angles and scenes
Content creation process:
System collects updates from Eliza OS GitHub repository
Data is aggregated into JSON format
Claude processes data into structured script with scenes and dialogue
Current limitations and planned improvements:
Imperfect mouth sync with speech
Exploring Unity HDRP for better lighting
Adding features like news interviews with guests
Plans to refactor code for better scalability

24/7 AI "SNL-like" show focusing on natural, comedic interactions
Features a website where viewers can submit prompts for characters to act out
Real-time prompt processing via WebSocket connection to Unity
Character system:
Uses FishAI Agent System with customizable character cards for personalities
Simple three-step prompt system for character creation
Speech patterns customization (catchphrases, typing styles)
Easy character addition by importing VRM/GLB/FBX models
Technical details:
Uses fish.audio (Tortoise-based) instead of Eleven Labs for better emotion
Imports Cognitive Computation's Dolphin Mixtral 8x22b model from OpenRouter for uncensored / more realistic dialogue generation
Ambient conversation generation when no prompts are submitted
Simple subtitle system
Future plans:
Implement viewer interaction features
Pull topics from social media
Add "Twitch Plays" style interaction where viewers can manipulate the scene
Potential monetization through paid interactions (e.g., $2 to set a character on fire)
THANKS FOR COMING TO DEMO DAY!! VODs will be uploaded soon.

M3 Demo Day returns showcasing XR Publisher, 3D AI Avatars, Unreal Engine integration, and innovative AI shows like Clank Tank and Agent-TV. Discover tools powering the next generation of digital characters and interactive experiences.
Shoutout to voxvienne for designing the world: https://hyperfy.io/demoday
XR Publisher platform enables creation of digital characters that exist across multiple platforms
Key features:
Internal memory system powered by Eliza OS, storing interactions, replies, and streams
Date planner feature for scheduling and goal tracking
Asset management system for inventory items that follow characters across platforms
Character system upgrades with companion characters (e.g., Lumi helps generate weekly reports)
Plugin system with hook architecture for extending functionality (e.g., MIDI Generator plugin)
Open-source API backend for world publishing, user authentication, and character management
New local-only mode that eliminates need for API deployment during development
Export-import feature to transfer characters between local and edge mode
Cross-platform capabilities (Discord, Telegram, Twitch, YouTube)
Streaming system with real-time voice generation and multiple camera presets
Comprehensive security features including authentication checks and ownership verification

Building 3D avatars for AI agents via MOD3LS DAO (also using M3 as logo, purely coincidental)
Created an agent prototype called GIGAI that won Solana hackathon
GIGAI scrolls Twitter lists 24/7 and reacts to tweets in real-time
Offering an API to generate custom videos with one simple request:
Custom branding
Animated characters in dynamic environments
Background music
TikTok-like subtitles
Building a studio for avatar creation with features:
Avatar import and creation
Brand customization
Environment creation
Integration with any TTS platform (Eleven Labs, Play.ai, Azure) with automatic lip sync
Multi-format video support (Instagram, Twitter, TikTok)
Revenue streams:
Decentralized node system where anyone can generate videos using their GPUs
M3 utility token for fee payments
Creating professional avatars with in-house team
Support for importing avatars from platforms like Ready Player Me
Current goal: 10K active M3 agents by end of year

Unreal Engine plugin allowing integration with Eliza OS
Originally developed as a prototype for NPCs in game "Bunging Industries"
Uses HTTP restful calls to Eliza instances for character interactions
Compatible with multiple Eliza instances in one Multiplayer session
Features simple drag-and-drop nodes in Unreal Engine for easy setup
Supports local instances and Fleek for hosting
Can connect chat and actions to different systems
Available on Fab now with basic functionality:
Create new character
Get agent characters
Edit existing ones
Message exchange
Open source (MIT license) on GitHub with documentation
Works on Unreal Engine versions from 4.27 to 5.5
Next steps: deeper integration with v2 of Eliza, connecting to common NPC behaviors like behavior trees and state trees

Clank Tank: AI show similar to Shark Tank where entrepreneurs pitch to AI investors
Show production process:
Pitcher submits basic info (name, bio, pitch details)
Build AI version of pitcher using submitted information
"Writer's room" (single or multi-agent) creates show script
Output is a generic JSON structure that can be played back in different ways
Structure called "Derpy Show" format:
Very simple event-driven system focusing on "load scene" and "speak" events
Stage handles rendering and character animations
Load scene event includes scene ID and actor IDs
Speak event includes actor ID, dialogue text, and emotion/action
Events can be asynchronous (scene loading, dialogue delivery)
Same JSON structure can be played back in different visual formats:
3D models in Unity/Unreal
2D representations (static images with dialogue)
Different camera angles and presentations
Docmentation available for the JSON structure to build compatible systems
AI News show in Unity 3D using the Derpy Show format
Show features AI characters discussing Eliza OS updates and news
Technical implementation:
Built on Unity 2022.3.53 F1
Uses UnityRM for character models and animations
Dialogue generated as JSON script via Claude
Voice generation through Eleven Labs TTS
Oculus lip-sync system for mouth movement
IK rigging for character positioning
Multiple camera angles and scenes
Content creation process:
System collects updates from Eliza OS GitHub repository
Data is aggregated into JSON format
Claude processes data into structured script with scenes and dialogue
Current limitations and planned improvements:
Imperfect mouth sync with speech
Exploring Unity HDRP for better lighting
Adding features like news interviews with guests
Plans to refactor code for better scalability

24/7 AI "SNL-like" show focusing on natural, comedic interactions
Features a website where viewers can submit prompts for characters to act out
Real-time prompt processing via WebSocket connection to Unity
Character system:
Uses FishAI Agent System with customizable character cards for personalities
Simple three-step prompt system for character creation
Speech patterns customization (catchphrases, typing styles)
Easy character addition by importing VRM/GLB/FBX models
Technical details:
Uses fish.audio (Tortoise-based) instead of Eleven Labs for better emotion
Imports Cognitive Computation's Dolphin Mixtral 8x22b model from OpenRouter for uncensored / more realistic dialogue generation
Ambient conversation generation when no prompts are submitted
Simple subtitle system
Future plans:
Implement viewer interaction features
Pull topics from social media
Add "Twitch Plays" style interaction where viewers can manipulate the scene
Potential monetization through paid interactions (e.g., $2 to set a character on fire)
THANKS FOR COMING TO DEMO DAY!! VODs will be uploaded soon.

No comments yet