HXA079 family of hybrid models, combining RWKV recurrent architectures with Transformer-based attention.
Designed for efficient long-context.
OpenMOSE
OpenMOSE
AI & ML interests
Can love be expressed as a tensor?
Recent Activity
updated
a model
about 12 hours ago
OpenMOSE/RWKV-Qwen3-30B-A3B-Instruct-hxa07d
published
a model
about 13 hours ago
OpenMOSE/RWKV-Qwen3-30B-A3B-Instruct-hxa07d
new activity
10 days ago
OpenMOSE/Qwen3-VL-REAP-145B-A22B:50% prune?
Organizations
None yet