Main Github
Hugging Face Integration
Community Discord
Main Github
Hugging Face Integration
Community Discord
  • RWKV Language Model
  • Getting Started
    • How to Experience RWKV
    • RWKV Decoding Parameters
    • Integrate with your application
    • Frequently Asked Questions
  • RWKV Prompting
    • Prompting Format Guidelines
    • Chat Prompt Examples
    • Completion Prompt Examples
  • Advanced
    • Fine-tuning
    • Preparing The Training Datasets
    • RWKV Training Environment
    • RWKV Architecture History
    • RWKV pip Usage Guide
  • Inference Tutorials
    • llama.cpp Inference
    • Ollama Inference
    • Silly Tavern Inference
    • Text Generation WebUI Inference
    • KoboldCpp Inference
    • Ai00 Inference
  • Fine Tune Tutorials
    • State Tuning Tutorial
    • LoRA Fine-Tuning Tutorial
    • PiSSA Fine-Tuning Tutorial
    • DiSHA Fine-Tuning Tutorial
    • FAQ about Fine-Tuning
  • Community
    • Code Of Conduct
    • Contributing to RWKV
    • Various RWKV related links

Contributing to RWKV

Support the project financially via Ko-fi

For donations, use the ko-fi link - https://ko-fi.com/rwkv_lm

Help port X language bindings for RWKV.cpp

Help integrate RWKV to new places, by implementing the RWKV.cpp library in your language of choice. See the RWKV.cpp project.

For an example language binding see the NodeJS project rwkv-cpp-node

Help improve the RWKV.cpp / rwkv-cpp-cuda libraries

Generally, the recommended means of running RWKV models. Help improve its performance, or the ease of use of the library.

Help work on datasets

RWKV long term goal, is to make an AI model for everyone in every language. Help us build the multi-lingual (aka NOT english) dataset to make this possible at the #dataset channel in the discord

Help work on eval benchmarks

RWKV could improve with a more consistent, and easily replicatable set of benchmarks. Help us build run such bechmarks to help better compare RWKV against existing opensource models.

Just use it

Help find use cases for RWKV, and implement them. Help us find bugs, and fix them. Help us improve the documentation, and make it easier for others to use RWKV.

Help Work on multimodel

Based on miniGPT4, RWKV state can be used as a glue between various models. Help implement multimodel support for RWKV

Edit this page
Last Updated:
Contributors: luoqiqi, manjuan
Prev
Code Of Conduct
Next
Various RWKV related links