Claude Skill
RightNow-AI/picolm
Open-source C project for running billion-parameter LLMs on low-cost embedded hardware. Enables AI inference on $10 boards with only 256MB RAM. Supports ARM, RISC-V.
Overview
Repository
🚀 Install this Skill
openclaw install RightNow-AI/picolmSummary
Picolm is a C-based project that enables running a 1-billion parameter large language model on low-cost embedded hardware with only 256MB of RAM, such as a $10 development board.
在256MB内存的10美元开发板上运行10亿参数大语言模型
Key features
- Runs 1B parameter LLM on embedded hardware
- Minimal memory requirement (256MB RAM)
- Optimized for low-cost ($10) development boards
- Written in efficient C language
- Supports ARM and RISC-V architectures
Use cases
- On-device AI inference for IoT devices
- Educational projects on resource-constrained hardware
- Prototyping LLM applications on Raspberry Pi
- Edge computing with large language models
- Demonstrating efficient model quantization techniques