Skip to content

xichen1997/opencode

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 

History

1 Commit
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation

OpenCode

An open-source AI-powered code assistant that runs locally

Privacy-first β€’ Secure β€’ Customizable

License: MIT Python 3.8+ Ollama

Overview

OpenCode is a fully-featured, open-source alternative to Claude Code that runs entirely on your local machine using Ollama. It provides a powerful CLI interface for AI-assisted coding, code analysis, and project management while ensuring complete privacy and security.

Key Features

  • πŸ”’ Privacy-First: All processing happens locally - no data leaves your machine
  • πŸ›‘οΈ Secure: File operations restricted to current working directory
  • πŸ€– AI-Powered: Uses local Ollama models for code assistance
  • πŸ’» Interactive CLI: Rich terminal interface with conversation history
  • πŸ”§ Tool System: File operations, search, Git integration, and command execution
  • πŸ“¦ Easy Installation: One-command bootstrap setup
  • πŸŽ›οΈ Customizable: Flexible configuration and model selection

Quick Start

One-Click Installation πŸš€

Linux/macOS:

git clone https://github.com/your-username/openCode.git
cd openCode
./scripts/bootstrap.sh

Windows:

git clone https://github.com/your-username/openCode.git
cd openCode
scripts\bootstrap.bat

Cross-platform:

git clone https://github.com/your-username/openCode.git
cd openCode
python scripts/bootstrap.py

The bootstrap script will:

  • Install Ollama automatically
  • Download and setup an AI model
  • Install OpenCode as a system command
  • Run tests to verify everything works

Manual Installation

If you prefer manual installation, see Installation Guide.

Usage

Interactive Mode (Recommended)

opencode

Start a conversation with the AI assistant:

πŸ”§ > Create a Python function to calculate fibonacci numbers
πŸ”§ > read main.py
πŸ”§ > Add error handling to the parseConfig function
πŸ”§ > run the tests
πŸ”§ > git status

Direct Commands

# Generate code
opencode "Create a REST API endpoint for user authentication"

# Analyze files
opencode --file main.py "Review this code for potential bugs"

# Use specific model
opencode --model deepseek-coder:6.7b "Optimize this algorithm"

Available Commands

File Operations:

  • read config.py - Read and analyze files
  • write to new_file.py - Create new files
  • edit the main function - Modify existing code

Code Analysis:

  • search for TODO comments - Find patterns in codebase
  • explain this function - Code documentation
  • review for security issues - Code auditing

Git Integration:

  • git status - Check repository status
  • git diff - Review changes
  • show recent commits - Git history

Command Execution:

  • run python tests.py - Execute commands
  • install dependencies - Package management
  • build the project - Build automation

Security Features

OpenCode prioritizes security with several protective measures:

πŸ”’ Working Directory Restriction

  • All file operations are restricted to the current working directory
  • No access to system files (/etc/passwd, user directories, etc.)
  • Blocks directory traversal attacks using ../ paths
  • Safe by default - operates only within your project

πŸ›‘οΈ Command Execution Safety

  • Allowed commands only (git, npm, pip, python, node, etc.)
  • Timeout protection prevents hanging processes
  • Working directory scoped execution
  • No shell injection vulnerabilities

πŸ” Privacy Protection

  • 100% local processing - no data sent to external servers
  • No telemetry or usage tracking
  • Open source - fully auditable codebase
  • Configurable - control all aspects of operation

For complete security details, see Security Features.

Models

OpenCode supports various AI models via Ollama:

Recommended Models

Model Size Best For Speed
llama3.2:3b 2GB General coding, fast responses ⚑⚑⚑
deepseek-coder:6.7b 4GB Code generation, debugging ⚑⚑
codestral:7b 4GB Code completion, refactoring ⚑⚑
llama3.1:8b 5GB Complex reasoning, documentation ⚑

Model Management

# List available models
ollama list

# Download new model
ollama pull deepseek-coder:6.7b

# Switch models in OpenCode
opencode --model deepseek-coder:6.7b "your prompt"

Configuration

OpenCode is highly configurable via config.yaml:

model:
  name: "llama3.2:3b"
  temperature: 0.1
  max_tokens: 4096

tools:
  enabled:
    - file_read
    - file_write
    - bash_execute
    - grep_search
    - git_operations
  
  restrict_to_working_dir: true  # Security feature

interface:
  syntax_highlighting: true
  auto_save: false
  confirm_destructive: true

Documentation

Document Description
Installation Guide Detailed installation instructions
Usage Guide Comprehensive usage examples
Bootstrap Guide Automated installation documentation
Security Features Security implementation details

Architecture

openCode/
β”œβ”€β”€ opencode/           # Main package
β”‚   β”œβ”€β”€ cli.py         # Command-line interface
β”‚   β”œβ”€β”€ interactive.py # Interactive mode
β”‚   β”œβ”€β”€ ollama_client.py # Ollama integration
β”‚   β”œβ”€β”€ tools.py       # Tool system
β”‚   └── config.py      # Configuration management
β”œβ”€β”€ scripts/           # Installation scripts
β”‚   β”œβ”€β”€ bootstrap.sh   # Linux/macOS installer
β”‚   β”œβ”€β”€ bootstrap.py   # Cross-platform installer
β”‚   └── bootstrap.bat  # Windows installer
β”œβ”€β”€ tests/            # Test suite
β”œβ”€β”€ docs/             # Documentation
└── examples/         # Usage examples

Contributing

We welcome contributions! OpenCode is designed to be:

  • Extensible - Easy to add new tools and features
  • Secure - All changes must maintain security standards
  • Tested - Comprehensive test suite for reliability
  • Documented - Clear documentation for all features

Requirements

  • Python 3.8+
  • Ollama (automatically installed by bootstrap)
  • 4GB+ RAM (for AI models)
  • 2-10GB storage (depends on models)

Advantages Over Cloud Solutions

Feature OpenCode Cloud Services
Privacy βœ… 100% local ❌ Data sent to servers
Security βœ… Sandboxed operation ❌ Potential data breaches
Cost βœ… Free after setup ❌ Ongoing subscription
Offline βœ… Works without internet ❌ Requires connection
Customization βœ… Full control ❌ Limited options
Transparency βœ… Open source ❌ Proprietary

Performance

OpenCode is optimized for local execution:

  • Response time: 1-5 seconds (depends on model and hardware)
  • Memory usage: 2-8GB (depends on model size)
  • CPU usage: Efficient inference with modern hardware
  • Storage: Models cached locally for fast access

Troubleshooting

Common Issues

Command not found:

# Restart terminal or check PATH
echo $PATH

Ollama not running:

# Start Ollama service
ollama serve

Model not available:

# Pull required model
ollama pull llama3.2:3b

Security restrictions:

# Check you're in the correct directory
pwd
# OpenCode works within current directory only

For more troubleshooting, see Installation Guide.

Testing

Run the test suite to verify installation:

# Security tests
python tests/test_security.py

# Installation tests
python tests/test_installation.py

License

OpenCode is released under the MIT License. See LICENSE for details.

Support

  • Documentation: Check the docs/ directory
  • Issues: Open an issue on GitHub
  • Security: Report security issues privately

Roadmap

  • Plugin system for custom tools
  • Web interface option
  • Advanced code analysis features
  • Team collaboration features
  • IDE integrations

OpenCode - Your local AI coding assistant

Built with privacy, security, and productivity in mind

About

opencode is an open source solution for Claude code, which provide ollama integration

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published