back to home

18-02-2025

Building a AI Coding Assistant: DeepSeek VS Code Extension

In the era of AI-powered development tools, privacy concerns often take a backseat to functionality. Many popular coding assistants require sending your code to remote servers for processing. Today, I'm excited to share a project that takes a different approach: a VS Code extension that brings the power of DeepSeek's AI right to your local environment.

What is DeepSeek Assistant?

DeepSeek Assistant is an open-source VS Code extension that provides AI-powered coding assistance without any cloud dependencies. It integrates DeepSeek-R1, a powerful language model, through Ollama to run completely offline on your machine.

Key Features

Privacy-First Architecture

Unlike many AI coding assistants, DeepSeek Assistant processes everything locally. Your code never leaves your machine, making it suitable for:

Seamless Integration

Flexible Usage

Technical Deep Dive

Architecture Overview

The extension is built on three main components:

  1. VS Code Extension Host

    • Handles user interactions and commands
    • Manages WebView communication
    • Processes text selection and context gathering
  2. WebView Interface

    • Built with React for smooth user experience
    • Handles message formatting and display
    • Manages chat history and user input
  3. Ollama Integration

    • Local model inference using DeepSeek-R1
    • Streaming responses for real-time feedback
    • Efficient prompt handling and context management

Key Implementation Details

Extension Activation

export function activate(context: vscode.ExtensionContext) {
    const disposable = vscode.commands.registerCommand('personalassistant-ext.start', async () => {
        const editor = vscode.window.activeTextEditor;
        const selection = editor?.selection;
        const highlightedText = editor?.document.getText(selection);
        
        // Create WebView panel and handle communication
        const panel = vscode.window.createWebviewPanel(
            'deepChat',
            'DeepSeek Chat',
            vscode.ViewColumn.Two,
            {
                enableScripts: true,
                localResourceRoots: [vscode.Uri.joinPath(context.extensionUri, 'media')]
            }
        );
        // ... WebView setup and message handling
    });
}

Chat Interface

The chat interface is implemented using React and handles both user input and AI responses:

function ChatApp() {
    const [messages, setMessages] = React.useState([]);
    const [input, setInput] = React.useState('');
    
    // Handle incoming messages from the extension
    React.useEffect(() => {
        const handleMessage = (event) => {
            const message = event.data;
            if (message.command === 'chatResponse') {
                setMessages(prev => [...prev, {
                    role: 'bot',
                    content: formatMessage(message.text)
                }]);
            }
        };
        
        window.addEventListener('message', handleMessage);
        return () => window.removeEventListener('message', handleMessage);
    }, []);
    
    // ... message handling and UI rendering
}

Message Processing

Messages are processed with markdown formatting and syntax highlighting:

function formatMessage(text) {
    // Handle code blocks with language specification
    text = text.replace(/```(\w+)?\n([\s\S]+?)```/g, (match, lang, code) => {
        return '<div class="code-block">' +
            '<div class="code-header">' +
                '<span class="language-label">' + (lang || 'text') + '</span>' +
                '<button class="copy-button" onclick="copyCodeBlock(this)">Copy</button>' +
            '</div>' +
            '<pre><code>' + 
                code.replace(/&/g, '&amp;')
                    .replace(/</g, '&lt;')
                    .replace(/>/g, '&gt;') + 
            '</code></pre>' +
        '</div>';
    });
    
    // ... additional formatting
    return text;
}

Getting Started

Prerequisites

  1. Install VS Code
  2. Install Ollama from the official repository
  3. Download the DeepSeek-R1 model (8B parameters recommended):
    ollama run deepseek-r1:8b
    

Installation

  1. Download the extension from VS Code Marketplace or build from source
  2. Install in VS Code
  3. Press Cmd+E (Mac) or Ctrl+E (Windows/Linux) to start using

Future Development

The project is actively developed and welcomes contributions. Some planned features include:

Contributing

The extension is open-source and accepts contributions. Whether you're interested in:

Visit the GitHub repository to get started.

Conclusion

DeepSeek Assistant demonstrates that we can have powerful AI coding assistance without compromising on privacy. By leveraging local processing and open-source models, we're providing developers with tools that respect their code's confidentiality while maintaining high functionality.

Try it out and let me know what you think! Your feedback and contributions are welcome as we continue to improve the extension.

#VSCode #AI #OpenSource #DeepSeek #DeveloperTools #Privacy #CodingAssistant