Описание

LocalIntelligence
LocalIntelligence - Your Private AI Assistant

LocalIntelligence is a powerful application that brings the power of large language models directly to your desktop through Ollama. Chat with AI models privately and securely—everything runs locally on your device, with no cloud dependency or data sharing.

Core Features

Ollama Integration
Connect seamlessly to your local Ollama instance. Browse and select from all your installed models, with automatic detection of model capabilities including vision, tools, thinking, and embedding support. Each model displays detailed runtime information including quantization level, context window size, and default generation parameters.

Vision Support
Vision-capable models can analyze images. Simply drag and drop an image onto the photo icon or use the file picker to attach photos, screenshots, or graphics. Ask questions about what you see, extract text, analyze diagrams, or get creative descriptions—all processed locally on your device.

MCP (Model Context Protocol) Integration
Extend your AI's capabilities with MCP tools. Connect to TCP/IP MCP servers. Tools are automatically discovered and made available to compatible models, enabling your AI to fetch real-time data, perform calculations, access APIs, and interact with external services. The app handles tool execution, result processing, and multi-turn conversations seamlessly.

Fine-Tuned Control
Adjust generation parameters on the fly with an intuitive interface:
• Temperature: Control randomness and creativity (0.0 - 2.0)
• Top P: Nucleus sampling for vocabulary focus
• Top K: Limit token choices for consistency
• Seed: Set fixed seeds for reproducible outputs

Performance Monitoring
Track model performance with detailed metrics (optional verbose mode):
• Prompt evaluation speed (tokens/second)
• Generation speed (tokens/second)
• Context window usage with visual indicator
• Load times and total duration

Conversation Management
• Export conversations in plain text or Markdown format
• Copy to clipboard or save as files
• Clear conversations with confirmation dialogs
• Context window tracking prevents truncation issues
• Model switching automatically clears conversations to prevent confusion

Modern, native experience:
• Clean, intuitive interface
• Dark mode support
• Keyboard shortcuts and native controls
• Smooth animations and transitions
• Efficient memory usage

Privacy & Security
• 100% local processing - no data leaves your Mac
• No telemetry or tracking
• Your conversations stay private

Advanced Features

Model Information
View comprehensive details about each model:
• Quantization type with explanatory guide (Q4_K_M, Q8_0, F16, etc.)
• Current and maximum context window sizes
• Parameter family and format
• Size and modification date

Context Window Management
Visual indicator shows real-time context usage:
• Automatic warnings when limits are reached

MCP Server Management
Configure and test MCP connections:
• Connect to remote HTTP/SSE servers
• Real-time connection testing with detailed diagnostics
• Visual status indicators (connected, failed, testing)
• Automatic tool discovery and listing

Tool Execution
When models with tool support call functions:
• Visual indication of executing tools
• Display of tool results inline
• Multi-turn conversations with context preservation

System Requirements
• Ollama installed and running locally
• Recommended: Apple Silicon for optimal performance

Perfect For
• Developers testing and debugging models
• Researchers exploring AI capabilities
• Privacy-conscious users who want local AI
• Anyone who wants full control over their AI interactions

LocalIntelligence puts powerful AI tools in your hands while respecting your privacy and giving you complete control over the experience.
ещё ↓

Скриншоты

#1. LocalIntelligence (macOS) От: Huibert Aalbers
#2. LocalIntelligence (macOS) От: Huibert Aalbers
#3. LocalIntelligence (macOS) От: Huibert Aalbers
#4. LocalIntelligence (macOS) От: Huibert Aalbers
#5. LocalIntelligence (macOS) От: Huibert Aalbers
#6. LocalIntelligence (macOS) От: Huibert Aalbers
#7. LocalIntelligence (macOS) От: Huibert Aalbers
#8. LocalIntelligence (macOS) От: Huibert Aalbers
#9. LocalIntelligence (macOS) От: Huibert Aalbers

Что нового

  • Версия: 1.2
  • Обновлено:
  • Added a slider in the General pane of the Settings window to define the default context window size that applies to all models. Previously, this value could only be set on a per model basis.

Цена

  • Сегодня: Бесплатно
  • Минимум: Бесплатно
  • Максимум: Бесплатно
Отслеживайте цены

Разработчик

Очки

0 ☹️

Рейтинги

0 ☹️

Списки

0 ☹️

Отзывы

Ваш отзыв будет первым 🌟

Дополнительная информация

Контакты

LocalIntelligenceLocalIntelligence Короткий URL: Скопировано!
  • 🌟 Поделиться
  • Mac App Store

Вам также могут понравиться

Вам также могут понравиться

Поисковые операторы в AppAgg
Добавить в AppAgg
AppAgg
Начните использовать AppAgg. Это бесплатно!
Регистрация
Войти