Concepteur: Huibert Aalbers (4)
Prix: Gratuit
Classements: 0 
Avis: 0 Rédiger un avis
Commentaires: 0 Publier un commentaire
Listes: 0 + 0
Points: 0 + 0 ¡
Mac App Store

Description

LocalIntelligence
LocalIntelligence - Your Private AI Assistant with MCP support

LocalIntelligence is a powerful application that brings the power of large language models directly to your desktop through Ollama. Chat with AI models privately and securely—everything runs locally on your device, with no cloud dependency or data sharing.

Core Features

Ollama Integration
Connect seamlessly to your local Ollama instance. Browse and select from all your installed models, with automatic detection of model capabilities including vision, tools, thinking, and embedding support. Each model displays detailed runtime information including quantization level, context window size, and default generation parameters.

Vision Support
Vision-capable models can analyze images. Simply drag and drop an image onto the photo icon or use the file picker to attach photos, screenshots, or graphics. Ask questions about what you see, extract text, analyze diagrams, or get creative descriptions—all processed locally on your device.

MCP (Model Context Protocol) Integration
Extend your AI's capabilities with MCP tools. Connect to TCP/IP MCP servers. Tools are automatically discovered and made available to compatible models, enabling your AI to fetch real-time data, perform calculations, access APIs, and interact with external services. The app handles tool execution, result processing, and multi-turn conversations seamlessly.

Fine-Tuned Control
Adjust generation parameters on the fly with an intuitive interface:
• Temperature: Control randomness and creativity (0.0 - 2.0)
• Top P: Nucleus sampling for vocabulary focus
• Top K: Limit token choices for consistency
• Seed: Set fixed seeds for reproducible outputs

Performance Monitoring
Track model performance with detailed metrics (optional verbose mode):
• Prompt evaluation speed (tokens/second)
• Generation speed (tokens/second)
• Context window usage with visual indicator
• Load times and total duration

Conversation Management
• Export conversations in plain text or Markdown format
• Copy to clipboard or save as files
• Clear conversations with confirmation dialogs
• Context window tracking prevents truncation issues
• Model switching automatically clears conversations to prevent confusion

Modern, native experience:
• Clean, intuitive interface
• Dark mode support
• Keyboard shortcuts and native controls
• Smooth animations and transitions
• Efficient memory usage

Privacy & Security
• 100% local processing - no data leaves your Mac
• No telemetry or tracking
• Your conversations stay private

Advanced Features

Model Information
View comprehensive details about each model:
• Quantization type with explanatory guide (Q4_K_M, Q8_0, F16, etc.)
• Current and maximum context window sizes
• Parameter family and format
• Size and modification date

Context Window Management
Visual indicator shows real-time context usage:
• Automatic warnings when limits are reached

MCP Server Management
Configure and test MCP connections:
• Connect to remote HTTP/SSE servers
• Real-time connection testing with detailed diagnostics
• Visual status indicators (connected, failed, testing)
• Automatic tool discovery and listing

Tool Execution
When models with tool support call functions:
• Visual indication of executing tools
• Display of tool results inline
• Multi-turn conversations with context preservation

System Requirements
• Ollama installed and running locally
• Recommended: Apple Silicon for optimal performance

Perfect For
• Developers testing and debugging models
• Researchers exploring AI capabilities
• Privacy-conscious users who want local AI
• Anyone who wants full control over their AI interactions

LocalIntelligence puts powerful AI tools in your hands while respecting your privacy and giving you complete control over the experience.
plus ↓
Ad

Captures d'écran

#1. LocalIntelligence (macOS) De: Huibert Aalbers
#2. LocalIntelligence (macOS) De: Huibert Aalbers
#3. LocalIntelligence (macOS) De: Huibert Aalbers
#4. LocalIntelligence (macOS) De: Huibert Aalbers
#5. LocalIntelligence (macOS) De: Huibert Aalbers
#6. LocalIntelligence (macOS) De: Huibert Aalbers
#7. LocalIntelligence (macOS) De: Huibert Aalbers
#8. LocalIntelligence (macOS) De: Huibert Aalbers
#9. LocalIntelligence (macOS) De: Huibert Aalbers
#10. LocalIntelligence (macOS) De: Huibert Aalbers

Historique des prix

  • Aujourd’hui: Gratuit
  • Minimum: Gratuit
  • Maximum: Gratuit
Suivre l'évolution des prix

Nouveautés

  • Version: 1.3
  • Mis à jour:
  • Model Library & Management
    Pull new models directly from the Ollama library with a built-in browser featuring popular models like Llama, Gemma, Qwen, DeepSeek, and more. Track download progress in real-time with visual indicators. Check for model updates and easily refresh to the latest versions. Remove unused models with a simple right-click context menu.

    Improved UX
    New Keyboard Shortcuts
    Simpler onboarding for users with no Ollama experience

Concepteur

Points (0)

0 ☹️

Classements (0)

0 ☹️

Listes (0)

0 ☹️
  • LocalIntelligence

Avis (0)

Soyez le premier à donner votre avis 🌟

Informations supplémentaires

Contacts

«LocalIntelligence». Plateforme : macOS. Catégorie : Développeurs. Développeur : «Huibert Aalbers». Première version : . Dernière mise à jour : . Prix actuel : gratuit. Ce titre n'a pas encore reçu de notes ou d'avis sur AppAgg. Langues disponibles : English. AppAgg suit l'historique des prix, les notes et les commentaires des utilisateurs pour «LocalIntelligence». Suivez les futures réductions et mises à jour via RSS. AppAgg n'héberge pas d'applications et ne distribue pas de logiciels. Toutes les marques, logos et captures d'écran appartiennent à leurs propriétaires respectifs.

Commentaires (0)

Soyez le premier à commenter 🌟
LocalIntelligenceLocalIntelligence URL courte: Copié!
  • 🌟 Partager
  • Mac App Store
Ad

Similaires

Découvrez également

Opérateurs de recherche compatibles avec AppAgg
Ajouter à AppAgg
AppAgg
Lancez-vous, c'est gratuit.
Inscription
Connexion