Feature #16787
openAdd optional local LLM (e.g., Ollama) integration for natural language system management and real-time network insights
0%
Description
I did use AI to write this for me.
Problem:
pfSense provides powerful networking and security capabilities, but configuration, monitoring, and troubleshooting still require significant expertise and manual navigation across multiple interfaces (GUI, logs, shell, packages).
For many users—especially in complex environments—understanding system state, diagnosing issues, or making configuration changes requires deep familiarity with pfSense internals.
At the same time, modern local large language models (LLMs) (e.g., via Ollama) can interpret system data and provide natural language interaction without requiring cloud connectivity.
Proposed Solution:
Introduce optional integration with a locally hosted LLM (such as Ollama) to enable:
1. Natural language querying of system state
2. Assisted configuration changes
3. Real-time summarization of logs, traffic, and threats
This would function as an optional feature or package, with all processing performed locally for security.
Example capabilities:
- "Show me a summary of current WAN/LAN traffic and usage"
- "Explain why my VPN throughput is slow"
- "List active firewall rules affecting this IP"
- "Summarize recent intrusion detection alerts"
- "What changed in my configuration recently?"
- "Help me configure a WireGuard tunnel step-by-step"
Advanced (optional, permission-based):
- Apply configuration changes via guided confirmation:
- "Create a firewall rule to allow port 443 from this subnet"
- "Restart WireGuard and verify connectivity"
Use Case:
In advanced environments (homelabs, research labs, AI-integrated networks, high-throughput deployments), administrators often need rapid situational awareness.
A local LLM could:
- Aggregate data from multiple subsystems (pf, logs, interfaces, VPN, IDS/IPS)
- Present human-readable summaries
- Reduce time to diagnose issues
- Improve accessibility without sacrificing control
This is particularly useful when combined with:
- SSH + tmux workflows
- Remote administration
- High-frequency troubleshooting scenarios
Security Considerations:
- Must be strictly local (no external API calls required)
- Optional and disabled by default
- Role-based access control for any configuration changes
- Read-only mode available
- Full audit logging of LLM-suggested or executed actions
Implementation Approach (high-level):
- Provide a package or plugin interface for LLM integration
- Allow connection to local LLM endpoints (e.g., Ollama REST API)
- Expose structured system data (logs, metrics, config state) to the LLM
- Include a safe command translation layer for configuration changes
Additional Benefits:
- Improves usability for both new and advanced users
- Reduces complexity of system interaction
- Enables faster troubleshooting and insight generation
- Aligns pfSense with emerging trends in local AI-assisted system management
Notes:
This feature would be optional and modular, ensuring no impact on systems that do not require it. It leverages local AI capabilities while maintaining pfSense's security-first design.
Updated by → luckman212 2 days ago
I did use AI to write this for me
I'd expect this request to receive about as much effort from Netgate as you put into it.