Neurosymbolic AI (Prolog + LLM)#
Combine the pattern recognition of large language models with the logical rigor of symbolic reasoning to build AI systems that are both intelligent and verifiable.
Why This Matters#
LLMs excel at understanding context and generating natural language, but they can hallucinate facts and struggle with complex logical reasoning. Prolog provides deterministic, explainable inference chains that ground LLM outputs in verified facts and rules. By combining both paradigms, you get AI agents that can understand nuance while maintaining logical correctness.
Quick Example#
name: reasoning-agent
nodes:
- name: classify
uses: llm.call
with:
messages:
- role: user
content: "Classify: {{ state.input }}"
output: classification
- name: reason
language: prolog
run: |
state(classification, Class),
rule(Class, Action),
return(action, Action).
This simple pattern demonstrates the core idea: an LLM classifies input, then Prolog applies domain rules to determine the correct action. The classification is neural; the decision logic is symbolic and auditable.
Key Features#
Feature |
Description |
|---|---|
Knowledge Graphs |
Represent domain knowledge as Prolog facts and query relationships, ancestors, and paths with built-in inference |
Constraint Solving |
Use CLP(FD) to solve scheduling, allocation, and optimization problems with declarative constraints |
Inference Chains |
Build multi-step reasoning pipelines where each step derives new facts from previous conclusions |
Grounding Validation |
Validate LLM extractions against semantic probes to catch hallucinations before they propagate |
Thread-Local State |
Parallel execution branches maintain isolated state via Prolog’s native thread-local predicates |
Available Actions#
Action |
Description |
|---|---|
|
Execute a Prolog query and return variable bindings |
|
Add facts to the knowledge base dynamically |
|
Remove facts from the knowledge base |
|
Load external |
Inline Prolog Execution#
For most use cases, you define Prolog logic directly in the run: block with language: prolog:
- name: infer_relationships
language: prolog
run: |
% Access workflow state
state(person, Person),
% Query knowledge base
ancestor(Person, Ancestor),
% Return results to state
return(ancestors, Ancestor).
Examples#
Neurosymbolic Patterns#
Family Reasoning with LLM + Prolog - Full pipeline with 3-layer validation (structural, semantic, grounding)
Knowledge Graph Reasoning - Inference over relationships: grandparent, sibling, ancestor
Multi-Step Reasoning Chain - Medical symptom checker with chained inference
CLP(FD) Scheduling - Constraint-based task scheduling
Basic Prolog#
Simple Prolog Agent - State access, arithmetic, and inline rules
CLP(FD) Constraints - Basic constraint satisfaction
Learn More#
Prolog Integration Epic - Architecture overview and story status
Python Prolog Support - Python implementation details
Rust Prolog Support - Rust implementation details
Cross-Runtime Parity - Python/Rust compatibility testing
Installation#
Prolog support is optional and requires SWI-Prolog 9.1+ system installation:
# Ubuntu/Debian
apt install swi-prolog
# macOS
brew install swi-prolog
# Python TEA with Prolog
pip install the-edge-agent[prolog]
# Rust Cargo.toml
[dependencies]
the_edge_agent = { version = "0.8", features = ["prolog"] }