Parsimony converts English sentences into structured semantic trees that capture meaning, not just grammar. Unlike dependency parsers that show syntactic relationships, Parsimony produces trees with semantic roles (who did what to whom) and logic operators (negation, quantifiers, modals, knowledge, belief, conditionals).
The result is a machine-readable representation of what a sentence means — not just how it's structured. This makes it useful for reasoning, fact-checking, knowledge extraction, and giving AI agents the ability to understand natural language logically.
Here's what "The cat sat on the mat" looks like as a Parsimony tree:
Type any English sentence below and press Parse to see its semantic tree. Start simple — subject-verb-object sentences work great.
Every sentence produces a tree rooted at the main verb. Children are connected by labeled edges that describe their semantic role: agent (who does it), patient (who it's done to), location (where), and more.
Every node in the tree is a JSON object with these key fields:
{
"ID": "chase_V_1.0", // word_POS_version
"POS": "V", // part of speech
"role": "agent", // semantic role (on the edge from parent)
"definition": "to pursue", // word sense definition
"wiki": "Chase_(action)", // Wikipedia link (when available)
"children": [ ... ], // child nodes
"marks": [ ... ], // logic operators attached to this node
"pointer": "cat_N_1.0" // coreference pointer (pronouns)
}
The ID encodes the word, its part of speech, and a version
number for word sense disambiguation (e.g., bank_N_1.0 for a
river bank vs. bank_N_2.0 for a financial institution).
Each node has a POS (part of speech) that determines its color
in the visualization:
Try a sentence with different word types to see the colors in action:
The edges between nodes carry semantic roles — they describe how a child relates to its parent:
Parsimony captures scope of negation precisely. "The cat is not sleeping" puts the NOT operator on the verb's subtree only. "Not every student passed" scopes differently than "Every student did not pass." Try both:
Universal (all, every) and existential (some, a, few) quantifiers scope over their noun phrase. This is critical for logical reasoning — "All cats like fish" means something very different from "Some cats like fish."
Modal operators capture possibility (might, could, can) and necessity (must, should, has to). Parsimony attaches these as marks on the clause they scope over.
Epistemic operators are central to reasoning: "John knows that the earth is round" vs. "John believes that the earth is flat." Parsimony nests the embedded clause under the knowledge/belief verb, preserving the logical distinction.
"If P then Q" structures are captured with explicit conditional marks on the antecedent and consequent clauses. This enables downstream systems to reason about hypotheticals.
The real power emerges when operators combine. Natural language routinely nests negation inside quantifiers inside knowledge claims. Parsimony preserves these nested scopes faithfully.
Notice how the negation (nobody), the knowledge operator (knows), the question (whether), the universal quantifier (all), and the modal (can) are each scoped correctly in the output tree.
The Parsimony API has one primary endpoint:
POST https://api.p7y.ai/parse
Content-Type: application/json
X-API-Key: YOUR_API_KEY
{
"text": "The cat sat on the mat."
}
The response is a JSON array of trees (one per clause in the input). Each tree is a nested object with the structure shown in the Tree Structure section above.
Pricing: $40 per million words. Buy credits at the console after signing in.
Call the API from any language. Here's the same request in the most popular options:
curl -X POST https://api.p7y.ai/parse \
-H "Content-Type: application/json" \
-H "X-API-Key: YOUR_API_KEY" \
-d '{"text": "The cat sat on the mat."}'
import requests
response = requests.post(
"https://api.p7y.ai/parse",
headers={
"Content-Type": "application/json",
"X-API-Key": "YOUR_API_KEY",
},
json={"text": "The cat sat on the mat."},
)
trees = response.json()
print(trees)
const response = await fetch("https://api.p7y.ai/parse", {
method: "POST",
headers: {
"Content-Type": "application/json",
"X-API-Key": "YOUR_API_KEY",
},
body: JSON.stringify({ text: "The cat sat on the mat." }),
});
const trees = await response.json();
console.log(trees);
package main
import (
"bytes"
"encoding/json"
"fmt"
"io"
"net/http"
)
func main() {
body, _ := json.Marshal(map[string]string{
"text": "The cat sat on the mat.",
})
req, _ := http.NewRequest("POST", "https://api.p7y.ai/parse",
bytes.NewReader(body))
req.Header.Set("Content-Type", "application/json")
req.Header.Set("X-API-Key", "YOUR_API_KEY")
resp, _ := http.DefaultClient.Do(req)
defer resp.Body.Close()
data, _ := io.ReadAll(resp.Body)
fmt.Println(string(data))
}
require "net/http"
require "json"
require "uri"
uri = URI("https://api.p7y.ai/parse")
http = Net::HTTP.new(uri.host, uri.port)
http.use_ssl = true
request = Net::HTTP::Post.new(uri)
request["Content-Type"] = "application/json"
request["X-API-Key"] = "YOUR_API_KEY"
request.body = { text: "The cat sat on the mat." }.to_json
response = http.request(request)
puts JSON.parse(response.body)
use reqwest::header::{CONTENT_TYPE, HeaderMap, HeaderValue};
use serde_json::json;
#[tokio::main]
async fn main() -> Result<(), Box<dyn std::error::Error>> {
let mut headers = HeaderMap::new();
headers.insert(CONTENT_TYPE, HeaderValue::from_static("application/json"));
headers.insert("X-API-Key", HeaderValue::from_static("YOUR_API_KEY"));
let client = reqwest::Client::new();
let res = client
.post("https://api.p7y.ai/parse")
.headers(headers)
.json(&json!({"text": "The cat sat on the mat."}))
.send()
.await?;
println!("{}", res.text().await?);
Ok(())
}
import java.net.http.*;
import java.net.URI;
var client = HttpClient.newHttpClient();
var request = HttpRequest.newBuilder()
.uri(URI.create("https://api.p7y.ai/parse"))
.header("Content-Type", "application/json")
.header("X-API-Key", "YOUR_API_KEY")
.POST(HttpRequest.BodyPublishers.ofString(
"{\"text\": \"The cat sat on the mat.\"}"))
.build();
var response = client.send(request,
HttpResponse.BodyHandlers.ofString());
System.out.println(response.body());
The API accepts two authentication methods:
Generate an API key from the console (sign in → API Keys). Include it in every request as:
X-API-Key: pk_live_abc123...
If you're building a web app, use the
P7Y logins library
to handle Firebase Auth and App Check. It generates the correct
Authorization and X-Firebase-AppCheck headers
automatically via auth.getAuthHeaders().
The Model Context Protocol (MCP) lets LLM agents call external tools. You can give any MCP-compatible agent (Claude, GPT, etc.) the ability to parse sentences by registering Parsimony as an MCP tool.
LLMs process text statistically. By calling Parsimony, an agent gets a structured logical representation it can reason over: checking negation scope, verifying quantifier interactions, detecting contradictions, or extracting entity relationships with precision that pure text processing can't match.
Add this to your MCP server configuration to expose Parsimony as a tool:
{
"tools": [{
"name": "parsimony_parse",
"description": "Parse an English sentence into a semantic tree with logic operators (negation, quantifiers, modals, knowledge, belief, conditionals). Returns structured JSON with semantic roles (agent, patient, theme, location) and word sense disambiguation.",
"input_schema": {
"type": "object",
"properties": {
"text": {
"type": "string",
"description": "The English sentence to parse"
}
},
"required": ["text"]
}
}]
}
from mcp.server.fastmcp import FastMCP
import httpx
mcp = FastMCP("parsimony")
API_KEY = "YOUR_API_KEY"
@mcp.tool()
async def parsimony_parse(text: str) -> dict:
"""Parse an English sentence into a semantic tree with
logic operators and semantic roles."""
async with httpx.AsyncClient() as client:
resp = await client.post(
"https://api.p7y.ai/parse",
headers={
"Content-Type": "application/json",
"X-API-Key": API_KEY,
},
json={"text": text},
)
return resp.json()
Contradiction detector: Parse two claims, compare their logic trees. If one has NOT(P) where the other has P under the same scope, flag a contradiction.
Knowledge graph builder: Parse a document sentence by sentence. Extract (agent, verb, patient) triples from each tree. Accumulate into a graph database for structured querying.
Fact verification agent: Parse a claim, identify its logical structure (universal? conditional? negated?), then verify each atomic proposition against a knowledge base.
Legal clause analyzer: Parse contract clauses, extract conditional obligations ("If X then Party A must Y"), identify negations and exceptions, and build a structured obligations graph.
Parsimony's structured semantic output is most valuable where logical precision matters:
Contract clauses are full of nested conditions, negations, and obligations. Parsimony decomposes them into trees where each operator's scope is explicit, enabling automated compliance checking and conflict detection.
A claim like "No European country has more than 100 million people" combines negation with a universal quantifier. Parsimony exposes this structure so a downstream system can verify the atomic predicate against data.
Extract structured (subject, relation, object) triples from text at scale. The semantic roles (agent, patient, theme) map directly to graph edges, and word sense disambiguation prevents conflation of homonyms.
Use Parsimony as a first stage before sentiment analysis, summarization, or question answering. The structured output reduces ambiguity that downstream models would otherwise have to resolve statistically.
Teach sentence structure, formal logic, and semantics interactively. Students type sentences and immediately see how meaning is composed from parts — making abstract linguistic concepts concrete.
Build systems that reason over natural language by converting text to logic trees, then applying inference rules. Parsimony's explicit quantifier and modal scoping makes this tractable where raw text would be ambiguous.
POST /parse Full semantic tree
Accepts {"text": "..."} and returns a JSON array of trees
(one per clause).
ID string Word identifier: word_POS or word_POS_version
POS string Part of speech: V, N, A, M
role string Semantic role: agent, patient, theme, location,
mark, combinator, manner, instrument, ...
definition string Word sense definition
wiki string Wikipedia article name (optional)
children array Child nodes
marks array Logic operators: NOT, ALL, SOME, IF, ...
pointer string Coreference target ID (resolved pronouns)
400 Bad request — missing or empty "text" field 401 Unauthorized — invalid or missing API key 402 Payment required — insufficient credits 429 Rate limited — too many requests 500 Internal server error — parsing failed
The API allows 60 requests per minute per API key. For higher throughput, contact us. Pricing is $40 per million words (40 microdollars per word).