curl / REST Client Reference¶
Every endpoint in Beyond Retrieval v2 as a copy-paste curl command. Commands are organized by workflow so you can follow them top to bottom for a complete integration.
Environment Setup¶
Set these variables once per session:
Common Patterns¶
1. Notebook Management¶
Create a Notebook¶
NOTEBOOK_ID=$(uuidgen | tr '[:upper:]' '[:lower:]')
curl -X POST "$BASE_URL/notebooks/" \
-H "Authorization: Bearer $TOKEN" \
-H "Content-Type: application/json" \
-d "{
\"notebook_id\": \"$NOTEBOOK_ID\",
\"notebook_title\": \"Customer Support KB\",
\"user_id\": \"dev-user\",
\"embedding_model\": \"openai/text-embedding-3-small\"
}"
List All Notebooks¶
Get a Single Notebook¶
Get Notebook Status¶
Update a Notebook¶
curl -X PATCH "$BASE_URL/notebooks/$NOTEBOOK_ID" \
-H "Authorization: Bearer $TOKEN" \
-H "Content-Type: application/json" \
-d '{"notebook_title": "Updated Title", "notebook_description": "New description"}'
Delete a Notebook¶
Delete All Notebooks¶
Destructive Operation
This permanently deletes all notebooks, documents, conversations, and vector data. There is no undo.
2. Document Pipeline¶
Upload Files¶
curl -X POST "$BASE_URL/notebooks/$NOTEBOOK_ID/documents/upload" \
-H "Authorization: Bearer $TOKEN" \
-F "files=@handbook.pdf" \
-F "files=@faq.docx"
List Storage Sources¶
Start Ingestion¶
curl -X POST "$BASE_URL/notebooks/$NOTEBOOK_ID/documents/ingest" \
-H "Authorization: Bearer $TOKEN" \
-H "Content-Type: application/json" \
-d "{
\"files\": [
{
\"file_id\": \"$FILE_ID\",
\"file_name\": \"handbook.pdf\",
\"file_path\": \"$NOTEBOOK_ID/$FILE_ID/handbook.pdf\"
}
],
\"settings\": {
\"parser\": \"Docling Parser\",
\"chunking_strategy\": \"Recursive Chunking\",
\"chunk_size\": 1000,
\"chunk_overlap\": 200
},
\"notebook_name\": \"Customer Support KB\"
}"
Poll Ingestion Progress¶
curl "$BASE_URL/notebooks/$NOTEBOOK_ID/documents/$FILE_ID/stage" \
-H "Authorization: Bearer $TOKEN"
List All Documents¶
Get Ingestion Settings for a File¶
curl "$BASE_URL/notebooks/$NOTEBOOK_ID/documents/settings?file_id=$FILE_ID" \
-H "Authorization: Bearer $TOKEN"
Get Context State¶
List Ingestion Errors¶
Re-Ingest a Single File¶
curl -X POST "$BASE_URL/notebooks/$NOTEBOOK_ID/documents/$FILE_ID/reingest" \
-H "Authorization: Bearer $TOKEN" \
-H "Content-Type: application/json" \
-d '{"settings": {"parser": "Mistral OCR", "chunk_size": 800}}'
Batch Re-Ingest¶
curl -X POST "$BASE_URL/notebooks/$NOTEBOOK_ID/documents/reingest-batch" \
-H "Authorization: Bearer $TOKEN" \
-H "Content-Type: application/json" \
-d "{\"file_ids\": [\"$FILE_ID_1\", \"$FILE_ID_2\"], \"settings\": {\"parser\": \"Docling Parser\"}}"
Delete a Document¶
curl -X DELETE "$BASE_URL/notebooks/$NOTEBOOK_ID/documents/$FILE_ID" \
-H "Authorization: Bearer $TOKEN"
Batch Delete Documents¶
curl -X POST "$BASE_URL/notebooks/$NOTEBOOK_ID/documents/delete-batch" \
-H "Authorization: Bearer $TOKEN" \
-H "Content-Type: application/json" \
-d "{\"file_ids\": [\"$FILE_ID_1\", \"$FILE_ID_2\"]}"
Mark a File as Errored¶
curl -X POST "$BASE_URL/notebooks/$NOTEBOOK_ID/documents/$FILE_ID/mark-error" \
-H "Authorization: Bearer $TOKEN" \
-H "Content-Type: application/json" \
-d '{"error_message": "Manual abort: file too large", "error_stage": "upload"}'
3. Chat and Conversations¶
Create a Conversation¶
curl -X POST "$BASE_URL/notebooks/$NOTEBOOK_ID/conversations" \
-H "Authorization: Bearer $TOKEN" \
-H "Content-Type: application/json" \
-d '{"title": "Support Chat", "chat_mode": "rag"}'
List Conversations¶
List Conversations (Include Archived)¶
curl "$BASE_URL/notebooks/$NOTEBOOK_ID/conversations?include_archived=true" \
-H "Authorization: Bearer $TOKEN"
Send a RAG Message¶
curl -X POST "$BASE_URL/notebooks/$NOTEBOOK_ID/conversations/$CONV_ID/messages" \
-H "Authorization: Bearer $TOKEN" \
-H "Content-Type: application/json" \
-d '{
"content": "What is the refund policy?",
"chat_mode": "rag",
"strategy_id": "fusion",
"persona": "professional",
"language": "en"
}'
Get Message History¶
curl "$BASE_URL/notebooks/$NOTEBOOK_ID/conversations/$CONV_ID/messages" \
-H "Authorization: Bearer $TOKEN"
Update a Conversation¶
curl -X PUT "$BASE_URL/notebooks/$NOTEBOOK_ID/conversations/$CONV_ID" \
-H "Authorization: Bearer $TOKEN" \
-H "Content-Type: application/json" \
-d '{"title": "Renamed Chat"}'
Pin / Unpin a Conversation¶
curl -X POST "$BASE_URL/notebooks/$NOTEBOOK_ID/conversations/$CONV_ID/pin" \
-H "Authorization: Bearer $TOKEN"
Submit Feedback¶
curl -X POST "$BASE_URL/notebooks/$NOTEBOOK_ID/messages/$MSG_ID/feedback" \
-H "Authorization: Bearer $TOKEN" \
-H "Content-Type: application/json" \
-d '{"is_positive": true, "feedback_text": "Very helpful answer"}'
Remove Feedback¶
curl -X DELETE "$BASE_URL/notebooks/$NOTEBOOK_ID/messages/$MSG_ID/feedback" \
-H "Authorization: Bearer $TOKEN"
Clear Chat History¶
curl -X DELETE "$BASE_URL/notebooks/$NOTEBOOK_ID/conversations/$CONV_ID/messages" \
-H "Authorization: Bearer $TOKEN"
Delete a Conversation¶
curl -X DELETE "$BASE_URL/notebooks/$NOTEBOOK_ID/conversations/$CONV_ID" \
-H "Authorization: Bearer $TOKEN"
Delete All Conversations¶
4. Search and Retrieval¶
List Available Strategies¶
Execute a Search¶
curl -X POST "$BASE_URL/notebooks/$NOTEBOOK_ID/retrieval/retrieve" \
-H "Authorization: Bearer $TOKEN" \
-H "Content-Type: application/json" \
-d '{
"query": "What is the refund policy?",
"strategy_id": "fusion",
"top_k": 10,
"full_text_weight": 1.0,
"semantic_weight": 1.0,
"rrf_k": 60
}'
Save Search to History¶
curl -X POST "$BASE_URL/notebooks/$NOTEBOOK_ID/retrieval/history" \
-H "Authorization: Bearer $TOKEN" \
-H "Content-Type: application/json" \
-d '{
"mode": "single",
"strategy_id": "fusion",
"strategy_name": "Fusion (RRF)",
"user_query": "What is the refund policy?",
"total_results": 5,
"execution_time_ms": 450
}'
List Search History¶
Get History Entry Detail¶
curl "$BASE_URL/notebooks/$NOTEBOOK_ID/retrieval/history/$ENTRY_ID" \
-H "Authorization: Bearer $TOKEN"
Get Compare Group¶
curl "$BASE_URL/notebooks/$NOTEBOOK_ID/retrieval/history/compare/$GROUP_ID" \
-H "Authorization: Bearer $TOKEN"
Delete History Entry¶
curl -X DELETE "$BASE_URL/notebooks/$NOTEBOOK_ID/retrieval/history/$ENTRY_ID" \
-H "Authorization: Bearer $TOKEN"
Delete Compare Group¶
curl -X DELETE "$BASE_URL/notebooks/$NOTEBOOK_ID/retrieval/history/compare/$GROUP_ID" \
-H "Authorization: Bearer $TOKEN"
Clear All History¶
curl -X DELETE "$BASE_URL/notebooks/$NOTEBOOK_ID/retrieval/history" \
-H "Authorization: Bearer $TOKEN"
5. AI Enhancement¶
List Files for Enhancement¶
List Chunks for a File¶
curl "$BASE_URL/notebooks/$NOTEBOOK_ID/enhance/files/$FILE_ID/chunks?limit=50&offset=0" \
-H "Authorization: Bearer $TOKEN"
Get Chunk Detail¶
curl "$BASE_URL/notebooks/$NOTEBOOK_ID/enhance/chunks/$CHUNK_ID" \
-H "Authorization: Bearer $TOKEN"
Get Enhancement Counts¶
# Notebook-wide
curl "$BASE_URL/notebooks/$NOTEBOOK_ID/enhance/count" \
-H "Authorization: Bearer $TOKEN"
# Per file
curl "$BASE_URL/notebooks/$NOTEBOOK_ID/enhance/count?file_id=$FILE_ID" \
-H "Authorization: Bearer $TOKEN"
Start Enhancement¶
# By file IDs
curl -X POST "$BASE_URL/notebooks/$NOTEBOOK_ID/enhance" \
-H "Authorization: Bearer $TOKEN" \
-H "Content-Type: application/json" \
-d "{\"file_ids\": [\"$FILE_ID\"]}"
# By chunk IDs
curl -X POST "$BASE_URL/notebooks/$NOTEBOOK_ID/enhance" \
-H "Authorization: Bearer $TOKEN" \
-H "Content-Type: application/json" \
-d '{"chunk_ids": ["chunk-1234", "chunk-5678"]}'
Poll Enhancement Status¶
curl "$BASE_URL/notebooks/$NOTEBOOK_ID/enhance/status?file_id=$FILE_ID" \
-H "Authorization: Bearer $TOKEN"
Publish Enhanced Chunks (File-Level)¶
curl -X POST "$BASE_URL/notebooks/$NOTEBOOK_ID/enhance/publish" \
-H "Authorization: Bearer $TOKEN" \
-H "Content-Type: application/json" \
-d "{
\"file_id\": \"$FILE_ID\",
\"file_name\": \"handbook.pdf\",
\"notebook_title\": \"Customer Support KB\"
}"
Publish Enhanced Chunks (Chunk-Level)¶
curl -X POST "$BASE_URL/notebooks/$NOTEBOOK_ID/enhance/publish-chunks" \
-H "Authorization: Bearer $TOKEN" \
-H "Content-Type: application/json" \
-d "{
\"chunk_ids\": [\"chunk-1234\", \"chunk-5678\"],
\"file_id\": \"$FILE_ID\",
\"file_name\": \"handbook.pdf\",
\"notebook_title\": \"Customer Support KB\"
}"
Reset Failed Chunks¶
curl -X POST "$BASE_URL/notebooks/$NOTEBOOK_ID/enhance/reset" \
-H "Authorization: Bearer $TOKEN" \
-H "Content-Type: application/json" \
-d "{\"file_id\": \"$FILE_ID\", \"trigger_enhancement\": true}"
Backfill Enhancement Table¶
curl -X POST "$BASE_URL/notebooks/$NOTEBOOK_ID/enhance/populate" \
-H "Authorization: Bearer $TOKEN"
Repair Chunk Metadata¶
# All files
curl -X POST "$BASE_URL/notebooks/$NOTEBOOK_ID/enhance/repair-metadata" \
-H "Authorization: Bearer $TOKEN"
# Specific file
curl -X POST "$BASE_URL/notebooks/$NOTEBOOK_ID/enhance/repair-metadata?file_id=$FILE_ID" \
-H "Authorization: Bearer $TOKEN"
6. Health and Monitoring¶
Run Health Check¶
Clean Up Duplicates¶
7. Settings and Admin¶
Get Notebook Settings¶
Update Notebook Settings¶
curl -X PUT "$BASE_URL/notebooks/$NOTEBOOK_ID/settings/" \
-H "Authorization: Bearer $TOKEN" \
-H "Content-Type: application/json" \
-d '{
"inference_provider": "openrouter",
"inference_model": "openai/gpt-4o-mini",
"inference_temperature": 0.4,
"judge_enabled": true,
"active_strategy_id": "fusion"
}'
Get Default System Prompts¶
List API Key Status¶
Save API Keys¶
curl -X PUT "$BASE_URL/settings/api-keys" \
-H "Authorization: Bearer $TOKEN" \
-H "Content-Type: application/json" \
-d '{"keys": {"openrouter_api_key": "sk-or-v1-abc123"}}'
Delete an API Key¶
Test an API Key¶
curl -X POST "$BASE_URL/settings/api-keys/test" \
-H "Authorization: Bearer $TOKEN" \
-H "Content-Type: application/json" \
-d '{"service": "openrouter", "api_key": "sk-or-v1-abc123"}'
Get Storage Provider¶
Set Storage Provider¶
curl -X PUT "$BASE_URL/settings/storage" \
-H "Authorization: Bearer $TOKEN" \
-H "Content-Type: application/json" \
-d '{"provider": "supabase"}'
Test S3 Connection¶
curl -X POST "$BASE_URL/settings/storage/test-s3" \
-H "Authorization: Bearer $TOKEN" \
-H "Content-Type: application/json" \
-d '{
"endpoint": "https://s3.example.com",
"access_key": "AKIAIOSFODNN7EXAMPLE",
"secret_key": "wJalrXUtnFEMI...",
"bucket": "my-bucket"
}'
Test Local Storage¶
curl -X POST "$BASE_URL/settings/storage/test-local" \
-H "Authorization: Bearer $TOKEN" \
-H "Content-Type: application/json" \
-d '{"path": "/data/storage"}'
Get Database Type¶
Switch Database Type¶
curl -X PUT "$BASE_URL/settings/db-type" \
-H "Authorization: Bearer $TOKEN" \
-H "Content-Type: application/json" \
-d '{"db_type": "local"}'
Test Local Database Connection¶
List LLM Models (OpenRouter)¶
List Embedding Models¶
List Ollama Models¶
8. Sharing and Collaboration¶
Create an Invite Link¶
curl -X POST "$BASE_URL/notebooks/$NOTEBOOK_ID/invites" \
-H "Authorization: Bearer $TOKEN" \
-H "Content-Type: application/json" \
-d '{"expires_at": "2026-12-31T00:00:00Z"}'
List Invites¶
Revoke an Invite¶
curl -X DELETE "$BASE_URL/notebooks/$NOTEBOOK_ID/invites/$INVITE_CODE" \
-H "Authorization: Bearer $TOKEN"
Get Invite Info (Public)¶
Redeem an Invite¶
List Users with Access¶
Revoke User Access¶
curl -X DELETE "$BASE_URL/notebooks/$NOTEBOOK_ID/access/$USER_ID" \
-H "Authorization: Bearer $TOKEN"
9. Authentication¶
Check Auth Configuration¶
Shell Script: Full Pipeline¶
A complete shell script that runs the entire pipeline from notebook creation to RAG chat:
#!/usr/bin/env bash
set -euo pipefail
BASE_URL="http://localhost:8000/api"
TOKEN="dev"
AUTH="Authorization: Bearer $TOKEN"
CT="Content-Type: application/json"
# 1. Create notebook
NOTEBOOK_ID=$(uuidgen | tr '[:upper:]' '[:lower:]')
echo "Creating notebook: $NOTEBOOK_ID"
curl -s -X POST "$BASE_URL/notebooks/" \
-H "$AUTH" -H "$CT" \
-d "{\"notebook_id\":\"$NOTEBOOK_ID\",\"notebook_title\":\"Shell Test\",\"user_id\":\"dev-user\"}" \
| jq -r '.data.notebook_title'
# 2. Upload
echo "Uploading file..."
UPLOAD=$(curl -s -X POST "$BASE_URL/notebooks/$NOTEBOOK_ID/documents/upload" \
-H "$AUTH" -F "files=@handbook.pdf")
FILE_ID=$(echo "$UPLOAD" | jq -r '.data[0].file_id')
FILE_PATH=$(echo "$UPLOAD" | jq -r '.data[0].storage_path')
echo "File ID: $FILE_ID"
# 3. Ingest
echo "Starting ingestion..."
curl -s -X POST "$BASE_URL/notebooks/$NOTEBOOK_ID/documents/ingest" \
-H "$AUTH" -H "$CT" \
-d "{\"files\":[{\"file_id\":\"$FILE_ID\",\"file_name\":\"handbook.pdf\",\"file_path\":\"$FILE_PATH\"}],\"notebook_name\":\"Shell Test\"}" \
| jq '.data.jobs | length'
# 4. Poll
echo "Polling ingestion status..."
while true; do
STATUS=$(curl -s "$BASE_URL/notebooks/$NOTEBOOK_ID/documents/$FILE_ID/stage" \
-H "$AUTH" | jq -r '.data.status')
echo " Status: $STATUS"
[ "$STATUS" = "success" ] || [ "$STATUS" = "error" ] && break
sleep 3
done
# 5. Create conversation
echo "Creating conversation..."
CONV_ID=$(curl -s -X POST "$BASE_URL/notebooks/$NOTEBOOK_ID/conversations" \
-H "$AUTH" -H "$CT" \
-d '{"title":"Shell Chat"}' | jq -r '.data.conversation_id')
echo "Conversation: $CONV_ID"
# 6. Chat
echo "Sending message..."
curl -s -X POST "$BASE_URL/notebooks/$NOTEBOOK_ID/conversations/$CONV_ID/messages" \
-H "$AUTH" -H "$CT" \
-d '{"content":"Summarize the document","strategy_id":"fusion"}' \
| jq -r '.data.assistant_message.content' | head -c 200
echo -e "\n\nDone."