Marketplace backend-fastapi
Documentation for the FastAPI backend, endpoints, and dependency injection.
install
source · Clone the upstream repo
git clone https://github.com/aiskillstore/marketplace
Claude Code · Install into ~/.claude/skills/
T=$(mktemp -d) && git clone --depth=1 https://github.com/aiskillstore/marketplace "$T" && mkdir -p ~/.claude/skills && cp -r "$T/skills/abdulsamad94/backend-fastapi" ~/.claude/skills/aiskillstore-marketplace-backend-fastapi && rm -rf "$T"
manifest:
skills/abdulsamad94/backend-fastapi/SKILL.mdsource content
Backend Architecture (FastAPI)
Overview
The backend is a FastAPI application located in
backend/. It powers the chatbot and RAG functionality.
Entry Point
- File:
backend/main.py - Run:
(or viauvicorn backend.main:app --reload
)npm run dev - Port: Defaults to
.8000
Endpoints
POST /api/chat
POST /api/chat- Purpose: Main RAG chat endpoint.
- Input:
(query, history, user_context).ChatRequest - Process:
- Embed query.
- Search Qdrant (
).search_qdrant - Build prompt (
).build_rag_prompt - Generate Agent response.
- Output:
(answer, contexts).ChatResponse
POST /api/ask-selection
POST /api/ask-selection- Purpose: Targeted Q&A on selected text.
- Input:
(question, selected_text).AskSelectionRequest - Process:
- Validates selection length.
- Builds selection-specific prompt.
- specific Agent instructions.
Dependencies & Utils
: Qdrant initialization.backend/utils/config.py
: Embedding and Prompt building logic.backend/utils/helpers.py
: OpenAI/Gemini client setup.backend/models.py
Environment Variables
: For LLM and Embeddings.GEMINI_API_KEY
,QDRANT_URL
: Vector DB connection.QDRANT_API_KEY