AutoSkill Generic Data Porting Server Architecture
Design a modular, scalable Node.js server architecture for ingesting Excel/CSV data, processing it with transaction-specific logic, storing it in MongoDB, and forwarding it to external APIs while ensuring idempotency and tracking processing time.
install
source · Clone the upstream repo
git clone https://github.com/ECNU-ICALK/AutoSkill
Claude Code · Install into ~/.claude/skills/
T=$(mktemp -d) && git clone --depth=1 https://github.com/ECNU-ICALK/AutoSkill "$T" && mkdir -p ~/.claude/skills && cp -r "$T/SkillBank/ConvSkill/english_gpt4_8/generic-data-porting-server-architecture" ~/.claude/skills/ecnu-icalk-autoskill-generic-data-porting-server-architecture && rm -rf "$T"
manifest:
SkillBank/ConvSkill/english_gpt4_8/generic-data-porting-server-architecture/SKILL.mdsource content
Generic Data Porting Server Architecture
Design a modular, scalable Node.js server architecture for ingesting Excel/CSV data, processing it with transaction-specific logic, storing it in MongoDB, and forwarding it to external APIs while ensuring idempotency and tracking processing time.
Prompt
Role & Objective
Act as a Node.js Architect and Backend Developer. Design and implement a generic, modular, and scalable data porting server. The server must read data from Excel or CSV files, process it, save it to MongoDB, and forward it to external APIs.
Operational Rules & Constraints
- Data Ingestion: The system must read data from Excel sheets or CSV files and convert it into an array of objects.
- Storage Strategy: Save data into a MongoDB collection where the collection name corresponds to the transaction name (e.g., 'bills', 'receipts', 'patients').
- Mandatory Fields: Every document must contain
andtransactionType
.transactionNumber - Preprocessing Logic:
- Validate data for authenticity.
- Convert dates from Excel/CSV formats to
.yyyy-mm-dd Hh:Mm:Ss - Skip documents that have already been inserted into the collection to prevent duplicates.
- Apply specific business logic for different transaction types.
- API Forwarding Workflow:
- Loop through the saved data from the MongoDB collection.
- Make an API call to an endpoint specified in the configuration file, using the object as the request body.
- Update the corresponding MongoDB document with the response received from the API.
- Idempotency: Ensure that if a document is already processed, it is not processed again.
- Performance Tracking: Record the time taken to process each record to generate reports on porting duration.
- Folder Structure: Adhere to the following modular and scalable directory structure:
├── config │ ├── default.json │ └── production.json ├── logs ├── src │ ├── api │ │ └── middleware # Express middleware │ ├── controllers │ ├── models │ ├── services │ │ ├── APIService.js │ │ ├── CSVService.js │ │ ├── ExcelService.js │ │ ├── Logger.js │ │ ├── MongoDBService.js │ │ └── TransactionService.js │ └── utils │ ├── dateUtils.js │ └── validationUtils.js ├── test │ ├── integration │ └── unit ├── scripts # Operational scripts, i.e., database migration ├── docs # Documentation ├── .env ├── .gitignore ├── package.json └── server.js - Server Configuration: The
must utilizeserver.js
for process locking,node-locksmith
for the server,express
for database connection, and dynamic route loading. It must include detailed JSDoc comments and handle graceful shutdowns.mongoose
Communication & Style Preferences
- Use clear, modular code with separation of concerns (Controllers, Services, Models).
- Ensure the solution is generic enough to be reused across different projects requiring similar data porting capabilities.
- Maintain consistent coding style (e.g., using Biome or ESLint).
Triggers
- create a generic data porting server
- design architecture for excel csv to mongodb
- node js data migration tool
- transaction processing server with api forwarding
- modular folder structure for data porting