Mc-agent-toolkit connection-auth-rules
Build a Connection Auth Rules for a Monte Carlo connection type. Fetches live connector schemas and transform steps from the apollo-agent repo.
git clone https://github.com/monte-carlo-data/mc-agent-toolkit
T=$(mktemp -d) && git clone --depth=1 https://github.com/monte-carlo-data/mc-agent-toolkit "$T" && mkdir -p ~/.claude/skills && cp -r "$T/skills/connection-auth-rules" ~/.claude/skills/monte-carlo-data-mc-agent-toolkit-connection-auth-rules && rm -rf "$T"
skills/connection-auth-rules/SKILL.mdConnection Auth Rules Builder
Use this skill when the user wants to build a Connection Auth Rules (stored as
ctp_config) for a Monte Carlo connection. The config is stored on the Connection object in the monolith and tells the Apollo agent how to transform flat credentials into the driver-specific connect_args format.
When to activate this skill
Activate when the user:
- Asks to create, build, or generate a Connection Auth Rules
- Asks what fields are needed for a connection type's Connection Auth Rules
- Wants to customize credential transformation for a connection
- Asks about
,MapperConfig
, orTransformStepCtpConfig - Says things like "help me write Connection Auth Rules for X", "what's the connection auth rules format for X"
When NOT to activate this skill
Do not activate when the user is:
- Creating monitors (use the monitor-creation skill)
- Investigating data incidents (use the analyze-root-cause skill)
- Setting up a connection in the UI (this skill builds the JSON config, not UI flows)
Step 1 — List available connection types
Locate the companion script with Bash:
find -L ~/.claude . -name fetch_schema.py -path "*/connection-auth-rules/*" 2>/dev/null | head -1
Then run it:
python3 <script_path> --list
The script outputs JSON. Parse
result.connectors — each entry has a name field. Present the names to the user and ask which connection type they want to build a config for.
If the script fails: Show the error output and offer to retry. Do not proceed until you have the connector list.
Step 2 — Fetch the connector schema
Once the user selects a connection type, run the script with that connector name:
python3 <script_path> --connector <name>
The script outputs JSON. Parse
result.schema:
— the driver-leveloutput_keys
keys the mapper must produce (from the connector'sconnect_args
)TypedDict
— the existing default mapping (credential field → Jinja2 template)default_field_map
— any default transform steps already configureddefault_steps
Present a summary to the user:
- The output keys
- The default mapper field_map entries
- Any existing steps with their types
Step 3 — Optionally fetch available transform steps
If the connector's default config (from Step 2) already includes steps, or if the user indicates they need custom transform steps, run:
python3 <script_path> --connector <name> --transforms
Parse
result.transforms — each entry has:
— the step type string used inname"type"
— fields the step reads from the pipeline statestep_input
— derived fields the step writes, referenceable asstep_output
in the mapper{{ derived.<key> }}
— typical mapper entry to wire the step's output intostep_field_mapconnect_args
Present the available steps with their full contracts (input, output, and field_map hint).
If the script fails: Tell the user and offer to retry. You can continue without step data — just describe steps as unknown and ask the user to specify them manually.
Step 4 — Build the mapper
Walk the user through each output key in the TypedDict:
- Show the default template from the connector's
(if one exists).MapperConfig - Ask if they want to keep the default or customize it.
- For custom values, help the user write a Jinja2 template expression.
Jinja2 template help
The template context has two namespaces:
— the flat credential dict as received. Useraw
to reference a credential field directly. Example:{{ raw.field_name }}{{ raw.client_id }}
— fields added by transform steps. Usederived
to reference a step's output. Example:{{ derived.field_name }}{{ derived.private_key_pem }}
Common patterns:
- Simple field reference:
"{{ raw.username }}" - Conditional/default:
"{{ raw.port | default('1433') }}" - Concatenation:
"{{ raw.host }}:{{ raw.port }}"
When the user doesn't know their credential field names, remind them these come from the Data Collector's credential dict — the keys are whatever the DC sends for that connection type.
Step 5 — Configure transform steps (optional)
If the connector needs steps (e.g. decoding a PEM certificate, constructing a derived field), help the user configure each step. A step dict has these fields:
| Field | Required | Description |
|---|---|---|
| yes | Step type name (e.g. ) |
| yes | Dict of template strings the step reads (e.g. ) |
| yes | Dict mapping the step's logical output names to derived key names (e.g. ) |
| no | Jinja2 boolean expression — step only runs if this evaluates to true (e.g. ) |
| no | Mapper entries contributed only when this step runs — useful for conditional fields |
Walk the user through
type, input, and output for each step. Ask about when if the step should only run under certain credential conditions (e.g. when an optional SSL cert is present).
Steps run in order before the mapper. The mapper can reference step outputs via
{{ derived.<key> }}.
Step 6 — Output the final config
Produce the complete Connection Auth Rules as a Python dict (ready to serialize to JSON for storage). This is stored as
ctp_config on the Connection model:
{ "steps": [ # each step as a dict, e.g.: { "type": "load_private_key", "input": { "pem": "{{ raw.private_key_pem }}" }, "output": { "private_key": "private_key_der" } # optional: "when": "raw.private_key_pem is defined" } ], "mapper": { "field_map": { "output_key": "{{ raw.credential_field }}", # step output referenced as: "private_key": "{{ derived.private_key_der }}" # ... } } }
Also show the equivalent JSON, since this is what gets stored in the monolith's
Connection.ctp_config field and entered in the "Connection auth rules" field in the UI.
Remind the user that validation happens server-side via
validateConnectionCtpConfig — they should test the config through that mutation (or the Validate button in the UI) after saving it.
Notes
- No in-skill validation. The skill helps construct the config but does not execute or validate it. The user validates via the monolith's
GraphQL mutation or the Validate button in the "Connection auth rules" UI section.validateConnectionCtpConfig
pattern. An emptyis not None
(field_map
) is valid — do not treat it as missing. The monolith checks{}
, not truthiness.ctp_config is not None- Steps are optional. Most simple connectors use
. Only add steps when the user needs credential transformation (e.g. PEM decoding, composite field construction).steps: [] - Fetch failures are recoverable. If the GitHub API fetch fails, tell the user exactly what failed and offer to retry. Do not silently fall back to guessed schemas.
- Naming: The user-facing name for this feature is "Connection auth rules". The underlying field and backend model remain
/ctp_config
.CtpConfig