Skilllibrary gcp
Provision and operate GCP services — deploy to Cloud Run, query BigQuery datasets, configure IAM roles and service accounts, manage GCS buckets, wire Pub/Sub topics, set up Cloud Build pipelines, and push to Artifact Registry. Use when tasks involve gcloud CLI, GCP console configuration, or GCP service integration. Do not use for Firebase-specific features (prefer firebase skill) or AWS/Azure services.
install
source · Clone the upstream repo
git clone https://github.com/merceralex397-collab/skilllibrary
Claude Code · Install into ~/.claude/skills/
T=$(mktemp -d) && git clone --depth=1 https://github.com/merceralex397-collab/skilllibrary "$T" && mkdir -p ~/.claude/skills && cp -r "$T/14-cloud-platform-devops/gcp" ~/.claude/skills/merceralex397-collab-skilllibrary-gcp && rm -rf "$T"
manifest:
14-cloud-platform-devops/gcp/SKILL.mdsource content
Purpose
Provision, configure, and operate Google Cloud Platform services — deploy containerized services to Cloud Run, query and manage BigQuery datasets, configure IAM roles and service accounts, manage GCS buckets, wire Pub/Sub topics and subscriptions, set up Cloud Build CI/CD pipelines, and push container images to Artifact Registry.
When to use this skill
- Deploying a container to Cloud Run with
.gcloud run deploy - Creating or querying BigQuery datasets, tables, or scheduled queries.
- Configuring IAM roles, service accounts, and workload identity federation.
- Creating GCS buckets, setting lifecycle policies, or configuring public access.
- Setting up Pub/Sub topics, subscriptions, and push/pull configurations.
- Writing
for Cloud Build CI/CD pipelines.cloudbuild.yaml - Pushing or pulling container images from Artifact Registry.
- Configuring VPC connectors, Cloud NAT, or private service connections.
- Using
CLI commands for any GCP resource provisioning.gcloud
Do not use this skill when
- The task involves Firebase-specific features (Auth, Firestore rules, Hosting, Emulators) — prefer
.firebase - The target is AWS or Azure — prefer
or the relevant Azure skill.aws - The task is about generic deployment strategy (blue-green, canary) — prefer
.cloud-deploy - Infrastructure is managed via Terraform — prefer
for HCL authoring, but use this skill forterraform-iac
validation and GCP-specific decisions.gcloud
Operating procedure
- Identify the GCP project and region. Confirm the active project with
. Set the target region withgcloud config get-value project
. Verify billing is enabled on the project.gcloud config set run/region <region> - Enable required APIs. Run
for each service needed (e.g.,gcloud services enable <api>
,run.googleapis.com
,bigquery.googleapis.com
,pubsub.googleapis.com
,cloudbuild.googleapis.com
).artifactregistry.googleapis.com - Configure IAM. Create a dedicated service account:
. Grant the minimum required roles:gcloud iam service-accounts create <name> --display-name="<description>"
. Prefer predefined roles over primitive roles (Viewer/Editor/Owner).gcloud projects add-iam-policy-binding <project> --member="serviceAccount:<sa>" --role="roles/<role>" - Deploy to Cloud Run. Build the container:
. Deploy:gcloud builds submit --tag <region>-docker.pkg.dev/<project>/<repo>/<image>:<tag>
(orgcloud run deploy <service> --image=<image> --region=<region> --service-account=<sa> --allow-unauthenticated
for private services). Set environment variables with--no-allow-unauthenticated
and secrets with--set-env-vars
.--set-secrets - Set up Artifact Registry. Create a repository:
. Configure Docker auth:gcloud artifacts repositories create <repo> --repository-format=docker --location=<region>
.gcloud auth configure-docker <region>-docker.pkg.dev - Configure BigQuery. Create a dataset:
. Create tables with schema:bq mk --dataset <project>:<dataset>
. Load data:bq mk --table <dataset>.<table> schema.json
. Schedule queries:bq load --source_format=CSV <dataset>.<table> gs://<bucket>/<file>
.bq query --schedule='every 24 hours' --display_name="<name>" '<SQL>' - Manage GCS buckets. Create a bucket:
. Set lifecycle rules: create agcloud storage buckets create gs://<bucket> --location=<region> --uniform-bucket-level-access
with age-based deletion or storage class transitions, apply withlifecycle.json
.gcloud storage buckets update gs://<bucket> --lifecycle-file=lifecycle.json - Wire Pub/Sub. Create a topic:
. Create a push subscription:gcloud pubsub topics create <topic>
. Create a pull subscription:gcloud pubsub subscriptions create <sub> --topic=<topic> --push-endpoint=<url> --ack-deadline=60
. Set dead-letter topic withgcloud pubsub subscriptions create <sub> --topic=<topic> --ack-deadline=60
and--dead-letter-topic=<dlq>
.--max-delivery-attempts=5 - Set up Cloud Build. Write
with steps: build the container, push to Artifact Registry, deploy to Cloud Run. Configure build triggers:cloudbuild.yaml
.gcloud builds triggers create github --repo-name=<repo> --branch-pattern="^main$" --build-config=cloudbuild.yaml - Verify the deployment. For Cloud Run:
andgcloud run services describe <service> --region=<region>
. For BigQuery: run a test query. For Pub/Sub: publish a test message withcurl <service-url>
and verify delivery.gcloud pubsub topics publish <topic> --message="test" - Set up monitoring. Configure uptime checks in Cloud Monitoring for Cloud Run URLs. Set alert policies for error rate (>1%), latency (p99 >2s), and instance count. Enable Cloud Logging and create log-based metrics for application errors.
Decision rules
- Use Cloud Run for stateless HTTP services and containers — it scales to zero and requires no cluster management.
- Use Cloud Functions (2nd gen) only for event-driven triggers that do not need custom container images.
- Use BigQuery for analytical queries over large datasets — do not use it as a transactional database.
- Use Pub/Sub for async messaging between services — set ack deadlines based on expected processing time.
- Always use dedicated service accounts per service — never use the default compute service account in production.
- Prefer Artifact Registry over Container Registry (deprecated) for container images.
- Use workload identity federation over exported service account keys when authenticating from external systems (GitHub Actions, other clouds).
- Set Cloud Run min-instances >0 for latency-sensitive services to avoid cold starts.
Output requirements
- Service configuration — the
commands orgcloud
used to provision and deploy.cloudbuild.yaml - IAM configuration — service accounts created, roles granted, and the principle of least privilege rationale.
- Verification result — confirmed the service is reachable, queries return expected results, or messages are delivered.
- Monitoring setup — uptime checks, alert policies, and log-based metrics configured.
- Rollback path — previous Cloud Run revision ID or Artifact Registry image tag to revert to.
References
- Cloud Run documentation: https://cloud.google.com/run/docs
- BigQuery documentation: https://cloud.google.com/bigquery/docs
- IAM best practices: https://cloud.google.com/iam/docs/using-iam-securely
- Pub/Sub documentation: https://cloud.google.com/pubsub/docs
- Cloud Build documentation: https://cloud.google.com/build/docs
- Artifact Registry: https://cloud.google.com/artifact-registry/docs
references/preflight-checklist.md
Related skills
— Firebase Auth, Firestore, Hosting, Cloud Functions within the Firebase SDK.firebase
— AWS service equivalents (Lambda, SQS, S3, ECR).aws
— managing GCP resources via Terraform HCL.terraform-iac
Anti-patterns
- Using the default compute service account for Cloud Run services — it has overly broad permissions.
- Granting
orroles/owner
to service accounts — use specific predefined roles.roles/editor - Exporting service account keys when workload identity federation is available.
- Using Container Registry (
) for new projects — it is deprecated in favor of Artifact Registry.gcr.io - Deploying Cloud Run services without setting memory and CPU limits — leads to unexpected costs.
- Skipping API enablement —
commands fail with confusing errors when the service API is not enabled.gcloud - Hardcoding project IDs in
— usecloudbuild.yaml
substitution variable.$PROJECT_ID
Failure handling
- If
fails with permission errors, verify the deployer hasgcloud run deploy
and the service account hasroles/run.admin
.roles/run.invoker - If BigQuery queries fail with access denied, check that the querying identity has
on the dataset androles/bigquery.dataViewer
on the project.roles/bigquery.jobUser - If Pub/Sub messages are not being delivered, check subscription ack deadline, push endpoint health, and dead-letter queue for failed deliveries.
- If Cloud Build triggers do not fire, verify the GitHub connection is authorized and the branch pattern matches.
- If the task involves Firebase-specific features (Firestore rules, Auth providers, Emulator Suite), redirect to the
skill.firebase