Claude-skill-registry django-cloud-sql-postgres
git clone https://github.com/majiayu000/claude-skill-registry
T=$(mktemp -d) && git clone --depth=1 https://github.com/majiayu000/claude-skill-registry "$T" && mkdir -p ~/.claude/skills && cp -r "$T/skills/data/django-cloud-sql-postgres" ~/.claude/skills/majiayu000-claude-skill-registry-django-cloud-sql-postgres-594806 && rm -rf "$T"
skills/data/django-cloud-sql-postgres/SKILL.mdDjango on Google Cloud SQL PostgreSQL
Status: Production Ready Last Updated: 2026-01-24 Dependencies: None Latest Versions:
Django@5.1, psycopg2-binary@2.9.9, gunicorn@23.0.0, google-cloud-sql-connector@1.12.0
Quick Start (10 Minutes)
1. Install Dependencies
pip install Django psycopg2-binary gunicorn
For Cloud SQL Python Connector (recommended for local dev):
pip install "cloud-sql-python-connector[pg8000]"
Why this matters:
is the PostgreSQL adapter for Djangopsycopg2-binary
is required for App Engine Standard (Python 3.10+)gunicorn- Cloud SQL Python Connector provides secure connections without SSH tunneling
2. Configure Django Settings
settings.py (production with Unix socket):
import os # Detect App Engine environment IS_APP_ENGINE = os.getenv('GAE_APPLICATION', None) if IS_APP_ENGINE: # Production: Connect via Unix socket DATABASES = { 'default': { 'ENGINE': 'django.db.backends.postgresql', 'NAME': os.environ['DB_NAME'], 'USER': os.environ['DB_USER'], 'PASSWORD': os.environ['DB_PASSWORD'], 'HOST': f"/cloudsql/{os.environ['CLOUD_SQL_CONNECTION_NAME']}", 'PORT': '', # Empty for Unix socket } } else: # Local development: Connect via Cloud SQL Proxy DATABASES = { 'default': { 'ENGINE': 'django.db.backends.postgresql', 'NAME': os.environ.get('DB_NAME', 'mydb'), 'USER': os.environ.get('DB_USER', 'postgres'), 'PASSWORD': os.environ.get('DB_PASSWORD', ''), 'HOST': '127.0.0.1', 'PORT': '5432', } }
CRITICAL:
- App Engine connects via Unix socket at
/cloudsql/PROJECT:REGION:INSTANCE - Local development requires Cloud SQL Auth Proxy on
127.0.0.1:5432 - Never hardcode connection strings - use environment variables
3. Create app.yaml
runtime: python310 entrypoint: gunicorn -b :$PORT myproject.wsgi:application env_variables: DB_NAME: "mydb" DB_USER: "postgres" CLOUD_SQL_CONNECTION_NAME: "project-id:region:instance-name" # Cloud SQL connection beta_settings: cloud_sql_instances: "project-id:region:instance-name" handlers: - url: /static static_dir: static/ - url: /.* script: auto secure: always
CRITICAL:
enables the Unix socket atbeta_settings.cloud_sql_instances/cloudsql/...- DB_PASSWORD should be set via
or Secret Manager, not in app.yamlgcloud app deploy
The 6-Step Setup Process
Step 1: Create Cloud SQL Instance
# Create PostgreSQL instance gcloud sql instances create myinstance \ --database-version=POSTGRES_15 \ --tier=db-f1-micro \ --region=us-central1 # Create database gcloud sql databases create mydb --instance=myinstance # Create user gcloud sql users create postgres \ --instance=myinstance \ --password=YOUR_SECURE_PASSWORD
Key Points:
- Use
or later for best compatibilityPOSTGRES_15
is cheapest for dev ($7-10/month), usedb-f1-micro
or higher for productiondb-g1-small- Note the connection name:
PROJECT_ID:REGION:INSTANCE_NAME
Step 2: Configure Django Project
requirements.txt:
Django>=5.1,<6.0 psycopg2-binary>=2.9.9 gunicorn>=23.0.0 whitenoise>=6.7.0
settings.py additions:
import os # Security settings for production DEBUG = os.environ.get('DEBUG', 'False') == 'True' ALLOWED_HOSTS = [ '.appspot.com', '.run.app', 'localhost', '127.0.0.1', ] # Static files with WhiteNoise STATIC_URL = '/static/' STATIC_ROOT = os.path.join(BASE_DIR, 'static') MIDDLEWARE.insert(1, 'whitenoise.middleware.WhiteNoiseMiddleware') STATICFILES_STORAGE = 'whitenoise.storage.CompressedManifestStaticFilesStorage' # Database connection pooling DATABASES['default']['CONN_MAX_AGE'] = 60 # Keep connections open for 60 seconds
Why these settings:
reduces connection overhead (Cloud SQL has connection limits)CONN_MAX_AGE=60- WhiteNoise serves static files without Cloud Storage
must includeALLOWED_HOSTS
for App Engine.appspot.com
Step 3: Set Up Local Development with Cloud SQL Proxy
Install Cloud SQL Auth Proxy:
# macOS brew install cloud-sql-proxy # Linux curl -o cloud-sql-proxy https://storage.googleapis.com/cloud-sql-connectors/cloud-sql-proxy/v2.14.1/cloud-sql-proxy.linux.amd64 chmod +x cloud-sql-proxy
Run the proxy:
# Authenticate first gcloud auth application-default login # Start proxy (runs on 127.0.0.1:5432) ./cloud-sql-proxy PROJECT_ID:REGION:INSTANCE_NAME # Or with specific port ./cloud-sql-proxy PROJECT_ID:REGION:INSTANCE_NAME --port=5432
Set environment variables for local dev:
export DB_NAME=mydb export DB_USER=postgres export DB_PASSWORD=your_password export DEBUG=True
Key Points:
- Proxy creates a secure tunnel to Cloud SQL
- No need to whitelist your IP address
- Works with both password and IAM authentication
Step 4: Run Migrations
# Local (with proxy running) python manage.py migrate # Verify connection python manage.py dbshell
For production migrations (via Cloud Build or local with proxy):
# Option 1: Run locally with proxy ./cloud-sql-proxy PROJECT:REGION:INSTANCE & python manage.py migrate # Option 2: Use Cloud Build (recommended) # See references/cloud-build-migrations.md
Step 5: Configure Gunicorn
gunicorn.conf.py (optional, for fine-tuning):
import multiprocessing # Workers workers = 2 # App Engine Standard limits this threads = 4 worker_class = 'gthread' # Timeout (App Engine has 60s limit for standard, 3600s for flexible) timeout = 55 # Logging accesslog = '-' errorlog = '-' loglevel = 'info' # Bind (App Engine sets $PORT) bind = f"0.0.0.0:{os.environ.get('PORT', '8080')}"
app.yaml entrypoint options:
# Simple (recommended for most cases) entrypoint: gunicorn -b :$PORT myproject.wsgi:application # With config file entrypoint: gunicorn -c gunicorn.conf.py myproject.wsgi:application # With workers and timeout entrypoint: gunicorn -b :$PORT -w 2 -t 55 myproject.wsgi:application
Key Points:
- App Engine Standard limits workers (F1: 1 worker, F2: 2 workers)
- Use
worker class for I/O-bound Django appsgthread - Set timeout < 60s to avoid App Engine killing requests
Step 6: Deploy to App Engine
# Collect static files python manage.py collectstatic --noinput # Deploy gcloud app deploy app.yaml # Set DB password via environment gcloud app deploy --set-env-vars="DB_PASSWORD=your_secure_password" # View logs gcloud app logs tail -s default
Verify deployment:
# Open in browser gcloud app browse # Check database connection gcloud app logs read --service=default | grep -i database
Critical Rules (Django + Cloud SQL)
ALWAYS DO:
- Use Unix socket path
on App Engine/cloudsql/PROJECT:REGION:INSTANCE - Set
(empty string) for Unix socket connectionsPORT='' - Use Cloud SQL Auth Proxy for local development
- Set
for connection pooling (30-60 seconds)CONN_MAX_AGE - Use environment variables for database credentials
- Use Secret Manager for production passwords
NEVER DO:
- Put database password in
(use Secret Manager or env vars at deploy time)app.yaml - Use
on App Engine (must use Unix socket path)HOST='localhost' - Forget
in app.yamlbeta_settings.cloud_sql_instances - Set
(unlimited) - can exhaust connection poolCONN_MAX_AGE=None - Skip SSL on Cloud SQL (it's enforced by default)
Known Issues Prevention
This skill prevents 12 documented issues:
Issue #1: OperationalError - No such file or directory
Error:
django.db.utils.OperationalError: could not connect to server: No such file or directory
Source: https://cloud.google.com/sql/docs/postgres/connect-app-engine-standard
Why It Happens: App Engine can't find the Unix socket because beta_settings.cloud_sql_instances is missing in app.yaml
Prevention: Always include beta_settings.cloud_sql_instances: "PROJECT:REGION:INSTANCE" in app.yaml
Issue #2: Connection Refused on Local Development
Error:
django.db.utils.OperationalError: connection refused or could not connect to server: Connection refused
Source: https://cloud.google.com/sql/docs/postgres/connect-auth-proxy
Why It Happens: Cloud SQL Auth Proxy is not running or bound to wrong port
Prevention: Start cloud-sql-proxy PROJECT:REGION:INSTANCE before running Django locally. Verify it's running on port 5432.
Issue #3: FATAL: password authentication failed
Error:
FATAL: password authentication failed for user "postgres"
Source: https://cloud.google.com/sql/docs/postgres/create-manage-users
Why It Happens: Wrong password in environment variable, or user doesn't exist in Cloud SQL instance
Prevention: Verify password with gcloud sql users list --instance=INSTANCE. Reset if needed: gcloud sql users set-password postgres --instance=INSTANCE --password=NEW_PASSWORD
Issue #4: Too Many Connections
Error:
FATAL: too many connections for role "postgres" or remaining connection slots are reserved
Source: https://cloud.google.com/sql/docs/postgres/quotas#connection_limits
Why It Happens: Each request opens a new connection without pooling, exhausting the 25-100 connection limit
Prevention: Set CONN_MAX_AGE = 60 in Django settings. For high traffic, use PgBouncer or django-db-connection-pool.
Issue #5: App Engine Request Timeout
Error:
DeadlineExceededError or request terminated after 60 seconds
Source: https://cloud.google.com/appengine/docs/standard/python3/how-requests-are-handled
Why It Happens: Database query or migration takes longer than App Engine's 60-second limit
Prevention: Set Gunicorn timeout to 55 seconds. For long-running tasks, use Cloud Tasks or Cloud Run Jobs.
Issue #6: Static Files 404 in Production
Error: Static files return 404, CSS/JS not loading Source: https://cloud.google.com/appengine/docs/standard/serving-static-files Why It Happens: Missing
static/ handler in app.yaml or collectstatic not run before deploy
Prevention: Run python manage.py collectstatic --noinput before deploy. Include static handler in app.yaml.
Issue #7: CSRF Verification Failed
Error:
Forbidden (403) CSRF verification failed
Source: Django documentation on CSRF
Why It Happens: CSRF_TRUSTED_ORIGINS not set for appspot.com domain
Prevention: Add CSRF_TRUSTED_ORIGINS = ['https://*.appspot.com'] to settings.py
Issue #8: Database Not Found After Deployment
Error:
django.db.utils.OperationalError: FATAL: database "mydb" does not exist
Source: https://cloud.google.com/sql/docs/postgres/create-manage-databases
Why It Happens: Database exists in Cloud SQL but DB_NAME environment variable is wrong
Prevention: Verify database name: gcloud sql databases list --instance=INSTANCE. Ensure DB_NAME matches exactly.
Issue #9: IAM Authentication Failure
Error:
FATAL: Cloud SQL IAM user authentication failed
Source: https://cloud.google.com/sql/docs/postgres/iam-logins
Why It Happens: App Engine service account doesn't have roles/cloudsql.instanceUser role
Prevention: Grant role: gcloud projects add-iam-policy-binding PROJECT --member="serviceAccount:PROJECT@appspot.gserviceaccount.com" --role="roles/cloudsql.instanceUser"
Issue #10: Migrations Fail in Production
Error: Migrations timeout or can't run during deployment Source: https://cloud.google.com/sql/docs/postgres/connect-build Why It Happens: App Engine deploy doesn't provide a migration step; running in entrypoint times out Prevention: Run migrations separately via Cloud Build, or locally with Cloud SQL Proxy before deploying.
Issue #11: SECRET_KEY Exposed
Error: Security warning about hardcoded SECRET_KEY Source: Django deployment checklist Why It Happens: SECRET_KEY is in settings.py instead of environment variable or Secret Manager Prevention: Use
SECRET_KEY = os.environ.get('SECRET_KEY') and set via gcloud app deploy --set-env-vars or Secret Manager.
Issue #12: Cold Start Database Timeout
Error: First request after idle period times out Source: https://cloud.google.com/appengine/docs/standard/how-instances-are-managed Why It Happens: App Engine instance cold start + Cloud SQL activation delay (if using "on demand" activation) Prevention: Use App Engine warmup requests, or keep Cloud SQL instance "always on" (increases cost). Set
app_engine_apis: true and add /_ah/warmup handler.
Configuration Files Reference
settings.py (Complete Production Example)
import os from pathlib import Path BASE_DIR = Path(__file__).resolve().parent.parent # Security SECRET_KEY = os.environ.get('SECRET_KEY', 'dev-key-change-in-production') DEBUG = os.environ.get('DEBUG', 'False') == 'True' ALLOWED_HOSTS = [ '.appspot.com', '.run.app', 'localhost', '127.0.0.1', ] # CSRF for App Engine CSRF_TRUSTED_ORIGINS = [ 'https://*.appspot.com', 'https://*.run.app', ] # Application definition INSTALLED_APPS = [ 'django.contrib.admin', 'django.contrib.auth', 'django.contrib.contenttypes', 'django.contrib.sessions', 'django.contrib.messages', 'django.contrib.staticfiles', # Your apps here ] MIDDLEWARE = [ 'django.middleware.security.SecurityMiddleware', 'whitenoise.middleware.WhiteNoiseMiddleware', # Static files 'django.contrib.sessions.middleware.SessionMiddleware', 'django.middleware.common.CommonMiddleware', 'django.middleware.csrf.CsrfViewMiddleware', 'django.contrib.auth.middleware.AuthenticationMiddleware', 'django.contrib.messages.middleware.MessageMiddleware', 'django.middleware.clickjacking.XFrameOptionsMiddleware', ] ROOT_URLCONF = 'myproject.urls' TEMPLATES = [ { 'BACKEND': 'django.template.backends.django.DjangoTemplates', 'DIRS': [BASE_DIR / 'templates'], 'APP_DIRS': True, 'OPTIONS': { 'context_processors': [ 'django.template.context_processors.debug', 'django.template.context_processors.request', 'django.contrib.auth.context_processors.auth', 'django.contrib.messages.context_processors.messages', ], }, }, ] WSGI_APPLICATION = 'myproject.wsgi.application' # Database IS_APP_ENGINE = os.getenv('GAE_APPLICATION', None) if IS_APP_ENGINE: DATABASES = { 'default': { 'ENGINE': 'django.db.backends.postgresql', 'NAME': os.environ['DB_NAME'], 'USER': os.environ['DB_USER'], 'PASSWORD': os.environ['DB_PASSWORD'], 'HOST': f"/cloudsql/{os.environ['CLOUD_SQL_CONNECTION_NAME']}", 'PORT': '', 'CONN_MAX_AGE': 60, 'OPTIONS': { 'connect_timeout': 10, }, } } else: DATABASES = { 'default': { 'ENGINE': 'django.db.backends.postgresql', 'NAME': os.environ.get('DB_NAME', 'mydb'), 'USER': os.environ.get('DB_USER', 'postgres'), 'PASSWORD': os.environ.get('DB_PASSWORD', ''), 'HOST': os.environ.get('DB_HOST', '127.0.0.1'), 'PORT': os.environ.get('DB_PORT', '5432'), 'CONN_MAX_AGE': 60, } } # Static files STATIC_URL = '/static/' STATIC_ROOT = BASE_DIR / 'static' STATICFILES_STORAGE = 'whitenoise.storage.CompressedManifestStaticFilesStorage' # Default primary key field type DEFAULT_AUTO_FIELD = 'django.db.models.BigAutoField' # Logging LOGGING = { 'version': 1, 'disable_existing_loggers': False, 'handlers': { 'console': { 'class': 'logging.StreamHandler', }, }, 'root': { 'handlers': ['console'], 'level': 'INFO', }, 'loggers': { 'django': { 'handlers': ['console'], 'level': os.getenv('DJANGO_LOG_LEVEL', 'INFO'), 'propagate': False, }, }, }
app.yaml (Complete Example)
runtime: python310 entrypoint: gunicorn -b :$PORT -w 2 -t 55 --threads 4 myproject.wsgi:application instance_class: F2 env_variables: DB_NAME: "mydb" DB_USER: "postgres" CLOUD_SQL_CONNECTION_NAME: "project-id:us-central1:instance-name" DEBUG: "False" beta_settings: cloud_sql_instances: "project-id:us-central1:instance-name" handlers: - url: /static static_dir: static/ secure: always - url: /.* script: auto secure: always automatic_scaling: min_instances: 0 max_instances: 2 target_cpu_utilization: 0.65
Why these settings:
instance class allows 2 Gunicorn workersF2
saves costs when idlemin_instances: 0
scales before overloadtarget_cpu_utilization: 0.65
requirements.txt
Django>=5.1,<6.0 psycopg2-binary>=2.9.9 gunicorn>=23.0.0 whitenoise>=6.7.0
Common Patterns
Pattern 1: Environment-Aware Database Configuration
import os def get_database_config(): """Return database config based on environment.""" is_production = os.getenv('GAE_APPLICATION', None) if is_production: return { 'ENGINE': 'django.db.backends.postgresql', 'NAME': os.environ['DB_NAME'], 'USER': os.environ['DB_USER'], 'PASSWORD': os.environ['DB_PASSWORD'], 'HOST': f"/cloudsql/{os.environ['CLOUD_SQL_CONNECTION_NAME']}", 'PORT': '', 'CONN_MAX_AGE': 60, } else: return { 'ENGINE': 'django.db.backends.postgresql', 'NAME': os.environ.get('DB_NAME', 'mydb'), 'USER': os.environ.get('DB_USER', 'postgres'), 'PASSWORD': os.environ.get('DB_PASSWORD', ''), 'HOST': '127.0.0.1', 'PORT': '5432', 'CONN_MAX_AGE': 60, } DATABASES = {'default': get_database_config()}
When to use: Standard Django project needing local/production parity
Pattern 2: Secret Manager Integration
from google.cloud import secretmanager def get_secret(secret_id, version_id='latest'): """Retrieve secret from Google Secret Manager.""" client = secretmanager.SecretManagerServiceClient() project_id = os.environ.get('GOOGLE_CLOUD_PROJECT') name = f"projects/{project_id}/secrets/{secret_id}/versions/{version_id}" response = client.access_secret_version(request={"name": name}) return response.payload.data.decode('UTF-8') # Usage in settings.py if os.getenv('GAE_APPLICATION'): SECRET_KEY = get_secret('django-secret-key') DB_PASSWORD = get_secret('db-password')
When to use: Production deployments requiring proper secret management
Pattern 3: Cloud SQL Python Connector (Alternative to Proxy)
# For local development without Cloud SQL Auth Proxy from google.cloud.sql.connector import Connector def get_db_connection(): connector = Connector() return connector.connect( "project:region:instance", "pg8000", user="postgres", password=os.environ["DB_PASSWORD"], db="mydb", ) # In settings.py for local dev (requires pg8000 driver) DATABASES = { 'default': { 'ENGINE': 'django.db.backends.postgresql', 'NAME': 'mydb', 'USER': 'postgres', 'OPTIONS': { 'get_conn': get_db_connection, }, } }
When to use: Local development when you can't install Cloud SQL Auth Proxy
Pattern 4: Health Check Endpoint with Database Verification
# views.py from django.http import JsonResponse from django.db import connection def health_check(request): """Health check endpoint for App Engine.""" try: with connection.cursor() as cursor: cursor.execute("SELECT 1") return JsonResponse({ 'status': 'healthy', 'database': 'connected', }) except Exception as e: return JsonResponse({ 'status': 'unhealthy', 'database': str(e), }, status=503) # urls.py urlpatterns = [ path('_ah/health', health_check, name='health_check'), ]
When to use: Load balancer health checks, monitoring database connectivity
Using Bundled Resources
Templates (templates/)
- Copy-paste database configurationtemplates/settings_snippet.py
- Complete App Engine configurationtemplates/app.yaml
- Production dependenciestemplates/requirements.txt
References (references/)
- Detailed proxy installation and usagereferences/cloud-sql-proxy-setup.md
- IAM-based authentication (no passwords)references/iam-authentication.md
- Storing secrets properlyreferences/secret-manager.md
- Running migrations safelyreferences/migrations-in-production.md
When Claude should load these:
- Load
when user has local connection issuescloud-sql-proxy-setup.md - Load
when setting up passwordless authiam-authentication.md - Load
before first production deploymentmigrations-in-production.md
Advanced Topics
IAM Database Authentication
Instead of password authentication, use IAM for service-to-service auth:
# Enable IAM authentication on instance gcloud sql instances patch INSTANCE --database-flags cloudsql.iam_authentication=on # Create IAM user gcloud sql users create SERVICE_ACCOUNT@PROJECT.iam \ --instance=INSTANCE \ --type=CLOUD_IAM_SERVICE_ACCOUNT # Grant connect permission gcloud projects add-iam-policy-binding PROJECT \ --member="serviceAccount:PROJECT@appspot.gserviceaccount.com" \ --role="roles/cloudsql.instanceUser"
Django settings for IAM auth:
DATABASES = { 'default': { 'ENGINE': 'django.db.backends.postgresql', 'NAME': os.environ['DB_NAME'], 'USER': f"{os.environ['SERVICE_ACCOUNT']}@{os.environ['PROJECT_ID']}.iam", 'HOST': f"/cloudsql/{os.environ['CLOUD_SQL_CONNECTION_NAME']}", 'PORT': '', # No PASSWORD needed with IAM auth } }
Connection Pooling with PgBouncer
For high-traffic applications, deploy PgBouncer on Cloud Run:
# Cloud Run service for PgBouncer # See references/pgbouncer-setup.md for full configuration
Why PgBouncer:
- Cloud SQL limits connections (25-4000 depending on tier)
- Django's
helps but doesn't pool across processesCONN_MAX_AGE - PgBouncer provides true connection pooling
Running Migrations Safely
Option 1: Cloud Build (Recommended)
# cloudbuild.yaml steps: - name: 'gcr.io/cloud-builders/gcloud' args: ['sql', 'connect', 'INSTANCE', '--quiet'] - name: 'python:3.10' entrypoint: 'bash' args: - '-c' - | pip install -r requirements.txt python manage.py migrate --noinput env: - 'DB_NAME=mydb' - 'DB_USER=postgres' - 'DB_HOST=/cloudsql/PROJECT:REGION:INSTANCE' secretEnv: ['DB_PASSWORD'] availableSecrets: secretManager: - versionName: projects/PROJECT/secrets/db-password/versions/latest env: 'DB_PASSWORD'
Option 2: Local with Proxy (Simple)
./cloud-sql-proxy PROJECT:REGION:INSTANCE & python manage.py migrate
Dependencies
Required:
- Web frameworkDjango>=5.1
- PostgreSQL adapterpsycopg2-binary>=2.9.9
- WSGI server for App Enginegunicorn>=23.0.0
Recommended:
- Static file servingwhitenoise>=6.7.0
- Local environment variablespython-dotenv>=1.0.0
Optional:
- Secret Manager integrationgoogle-cloud-secret-manager>=2.20.0
- Python-native Cloud SQL connectorcloud-sql-python-connector[pg8000]>=1.12.0
- Connection pooling (alternative to CONN_MAX_AGE)django-db-connection-pool>=1.2.5
Official Documentation
- Cloud SQL for PostgreSQL: https://cloud.google.com/sql/docs/postgres
- App Engine Python 3: https://cloud.google.com/appengine/docs/standard/python3
- Cloud SQL Auth Proxy: https://cloud.google.com/sql/docs/postgres/connect-auth-proxy
- Django on App Engine: https://cloud.google.com/python/django/appengine
- Cloud SQL Python Connector: https://github.com/GoogleCloudPlatform/cloud-sql-python-connector
- Secret Manager: https://cloud.google.com/secret-manager/docs
Package Versions (Verified 2026-01-24)
{ "dependencies": { "Django": ">=5.1,<6.0", "psycopg2-binary": ">=2.9.9", "gunicorn": ">=23.0.0", "whitenoise": ">=6.7.0" }, "optional": { "google-cloud-secret-manager": ">=2.20.0", "cloud-sql-python-connector": ">=1.12.0" } }
Production Example
This skill is based on production Django deployments on App Engine:
- Use Case: Multi-tenant SaaS application
- Traffic: 10K+ daily requests
- Database: Cloud SQL PostgreSQL (db-g1-small, 25 connections)
- Errors: 0 (all 12 known issues prevented)
- Validation: Unix socket connection, connection pooling, static files, CSRF
Troubleshooting
Problem: No such file or directory
for socket
No such file or directorySolution:
- Verify
in app.yamlbeta_settings.cloud_sql_instances - Check connection name format:
PROJECT:REGION:INSTANCE - Ensure Cloud SQL instance exists:
gcloud sql instances list
Problem: Connection works locally but fails on App Engine
Solution:
- Verify
uses Unix socket path, notHOST127.0.0.1 - Check environment variables are set in app.yaml
- Verify App Engine service account has
roleCloud SQL Client
Problem: Migrations timeout during deployment
Solution:
- Don't run migrations in app.yaml entrypoint
- Use Cloud Build or run locally with proxy before deploying
- For large migrations, increase Cloud SQL tier temporarily
Problem: Static files 404 after deploy
Solution:
- Run
before deploypython manage.py collectstatic --noinput - Verify
handler in app.yaml points to correct directorystatic/ - Check WhiteNoise is in MIDDLEWARE list
Complete Setup Checklist
- Cloud SQL PostgreSQL instance created
- Database and user created in Cloud SQL
- Cloud SQL Auth Proxy installed locally
- Django settings configured for Unix socket (production) and localhost (dev)
-
in app.yamlbeta_settings.cloud_sql_instances -
set for connection poolingCONN_MAX_AGE - Environment variables configured (DB_NAME, DB_USER, etc.)
- DB_PASSWORD stored securely (not in app.yaml)
- Static files collected and handler configured
- CSRF_TRUSTED_ORIGINS includes *.appspot.com
- Migrations run (locally with proxy) before first deploy
- Deployed with
gcloud app deploy - Verified database connection in production logs
Questions? Issues?
- Check the troubleshooting section above
- Verify all environment variables are set correctly
- Check official docs: https://cloud.google.com/python/django/appengine
- Ensure Cloud SQL instance is running:
gcloud sql instances list
Last verified: 2026-01-24 | Skill version: 1.0.0