- Career Center Home
- Search Jobs
- Data Platform Engineer
Results
Job Details
Explore Location
Schwab
Lone Tree, Colorado, United States
(on-site)
Posted
1 day ago
Schwab
Lone Tree, Colorado, United States
(on-site)
Job Type
Full-Time
Data Platform Engineer
The insights provided are generated by AI and may contain inaccuracies. Please independently verify any critical information before relying on it.
Data Platform Engineer
The insights provided are generated by AI and may contain inaccuracies. Please independently verify any critical information before relying on it.
Description
Your OpportunitySchwab Asset Management (SAM) is a leading asset manager supporting mutual funds, ETFs, and managed account products governed under stringent regulatory and compliance requirements. SAM operates in a multi-cloud, multi-custodian, multi-vendor ecosystem, relying on a diverse set of external platforms such as Vestmark, Aladdin, Eagle, and others to serve its investment, operational, and regulatory functions.
This role sits directly within SAM Data Platform Engineering, the team responsible for designing, building, operating, and enhancing the shared platform capabilities underpinning SAM Data platform. As a junior platform engineer, you will help expand and operate the cloud infrastructure, CI/CD tooling, orchestration layers, and Snowflake resources that support all SAMDA tenants.
Role Summary
The SAM Data Platform Engineering team builds and operates the Schwab Asset Management Data platform - a unified, multi-tenant, cloud-native data platform that supports regulatory, operational, and analytical workloads across SAM. The platform leverages GCP services and Snowflake to deliver scalable data ingestion, transformation, governance, and consumption capabilities.
As a Data Platform Engineer (Level 55), you will contribute to the build-out, automation, and operation of the SAMDA platform. This junior role is ideal for early-career engineers who want to work across cloud infrastructure, DevOps tooling, Snowflake platform configuration, and data-pipeline enablement while learning modern cloud-native data engineering patterns.
What You Will Do (Responsibilities)
Platform Infrastructure & Environment Engineering
- Support creation and configuration of GCP infrastructure including Cloud Storage, Composer, Cloud Run, and IAM roles.
- Assist in provisioning Snowflake resources (databases, schemas, compute warehouses, RBAC roles, service accounts) aligned with tenant isolation and platform governance models.
- Follow platform patterns for networking and secure connectivity , including Private Service Connect and controlled access paths between Snowflake and GCP.
DevOps, CI/CD & Automation
- Contribute to Git repository setup , branching strategies, and automated CI/CD pipelines for pipelines, Snowflake DDL, configuration, and platform components.
- Help build automation templates for tenant-specific resources, notifications, dashboards, and deployment patterns.
Data Platform Capabilities
- Support ingestion workflows for file-based ingestion , vendor Snowflake secure data shares , and internal source ingestion to SAMDA raw/curated data zones.
- Assist in implementing transformations using Snowflake SQL, Python, and Composer as part of the PLT (Push-Load-Transform) orchestration model.
- Support standardized data quality checks baked into platform pipelines and tenant workflows.
Observability, Monitoring & Platform Operations
- Configure monitoring and alerting using GCP Operations , Snowflake usage monitoring, and platform dashboards for pipeline health and SLAs.
- Collaborate with production support on incident triage, pipeline monitoring, and environment troubleshooting.
Platform Enablement
- Help onboard new tenant applications, including infra creation, repository setup, Snowflake footprint provisioning, RBAC setup, and pipeline patterns.
- Learn and apply platform governance covering data segmentation, tenancy isolation, information barriers, and security guardrails.
What you have
Required Qualifications
- Bachelor's degree in Computer Science, IT, Engineering, or related field, or equivalent practical experience.
- 1-2 years of experience or relevant project work with cloud platforms (GCP, AWS, Azure).
- Foundational experience with Python, SQL, Git, and CI/CD concepts.
- Exposure to cloud-native services (GCS, Cloud Run, Composer, Pub/Sub) or Snowflake.
- Strong analytical, troubleshooting, and communication skills; ability to learn quickly and collaborate in a team environment.
Preferred Qualifications
- Exposure to Snowflake or other cloud data warehouses (BigQuery, Redshift).
- Experience with IaC tools such as Terraform or Deployment Manager.
- Understanding of data governance, secure data handling, or enterprise RBAC models.
- Familiarity with observability tools (Stackdriver/Cloud Operations, Prometheus, etc.).
Requisition #: 2026-118759
r1d4rh5eu
Requirements
2026-118759
Job ID: 82337877

Schwab
United States
Schwab is a leader in financial services, helping millions of people make the most of their money. Most Schwab careers are based in one of our two main operating segments, Investor Services or Institutional Services. But across the entire Schwab organization, more than 12,000 employees share a passion for fulfilling our corporate purpose: to help everyone be financially fit.
View Full Profile
More Jobs from Schwab
Client Relationship Specialist- Mission Viejo, CA
Mission Viejo, California, United States
7 hours ago
Senior Agile Product Owner, Workplace Services
Westlake, Texas, United States
7 hours ago
Senior AI Site Reliability Engineer, AI.x
Austin, Texas, United States
7 hours ago
Jobs You May Like
Community Intel Unavailable
Details for Lone Tree, Colorado, United States are unavailable at this time.
Loading...
