- Career Center Home
- Search Jobs
- Cloud Data Platform Engineer
Results
Job Details
Explore Location
Schwab
Southlake, Texas, United States
(on-site)
Posted
3 days ago
Schwab
Southlake, Texas, United States
(on-site)
Job Type
Full-Time
Cloud Data Platform Engineer
The insights provided are generated by AI and may contain inaccuracies. Please independently verify any critical information before relying on it.
Cloud Data Platform Engineer
The insights provided are generated by AI and may contain inaccuracies. Please independently verify any critical information before relying on it.
Description
Your OpportunityAt Schwab, you're empowered to make an impact on your career. Here, innovative thought meets creative problem solving, helping us "challenge the status quo" and transform the finance industry together.
We believe in the importance of in-office collaboration and fully intend for the selected candidate for this role to work on site in the specified location(s).
Schwab Asset Management (SAM) is a leading asset manager supporting mutual funds, ETFs, and managed account products governed under stringent regulatory and compliance requirements. SAM operates in a multi-cloud, multi-custodian, multi-vendor ecosystem, relying on a diverse set of external platforms such as Vestmark, Aladdin, Eagle, and others to serve its investment, operational, and regulatory functions.
This role sits directly within SAM Data team, the team responsible for designing, building, operating, and enhancing SAM Data products , platform capabilities underpinning SAM Data platform.
The Cloud Data Platform Engineer is within the SAM Data Technology organization, with a strong emphasis on the Investment Data domain. The role owns end-to-end data capabilities-from ingestion and domain modeling to platform frameworks, Data APIs, and Python-based visualization/UI layers-that power Schwab Asset Management's investment, operational, analytical, and regulatory use cases.
This position is expected to operate as a senior technical contributor and platform leader, defining reusable frameworks, setting engineering standards, and driving consistency across the SAMDA ecosystem. The engineer partners closely with architects, product owners, and investment stakeholders to ensure scalable, governed, and business-aligned data solutions.
Responsibilities:
Platform Frameworks & Engineering Standards
- Design and build reusable data platform frameworks for ingestion, transformation, validation, and consumption.
- Establish standardized patterns for data pipelines, APIs, and visualization layers across SAMDA.
- Define best practices for schema evolution, versioning, error handling, and observability.
- Influence platform roadmap through hands-on engineering leadership.
Design & Build Advanced Data Pipelines
- Architect and implement complex, cloud-native ETL/ELT pipelines supporting investment and analytical data.
- Build reliable workflows using GCS, Dataproc, Cloud Dataflow, Composer (Airflow), and Pub/Sub.
- Implement scalable transformations and curated layers in Snowflake and cloud data warehouses.
Investment Data Domain Focus
This role has deep ownership within the Investment Data domain, including but not limited to:
- Holdings, positions, transactions, cash flows, and security master data
- Portfolio, account, and instrument hierarchies
- Performance, risk, exposure, and attribution datasets
- Reference data, taxonomies, and domain models that support downstream analytics and reporting
The engineer is responsible for designing domain-driven data models and operational data stores that accurately represent investment concepts and scale across multiple SAM products and platforms.
Investment Data Modeling & Domain Engineering
- Design enterprise-grade investment data models using Kimball, relational, and domain-driven design principles.
- Create operational and analytical data stores from the ground up, including taxonomies and canonical models.
- Ensure models support regulatory, performance, and investment analytics use cases.
Data APIs & Service Layer
- Design and implement Data APIs using Python (FastAPI / Flask) to expose curated investment datasets.
- Build scalable, secure RESTful services for analytical and operational consumers.
- Apply governance, access control, and data protection standards aligned with regulated environments.
Python Dashboards & Data Visualization
- Develop Python-based dashboards and UI applications using Streamlit, Dash, Panel, or similar frameworks.
- Create interactive visualizations using Plotly, Matplotlib, and Seaborn to support investment insights.
- Translate complex investment data into intuitive, self-service analytical experiences.
Advanced Cloud & Application Engineering
- Build data and application services using Cloud Run, Cloud Functions, and Cloud SQL.
- Apply distributed processing frameworks such as Apache Spark, Beam, and Flink.
- Package and deploy data services, APIs, and UI components using Docker.
DevOps, CI/CD & Automation
- Lead CI/CD design for pipelines, APIs, and visualization apps using Git, Bitbucket, Bamboo, Jenkins, and GitHub Actions.
- Implement automated testing, deployment, and release management patterns.
- Drive infrastructure automation using Terraform or Google Cloud Deployment Manager.
Data Quality, Reliability & Observability
- Define and implement data quality frameworks, reconciliation checks, and monitoring standards.
- Proactively identify and resolve complex data, platform, and application issues.
Technical Leadership & Collaboration
- Act as a senior technical leader and mentor for data engineers across SAMDA.
- Lead design reviews and influence cross-team engineering decisions.
- Communicate complex platform and investment data concepts to technical and business stakeholders.
What you have
Required Qualifications
- Bachelor's degree in Computer Science, Information Technology, or equivalent practical experience.
- 6-8 years of experience building cloud-based data platforms and enterprise data solutions.
- Strong experience in the Investment or Asset Management data domain.
- Hands-on expertise with Snowflake and GCP services (GCS, Cloud Run, Cloud Functions, Pub/Sub, Composer, Cloud SQL).
- Advanced proficiency in Python for data engineering, API development, and visualization.
- Proven experience building REST APIs using Python frameworks (FastAPI, Flask, or equivalent).
- Experience with Python visualization and UI frameworks (Streamlit, Dash, Panel, or similar).
- Strong background with distributed processing frameworks (Spark, Beam, or Flink).
- Expertise in CI/CD, containerization (Docker), and infrastructure as code (Terraform or GCP Deployment Manager).
Preferred Qualifications
- Experience designing platform-level frameworks adopted by multiple engineering teams.
- Deep understanding of regulated data environments, governance, lineage, and auditability.
- Strong grasp of modern data architecture and self-service analytics patterns.
- Ability to influence platform strategy and mentor senior engineers.
- Excellent documentation and executive-level communication skills.
In addition to the salary range, this role is also eligible for bonus or incentive opportunities.
Requisition #: 2026-120056
r1d4rh5eu
Requirements
2026-120056
Job ID: 82985266

Schwab
United States
Schwab is a leader in financial services, helping millions of people make the most of their money. Most Schwab careers are based in one of our two main operating segments, Investor Services or Institutional Services. But across the entire Schwab organization, more than 12,000 employees share a passion for fulfilling our corporate purpose: to help everyone be financially fit.
View Full Profile
More Jobs from Schwab
Associate, Corporate Development
Westlake, Texas, United States
5 hours ago
Sr. Financial Consultant Partner - New York, NY
New York City, New York, United States
5 hours ago
Production Support Engineer
Austin, Texas, United States
5 hours ago
Jobs You May Like
Community Intel Unavailable
Details for Southlake, Texas, United States are unavailable at this time.
Loading...
