- Career Center Home
- Search Jobs
- Cloud Data Platform Engineer
Results
Job Details
Explore Location
Schwab
Southlake, Texas, United States
(on-site)
Posted
1 day ago
Schwab
Southlake, Texas, United States
(on-site)
Job Type
Full-Time
Cloud Data Platform Engineer
The insights provided are generated by AI and may contain inaccuracies. Please independently verify any critical information before relying on it.
Cloud Data Platform Engineer
The insights provided are generated by AI and may contain inaccuracies. Please independently verify any critical information before relying on it.
Description
Your OpportunitySchwab Asset Management (SAM) is a leading asset manager supporting mutual funds, ETFs, and managed account products governed under stringent regulatory and compliance requirements. SAM operates in a multi-cloud, multi-custodian, multi-vendor ecosystem, relying on a diverse set of external platforms such as Vestmark, Aladdin, Eagle, and others to serve its investment, operational, and regulatory functions.
This role sits directly within SAM Data team, the team responsible for designing, building, operating, and enhancing the shared platform capabilities underpinning SAM Data platform. As a junior platform engineer, you will help expand and operate the cloud infrastructure, CI/CD tooling, orchestration layers, and Snowflake resources that support all SAMDA tenants.
The SAM Data team supports Schwab Asset Management by building and enhancing cloud-native data solutions that enable high-quality, scalable, and reliable access to SAM datasets. As a Data Engineer (Level 55), you will contribute directly to the development and optimization of data pipelines, data models, and cloud-native processing patterns that power analytics, regulatory reporting, and operational workflows.
This role is well-suited for an early-career engineer who wants to deepen their skills in data engineering on Snowflake and Google Cloud Platform (GCP) while working in a collaborative environment.
Responsibilites:
Data Pipeline Engineering
- Build and enhance cloud-native data pipelines using ETL/ELT and data-integration patterns on GCP and Snowflake.
- Develop reliable and repeatable data workflows using services such as GCS, Dataproc, Cloud Dataflow, Cloud Composer (Airflow), and Cloud Pub/Sub.
- Create scalable, well-structured data transformations leveraging Snowflake and cloud-warehouse patterns.
- Apply modern data-modeling techniques, including RDBMS, NoSQL, star schema, and Kimball-style
- Work with distributed processing technologies such as Apache Spark, Apache Beam, and Apache Flink to organize, transform, and prepare data for consumption. Cloud Data Engineering
- Engineer solutions using key GCP services, including Cloud Storage, Cloud Run, Cloud Functions, Pub/Sub, Composer, and Cloud SQL
- Leverage cloud-native tools to build scalable, observable data workflows.
- Utilize build and deployment tools such as Git/Bitbucket/Bamboo, Maven, Jenkins, Nexus, and containerization technologies.
- Apply continuous integration and deployment practices using Docker, GitHub, and GitHub Actions.
- Ensure data quality and reliability by implementing validation checks and designing robust data-processing logic.
- Work collaboratively to troubleshoot data issues and continuously improve pipeline stability.
What you have
Required Qualifications:
- Bachelor's degree in Computer Science, Information Technology, or a related field, or equivalent practical experience.
- 2+ years of experience as a cloud data platform engineer in a data-analytics ecosystem, with progression in technical responsibilities.
- Hands-on experience with Snowflake and GCP services such as Cloud Storage, Cloud Run, Cloud Functions, Pub/Sub, Composer, and Cloud SQL.
- Proficiency with Infrastructure-as-Code tools such as Terraform or Google Cloud Deployment Manager.
- Strong analytical and problem-solving skills with the ability to work independently and within a team.
- Effective technical communication skills and attention to detail.
Preferred Qualifications:
- Experience with secure data-management practices, including data governance and data-quality management for pipelines.
- Deep understanding of modern data-architecture patterns for data-pipeline and reporting environments.
- Experience supporting cross-functional initiatives and strong documentation skills.
Requisition #: 2026-120190
r1d4rh5eu
Requirements
2026-120190
Job ID: 83030880

Schwab
United States
Schwab is a leader in financial services, helping millions of people make the most of their money. Most Schwab careers are based in one of our two main operating segments, Investor Services or Institutional Services. But across the entire Schwab organization, more than 12,000 employees share a passion for fulfilling our corporate purpose: to help everyone be financially fit.
View Full Profile
More Jobs from Schwab
Senior Trust Officer
Henderson, Nevada, United States
6 hours ago
Manager of Treasury Settlement Services
Westlake, Texas, United States
5 hours ago
Data Engineer, Quantitative Research
Lone Tree, Colorado, United States
6 hours ago
Jobs You May Like
Community Intel Unavailable
Details for Southlake, Texas, United States are unavailable at this time.
Loading...
