- Career Center Home
- Search Jobs
- Senior Data Engineer, Client Authentication
Results
Job Details
Explore Location
Schwab
Southlake, Texas, United States
(on-site)
Posted
1 day ago
Schwab
Southlake, Texas, United States
(on-site)
Job Type
Full-Time
Senior Data Engineer, Client Authentication
The insights provided are generated by AI and may contain inaccuracies. Please independently verify any critical information before relying on it.
Senior Data Engineer, Client Authentication
The insights provided are generated by AI and may contain inaccuracies. Please independently verify any critical information before relying on it.
Description
Your OpportunityAt Schwab, you're empowered to make an impact on your career. Here, innovative thought meets creative problem solving, helping us challenge the status quo and transform the finance industry together. We succeed as One Schwab-collaborating with trust, integrity, and a shared commitment to doing the right thing for our clients and each other.
We believe in the importance of in-office collaboration and fully intend for the selected candidate for this role to work on site in the specified location(s).
In this role, you'll join Schwab's Global Data and Analytics team to help build and evolve a large-scale data intelligence platform on Google Cloud Platform (GCP). You'll work at the intersection of data architecture and AI, engineering resilient pipelines that enable advanced analytics, fraud detection, and machine-learning models at scale. This is a hands-on, end-to-end engineering role where your work directly supports teams protecting clients and strengthening trust across the firm.
Key Responsibilities
- Data Pipeline Architecture and Development: Design, build, and maintain scalable batch and streaming data pipelines using tools such as Dataflow (Apache Beam), Cloud Composer (Airflow), and Pub/Sub to ingest terabytes of transaction and behavioral data.
- Advanced Coding: Write high-performance, production-grade Python and SQL, optimizing existing codebases for efficiency, latency, and cost.
- Data Modeling: Implement complex data models in BigQuery, utilizing partitioning, clustering, and materialized views for optimal performance.
- System Design: Architect robust backend data services and microservices to power analytics and AI platforms.
- Infrastructure as Code: Write and maintain Terraform scripts to provision and manage GCP resources, ensuring reproducible and secure infrastructure.
- Data Quality Engineering: Implement automated testing frameworks, data contracts, and anomaly detection systems into pipeline code.
- Performance Tuning: Deep dive into query execution plans and pipeline bottlenecks to actively reduce latency and cloud costs.
- Incident Resolution: Act as the highest level of escalation for critical data engineering issues, debugging complex failures in distributed systems.
- Technical Leadership: Elevate team coding standards through rigorous code reviews and creation of solution architecture documents.
- Mentorship: Mentor senior and junior engineers via pair programming and technical design sessions, helping them grow their skills.
- Strategy: Collaborate with stakeholders to define the technical roadmap, selecting the right tools and patterns for long-term success.
What you have
Required Qualifications
- 8+ years of hands-on software and data engineering experience with a proven track record of shipping complex systems to production.
- 4+ years as a hands-on senior engineer in startups and/or large organizations.
- Bachelor's degree in Computer Science or a related field.
- Strong software engineering foundation, applying best practices (CI/CD, unit testing, modular design) to data pipelines.
- Deep, practical experience with BigQuery, Dataflow, Pub/Sub, Cloud Storage, and IAM.
- Expert-level proficiency in Python and SQL, with the ability to write clean, maintainable, and efficient code.
- Mastery of dimensional modeling, distributed systems, and modern data-stack patterns.
- Extensive experience with workflow orchestration using Apache Airflow or Cloud Composer.
- Strong background in dbt (data build tool) implementation and strategy.
- Proven track record with CI/CD, Terraform (infrastructure as code), and containerization (Docker and Kubernetes).
Preferred Qualifications
- Deep expertise in real-time data processing using Kafka or Pub/Sub.
- Deep understanding of big-data frameworks such as Apache Beam or Spark.
- Experience with modern data stacks such as Snowflake or Databricks, though GCP is our primary platform.
- Demonstrated business-domain knowledge in fraud analytics.
- Strong written and verbal communication skills to clearly convey ideas and feedback.
- Google Professional Data Engineer certification.
- Master's or advanced degree in Computer Science or a related field.
In addition to the salary range, this role is also eligible for bonus or incentive opportunities.
Requisition #: 2026-119604
r1d4rh5eu
Requirements
2026-119604
Job ID: 83031078

Schwab
United States
Schwab is a leader in financial services, helping millions of people make the most of their money. Most Schwab careers are based in one of our two main operating segments, Investor Services or Institutional Services. But across the entire Schwab organization, more than 12,000 employees share a passion for fulfilling our corporate purpose: to help everyone be financially fit.
View Full Profile
More Jobs from Schwab
Senior Trust Officer
Henderson, Nevada, United States
11 hours ago
Manager of Treasury Settlement Services
Westlake, Texas, United States
11 hours ago
Data Engineer, Quantitative Research
Lone Tree, Colorado, United States
12 hours ago
Jobs You May Like
Community Intel Unavailable
Details for Southlake, Texas, United States are unavailable at this time.
Loading...
