hero

The Storyboard

Welcome to the Storyboard, a place to explore career adventures at start-ups and companies founded by Claremont alumni and the Claremont community. Choose your next adventure at a company where you’ll have an edge from day one, and leverage our Claremont network to build your career.

Also, make sure to check out our newsletter, StoryHouse Review, to find out more about these companies in the Claremont ecosystem.

Senior Data Engineer - Python, SQL

Strider

Strider

Software Engineering, Data Science
Brazil · Mexico · Colombia · Argentina · El Salvador · Peru · Dominican Republic · Paraguay · Puerto Rico · Ecuador · Chile · Costa Rica · Guatemala · Bolivia · Venezuela · Nicaragua · Panama · Honduras · Uruguay · Cuba · Remote
Posted on Nov 28, 2025

Requirements

Must-haves

  • 6+ years of data engineering experience
  • Proficiency Python for data processing (Pandas, PySpark)
  • Proficiency AWS (S3, Glue, Redshift, Lambda)
  • Expertise with SQL and relational databases (SQL Server)
  • Experience building ETL processes handling millions of records
  • Deep knowledge of data architecture, performance tuning, and governance practices
  • Strong communication skills in both spoken and written English

Nice-to-haves

  • Startup experience
  • Experience with Azure (Data Factory, Synapse, Blob Storage) experience
  • Experience in enterprise-scale environments with diverse data sources
  • Familiarity with big data tools (Spark) and distributed processing
  • Bachelor's Degree in Computer Engineering, Computer Science, or equivalent

What you will work on

  • This is a full-time role (40 hours/week) for a 6-month contract
  • Design, develop, and maintain scalable ETL pipelines for processing large datasets
  • Build data infrastructure on AWS, Azure ensuring performance, scalability, and cost-effectiveness
  • Develop and optimize data models and schemas for analytics and reporting
  • Collaborate with analysts and stakeholders to understand data needs and deliver solutions
  • Ensure data quality, integrity, and security throughout the lifecycle
  • Monitor pipeline performance, troubleshoot bottlenecks, and resolve data issues
  • Document processes and contribute to continuous improvement efforts across the team