Senior Data Engineer - Internal Measurement
Samba TV
What You'll Do
Design, build, and maintain scalable data pipelines that power internal metrics, dashboards, and reporting across the Technology and Product organizations.
Analyze and improve the efficiency, scalability, and stability of data collection, storage, and retrieval processes for our core systems.
Work in collaboration with data scientists/data analysts to develop new and improved algorithms that best capture the value of our data.
Own and operate Airflow DAGs and other orchestration workflows, ensuring reliable and timely delivery of internal data products.
Participate in code reviews, contribute to documentation, and help raise engineering standards within the team.
Who You Are
4+ years of professional development experience building high-performance, large-scale applications/pipelines.
Strong proficiency in Python/PySpark and SQL. While we work primarily in Python/PySpark, we acknowledge that engineers with sound fundamentals can pick up new languages relatively quickly.
Hands-on experience with Airflow or a similar orchestration tool.
Experience with Databricks or similar data lakehouse platforms is desirable.
Strong analytical skills and a genuine curiosity for understanding data - you should be comfortable getting deep into datasets to identify problems and opportunities.
Strong communication skills, especially in written form as we are a distributed team with members in both Warsaw and San Francisco.
Solid computer science fundamentals - data structures, algorithms, and software design principles.
Familiarity with FinOps concepts or cloud cost management is a plus, but not required.
250000 - 350000 PLN a year