Senior Data Engineer
Whistle
About us
Tractive is the most trusted name for keeping cats and dogs safe and healthy. The secret to our success? A team of truly unique individuals who care about each other just as much as going the extra mile for pet parents. Want to be part of our story? Check out this role and see if you’re a good match!
Your territory
As part of our Data team, you will:
Design and evolve Tractive’s analytical data platform that generates insights from position, health, business, and device telemetry data. Working with billions of GPS coordinates and large-scale device signals, you enable data-driven product and business decisions at scale
Build and maintain self-service tooling (e.g. data catalog, lineage) to improve data discoverability, trust, and transparency
Lead data governance and quality initiatives, implementing data quality frameworks, metadata management tools, and best practices to ensure data is high-quality, discoverable, reliable and trustworthy
Take full ownership of our data infrastructure on AWS end-to-end, working with technologies such as Lambda, Glue, Airflow, Redshift, and dbt
Ship reliable and observable data products through established coding standards, automated testing, and CI/CD pipelines that make data changes fast, safe, and developer-friendly
Implement monitoring, alerting, and cost-optimization strategies so we can confidently and proactively manage our infrastructure
Partner across engineering teams to ensure the data platform continues to meet evolving needs, performance expectations, and development workflows
Bring in your fresh ideas to make Tractive better – you’ll never hear the phrase “that’s how we’ve always done it.”
Continuously grow personally and professionally, take ownership of areas that show your potential, and attend workshops that help you get to the next level
Your profile
Key requirements:
Proven experience building and operating data platforms, including data pipelines, storage, and orchestration at scale
Strong software engineering fundamentals, including clean code, automated testing, Git workflows, CI/CD (e.g. GitHub Actions, Jenkins)
Solid understanding of networking and cloud security, including subnets, routing, IAM, and key and secret management, with the ability to apply these concepts to build secure and reliable cloud infrastructure
Experience in data quality and metadata management using OpenMetadata and AWS Glue with strong expertise in governance and security best practices, including data classification, access control, and regulatory compliance.
Proficient in Python, with hands-on experience using PySpark for large-scale data processing
Hands-on experience with cloud data analytics stacks (AWS, Azure, GCP), covering storage, processing, and analytics services. Experience with Infrastructure as Code (e.g. Cloudformation, CDK, ..) is a big plus
Very good English skills
Valid Austrian work permit
Does this sound like you?
Passionate about big data and its potential to drive impact
A proactive self-learner who stays ahead of the curve. You don’t just follow industry trends - you translate them into initiatives and keep our stakeholders inspired by what’s next
Meticulous about quality, security, and cost efficiency
A relentless problem solver who thrives on autonomy and clear ownership
Excited to work in Austria and collaborate face‑to‑face with an exceptional international team