Senior Data Engineer
About Acast
Since 2014, Acast has been building the world’s most valuable podcast marketplace, creating the technology that connects podcast creators, advertisers, and listeners. Its marketplace spans more than 140,000 podcasts, 3,300 advertisers, and one billion quarterly listens. Crucially, those listens are monetized wherever they happen—across any podcast app or listening platform.
About the role
We are looking for a Senior Data Engineer to further the evolution of our Insight Data Platforms. This role sits at the critical intersection of high-volume event processing, multi-cloud architecture, and business intelligence. You will be responsible for building and refining the pipelines that turn billions of raw events into standardized metrics—covering both audio listens and video views—to provide actionable growth insights for creators. A central part of this role is acting as a technical bridge across the organization. You will navigate multiple product teams, aligning disparate data streams and ensuring that insights are consistent regardless of which squad generates the underlying events. You won't just move data; you will design the standardized processes that ensure data quality and speed, empowering our internal teams and creators with world-class performance signals across all media formats.
What you´ll do
Cross-Product Data Bridging: Collaborate with and across multiple product squads to unify data strategies, ensuring that the Insight Platform serves as a cohesive "connective tissue" for data generated across different domains.
Architect Multi-Cloud Pipelines: Drive the end-to-end design of scalable ELT/ETL pipelines, managing complex data flows across AWS and GCP (BigQuery) to ensure a unified "source of truth."
Standardized Measurement Processes: Implement and maintain rigorous data quality frameworks to normalize and standardize performance metrics for audio (listens) and video (views), ensuring consistency across all creator-facing dashboards.
Analyst Enablement & Best Practices: Partner closely with Data Analysts to establish and champion best practices in data modeling, SQL optimization, and version control, ensuring the analytics layer is robust and scalable.
Semantic Layer Strategy: Modernize and maintain our data modeling layer using dbt, transitioning from legacy batch processes to fluid, reliable insights delivered via Omni.
Stakeholder Alignment: Act as a key technical partner for Product Managers and internal business units, translating complex, cross-functional business requirements into sustainable data architectures.
Infrastructure as Code: Manage and scale your own infrastructure using CDK within our cloud environments.
Mentorship: Set the bar for code quality, documentation, and testing, mentoring mid-level engineers and helping developers across product teams understand the downstream data implications of their services.
Who you are
The Technical Core: You are a master of SQL and Python, with a deep understanding of modern data modeling (dbt) and the ability to write clean, maintainable code.
AI-Augmented Engineer: You are comfortable leveraging AI coding tools to accelerate your workflows and increase productivity. You possess the professional judgment to weigh the benefits of speed in low-risk scenarios against the need for absolute rigor in high-stakes areas like core data processing or security. You maintain a "safety-first" mindset, proactively implementing guardrails and thorough peer-review processes to ensure the integrity and reliability of the platform.
The "Dot Connector": You excel at working across boundaries. You can identify how data from one product team impacts another and proactively work to bridge those gaps.
Multi-Cloud Proficient: You have experience navigating and integrating services across both AWS (S3, Lambda, Glue) and GCP (BigQuery).
Process-Driven Consultant: You enjoy helping others. You have the communication skills to explain technical trade-offs to PMs and the patience to help analysts improve their modeling workflows through standardized processes.
Business-Minded Engineer: You don't just build for the sake of building; you understand the "why" behind the data and how standardized audio/video metrics impact Acast’s growth and creator success.
Polyglot Curious: While Python and SQL are your primary tools, you are comfortable (or willing to learn) Node.js and TypeScript for surrounding microservices.
Autonomous & Agile: You thrive in a "remote-first" culture where you are given the autonomy to solve problems but the responsibility to communicate across different time zones and departments.
Our Tech Stack:
AWS (Glue, Athena, Lambda, S3, MWAA/Airflow), GCP (BigQuery), Omni, dbt, Spark, SQL, Python, TypeScript, Node.js
Culture
Acast is committed to creating a diverse environment and is proud to be an equal opportunity employer. All qualified applicants will receive consideration for employment regardless of race, gender, sexual orientation, religion, ethnicity, national origin or any of the other wonderful characteristics that make us different.
Culture is our number one priority as a business. We believe people come first, and we work every day to enable autonomy, continuous improvement and bring out the best in people. We’re global and have remote teams, so it’s even more important that we strive for an open, inclusive and caring environment where everyone feels visible and welcome.
We consider ourselves a modern organization driven by strong values to create the best, most fulfilling and nurturing culture.
We very much look forward to finding the next great person to join our cause!
- Department
- ProdDev: ProdDev Global Engineering
- Locations
- Stockholm / Sweden - Remote
About Acast
Acast is the world's leading independent podcast company, elevating podcast creators & podcast advertisers for the ultimate listening experience.