40 avsnitt • Längd: 30 min • Oregelbundet
Welcome to The Data Flowcast: Mastering Airflow for Data Engineering & AI — the podcast where we keep you up to date with insights and ideas propelling the Airflow community forward.
Join us each week, as we explore the current state, future and potential of Airflow with leading thinkers in the community, and discover how best to leverage this workflow management system to meet the ever-evolving needs of data engineering and AI ecosystems.
Podcast Webpage: https://www.astronomer.io/podcast/
The podcast The Data Flowcast: Mastering Airflow for Data Engineering & AI is created by Astronomer. The podcast and the artwork on this page are embedded on this page using the public podcast feed (RSS).
Transforming bottlenecked finance processes into streamlined, automated systems requires the right tools and a forward-thinking approach. In this episode, Mihir Samant, Senior Data Analyst at Etraveli Group, joins us to share how his team leverages Airflow to revolutionize finance automation. With extensive experience in data workflows and a passion for open-source tools, Mihir provides valuable insights into building efficient, scalable systems. We explore the transformative power of Airflow in automating workflows and enhancing data orchestration within the finance domain.
Key Takeaways:
(02:14) Etraveli Group specializes in selling affordable flight tickets and ancillary services.
(03:56) Mihir’s finance automation team uses Airflow to tackle month-end bottlenecks.
(06:00) Airflow's flexibility enables end-to-end automation for finance workflows.
(07:00) Open-source Airflow tools offer cost-effective solutions for new teams.
(08:46) Sensors and dynamic DAGs are pivotal features for optimizing tasks.
(13:30) GitSync simplifies development by syncing environments seamlessly.
(16:27) Plans include integrating Databricks for more advanced data handling.
(17:58) Airflow and Databricks offer multiple flexible methods to trigger workflows and execute SQL queries seamlessly.
Resources Mentioned:
https://www.linkedin.com/in/misamant/?originalSubdomain=ca
https://www.linkedin.com/company/etraveli-group/
https://airflow.apache.org/
Docker -
https://www.docker.com/
https://www.databricks.com/
Thanks for listening to “The Data Flowcast: Mastering Airflow for Data Engineering & AI.” If you enjoyed this episode, please leave a 5-star review to help get the word out about the show. And be sure to subscribe so you never miss any of the insightful conversations.
#AI #Automation #Airflow #MachineLearning
Data engineering is entering a new era, where orchestration and automation are redefining how large-scale projects operate. This episode features Vasantha Kosuri-Marshall, Data and ML Ops Engineer at Ford Motor Company. Vasantha shares her expertise in managing complex data pipelines. She takes us through Ford's transition to cloud platforms, the adoption of Airflow and the intricate challenges of orchestrating data in a diverse environment.
Key Takeaways:
(03:10) Vasantha’s transition to the Advanced Driving Assist Systems team at Ford.
(05:42) Early adoption of Airflow to orchestrate complex data pipelines.
(09:29) Ford's move from on-premise data solutions to Google Cloud Platform.
(12:03) The importance of Airflow's scheduling capabilities for efficient data management.
(16:12) Using Kubernetes to scale Airflow for large-scale data processing.
(19:59) Vasantha’s experience in overcoming challenges with legacy orchestration tools.
(22:22) Integration of data engineering and data science pipelines at Ford.
(28:03) How deferrable operators in Airflow improve performance and save costs.
(32:12) Vasantha’s insights into tuning Airflow properties for thousands of DAGs.
(36:09) The significance of monitoring and observability in managing Airflow instances.
Resources Mentioned:
https://www.linkedin.com/in/vasantha-kosuri-marshall-0b0aab188/
https://airflow.apache.org/
https://cloud.google.com/
Ford Motor Company | LinkedIn -
https://www.linkedin.com/company/ford-motor-company/
Ford Motor Company | Website -
https://www.ford.com/
https://www.astronomer.io/
Thanks for listening to “The Data Flowcast: Mastering Airflow for Data Engineering & AI.” If you enjoyed this episode, please leave a 5-star review to help get the word out about the show. And be sure to subscribe so you never miss any of the insightful conversations.
#AI #Automation #Airflow #MachineLearning
Data is the backbone of every modern business, but unlocking its full potential requires the right tools and strategies. In this episode, Ryan Delgado, Director of Engineering at Ramp, joins us to explore how innovative data platforms can transform business operations and fuel growth. He shares insights on integrating Apache Airflow, optimizing data workflows and leveraging analytics to enhance customer experiences.
Key Takeaways:
(01:52) Data is the lifeblood of Ramp, touching every vertical in the company.
(03:18) Ramp’s data platform team enables high-velocity scaling through tailored tools.
(05:27) Airflow powers Ramp’s enterprise data warehouse integrations for advanced analytics.
(07:55) Centralizing data in Snowflake simplifies storage and analytics pipelines.
(12:08) Machine learning models at Ramp integrate seamlessly with Airflow for operational excellence.
(14:11) Leveraging Airflow datasets eliminates inefficiencies in DAG dependencies.
(17:22) Platforms evolve from solving narrow business problems to scaling organizationally.
(18:55) ClickHouse enhances Ramp’s OLAP capabilities with 100x performance improvements.
(19:47) Ramp’s OLAP platform improves performance by reducing joins and leveraging ClickHouse.
(21:46) Ryan envisions a lighter-weight, more Python-native future for Airflow.
Resources Mentioned:
https://www.linkedin.com/in/ryan-delgado-69544568/
Ramp -
https://www.linkedin.com/company/ramp/
https://airflow.apache.org/
https://www.snowflake.com/
https://clickhouse.com/
dbt -
https://www.getdbt.com/
Thanks for listening to “The Data Flowcast: Mastering Airflow for Data Engineering & AI.” If you enjoyed this episode, please leave a 5-star review to help get the word out about the show. And be sure to subscribe so you never miss any of the insightful conversations.
#AI #Automation #Airflow #MachineLearning
What does it take to go from fixing a broken link to becoming a committer for one of the world’s leading open-source projects?
Amogh Desai, Senior Software Engineer at Astronomer, takes us through his journey with Apache Airflow. From small contributions to building meaningful connections in the open-source community, Amogh’s story provides actionable insights for anyone on the cusp of their open-source journey.
Key Takeaways:
(02:09) Building data engineering platforms at Cloudera with Kubernetes.
(04:00) Brainstorming led to contributing to Apache Airflow.
(05:17) Starting small with link fixes, progressing to Breeze development.
(07:00) Becoming a committer for Apache Airflow in September 2023.
(09:51) The steep learning curve for contributing to Airflow.
(16:30) Using GitHub’s “good-first-issue” label to get started.
(18:15) Setting up a development environment with Breeze.
(22:00) Open-source contributions enhance your resume and career.
(24:51) Amogh’s advice: Start small and stay consistent.
(28:12) Engage with the community via Slack, email lists and meetups.
Resources Mentioned:
https://www.linkedin.com/in/amogh-desai-385141157/?originalSubdomain=in%20%20https://www.linkedin.com/company/astronomer/
https://www.linkedin.com/company/astronomer/
Apache Airflow GitHub Repository -
https://github.com/apache/airflow
https://github.com/apache/airflow/blob/main/CONTRIBUTING.rst
https://github.com/apache/airflow/tree/main/dev/breeze
Thanks for listening to “The Data Flowcast: Mastering Airflow for Data Engineering & AI.” If you enjoyed this episode, please leave a 5-star review to help get the word out about the show. And be sure to subscribe so you never miss any of the insightful conversations.
#AI #Automation #Airflow #MachineLearning
Data orchestration and machine learning are shaping how organizations handle massive datasets and drive customer-focused strategies. Tools like Apache Airflow are central to this transformation. In this episode, Vasyl Vasyuta, R&D Team Leader at Optimove, joins us to discuss how his team leverages Airflow to optimize data processing, orchestrate machine learning models and create personalized customer experiences.
Key Takeaways:
(01:59) Optimove tailors marketing notifications with personalized customer journeys.
(04:25) Airflow orchestrates Snowflake procedures for massive datasets.
(05:11) DAGs manage workflows with branching and replay plugins.
(05:41) The "Joystick" plugin enables seamless data replays.
(09:33) Airflow supports MLOps for customer data grouping.
(11:15) Machine learning predicts customer behavior for better campaigns.
(13:20) Thousands of DAGs run every five minutes for data processing.
(15:36) Custom versioning allows rollbacks and gradual rollouts.
(18:00) Airflow logs enhance operational observability.
(23:00) DAG versioning in Airflow 3.0 could boost efficiency.
Resources Mentioned:
https://www.linkedin.com/in/vasyl-vasyuta-3270b54a/
Optimove -
https://www.linkedin.com/company/optimove/
https://airflow.apache.org/
https://www.snowflake.com/
Datadog -
https://www.datadoghq.com/
https://astronomer.typeform.com/airflowsurvey24
Thanks for listening to “The Data Flowcast: Mastering Airflow for Data Engineering & AI.” If you enjoyed this episode, please leave a 5-star review to help get the word out about the show. And be sure to subscribe so you never miss any of the insightful conversations.
#AI #Automation #Airflow #MachineLearning
Bridging the gap between data teams and business priorities is essential for maximizing impact and building value-driven workflows. Katie Bauer, Senior Director of Data at GlossGenius, joins us to share her principles for creating effective, aligned data teams. In this episode, Katie draws from her experience at GlossGenius, Reddit and Twitter to highlight the common pitfalls data teams face and how to overcome them. She offers practical strategies for aligning team efforts with organizational goals and fostering collaboration with stakeholders.
Key Takeaways:
(02:36) GlossGenius provides an all-in-one platform for beauty professionals.
(03:59) Airflow orchestrates data and MLOps workflows at GlossGenius.
(04:41) Focusing on value helps data teams achieve greater impact.
(06:23) Aligning team priorities with company goals minimizes friction.
(08:44) Building strong stakeholder relationships requires curiosity.
(12:46) Treating roles as flexible fosters team innovation.
(13:21) Adapting to new technologies improves effectiveness.
(18:28) Acting like your time is valuable earns respect.
(23:38) Proactive data initiatives drive strategic value.
(24:20) Usage data offers critical insights into tool effectiveness.
Resources Mentioned:
https://www.linkedin.com/in/mkatiebauer/
https://www.linkedin.com/company/glossgenius/
https://airflow.apache.org/
DBT -
https://www.getdbt.com/
Cosmos -
https://cosmos.apache.org/
https://astronomer.typeform.com/airflowsurvey24
Thanks for listening to “The Data Flowcast: Mastering Airflow for Data Engineering & AI.” If you enjoyed this episode, please leave a 5-star review to help get the word out about the show. And be sure to subscribe so you never miss any of the insightful conversations.
#AI #Automation #Airflow #MachineLearning
Scaling deployments for a billion users demands innovation, precision and resilience. In this episode, we dive into how LinkedIn optimizes its continuous deployment process using Apache Airflow. Rahul Gade, Staff Software Engineer at LinkedIn, shares his insights on building scalable systems and democratizing deployments for over 10,000 engineers.
Rahul discusses the challenges of managing large-scale deployments across 6,000 services and how his team leverages Airflow to enhance efficiency, reliability and user accessibility.
Key Takeaways:
(01:36) LinkedIn minimizes human involvement in production to reduce errors.
(02:00) Airflow powers LinkedIn’s Continuous Deployment platform.
(05:43) Continuous deployment adoption grew from 8% to a targeted 80%.
(11:25) Kubernetes ensures scalability and flexibility for deployments.
(12:04) A custom UI offers real-time deployment transparency.
(16:23) No-code YAML workflows simplify deployment tasks.
(17:18) Canaries and metrics ensure safe deployments across fabrics.
(20:45) A gateway service ensures redundancy across Airflow clusters.
(24:22) Abstractions let engineers focus on development, not logistics.
(25:20) Multi-language support in Airflow 3.0 simplifies adoption.
Resources Mentioned:
https://www.linkedin.com/in/rahul-gade-68666818/
LinkedIn -
https://www.linkedin.com/company/linkedin/
https://airflow.apache.org/
https://kubernetes.io/
https://www.openpolicyagent.org/
https://backstage.io/
https://astronomer.typeform.com/airflowsurvey24
Thanks for listening to The Data Flowcast: Mastering Airflow for Data Engineering & AI. If you enjoyed this episode, please leave a 5-star review to help get the word out about the show. And be sure to subscribe so you never miss any of the insightful conversations.
#AI #Automation #Airflow #MachineLearning
When data orchestration reaches Uber’s scale, innovation becomes a necessity, not a luxury. In this episode, we discuss the innovations behind Uber’s unique Airflow setup. With our guests Shobhit Shah and Sumit Maheshwari, both Staff Software Engineers at Uber, we explore how their team manages one of the largest data workflow systems in the world. Shobhit and Sumit walk us through the evolution of Uber’s Airflow implementation, detailing the custom solutions that support 200,000 daily pipelines. They discuss Uber's approach to tackling complex challenges in data orchestration, disaster recovery and scaling to meet the company’s extensive data needs.
Key Takeaways:
(02:03) Airflow as a service streamlines Uber’s data workflows.
(06:16) Serialization boosts security and reduces errors.
(10:05) Java-based scheduler improves system reliability.
(13:40) Custom recovery model supports emergency pipeline switching.
(15:58) No-code UI allows easy pipeline creation for non-coders.
(18:12) Backfill feature enables historical data processing.
(22:06) Regular updates keep Uber aligned with Airflow advancements.
(26:07) Plans to leverage Airflow’s latest features.
Resources Mentioned:
https://www.linkedin.com/in/shahshobhit/
https://www.linkedin.com/in/maheshwarisumit/
Uber -
https://www.linkedin.com/company/uber-com/
https://airflow.apache.org/
https://airflowsummit.org/
Uber -
https://www.uber.com/tw/en/
https://astronomer.typeform.com/airflowsurvey24
Thanks for listening to The Data Flowcast: Mastering Airflow for Data Engineering & AI. If you enjoyed this episode, please leave a 5-star review to help get the word out about the show. And be sure to subscribe so you never miss any of the insightful conversations.
#AI #Automation #Airflow #MachineLearning
Efficient data orchestration is the backbone of modern analytics and AI-driven workflows. Without the right tools, even the best data can fall short of its potential. In this episode, Andrea Bombino, Co-Founder and Head of Analytics Engineering at Astrafy, shares insights into his team’s approach to optimizing data transformation and orchestration using tools like datasets and Pub/Sub to drive real-time processing. Andrea explains how they leverage Apache Airflow and Google Cloud to power dynamic data workflows.
Key Takeaways:
(01:55) Astrafy helps companies manage data using Google Cloud.
(04:36) Airflow is central to Astrafy’s data engineering efforts.
(07:17) Datasets and Pub/Sub are used for real-time workflows.
(09:59) Pub/Sub links multiple Airflow environments.
(12:40) Datasets eliminate the need for constant monitoring.
(15:22) Airflow updates have improved large-scale data operations.
(18:03) New Airflow API features make dataset updates easier.
(20:45) Real-time orchestration speeds up data processing for clients.
(23:26) Pub/Sub enhances flexibility across cloud environments.
(26:08) Future Airflow features will offer more control over data workflows.
Resources Mentioned:
https://www.linkedin.com/in/andrea-bombino/
Astrafy -
https://www.linkedin.com/company/astrafy/
https://airflow.apache.org/
https://cloud.google.com/
dbt -
https://www.getdbt.com/
https://astronomer.typeform.com/airflowsurvey24
Thanks for listening to “The Data Flowcast: Mastering Airflow for Data Engineering & AI.” If you enjoyed this episode, please leave a 5-star review to help get the word out about the show. And be sure to subscribe so you never miss any of the insightful conversations.
#AI #Automation #Airflow #MachineLearning
Data orchestration is evolving faster than ever and Apache Airflow 3 is set to revolutionize how enterprises handle complex workflows. In this episode, we dive into the exciting advancements with Vikram Koka, Chief Strategy Officer at Astronomer and PMC Member at The Apache Software Foundation. Vikram shares his insights on the evolution of Airflow and its pivotal role in shaping modern data-driven workflows, particularly with the upcoming release of Airflow 3.
Key Takeaways:
(02:36) Vikram leads Astronomer’s engineering and open-source teams for Airflow.
(05:26) Airflow enables reliable data ingestion and curation.
(08:17) Enterprises use Airflow for mission-critical data pipelines.
(11:08) Airflow 3 introduces major architectural updates.
(13:58) Multi-cloud and edge deployments are supported in Airflow 3.
(16:49) Event-driven scheduling makes Airflow more dynamic.
(19:40) Tasks in Airflow 3 can run in any language.
(22:30) Multilingual task support is crucial for enterprises.
(25:21) Data assets and event-based integration enhance orchestration.
(28:12) Community feedback plays a vital role in Airflow 3.
Resources Mentioned:
https://www.linkedin.com/in/vikramkoka/
https://www.linkedin.com/company/astronomer/
The Apache Software Foundation LinkedIn -
https://www.linkedin.com/company/the-apache-software-foundation/
https://www.linkedin.com/company/apache-airflow/
https://airflow.apache.org/
https://www.astronomer.io/
The Apache Software Foundation -
https://www.apache.org/
Join the Airflow slack and/or Dev list -
https://airflow.apache.org/community/
https://astronomer.typeform.com/airflowsurvey24
Thanks for listening to The Data Flowcast: Mastering Airflow for Data Engineering & AI. If you enjoyed this episode, please leave a 5-star review to help get the word out about the show. And be sure to subscribe so you never miss any of the insightful conversations.
#AI #Automation #Airflow #MachineLearning
Data and AI are revolutionizing HR, empowering leaders to measure performance and drive strategic decisions like never before.
In this episode, we explore the transformation of HR technology with Guy Dassa, Chief Technology Officer at 15Five, as he shares insights into their evolving data platform. Guy discusses how 15Five equips HR leaders with tools to measure and take action on team performance, engagement and retention. He explains their data-driven approach, highlighting how Apache Airflow supports their data ingestion, transformation, and AI integration.
Key Takeaways:
(01:54) 15Five acts as a command center for HR leaders.
(03:40) Tools like performance reviews, engagement surveys, and an insights dashboard guide actionable HR steps.
(05:33) Data visualization, insights, and action recommendations enhance HR effectiveness to improve their people's outcomes.
(07:08) Strict data confidentiality and sanitized AI model training.
(09:21) Airflow is central to data transformation and enrichment.
(11:15) Airflow enrichment DAGs integrate AI models.
(13:33) Integration of Airflow and DBT enables efficient data transformation.
(15:28) Synchronization challenges arise with reverse ETL processes.
(17:10) Future plans include deeper Airflow integration with AI.
(19:31) Emphasizing the need for DAG versioning and improved dependency visibility.
Resources Mentioned:
https://www.linkedin.com/in/guydassa/
15Five -
https://www.linkedin.com/company/15five/
https://airflow.apache.org/
MLflow -
https://mlflow.org/
DBT -
https://www.getdbt.com/
https://kubernetes.io/
RedShift -
https://aws.amazon.com/redshift/
15Five -
https://www.15five.com/
Thanks for listening to The Data Flowcast: Mastering Airflow for Data Engineering & AI. If you enjoyed this episode, please leave a 5-star review to help get the word out about the show. And be sure to subscribe so you never miss any of the insightful conversations.
#AI #Automation #Airflow #MachineLearning
Unlocking engineering productivity goes beyond coding — it’s about managing knowledge efficiently. In this episode, we explore the innovative ways in which Dosu leverages Airflow for data orchestration and supports the Airflow project.
Devin Stein, Founder of Dosu, shares his insights on how engineering teams can focus on value-added work by automating knowledge management. Devin dives into Dosu’s purpose, the significance of AI in their product, and why they chose Airflow as the backbone for scheduling and data management.
Key Takeaways:
(01:33) Dosu's mission to democratize engineering knowledge.
(05:00) AI is central to Dosu's product for structuring engineering knowledge.
(06:23) The importance of maintaining up-to-date data for AI effectiveness.
(07:55) How Airflow supports Dosu’s data ingestion and automation processes.
(08:45) The reasoning behind choosing Airflow over other orchestrators.
(11:00) Airflow enables Dosu to manage both traditional ETL and dynamic workflows.
(13:04) Dosu assists the Airflow project by auto-labeling issues and discussions.
(14:56) Thoughtful collaboration with the Airflow community to introduce AI tools.
(16:37) The potential of Airflow to handle more dynamic, scheduled workflows in the future.
(18:00) Challenges and custom solutions for implementing dynamic workflows in Airflow.
Resources Mentioned:
Apache Airflow - https://airflow.apache.org/
Dosu Website - https://dosu.dev/
Thanks for listening to The Data Flowcast: Mastering Airflow for Data Engineering & AI. If you enjoyed this episode, please leave a 5-star review to help get the word out about the show. And be sure to subscribe so you never miss any of the insightful conversations.
#AI #Automation #Airflow #MachineLearning
En liten tjänst av I'm With Friends. Finns även på engelska.