Africa-focused technology, digital and innovation ecosystem insight and commentary.
…
continue reading
Player FM - Internet Radio Done Right
39 subscribers
Checked 4d ago
Adăugat seven ani în urmă
Content provided by The Data Flowcast. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by The Data Flowcast or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://ro.player.fm/legal.
Player FM - Aplicație Podcast
Treceți offline cu aplicația Player FM !
Treceți offline cu aplicația Player FM !
Podcasturi care merită ascultate
SPONSORIZAT
M
Mind The Business: Small Business Success Stories
![Mind The Business: Small Business Success Stories podcast artwork](https://cdn.player.fm/images/44159763/series/yvoDOHutf1eKJqQp/32.jpg 32w, https://cdn.player.fm/images/44159763/series/yvoDOHutf1eKJqQp/64.jpg 64w, https://cdn.player.fm/images/44159763/series/yvoDOHutf1eKJqQp/128.jpg 128w, https://cdn.player.fm/images/44159763/series/yvoDOHutf1eKJqQp/256.jpg 256w, https://cdn.player.fm/images/44159763/series/yvoDOHutf1eKJqQp/512.jpg 512w)
![Mind The Business: Small Business Success Stories podcast artwork](/static/images/64pixel.png)
1 The SB Starter Kit - Everything You Need or Need to Know to Get Your Business Off the Ground (Part 1) - Trademarks, Patents, LLCs and SCorps 35:28
35:28
Redare mai Târziu
Redare mai Târziu
Liste
Like
Plăcut35:28![icon](https://imagehost.player.fm/icons/general/red-pin.svg)
Step one for starting a small business is often coming up with an exciting idea. But what is step two? Step three? Steps four through launch and beyond? On our second episode, and first iteration of our Small Business Starter Kit Series, Austin and Jannese visit The Candle Pour to chat with founders Misty and Dennis Akers . They’ll tell our hosts about how they got their business off the ground and about all the things that go with it: from incorporation to trademarks. Join us as they detail how they went from Grand Idea to Grand Opening. Learn more about how QuickBooks can help you grow your business: Quickbooks.com See omnystudio.com/listener for privacy information.…
From Task Failures to Operational Excellence at GumGum with Brendan Frick
Manage episode 438606569 series 2053958
Content provided by The Data Flowcast. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by The Data Flowcast or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://ro.player.fm/legal.
Data failures are inevitable but how you manage them can define the success of your operations. In this episode, we dive deep into the challenges of data engineering and AI with Brendan Frick, Senior Engineering Manager, Data at GumGum. Brendan shares his unique approach to managing task failures and DAG issues in a high-stakes ad-tech environment. Brendan discusses how GumGum leverages Apache Airflow to streamline data processes, ensuring efficient data movement and orchestration while minimizing disruptions in their operations. Key Takeaways: (02:02) Brendan’s role at GumGum and its approach to ad tech. (04:27) How GumGum uses Airflow for daily data orchestration, moving data from S3 to warehouses. (07:02) Handling task failures in Airflow using Jira for actionable, developer-friendly responses. (09:13) Transitioning from email alerts to a more structured system with Jira and PagerDuty. (11:40) Monitoring task retry rates as a key metric to identify potential issues early. (14:15) Utilizing Looker dashboards to track and analyze task performance and retry rates. (16:39) Transitioning from Kubernetes operator to a more reliable system for data processing. (19:25) The importance of automating stakeholder communication with data lineage tools like Atlan. (20:48) Implementing data contracts to ensure SLAs are met across all data processes. (22:01) The role of scalable SLAs in Airflow to ensure data reliability and meet business needs. Resources Mentioned: Brendan Frick - https://www.linkedin.com/in/brendan-frick-399345107/ GumGum - https://www.linkedin.com/company/gumgum/ Apache Airflow - https://airflow.apache.org/ Jira - https://www.atlassian.com/software/jira Atlan - https://atlan.com/ Kubernetes - https://kubernetes.io/ Thanks for listening to The Data Flowcast: Mastering Airflow for Data Engineering & AI. If you enjoyed this episode, please leave a 5-star review to help get the word out about the show. And be sure to subscribe so you never miss any of the insightful conversations. #AI #Automation #Airflow #MachineLearning
…
continue reading
42 episoade
From Task Failures to Operational Excellence at GumGum with Brendan Frick
The Data Flowcast: Mastering Airflow for Data Engineering & AI
Manage episode 438606569 series 2053958
Content provided by The Data Flowcast. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by The Data Flowcast or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://ro.player.fm/legal.
Data failures are inevitable but how you manage them can define the success of your operations. In this episode, we dive deep into the challenges of data engineering and AI with Brendan Frick, Senior Engineering Manager, Data at GumGum. Brendan shares his unique approach to managing task failures and DAG issues in a high-stakes ad-tech environment. Brendan discusses how GumGum leverages Apache Airflow to streamline data processes, ensuring efficient data movement and orchestration while minimizing disruptions in their operations. Key Takeaways: (02:02) Brendan’s role at GumGum and its approach to ad tech. (04:27) How GumGum uses Airflow for daily data orchestration, moving data from S3 to warehouses. (07:02) Handling task failures in Airflow using Jira for actionable, developer-friendly responses. (09:13) Transitioning from email alerts to a more structured system with Jira and PagerDuty. (11:40) Monitoring task retry rates as a key metric to identify potential issues early. (14:15) Utilizing Looker dashboards to track and analyze task performance and retry rates. (16:39) Transitioning from Kubernetes operator to a more reliable system for data processing. (19:25) The importance of automating stakeholder communication with data lineage tools like Atlan. (20:48) Implementing data contracts to ensure SLAs are met across all data processes. (22:01) The role of scalable SLAs in Airflow to ensure data reliability and meet business needs. Resources Mentioned: Brendan Frick - https://www.linkedin.com/in/brendan-frick-399345107/ GumGum - https://www.linkedin.com/company/gumgum/ Apache Airflow - https://airflow.apache.org/ Jira - https://www.atlassian.com/software/jira Atlan - https://atlan.com/ Kubernetes - https://kubernetes.io/ Thanks for listening to The Data Flowcast: Mastering Airflow for Data Engineering & AI. If you enjoyed this episode, please leave a 5-star review to help get the word out about the show. And be sure to subscribe so you never miss any of the insightful conversations. #AI #Automation #Airflow #MachineLearning
…
continue reading
42 episoade
Toate episoadele
×T
The Data Flowcast: Mastering Airflow for Data Engineering & AI
![The Data Flowcast: Mastering Airflow for Data Engineering & AI podcast artwork](/static/images/64pixel.png)
1 Overcoming Airflow Scaling Challenges at Apollo GraphQL with Jonathan Rainer 43:39
43:39
Redare mai Târziu
Redare mai Târziu
Liste
Like
Plăcut43:39![icon](https://imagehost.player.fm/icons/general/red-pin.svg)
Scaling a data orchestration platform to manage thousands of tasks daily demands innovative solutions and strategic problem-solving. In this episode, we explore the complexities of scaling Airflow and the challenges of orchestrating thousands of tasks in dynamic data environments. Jonathan Rainer , Staff Software Engineer at Apollo GraphQL , joins us to share his journey optimizing data pipelines, overcoming UI limitations and ensuring DAG consistency in high-stakes scenarios. Key Takeaways: (03:11) Using Airflow to schedule computation in BigQuery. (07:02) How DAGs with 8,000+ tasks were managed nightly. (08:18) Ensuring accuracy in regulatory reporting for banking. (11:35) Handling task inconsistency and DAG failures with automation. (16:09) Building a service to resolve DAG consistency issues in Airflow. (25:05) Challenges with scaling the Airflow UI for thousands of tasks. (27:03) The role of upstream and downstream task management in Airflow. (37:33) The importance of operational metrics for monitoring Airflow health. (39:19) Balancing new tools with root cause analysis to address scaling issues. (41:35) Why scaling solutions require both technical and leadership buy-in. Resources Mentioned: Jonathan Rainer - https://www.linkedin.com/in/jonathan-rainer/ Apollo GraphQL - https://www.linkedin.com/company/apollo-graphql/ Apache Airflow - https://airflow.apache.org/ BigQuery - https://airflow.apache.org/docs/apache-airflow-providers-google/stable/operators/cloud/bigquery.html Kubernetes - https://kubernetes.io/ Thanks for listening to “The Data Flowcast: Mastering Airflow for Data Engineering & AI.” If you enjoyed this episode, please leave a 5-star review to help get the word out about the show. And be sure to subscribe so you never miss any of the insightful conversations. #AI #Automation #Airflow #MachineLearning…
T
The Data Flowcast: Mastering Airflow for Data Engineering & AI
![The Data Flowcast: Mastering Airflow for Data Engineering & AI podcast artwork](/static/images/64pixel.png)
1 Orchestrating Analytics and AI Workflows at Telia with Arjun Anandkumar 26:00
26:00
Redare mai Târziu
Redare mai Târziu
Liste
Like
Plăcut26:00![icon](https://imagehost.player.fm/icons/general/red-pin.svg)
T he future of data engineering lies in seamless orchestration and automation. In this episode, Arjun Anandkumar , Data Engineer at Telia , shares how his team uses Airflow to drive analytics and AI workflows. He highlights the challenges of scaling data platforms and how adopting best practices can simplify complex processes for teams across the organization. Arjun also discusses the transformative role of tools like Cosmos and Terraform in enhancing efficiency and collaboration. Key Takeaways: (02:16) Telia operates across the Nordics and Baltics, focusing on telecom and energy services. (03:45) Airflow runs dbt models seamlessly with Cosmos on AWS MWAA. (05:47) Cosmos improves visibility and orchestration in Airflow. (07:00) Medallion Architecture organizes data into bronze, silver and gold layers. (08:34) Task group challenges highlight the need for adaptable workflows. (15:04) Scaling managed services requires trial, error and tailored tweaks. (19:46) Terraform scales infrastructure, while YAML templates manage DAGs efficiently. (20:00) Templated DAGs and robust testing enhance platform management. (24:15) Open-source resources drive innovation in Airflow practices. Resources Mentioned: Arjun Anandkumar - https://www.linkedin.com/in/arjunanand1/?originalSubdomain=dk Telia - https://www.linkedin.com/company/teliacompany/ Apache Airflow - https://airflow.apache.org/ Cosmos by Astronomer - https://www.astronomer.io/cosmos/ Terraform - https://www.terraform.io/ Medallion Architecture by Databricks - https://www.databricks.com/glossary/medallion-architecture Thanks for listening to “The Data Flowcast: Mastering Airflow for Data Engineering & AI.” If you enjoyed this episode, please leave a 5-star review to help get the word out about the show. And be sure to subscribe so you never miss any of the insightful conversations. #AI #Automation #Airflow #MachineLearning…
T
The Data Flowcast: Mastering Airflow for Data Engineering & AI
![The Data Flowcast: Mastering Airflow for Data Engineering & AI podcast artwork](/static/images/64pixel.png)
1 The Role of Airflow in Finance Transformation at Etraveli Group with Mihir Samant 21:19
21:19
Redare mai Târziu
Redare mai Târziu
Liste
Like
Plăcut21:19![icon](https://imagehost.player.fm/icons/general/red-pin.svg)
Transforming bottlenecked finance processes into streamlined, automated systems requires the right tools and a forward-thinking approach. In this episode, Mihir Samant , Senior Data Analyst at Etraveli Group , joins us to share how his team leverages Airflow to revolutionize finance automation. With extensive experience in data workflows and a passion for open-source tools, Mihir provides valuable insights into building efficient, scalable systems. We explore the transformative power of Airflow in automating workflows and enhancing data orchestration within the finance domain. Key Takeaways: (02:14) Etraveli Group specializes in selling affordable flight tickets and ancillary services. (03:56) Mihir’s finance automation team uses Airflow to tackle month-end bottlenecks. (06:00) Airflow's flexibility enables end-to-end automation for finance workflows. (07:00) Open-source Airflow tools offer cost-effective solutions for new teams. (08:46) Sensors and dynamic DAGs are pivotal features for optimizing tasks. (13:30) GitSync simplifies development by syncing environments seamlessly. (16:27) Plans include integrating Databricks for more advanced data handling. (17:58) Airflow and Databricks offer multiple flexible methods to trigger workflows and execute SQL queries seamlessly. Resources Mentioned: Mihir Samant - https://www.linkedin.com/in/misamant/?originalSubdomain=ca Etraveli Group - https://www.linkedin.com/company/etraveli-group/ Apache Airflow - https://airflow.apache.org/ Docker - https://www.docker.com/ Databricks - https://www.databricks.com/ Thanks for listening to “The Data Flowcast: Mastering Airflow for Data Engineering & AI.” If you enjoyed this episode, please leave a 5-star review to help get the word out about the show. And be sure to subscribe so you never miss any of the insightful conversations. #AI #Automation #Airflow #MachineLearning…
T
The Data Flowcast: Mastering Airflow for Data Engineering & AI
![The Data Flowcast: Mastering Airflow for Data Engineering & AI podcast artwork](/static/images/64pixel.png)
1 Inside Ford’s Data Transformation: Advanced Orchestration Strategies with Vasantha Kosuri-Marshall 38:54
38:54
Redare mai Târziu
Redare mai Târziu
Liste
Like
Plăcut38:54![icon](https://imagehost.player.fm/icons/general/red-pin.svg)
Data engineering is entering a new era, where orchestration and automation are redefining how large-scale projects operate. This episode features Vasantha Kosuri-Marshall , Data and ML Ops Engineer at Ford Motor Company . Vasantha shares her expertise in managing complex data pipelines. She takes us through Ford's transition to cloud platforms, the adoption of Airflow and the intricate challenges of orchestrating data in a diverse environment. Key Takeaways: (03:10) Vasantha’s transition to the Advanced Driving Assist Systems team at Ford. (05:42) Early adoption of Airflow to orchestrate complex data pipelines. (09:29) Ford's move from on-premise data solutions to Google Cloud Platform. (12:03) The importance of Airflow's scheduling capabilities for efficient data management. (16:12) Using Kubernetes to scale Airflow for large-scale data processing. (19:59) Vasantha’s experience in overcoming challenges with legacy orchestration tools. (22:22) Integration of data engineering and data science pipelines at Ford. (28:03) How deferrable operators in Airflow improve performance and save costs. (32:12) Vasantha’s insights into tuning Airflow properties for thousands of DAGs. (36:09) The significance of monitoring and observability in managing Airflow instances. Resources Mentioned: Vasantha Kosuri-Marshall - https://www.linkedin.com/in/vasantha-kosuri-marshall-0b0aab188/ Apache Airflow - https://airflow.apache.org/ Google Cloud Platform (GCP) - https://cloud.google.com/ Ford Motor Company | LinkedIn - https://www.linkedin.com/company/ford-motor-company/ Ford Motor Company | Website - https://www.ford.com/ Astronomer - https://www.astronomer.io/ Thanks for listening to “The Data Flowcast: Mastering Airflow for Data Engineering & AI.” If you enjoyed this episode, please leave a 5-star review to help get the word out about the show. And be sure to subscribe so you never miss any of the insightful conversations. #AI #Automation #Airflow #MachineLearning…
T
The Data Flowcast: Mastering Airflow for Data Engineering & AI
![The Data Flowcast: Mastering Airflow for Data Engineering & AI podcast artwork](/static/images/64pixel.png)
1 Powering Finance With Advanced Data Solutions at Ramp with Ryan Delgado 24:35
24:35
Redare mai Târziu
Redare mai Târziu
Liste
Like
Plăcut24:35![icon](https://imagehost.player.fm/icons/general/red-pin.svg)
Data is the backbone of every modern business, but unlocking its full potential requires the right tools and strategies. In this episode, Ryan Delgado , Director of Engineering at Ramp , joins us to explore how innovative data platforms can transform business operations and fuel growth. He shares insights on integrating Apache Airflow, optimizing data workflows and leveraging analytics to enhance customer experiences. Key Takeaways: (01:52) Data is the lifeblood of Ramp, touching every vertical in the company. (03:18) Ramp’s data platform team enables high-velocity scaling through tailored tools. (05:27) Airflow powers Ramp’s enterprise data warehouse integrations for advanced analytics. (07:55) Centralizing data in Snowflake simplifies storage and analytics pipelines. (12:08) Machine learning models at Ramp integrate seamlessly with Airflow for operational excellence. (14:11) Leveraging Airflow datasets eliminates inefficiencies in DAG dependencies. (17:22) Platforms evolve from solving narrow business problems to scaling organizationally. (18:55) ClickHouse enhances Ramp’s OLAP capabilities with 100x performance improvements. (19:47) Ramp’s OLAP platform improves performance by reducing joins and leveraging ClickHouse. (21:46) Ryan envisions a lighter-weight, more Python-native future for Airflow. Resources Mentioned: Ryan Delgado - https://www.linkedin.com/in/ryan-delgado-69544568/ Ramp - https://www.linkedin.com/company/ramp/ Apache Airflow - https://airflow.apache.org/ Snowflake - https://www.snowflake.com/ ClickHouse - https://clickhouse.com/ dbt - https://www.getdbt.com/ Thanks for listening to “The Data Flowcast: Mastering Airflow for Data Engineering & AI.” If you enjoyed this episode, please leave a 5-star review to help get the word out about the show. And be sure to subscribe so you never miss any of the insightful conversations. #AI #Automation #Airflow #MachineLearning…
T
The Data Flowcast: Mastering Airflow for Data Engineering & AI
![The Data Flowcast: Mastering Airflow for Data Engineering & AI podcast artwork](/static/images/64pixel.png)
1 Exploring the Power of Airflow 3 at Astronomer with Amogh Desai 30:24
30:24
Redare mai Târziu
Redare mai Târziu
Liste
Like
Plăcut30:24![icon](https://imagehost.player.fm/icons/general/red-pin.svg)
What does it take to go from fixing a broken link to becoming a committer for one of the world’s leading open-source projects? Amogh Desai , Senior Software Engineer at Astronomer , takes us through his journey with Apache Airflow. From small contributions to building meaningful connections in the open-source community, Amogh’s story provides actionable insights for anyone on the cusp of their open-source journey. Key Takeaways: (02:09) Building data engineering platforms at Cloudera with Kubernetes. (04:00) Brainstorming led to contributing to Apache Airflow. (05:17) Starting small with link fixes, progressing to Breeze development. (07:00) Becoming a committer for Apache Airflow in September 2023. (09:51) The steep learning curve for contributing to Airflow. (16:30) Using GitHub’s “good-first-issue” label to get started. (18:15) Setting up a development environment with Breeze. (22:00) Open-source contributions enhance your resume and career. (24:51) Amogh’s advice: Start small and stay consistent. (28:12) Engage with the community via Slack, email lists and meetups. Resources Mentioned: Amogh Desai - https://www.linkedin.com/in/amogh-desai-385141157/?originalSubdomain=in%20%20https://www.linkedin.com/company/astronomer/ Astronomer - https://www.linkedin.com/company/astronomer/ Apache Airflow GitHub Repository - https://github.com/apache/airflow Contributors Quick Guide - https://github.com/apache/airflow/blob/main/CONTRIBUTING.rst Breeze Development Tool - https://github.com/apache/airflow/tree/main/dev/breeze Thanks for listening to “The Data Flowcast: Mastering Airflow for Data Engineering & AI.” If you enjoyed this episode, please leave a 5-star review to help get the word out about the show. And be sure to subscribe so you never miss any of the insightful conversations. #AI #Automation #Airflow #MachineLearning…
T
The Data Flowcast: Mastering Airflow for Data Engineering & AI
![The Data Flowcast: Mastering Airflow for Data Engineering & AI podcast artwork](/static/images/64pixel.png)
1 Using Airflow To Power Machine Learning Pipelines at Optimove with Vasyl Vasyuta 24:11
24:11
Redare mai Târziu
Redare mai Târziu
Liste
Like
Plăcut24:11![icon](https://imagehost.player.fm/icons/general/red-pin.svg)
Data orchestration and machine learning are shaping how organizations handle massive datasets and drive customer-focused strategies. Tools like Apache Airflow are central to this transformation. In this episode, Vasyl Vasyuta , R&D Team Leader at Optimove , joins us to discuss how his team leverages Airflow to optimize data processing, orchestrate machine learning models and create personalized customer experiences. Key Takeaways: (01:59) Optimove tailors marketing notifications with personalized customer journeys. (04:25) Airflow orchestrates Snowflake procedures for massive datasets. (05:11) DAGs manage workflows with branching and replay plugins. (05:41) The "Joystick" plugin enables seamless data replays. (09:33) Airflow supports MLOps for customer data grouping. (11:15) Machine learning predicts customer behavior for better campaigns. (13:20) Thousands of DAGs run every five minutes for data processing. (15:36) Custom versioning allows rollbacks and gradual rollouts. (18:00) Airflow logs enhance operational observability. (23:00) DAG versioning in Airflow 3.0 could boost efficiency. Resources Mentioned: Vasyl Vasyuta - https://www.linkedin.com/in/vasyl-vasyuta-3270b54a/ Optimove - https://www.linkedin.com/company/optimove/ Apache Airflow - https://airflow.apache.org/ Snowflake - https://www.snowflake.com/ Datadog - https://www.datadoghq.com/ Apache Airflow Survey - https://astronomer.typeform.com/airflowsurvey24 Thanks for listening to “The Data Flowcast: Mastering Airflow for Data Engineering & AI.” If you enjoyed this episode, please leave a 5-star review to help get the word out about the show. And be sure to subscribe so you never miss any of the insightful conversations. #AI #Automation #Airflow #MachineLearning…
T
The Data Flowcast: Mastering Airflow for Data Engineering & AI
![The Data Flowcast: Mastering Airflow for Data Engineering & AI podcast artwork](/static/images/64pixel.png)
1 Maximizing Business Impact Through Data at GlossGenius with Katie Bauer 25:49
25:49
Redare mai Târziu
Redare mai Târziu
Liste
Like
Plăcut25:49![icon](https://imagehost.player.fm/icons/general/red-pin.svg)
Bridging the gap between data teams and business priorities is essential for maximizing impact and building value-driven workflows. Katie Bauer , Senior Director of Data at GlossGenius , joins us to share her principles for creating effective, aligned data teams. In this episode, Katie draws from her experience at GlossGenius, Reddit and Twitter to highlight the common pitfalls data teams face and how to overcome them. She offers practical strategies for aligning team efforts with organizational goals and fostering collaboration with stakeholders. Key Takeaways: (02:36) GlossGenius provides an all-in-one platform for beauty professionals. (03:59) Airflow orchestrates data and MLOps workflows at GlossGenius. (04:41) Focusing on value helps data teams achieve greater impact. (06:23) Aligning team priorities with company goals minimizes friction. (08:44) Building strong stakeholder relationships requires curiosity. (12:46) Treating roles as flexible fosters team innovation. (13:21) Adapting to new technologies improves effectiveness. (18:28) Acting like your time is valuable earns respect. (23:38) Proactive data initiatives drive strategic value. (24:20) Usage data offers critical insights into tool effectiveness. Resources Mentioned: Katie Bauer - https://www.linkedin.com/in/mkatiebauer/ GlossGenius - https://www.linkedin.com/company/glossgenius/ Apache Airflow - https://airflow.apache.org/ DBT - https://www.getdbt.com/ Cosmos - https://cosmos.apache.org/ Apache Airflow Survey - https://astronomer.typeform.com/airflowsurvey24 Thanks for listening to “The Data Flowcast: Mastering Airflow for Data Engineering & AI.” If you enjoyed this episode, please leave a 5-star review to help get the word out about the show. And be sure to subscribe so you never miss any of the insightful conversations. #AI #Automation #Airflow #MachineLearning…
T
The Data Flowcast: Mastering Airflow for Data Engineering & AI
![The Data Flowcast: Mastering Airflow for Data Engineering & AI podcast artwork](/static/images/64pixel.png)
1 Optimizing Large-Scale Deployments at LinkedIn with Rahul Gade 27:47
27:47
Redare mai Târziu
Redare mai Târziu
Liste
Like
Plăcut27:47![icon](https://imagehost.player.fm/icons/general/red-pin.svg)
Scaling deployments for a billion users demands innovation, precision and resilience. In this episode, we dive into how LinkedIn optimizes its continuous deployment process using Apache Airflow. Rahul Gade , Staff Software Engineer at LinkedIn , shares his insights on building scalable systems and democratizing deployments for over 10,000 engineers. Rahul discusses the challenges of managing large-scale deployments across 6,000 services and how his team leverages Airflow to enhance efficiency, reliability and user accessibility. Key Takeaways: (01:36) LinkedIn minimizes human involvement in production to reduce errors. (02:00) Airflow powers LinkedIn’s Continuous Deployment platform. (05:43) Continuous deployment adoption grew from 8% to a targeted 80%. (11:25) Kubernetes ensures scalability and flexibility for deployments. (12:04) A custom UI offers real-time deployment transparency. (16:23) No-code YAML workflows simplify deployment tasks. (17:18) Canaries and metrics ensure safe deployments across fabrics. (20:45) A gateway service ensures redundancy across Airflow clusters. (24:22) Abstractions let engineers focus on development, not logistics. (25:20) Multi-language support in Airflow 3.0 simplifies adoption. Resources Mentioned: Rahul Gade - https://www.linkedin.com/in/rahul-gade-68666818/ LinkedIn - https://www.linkedin.com/company/linkedin/ Apache Airflow - https://airflow.apache.org/ Kubernetes - https://kubernetes.io/ Open Policy Agent (OPA) - https://www.openpolicyagent.org/ Backstage - https://backstage.io/ Apache Airflow Survey - https://astronomer.typeform.com/airflowsurvey24 Thanks for listening to The Data Flowcast: Mastering Airflow for Data Engineering & AI. If you enjoyed this episode, please leave a 5-star review to help get the word out about the show. And be sure to subscribe so you never miss any of the insightful conversations. #AI #Automation #Airflow #MachineLearning…
T
The Data Flowcast: Mastering Airflow for Data Engineering & AI
![The Data Flowcast: Mastering Airflow for Data Engineering & AI podcast artwork](/static/images/64pixel.png)
1 How Uber Manages 1 Million Daily Tasks Using Airflow, with Shobhit Shah and Sumit Maheshwari 28:44
28:44
Redare mai Târziu
Redare mai Târziu
Liste
Like
Plăcut28:44![icon](https://imagehost.player.fm/icons/general/red-pin.svg)
When data orchestration reaches Uber’s scale, innovation becomes a necessity, not a luxury. In this episode, we discuss the innovations behind Uber’s unique Airflow setup. With our guests Shobhit Shah and Sumit Maheshwari , both Staff Software Engineers at Uber , we explore how their team manages one of the largest data workflow systems in the world. Shobhit and Sumit walk us through the evolution of Uber’s Airflow implementation, detailing the custom solutions that support 200,000 daily pipelines. They discuss Uber's approach to tackling complex challenges in data orchestration, disaster recovery and scaling to meet the company’s extensive data needs. Key Takeaways: (02:03) Airflow as a service streamlines Uber’s data workflows. (06:16) Serialization boosts security and reduces errors. (10:05) Java-based scheduler improves system reliability. (13:40) Custom recovery model supports emergency pipeline switching. (15:58) No-code UI allows easy pipeline creation for non-coders. (18:12) Backfill feature enables historical data processing. (22:06) Regular updates keep Uber aligned with Airflow advancements. (26:07) Plans to leverage Airflow’s latest features. Resources Mentioned: Shobhit Shah - https://www.linkedin.com/in/shahshobhit/ Sumit Maheshwar - https://www.linkedin.com/in/maheshwarisumit/ Uber - https://www.linkedin.com/company/uber-com/ Apache Airflow - https://airflow.apache.org/ Airflow Summit - https://airflowsummit.org/ Uber - https://www.uber.com/tw/en/ Apache Airflow Survey - https://astronomer.typeform.com/airflowsurvey24 Thanks for listening to The Data Flowcast: Mastering Airflow for Data Engineering & AI. If you enjoyed this episode, please leave a 5-star review to help get the word out about the show. And be sure to subscribe so you never miss any of the insightful conversations. #AI #Automation #Airflow #MachineLearning…
T
The Data Flowcast: Mastering Airflow for Data Engineering & AI
![The Data Flowcast: Mastering Airflow for Data Engineering & AI podcast artwork](/static/images/64pixel.png)
1 Building Resilient Data Systems for Modern Enterprises at Astrafy with Andrea Bombino 28:29
28:29
Redare mai Târziu
Redare mai Târziu
Liste
Like
Plăcut28:29![icon](https://imagehost.player.fm/icons/general/red-pin.svg)
Efficient data orchestration is the backbone of modern analytics and AI-driven workflows. Without the right tools, even the best data can fall short of its potential. In this episode, Andrea Bombino , Co-Founder and Head of Analytics Engineering at Astrafy , shares insights into his team’s approach to optimizing data transformation and orchestration using tools like datasets and Pub/Sub to drive real-time processing. Andrea explains how they leverage Apache Airflow and Google Cloud to power dynamic data workflows. Key Takeaways: (01:55) Astrafy helps companies manage data using Google Cloud. (04:36) Airflow is central to Astrafy’s data engineering efforts. (07:17) Datasets and Pub/Sub are used for real-time workflows. (09:59) Pub/Sub links multiple Airflow environments. (12:40) Datasets eliminate the need for constant monitoring. (15:22) Airflow updates have improved large-scale data operations. (18:03) New Airflow API features make dataset updates easier. (20:45) Real-time orchestration speeds up data processing for clients. (23:26) Pub/Sub enhances flexibility across cloud environments. (26:08) Future Airflow features will offer more control over data workflows. Resources Mentioned: Andrea Bombino - https://www.linkedin.com/in/andrea-bombino/ Astrafy - https://www.linkedin.com/company/astrafy/ Apache Airflow - https://airflow.apache.org/ Google Cloud - https://cloud.google.com/ dbt - https://www.getdbt.com/ Apache Airflow Survey - https://astronomer.typeform.com/airflowsurvey24 Thanks for listening to “The Data Flowcast: Mastering Airflow for Data Engineering & AI.” If you enjoyed this episode, please leave a 5-star review to help get the word out about the show. And be sure to subscribe so you never miss any of the insightful conversations. #AI #Automation #Airflow #MachineLearning…
T
The Data Flowcast: Mastering Airflow for Data Engineering & AI
![The Data Flowcast: Mastering Airflow for Data Engineering & AI podcast artwork](/static/images/64pixel.png)
1 Inside Airflow 3: Redefining Data Engineering with Vikram Koka 30:08
30:08
Redare mai Târziu
Redare mai Târziu
Liste
Like
Plăcut30:08![icon](https://imagehost.player.fm/icons/general/red-pin.svg)
Data orchestration is evolving faster than ever and Apache Airflow 3 is set to revolutionize how enterprises handle complex workflows. In this episode, we dive into the exciting advancements with Vikram Koka , Chief Strategy Officer at Astronomer and PMC Member at The Apache Software Foundation . Vikram shares his insights on the evolution of Airflow and its pivotal role in shaping modern data-driven workflows, particularly with the upcoming release of Airflow 3. Key Takeaways: (02:36) Vikram leads Astronomer’s engineering and open-source teams for Airflow. (05:26) Airflow enables reliable data ingestion and curation. (08:17) Enterprises use Airflow for mission-critical data pipelines. (11:08) Airflow 3 introduces major architectural updates. (13:58) Multi-cloud and edge deployments are supported in Airflow 3. (16:49) Event-driven scheduling makes Airflow more dynamic. (19:40) Tasks in Airflow 3 can run in any language. (22:30) Multilingual task support is crucial for enterprises. (25:21) Data assets and event-based integration enhance orchestration. (28:12) Community feedback plays a vital role in Airflow 3. Resources Mentioned: Vikram Koka - https://www.linkedin.com/in/vikramkoka/ Astronomer - https://www.linkedin.com/company/astronomer/ The Apache Software Foundation LinkedIn - https://www.linkedin.com/company/the-apache-software-foundation/ Apache Airflow LinkedIn - https://www.linkedin.com/company/apache-airflow/ Apache Airflow - https://airflow.apache.org/ Astronomer - https://www.astronomer.io/ The Apache Software Foundation - https://www.apache.org/ Join the Airflow slack and/or Dev list - https://airflow.apache.org/community/ Apache Airflow Survey - https://astronomer.typeform.com/airflowsurvey24 Thanks for listening to The Data Flowcast: Mastering Airflow for Data Engineering & AI. If you enjoyed this episode, please leave a 5-star review to help get the word out about the show. And be sure to subscribe so you never miss any of the insightful conversations. #AI #Automation #Airflow #MachineLearning…
T
The Data Flowcast: Mastering Airflow for Data Engineering & AI
![The Data Flowcast: Mastering Airflow for Data Engineering & AI podcast artwork](/static/images/64pixel.png)
1 Building a Data-Driven HR Platform at 15Five with Guy Dassa 20:25
20:25
Redare mai Târziu
Redare mai Târziu
Liste
Like
Plăcut20:25![icon](https://imagehost.player.fm/icons/general/red-pin.svg)
Data and AI are revolutionizing HR, empowering leaders to measure performance and drive strategic decisions like never before. In this episode, we explore the transformation of HR technology with Guy Dassa , Chief Technology Officer at 15Five , as he shares insights into their evolving data platform. Guy discusses how 15Five equips HR leaders with tools to measure and take action on team performance, engagement and retention. He explains their data-driven approach, highlighting how Apache Airflow supports their data ingestion, transformation, and AI integration. Key Takeaways: (01:54) 15Five acts as a command center for HR leaders. (03:40) Tools like performance reviews, engagement surveys, and an insights dashboard guide actionable HR steps. (05:33) Data visualization, insights, and action recommendations enhance HR effectiveness to improve their people's outcomes. (07:08) Strict data confidentiality and sanitized AI model training. (09:21) Airflow is central to data transformation and enrichment. (11:15) Airflow enrichment DAGs integrate AI models. (13:33) Integration of Airflow and DBT enables efficient data transformation. (15:28) Synchronization challenges arise with reverse ETL processes. (17:10) Future plans include deeper Airflow integration with AI. (19:31) Emphasizing the need for DAG versioning and improved dependency visibility. Resources Mentioned: Guy Dassa - https://www.linkedin.com/in/guydassa/ 15Five - https://www.linkedin.com/company/15five/ Apache Airflow - https://airflow.apache.org/ MLflow - https://mlflow.org/ DBT - https://www.getdbt.com/ Kubernetes - https://kubernetes.io/ RedShift - https://aws.amazon.com/redshift/ 15Five - https://www.15five.com/ Thanks for listening to The Data Flowcast: Mastering Airflow for Data Engineering & AI. If you enjoyed this episode, please leave a 5-star review to help get the word out about the show. And be sure to subscribe so you never miss any of the insightful conversations. #AI #Automation #Airflow #MachineLearning…
T
The Data Flowcast: Mastering Airflow for Data Engineering & AI
![The Data Flowcast: Mastering Airflow for Data Engineering & AI podcast artwork](/static/images/64pixel.png)
1 The Intersection of AI and Data Management at Dosu with Devin Stein 20:18
20:18
Redare mai Târziu
Redare mai Târziu
Liste
Like
Plăcut20:18![icon](https://imagehost.player.fm/icons/general/red-pin.svg)
Unlocking engineering productivity goes beyond coding — it’s about managing knowledge efficiently. In this episode, we explore the innovative ways in which Dosu leverages Airflow for data orchestration and supports the Airflow project. Devin Stein , Founder of Dosu , shares his insights on how engineering teams can focus on value-added work by automating knowledge management. Devin dives into Dosu’s purpose, the significance of AI in their product, and why they chose Airflow as the backbone for scheduling and data management. Key Takeaways: (01:33) Dosu's mission to democratize engineering knowledge. (05:00) AI is central to Dosu's product for structuring engineering knowledge. (06:23) The importance of maintaining up-to-date data for AI effectiveness. (07:55) How Airflow supports Dosu’s data ingestion and automation processes. (08:45) The reasoning behind choosing Airflow over other orchestrators. (11:00) Airflow enables Dosu to manage both traditional ETL and dynamic workflows. (13:04) Dosu assists the Airflow project by auto-labeling issues and discussions. (14:56) Thoughtful collaboration with the Airflow community to introduce AI tools. (16:37) The potential of Airflow to handle more dynamic, scheduled workflows in the future. (18:00) Challenges and custom solutions for implementing dynamic workflows in Airflow. Resources Mentioned: Apache Airflow - https://airflow.apache.org/ Dosu Website - https://dosu.dev/ Thanks for listening to The Data Flowcast: Mastering Airflow for Data Engineering & AI. If you enjoyed this episode, please leave a 5-star review to help get the word out about the show. And be sure to subscribe so you never miss any of the insightful conversations. #AI #Automation #Airflow #MachineLearning…
T
The Data Flowcast: Mastering Airflow for Data Engineering & AI
![The Data Flowcast: Mastering Airflow for Data Engineering & AI podcast artwork](/static/images/64pixel.png)
1 AI-Powered Vehicle Automation at Ford Motor Company with Serjesh Sharma 26:11
26:11
Redare mai Târziu
Redare mai Târziu
Liste
Like
Plăcut26:11![icon](https://imagehost.player.fm/icons/general/red-pin.svg)
Harnessing data at scale is the key to driving innovation in autonomous vehicle technology. In this episode, we uncover how advanced orchestration tools are transforming machine learning operations in the automotive industry. Serjesh Sharma, Supervisor ADAS Machine Learning Operations (MLOps) at Ford Motor Company, joins us to discuss the challenges and innovations his team faces working to enhance vehicle safety and automation. Serjesh shares insights into the intricate data processes that support Ford’s Advanced Driver Assistance Systems (ADAS) and how his team leverages Apache Airflow to manage massive data loads efficiently. Key Takeaways: (01:44) ADAS involves advanced features like pre-collision assist and self-driving capabilities. (04:47) Ensuring sensor accuracy and vehicle safety requires extensive data processing. (05:08) The combination of on-prem and cloud infrastructure optimizes data handling. (09:27) Ford processes around one petabyte of data per week, using both CPUs and GPUs. (10:33) Implementing software engineering best practices to improve scalability and reliability. (15:18) GitHub Issues streamline onboarding and infrastructure provisioning. (17:00) Airflow's modular design allows Ford to manage complex data pipelines. (19:00) Kubernetes pod operators help optimize resource usage for CPU-intensive tasks. (20:35) Ford's scale challenges led to customized Airflow configurations for high concurrency. (21:02) Advanced orchestration tools are pivotal in managing vast data landscapes in automotive innovation. Resources Mentioned: Serjesh Sharma - www.linkedin.com/in/serjeshsharma/ Ford Motor Company - www.linkedin.com/company/ford-motor-company/ Apache Airflow - airflow.apache.org/ Kubernetes - kubernetes.io/ Thanks for listening to The Data Flowcast: Mastering Airflow for Data Engineering & AI. If you enjoyed this episode, please leave a 5-star review to help get the word out about the show. And be sure to subscribe so you never miss any of the insightful conversations. #AI #Automation #Airflow #MachineLearning…
T
The Data Flowcast: Mastering Airflow for Data Engineering & AI
![The Data Flowcast: Mastering Airflow for Data Engineering & AI podcast artwork](/static/images/64pixel.png)
1 From Task Failures to Operational Excellence at GumGum with Brendan Frick 24:06
24:06
Redare mai Târziu
Redare mai Târziu
Liste
Like
Plăcut24:06![icon](https://imagehost.player.fm/icons/general/red-pin.svg)
Data failures are inevitable but how you manage them can define the success of your operations. In this episode, we dive deep into the challenges of data engineering and AI with Brendan Frick, Senior Engineering Manager, Data at GumGum. Brendan shares his unique approach to managing task failures and DAG issues in a high-stakes ad-tech environment. Brendan discusses how GumGum leverages Apache Airflow to streamline data processes, ensuring efficient data movement and orchestration while minimizing disruptions in their operations. Key Takeaways: (02:02) Brendan’s role at GumGum and its approach to ad tech. (04:27) How GumGum uses Airflow for daily data orchestration, moving data from S3 to warehouses. (07:02) Handling task failures in Airflow using Jira for actionable, developer-friendly responses. (09:13) Transitioning from email alerts to a more structured system with Jira and PagerDuty. (11:40) Monitoring task retry rates as a key metric to identify potential issues early. (14:15) Utilizing Looker dashboards to track and analyze task performance and retry rates. (16:39) Transitioning from Kubernetes operator to a more reliable system for data processing. (19:25) The importance of automating stakeholder communication with data lineage tools like Atlan. (20:48) Implementing data contracts to ensure SLAs are met across all data processes. (22:01) The role of scalable SLAs in Airflow to ensure data reliability and meet business needs. Resources Mentioned: Brendan Frick - https://www.linkedin.com/in/brendan-frick-399345107/ GumGum - https://www.linkedin.com/company/gumgum/ Apache Airflow - https://airflow.apache.org/ Jira - https://www.atlassian.com/software/jira Atlan - https://atlan.com/ Kubernetes - https://kubernetes.io/ Thanks for listening to The Data Flowcast: Mastering Airflow for Data Engineering & AI. If you enjoyed this episode, please leave a 5-star review to help get the word out about the show. And be sure to subscribe so you never miss any of the insightful conversations. #AI #Automation #Airflow #MachineLearning…
T
The Data Flowcast: Mastering Airflow for Data Engineering & AI
![The Data Flowcast: Mastering Airflow for Data Engineering & AI podcast artwork](/static/images/64pixel.png)
1 From Sensors to Datasets: Enhancing Airflow at Astronomer with Maggie Stark and Marion Azoulai 22:25
22:25
Redare mai Târziu
Redare mai Târziu
Liste
Like
Plăcut22:25![icon](https://imagehost.player.fm/icons/general/red-pin.svg)
A 13% reduction in failure rates — this is how two data scientists at Astronomer revolutionized their data pipelines using Apache Airflow. In this episode, we enter the world of data orchestration and AI with Maggie Stark and Marion Azoulai, both Senior Data Scientists at Astronomer. Maggie and Marion discuss how their team re-architected their use of Airflow to improve scalability, reliability and efficiency in data processing. They share insights on overcoming challenges with sensors and how moving to datasets transformed their workflows. Key Takeaways: (02:23) The data team’s role as a centralized hub within Astronomer. (05:11) Airflow is the backbone of all data processes, running 60,000 tasks daily. (07:13) Custom task groups enable efficient code reuse and adherence to best practices. (11:33) Sensor-heavy architectures can lead to cascading failures and resource issues. (12:09) Switching to datasets has improved reliability and scalability. (14:19) Building a control DAG provides end-to-end visibility of pipelines. (16:42) Breaking down DAGs into smaller units minimizes failures and improves management. (19:02) Failure rates improved from 16% to 3% with the new architecture. Resources Mentioned: Maggie Stark - https://www.linkedin.com/in/margaretstark/ Marion Azoulai - https://www.linkedin.com/in/marionazoulai/ Astronomer | LinkedIn - https://www.linkedin.com/company/astronomer/ Apache Airflow - https://airflow.apache.org/ Astronomer | Website - https://www.astronomer.io/ Thanks for listening to The Data Flowcast: Mastering Airflow for Data Engineering & AI. If you enjoyed this episode, please leave a 5-star review to help get the word out about the show. And be sure to subscribe so you never miss any of the insightful conversations. #AI #Automation #Airflow #MachineLearning…
T
The Data Flowcast: Mastering Airflow for Data Engineering & AI
![The Data Flowcast: Mastering Airflow for Data Engineering & AI podcast artwork](/static/images/64pixel.png)
1 Mastering Data Orchestration with Airflow at M Science with Ben Tallman 24:36
24:36
Redare mai Târziu
Redare mai Târziu
Liste
Like
Plăcut24:36![icon](https://imagehost.player.fm/icons/general/red-pin.svg)
Mastering the flow of data is essential for driving innovation and efficiency in today’s competitive landscape. In this episode, we explore the evolution of data orchestration and the pivotal role of Apache Airflow in modern data workflows. Ben Tallman, Chief Technology Officer at M Science, joins us and shares his extensive experience with Airflow, detailing its early adoption, evolution and the profound impact it has had on data engineering practices. His insights reveal how leveraging Airflow can streamline complex data processes, enhance observability and ultimately drive business success. Key Takeaways: (02:31) Benjamin’s journey with Airflow and its early adoption. (05:36) The transition from legacy schedulers to Airflow at Apigee and later Google. (08:52) The challenges and benefits of running production-grade Airflow instances. (10:46) How Airflow facilitates the management of large-scale data at M Science. (11:56) The importance of reducing time to value for customers using data products. (13:32) Airflow’s role in ensuring observability and reliability in data workflows. (17:00) Managing petabytes of data and billions of records efficiently. (19:08) Integration of various data sources and ensuring data product quality. (20:04) Leveraging Airflow for data observability and reducing time to value. (22:04) Benjamin’s vision for the future development of Airflow, including audit trails for variables. Resources Mentioned: Ben Tallman - https://www.linkedin.com/in/btallman/ M Science - https://www.linkedin.com/company/m-science-llc/ Apache Airflow - https://airflow.apache.org/ Astronomer - https://www.astronomer.io/ Databricks - https://databricks.com/ Snowflake - https://www.snowflake.com/ Thanks for listening to The Data Flowcast: Mastering Airflow for Data Engineering & AI. If you enjoyed this episode, please leave a 5-star review to help get the word out about the show. And be sure to subscribe so you never miss any of the insightful conversations. #AI #Automation #Airflow #MachineLearning…
T
The Data Flowcast: Mastering Airflow for Data Engineering & AI
![The Data Flowcast: Mastering Airflow for Data Engineering & AI podcast artwork](/static/images/64pixel.png)
Welcome to The Data Flowcast: Mastering Airflow for Data Engineering & AI — the podcast where we keep you up to date with insights and ideas propelling the Airflow community forward. Join us each week, as we explore the current state, future and potential of Airflow with leading thinkers in the community, and discover how best to leverage this workflow management system to meet the ever-evolving needs of data engineering and AI ecosystems. #AI #Automation #Airflow #MachineLearning…
T
The Data Flowcast: Mastering Airflow for Data Engineering & AI
![The Data Flowcast: Mastering Airflow for Data Engineering & AI podcast artwork](/static/images/64pixel.png)
1 Enhancing Business Metrics With Airflow at Artlist with Hannan Kravitz 23:51
23:51
Redare mai Târziu
Redare mai Târziu
Liste
Like
Plăcut23:51![icon](https://imagehost.player.fm/icons/general/red-pin.svg)
Data orchestration is revolutionizing the way companies manage and process data. In this episode, we explore the critical role of data orchestration in modern data workflows and how Apache Airflow is used to enhance data processing and AI model deployment. Hannan Kravitz, Data Engineering Team Leader at Artlist, joins us to share his insights on leveraging Airflow for data engineering and its impact on their business operations. Key Takeaways: (01:00) Hannan introduces Artlist and its mission to empower content creators. (04:27) The importance of collecting and modeling data to support business insights. (06:40) Using Airflow to connect multiple data sources and create dashboards. (09:40) Implementing a monitoring DAG for proactive alerts within Airflow. (12:31) Customizing Airflow for business metric KPI monitoring and setting thresholds. (15:00) Addressing decreases in purchases due to technical issues with proactive alerts. (17:45) Customizing data quality checks with dynamic task mapping in Airflow. (20:00) Desired improvements in Airflow UI and logging capabilities. (21:00) Enabling business stakeholders to change thresholds using Streamlit. (22:26) Future improvements desired in the Airflow project. Resources Mentioned: Hannan Kravitz - https://www.linkedin.com/in/hannan-kravitz-60563112/ Artlist - https://www.linkedin.com/company/art-list/ Apache Airflow - https://airflow.apache.org/ Snowflake - https://www.snowflake.com/ Streamlit - https://streamlit.io/ Thanks for listening to The Data Flowcast: Mastering Airflow for Data Engineering & AI. If you enjoyed this episode, please leave a 5-star review to help get the word out about the show. And be sure to subscribe so you never miss any of the insightful conversations. #AI #Automation #Airflow #MachineLearning…
T
The Data Flowcast: Mastering Airflow for Data Engineering & AI
![The Data Flowcast: Mastering Airflow for Data Engineering & AI podcast artwork](/static/images/64pixel.png)
1 Cutting-Edge Data Engineering at Teya with Alexandre Magno Lima Martins 23:46
23:46
Redare mai Târziu
Redare mai Târziu
Liste
Like
Plăcut23:46![icon](https://imagehost.player.fm/icons/general/red-pin.svg)
Data engineering is constantly evolving and staying ahead means mastering tools like Apache Airflow. In this episode, we explore the world of data engineering with Alexandre Magno Lima Martins, Senior Data Engineer at Teya. Alexandre talks about optimizing data workflows and the smart solutions they've created at Teya to make data processing easier and more efficient. Key Takeaways: (02:01) Alexandre explains his role at Teya and the responsibilities of a data platform engineer. (02:40) The primary use cases of Airflow at Teya, especially with dbt and machine learning projects. (04:14) How Teya creates self-service DAGs for dbt models. (05:58) Automating DAG creation with CI/CD pipelines. (09:04) Switching to a multi-file method for better Airflow performance. (12:48) Challenges faced with Kubernetes Executor vs. Celery Executor. (16:13) Using Celery Executor to handle fast tasks efficiently. (17:02) Implementing KEDA autoscaler for better scaling of Celery workers. (19:05) Reasons for not using Cosmos for DAG generation and cross-DAG dependencies. (21:16) Alexandre's wish list for future Airflow features, focusing on multi-tenancy. Resources Mentioned: Alexandre Magno Lima Martins - https://www.linkedin.com/in/alex-magno/ Teya - https://www.linkedin.com/company/teya-global/ Apache Airflow - https://airflow.apache.org/ dbt - https://www.getdbt.com/ Kubernetes - https://kubernetes.io/ KEDA - https://keda.sh/ Thanks for listening to The Data Flowcast: Mastering Airflow for Data Engineering & AI. If you enjoyed this episode, please leave a 5-star review to help get the word out about the show. And be sure to subscribe so you never miss any of the insightful conversations. #AI #Automation #Airflow #MachineLearning…
T
The Data Flowcast: Mastering Airflow for Data Engineering & AI
![The Data Flowcast: Mastering Airflow for Data Engineering & AI podcast artwork](/static/images/64pixel.png)
1 Airflow Strategies for Business Efficiency at Campbell with Larry Komenda 26:10
26:10
Redare mai Târziu
Redare mai Târziu
Liste
Like
Plăcut26:10![icon](https://imagehost.player.fm/icons/general/red-pin.svg)
Managing data workflows well can change the game for any company. In this episode, we talk about how Airflow makes this possible. Larry Komenda, Chief Technology Officer at Campbell, shares how Airflow supports their operations and improves efficiency. Larry discusses his role at Campbell, their switch to Airflow, and its impact. We look at their strategies for testing and maintaining reliable workflows and how these help their business. Key Takeaways: (02:26) Strong technology and data systems are crucial for Campbell’s investment process. (05:03) Airflow manages data pipelines efficiently in the market data team. (07:39) Airflow supports various departments, including trading and operations. (09:22) Machine learning models run on dedicated Airflow instances. (11:12) Reliable workflows are ensured through thorough testing and development. (13:45) Business tasks are organized separately from Airflow for easier testing. (15:30) Non-technical teams have access to Airflow for better efficiency. (17:20) Thorough testing before deploying to Airflow is essential. (19:10) Non-technical users can interact with Airflow DAGs to solve their issues. (21:55) Airflow improves efficiency and reliability in trading and operations. (24:40) Enhancing the Airflow UI for non-technical users is important for accessibility. Resources Mentioned: Larry Komenda - https://www.linkedin.com/in/larrykomenda/ Campbell - https://www.linkedin.com/company/campbell-and-company/ 30% off Airflow Summit Ticket - https://ti.to/airflowsummit/2024/discount/30DISC_ASTRONOMER Apache Airflow - https://airflow.apache.org/ NumPy - https://numpy.org/ Python - https://www.python.org/ Thanks for listening to The Data Flowcast: Mastering Airflow for Data Engineering & AI. If you enjoyed this episode, please leave a 5-star review to help get the word out about the show. And be sure to subscribe so you never miss any of the insightful conversations. #AI #Automation #Airflow #MachineLearning…
T
The Data Flowcast: Mastering Airflow for Data Engineering & AI
![The Data Flowcast: Mastering Airflow for Data Engineering & AI podcast artwork](/static/images/64pixel.png)
1 How Laurel Uses Airflow To Enhance Machine Learning Pipelines with Vincent La and Jim Howard 23:58
23:58
Redare mai Târziu
Redare mai Târziu
Liste
Like
Plăcut23:58![icon](https://imagehost.player.fm/icons/general/red-pin.svg)
The world of timekeeping for knowledge workers is transforming through the use of AI and machine learning. Understanding how to leverage these technologies is crucial for improving efficiency and productivity. In this episode, we’re joined by Vincent La, Principal Data Scientist at Laurel, and Jim Howard, Principal Machine Learning Engineer at Laurel, to explore the implementation of AI in automating timekeeping and its impact on legal and accounting firms. Key Takeaways: (01:54) Laurel's mission in time automation. (03:39) Solving clustering, prediction and summarization with AI. (06:30) Daily batch jobs for user time generation. (08:22) Knowledge workers touch 300 items daily. (09:01) Mapping 300 activities to seven billable items. (11:38) Retraining models for better performance. (14:00) Using Airflow for retraining and backfills. (17:06) RAG-based summarization for user-specific tone. (18:58) Testing Airflow DAGs for cost-effective summarization. (22:00) Enhancing Airflow for long-running DAGs. Resources Mentioned: Vincent La - https://www.linkedin.com/in/vincentla/ Jim Howard - https://www.linkedin.com/in/jameswhowardml/ Laurel - https://www.linkedin.com/company/laurel-ai/ Apache Airflow - https://airflow.apache.org/ Ernst & Young - https://www.ey.com/ Thanks for listening to The Data Flowcast: Mastering Airflow for Data Engineering & AI. If you enjoyed this episode, please leave a 5-star review to help get the word out about the show. And be sure to subscribe so you never miss any of the insightful conversations. #AI #Automation #Airflow #MachineLearning…
T
The Data Flowcast: Mastering Airflow for Data Engineering & AI
![The Data Flowcast: Mastering Airflow for Data Engineering & AI podcast artwork](/static/images/64pixel.png)
1 How Vibrant Planet's Self-Healing Pipelines Revolutionize Data Processing 23:51
23:51
Redare mai Târziu
Redare mai Târziu
Liste
Like
Plăcut23:51![icon](https://imagehost.player.fm/icons/general/red-pin.svg)
Discover the cutting-edge methods Vibrant Planet uses to revolutionize geospatial data processing and resource management. In this episode, we delve into the intricacies of scaling geospatial data processing and resource allocation with experts from Vibrant Planet. Joining us are Cyrus Dukart, Engineering Lead, and David Sacerdote, Staff Software Engineer, who share their innovative approaches to handling large datasets and optimizing resource use in Airflow. Key Takeaways: (00:00) Inefficiencies in resource allocation. (03:00) Scientific validity of sharded results. (05:53) Tech-based solutions for resource management. (06:11) Retry callback process for resource allocation. (08:00) Running database queries for resource needs. (10:05) Importance of remembering resource usage. (13:51) Generating resource predictions. (14:44) Custom task decorator for resource management. (20:28) Massive resource usage gap in sharded data. (21:14) Fail-fast model for long-running tasks. Resources Mentioned: Cyrus Dukart - https://www.linkedin.com/in/cyrus-dukart-6561482/ David Sacerdote - https://www.linkedin.com/in/davidsacerdote/ Vibrant Planet - https://www.linkedin.com/company/vibrant-planet/ Apache Airflow - https://airflow.apache.org/ Kubernetes - https://kubernetes.io/ Vibrant Planet - https://vibrantplanet.net/ Thanks for listening to The Data Flowcast: Mastering Airflow for Data Engineering & AI. If you enjoyed this episode, please leave a 5-star review to help get the word out about the show. And be sure to subscribe so you never miss any of the insightful conversations. #AI #Automation #Airflow #MachineLearning…
T
The Data Flowcast: Mastering Airflow for Data Engineering & AI
![The Data Flowcast: Mastering Airflow for Data Engineering & AI podcast artwork](/static/images/64pixel.png)
1 The Future of AI in Data Engineering With Astronomer’s Julian LaNeve and David Xue 23:36
23:36
Redare mai Târziu
Redare mai Târziu
Liste
Like
Plăcut23:36![icon](https://imagehost.player.fm/icons/general/red-pin.svg)
The world of data orchestration and machine learning is rapidly evolving, and tools like Apache Airflow are at the forefront of these changes. Understanding how to effectively utilize these tools can significantly enhance data processing and AI model deployment. This episode features Julian LaNeve, CTO at Astronomer, and David Xue, Machine Learning Engineer at Astronomer. They delve into the intricacies of data orchestration, generative AI and the practical applications of these technologies in modern data workflows. Key Takeaways: (01:51) The pressure to engage in the generative AI space. (02:02) Generative AI can elevate data utilization to the next level. (02:43) The transparency issues with commercial AI models. (04:27) High-quality data in model performance is crucial. (06:40) Running new models on smaller devices, like phones. (12:19) Fine-tuning LLMs to handle millions of task failures. (16:54) Teaching AI to understand specific logs, not general passages, is a goal. (21:56) Using Airflow as a general-purpose orchestration tool. (22:00) Airflow is adaptable for various use cases, including ETL and ML systems. Resources Mentioned: Julian LaNeve - https://www.linkedin.com/in/julianlaneve/ Atronomer - https://www.linkedin.com/company/astronomer/ David Xue - https://www.linkedin.com/in/david-xue-uva/ Apache Airflow - https://airflow.apache.org/ Meta’s Open Source Llama 3 model: https://ai.meta.com/blog/meta-llama-3/https://ai.meta.com/blog/meta-llama-3/ Microsoft’s Phi-3 model: https://www.microsoft.com/en-us/research/publication/phi-3-technical-report-a-highly-capable-language-model-locally-on-your-phone/ GPT-4 - https://www.openai.com/research/gpt-4 Thanks for listening to The Data Flowcast: Mastering Airflow for Data Engineering & AI. If you enjoyed this episode, please leave a 5-star review to help get the word out about the show. And be sure to subscribe so you never miss any of the insightful conversations. #ai #automation #airflow #machinelearning…
T
The Data Flowcast: Mastering Airflow for Data Engineering & AI
![The Data Flowcast: Mastering Airflow for Data Engineering & AI podcast artwork](/static/images/64pixel.png)
1 The Power of Airflow in Modern Data Environments at Wynn Las Vegas with Siva Krishna Yetukuri 24:31
24:31
Redare mai Târziu
Redare mai Târziu
Liste
Like
Plăcut24:31![icon](https://imagehost.player.fm/icons/general/red-pin.svg)
Understanding the critical role of data integration and management is essential for driving business success, particularly in a dynamic environment like a luxury casino resort. In this episode, we sit down with Siva Krishna Yetukuri, Cloud Data Architect at Wynn Las Vegas, to explore how Airflow and other tools are transforming data workflows and customer experiences at Wynn Las Vegas. Key Takeaways: (02:00) Siva designs and builds cutting-edge data pipelines and architectures. (02:54) Wynn is building a data platform to drive surveys and marketing strategies. (05:00) Airflow is the backbone of data ingestion, curation and integration. (07:00) Custom operators in Airflow enhance monitoring and reporting. (09:00) Excitement surrounds the use of Airflow 2.9 and its new features. (08:32) A metadata database drives Airflow workflows and captures metrics. (12:31) Understanding Airflow fundamentals in layman’s terms simplifies complexity. (16:33) Transitioning from Control-M to Airflow eases building complex workflows. (24:06) ML models for volume and freshness anomalies improve data quality. (20:15) DAGs are often auto-generated, simplifying the process for engineers. Resources Mentioned: Apache Airflow - https://airflow.apache.org/ Snowflake - https://www.snowflake.com/ Databricks - https://databricks.com/ Great Expectations - https://greatexpectations.io/ Thanks for listening to The Data Flowcast: Mastering Airflow for Data Engineering & AI. If you enjoyed this episode, please leave a 5-star review to help get the word out about the show. And be sure to subscribe so you never miss any of the insightful conversations. #ai #automation #airflow #machinelearning…
T
The Data Flowcast: Mastering Airflow for Data Engineering & AI
![The Data Flowcast: Mastering Airflow for Data Engineering & AI podcast artwork](/static/images/64pixel.png)
1 Powering the Texas Rangers World Series Win With AI on Airflow with Alexander Booth 23:38
23:38
Redare mai Târziu
Redare mai Târziu
Liste
Like
Plăcut23:38![icon](https://imagehost.player.fm/icons/general/red-pin.svg)
The integration of data and AI in sports is transforming how teams strategize and perform. Understanding how to harness this technology is key to staying competitive in the rapidly evolving landscape of baseball. In this episode, we sit down with Alexander Booth, Assistant Director of Research and Development at Texas Rangers Baseball Club, to explore the intersection of big data, AI and baseball strategy. Key Takeaways: (03:00) Alexander Booth's role and responsibilities at the Texas Rangers. (03:33) The implementation of multiple cameras and pose tracking in stadiums. (06:16) The importance of Airflow in organizing data orchestrations. (06:22) The demand for faster data among modern baseball players. (11:01) The necessity of scalable solutions for handling large data sets. (15:00) How weather data influences game strategy. (15:46) The impact of advanced technology on decision-making in baseball. (18:00) The role of AI and machine learning in player and game analysis. (22:26) The use of dynamic tasks in Airflow for better data management. Resources Mentioned: Apache Airflow - https://airflow.apache.org/ Statcast - https://www.mlb.com/statcast Google BigQuery - https://cloud.google.com/bigquery/ Databricks - https://databricks.com/ Thanks for listening to The Data Flowcast: Mastering Airflow for Data Engineering & AI. If you enjoyed this episode, please leave a 5-star review to help get the word out about the show. And be sure to subscribe so you never miss any of the insightful conversations. #ai #automation #airflow #machinelearning…
T
The Data Flowcast: Mastering Airflow for Data Engineering & AI
![The Data Flowcast: Mastering Airflow for Data Engineering & AI podcast artwork](/static/images/64pixel.png)
1 Expanding the Data Engineering Toolkit at Reddit 45:48
45:48
Redare mai Târziu
Redare mai Târziu
Liste
Like
Plăcut45:48![icon](https://imagehost.player.fm/icons/general/red-pin.svg)
Welcome back to the Airflow Podcast. This week, we met up with Ben Wisegarver, a staff data scientist at Reddit who runs their data warehousing and data engineering functions. Reddit users generate petabytes of data every day that needs to be processed, stored, and analyzed by a wide breadth of backend services. Our conversation with Ben touches on everything from Airflow as a tool for career mobility across the data stack to scaling out a self-service data architecture across many teams. For folks interested, our team at Astronomer is growing rapidly and we're on the hunt for new folks to join in a variety of different roles. If you're passionate about Airflow and interested in building the future of data engineering, please get in touch. You can check our current job postings at careers.astronomer.io, but we're constantly updating our listings to accommodate new hiring needs. Please feel free to email me directly at pete@astronomer.io if you're passionate about what we're doing and think you'd be a good addition to the team. Mentioned Resources: Careers: https://careers.astronomer.io Guest Profile: Ben Wisegarver: https://www.linkedin.com/in/ben-wisegarver-54566576…
T
The Data Flowcast: Mastering Airflow for Data Engineering & AI
![The Data Flowcast: Mastering Airflow for Data Engineering & AI podcast artwork](/static/images/64pixel.png)
1 GDPR, Self-Service Data, and Infrastructure Automation with Typeform 31:26
31:26
Redare mai Târziu
Redare mai Târziu
Liste
Like
Plăcut31:26![icon](https://imagehost.player.fm/icons/general/red-pin.svg)
Welcome back to the Airflow Podcast. This week, we met up with Albert Franzi and Carlos Escura from Typeform. Typeform is a tool that allows you to build beautiful interactive forms that you can use for a wide variety of use cases, including customer surveys, employee engagement, product feedback, and market research to name a few. In our conversation, we discussed Airflow as a tool for GDPR compliance, the concept of self-service data and how it allows your data operations team to function as a data platform team, and some of the more specialized infrastructure tooling that the Typeform team has built out to support their internal teams. For folks interested, our team at Astronomer is growing rapidly and we're on the hunt for new folks to join in a variety of different roles. If you're passionate about Airflow and interested in building the future of data engineering, please get in touch. You can check our current job postings at careers.astronomer.io, but we're constantly updating our listings to accommodate new hiring needs. Please feel free to email me directly at pete@astronomer.io if you're passionate about what we're doing and think you'd be a good addition to the team. Mentioned Resources: Dag Factory: https://github.com/ajbosco/dag-factory Astronomer Careers: https://careers.astronomer.io Guest Profiles: Albert Franzi: https://www.linkedin.com/in/albertfranzi/?originalSubdomain=es Carlos Escura: https://www.linkedin.com/in/carlosescura/en-us/…
T
The Data Flowcast: Mastering Airflow for Data Engineering & AI
![The Data Flowcast: Mastering Airflow for Data Engineering & AI podcast artwork](/static/images/64pixel.png)
After a bit of a break, we're back with the third official episode bundle of The Airflow Podcast. In this batch, we'll get a little bit deeper with current Airflow users and maintainers on core fundamental concepts in data engineering, architectures for operating modern data platforms at scale, and the process of maintaining and operating Airflow, specifically as we go through the release process of Airflow 2.0. This week, we met up with Brian de la Motte and Florian Hines at Netlify. Netlify provides an extremely popular toolset for building and deploying JAMstack sites. They provide hosting services, CI, DNS, authentication, and managed backend tools that help users run and operate static sites at scale. The team over there recently adopted Airflow to help decouple orchestration logic from a complex collection Spark jobs and are currently in the process of expanding their Airflow footprint to accommodate a broader group of interesting use-cases. Disclaimer: we get a bit of a surprise about halfway through the episode when Brian tells us that they had recently signed up for Astronomer- we promise that it wasn't a planted ad :). Please contact pete@astronomer.io if you'd like to get in touch regarding future episodes. Hope you enjoy! Guest Profiles: Brian de la Motte: https://www.linkedin.com/in/brian-de-la-motte/ Florian Hines: https://www.linkedin.com/in/florianhines/…
Bun venit la Player FM!
Player FM scanează web-ul pentru podcast-uri de înaltă calitate pentru a vă putea bucura acum. Este cea mai bună aplicație pentru podcast și funcționează pe Android, iPhone și pe web. Înscrieți-vă pentru a sincroniza abonamentele pe toate dispozitivele.