Astronomer's the Dataflow Cast

How Vibrant Planet's Self-Healing Pipelines Revolutionize Data Processing

Discover the cutting-edge methods Vibrant Planet uses to revolutionize geospatial data processing and resource management.

In this episode, we delve into the intricacies of scaling geospatial data processing and resource allocation with experts from Vibrant Planet. Joining us are Cyrus Dukart, Engineering Lead, and David Sacerdote, Staff Software Engineer, who share their innovative approaches to handling large datasets and optimizing resource use in Airflow.

 

Key Takeaways:

(00:00) Inefficiencies in resource allocation.

(03:00) Scientific validity of sharded results.

(05:53) Tech-based solutions for resource management.

(06:11) Retry callback process for resource allocation.

(08:00) Running database queries for resource needs.

(10:05) Importance of remembering resource usage.

(13:51) Generating resource predictions.

(14:44) Custom task decorator for resource management.

(20:28) Massive resource usage gap in sharded data.

(21:14) Fail-fast model for long-running tasks.

 

Resources Mentioned:

Cyrus Dukart

David Sacerdote

Vibrant Planet

Apache Airflow®

Kubernetes

 

Thanks for listening to The Data Flowcast: Mastering Airflow for Data Engineering & AI. If you enjoyed this episode, please leave a 5-star review to help get the word out about the show. And be sure to subscribe so you never miss any of the insightful conversations.

#AI #Automation #Airflow #MachineLearning

Be Our Guest

Interested in being a guest on The Data Flowcast? Fill out the form and we will be in touch.


By proceeding you agree to our Privacy Policy,
our Website Terms and to receive emails from Astronomer.

Build, run, & observe your data workflows.
All in one place.

Get $300 in free credits during your 14-day trial.

Get Started Free