In the past months, Large Language Models (LLMs) have changed the data science landscape and revolutionized the application of machine learning technologies. There is hardly a data-driven organization which is not looking into ways to leverage LLMs for augmenting user-facing applications, creating new products, and supercharging their internal analytics.
Apache Airflow® has the world’s most active open-source community and is the leading workflow and data orchestration framework. As such, Airflow plays an important role in machine learning operations. When it comes to LLMs, many organizations are also looking to Airflow to operationalize their Retrieval Augmented Generation pipelines.
At Astronomer, we have built templates and reference architectures for operationalizing LLM workflows.In this webinar, we showed how to use the Ask Astro LLMOps reference architecture to create your own retrieval augmented LLM application, feeding state of the art models with domain specific knowledge. You can find the demo repository here, and a write-up of the code and use case here.