over 2 years ago
Today's Workshop Zoom Invite + Details for Thursday's Workshop

Hi everyone,
Here is the Zoom Info for today's Hackathon Welcome workshop at 2pm PT:
Join Zoom Meeting: https://docker.zoom.us/j/92285401100 (Find your local number: https://docker.zoom.us/u/abB7zCnV7H)
Building a Petabyte-Scale Vector Store: Powering Future AGI hosted by DataStax
We'll be hosting two workshops each week! Join us Thursday at 11am PT for a one-hour workshop.
A laptop proof of concept won’t cut it for this impending era of generative AI. Let’s dig into the mechanics of building and using a petabyte-scale vector store and the future of handling data in generative AI models. This talk will focus on the work in the Apache Cassandra® project to develop a vector store capable of handling petabytes of data, discussing why this capacity is critical for future AI applications. We will also connect how this pertains to the exciting new generation of AI technologies like Improved Large Language Models (LLMs), Retrieval-Augmented Generation (RAG), and Forward-Looking Active Retrieval Augmented Generation (FLARE) that all contribute to the growing need for such scalable solutions. Finally, we’ll discuss the importance of planning for future scalability and how to effectively manage AI agents in this new age of data.
Key Takeaways:
- Understand the future of generative AI and why current laptop-scale models will soon be obsolete.
- Apache Cassandra® and its role in creating a petabyte-scale vector store for AI applications.
- Deep dive into vector-powered AI technologies such as LLMs, RAG, and FLARE.
- How AI agents can leverage such scalable solutions for better decision-making.
- Importance of planning and managing future growth in AI applications and how to avoid painful migrations later.
- Use cases with frameworks like LangChain, LlamaIndex, and CassIO.
This session provides you with the insights necessary to plan and scale your AI applications effectively, helping you start on the right foot today to reap the benefits tomorrow.
