Talk

Ever thought about building your very own question-answering system? Like the one that powers Siri, Alexa, or Google Assistant? Well, we've got something awesome lined up for you!
In our hands-on workshop, we'll guide you through the ins and outs of creating a question-answering system. We prefer using Python for the workshop. We have prepared a GUI that works with python. If you prefer another language, you can still do the workshop, but you will miss the GUI to test your application. You'll get your hands dirty with vector stores and Large Language Models, we help you combine these two in a way you've never done before.
You've probably used search engines for keyword-based searches, right? Well, prepare to have your mind blown. We'll dive into something called semantic search, which is the next big thing after traditional searches. It’s like moving from asking Google to search "best pizza places" to "Where can I find a pizza place that my gluten-intolerant, vegan friend would love?" – you get the idea, right?
We’ll be teaching you how to build an entire pipeline, starting from collecting data from various sources, converting that into vectors (yeah, it’s more math, but it’s cool, we promise), and storing it so you can use it to answer all sorts of queries. It's like building your own mini Google!
We've got a repository ready to help you set up everything you need on your laptop. By the end of our workshop, you'll have your question-answering system ready and running.
So, why wait? Grab your laptop, bring your coding hat, and let's start building something fantastic together. Trust us, it’s going to be a blast!
Some of the highlights of the workshop:
  • Use a vector store (OpenSearch, Elasticsearch, Weaviate)
  • Use a Large Language Model (OpenAI, HuggingFace, Cohere, PaLM, Bedrock)
  • Use a tool for content extraction (Unstructured, Llama)
  • Create your pipeline (Langchain, Custom)
Jettro Coenradie
Luminis
Jettro is a software architect, search relevance geek, and data enthusiast that loves to talk about his job, hobbies, and other things that inspire people. Jettro truly believes in the Luminis mantra that the only thing that grows by sharing is knowledge. After more than ten years of creating the best search engines for multiple customers, Jettro is drawn into Machine Learning and Natural Language Processing. Learning and talking about NLP is what drives him to keep improving the user experience of search engines.
Daniël Spee
Luminis
Daniël is a seasoned Search Engineer at Luminis, where his passion for technology is manifested in creating and enhancing search solutions. He possesses a unique understanding of the field, believing that an excellent search experience transcends the pure technology underpinning it. His focus on the user experience, along with his innovative approach to problem-solving, enables him to develop search systems that are functional, efficient, and exceptionally user-friendly.
His proficiency in search engineering is matched by his excellent communication skills and team spirit. Daniël excels in translating complex concepts into easily understandable language, enabling effective collaboration with diverse teams and clients. His dedication to fostering a positive, productive work environment is as instrumental to his success as his technical expertise.
Staying ahead of the curve in the rapidly evolving field of search technology, Daniël consistently adopts the latest techniques to provide superior search solutions. His dedication, skills, and unique approach make him a crucial member of the Luminis team, contributing significantly to the field of search engineering.