Return to page
h2o gen ai world conference san francisco h2o gen ai world conference san francisco

Building Custom GenAI Apps at H2O | Michelle Tanco


Speaker Bio

As the Head of Product at, Michelle Tanco’s primary focus lies in delivering a seamless user experience across machine learning applications. With a strong dedication to math and computer science, she is enthusiastic about leveraging these disciplines to address real-world challenges. Before joining H2O, she served as a Senior Data Science Consultant at Teradata, working on high-impact analytics projects to tackle business issues across various industries.

Michelle holds a B.A. in Mathematics and Computer Science from Ursinus College. In her downtime, she enjoys spending quality time with her family and expressing her creativity by playing the bass and ukulele.

Read the Full Transcript


Michelle Tanco


All right, chairs are gone. Hi, everyone. My name is Michelle Tanco. I'm going to talk to you today about our newly released AI app store. But first, I want to set the stage on why we're talking about this and also how it's not really that new. 



We've been doing the app store for a while. We're really excited to help you build your own GenAI apps and own and tell you why. So everyone right now is using generative AI, both in their business for business use cases, but also at home in their personal lives. 



So I want to set the stage and just talk about why we might want custom GenAI apps. So here's something that I have been doing for months. I don't personally like meal planning very much. It is hard to make sure we eat enough vegetables. 



It's hard to not spend a lot of money on takeout because you have to do dinner. So you can write a little prompt to your favorite LLM. Here is H2O's open source GPT on top of Llama2, 70 billion chat. And it will write you a nice meal plan. 



And so then what I could do is take the time to find the right prompt for me and test and see what works and what doesn't. And I'm pretty technical, but anyone could do this. But it does take a lot of time. 



And so what we found is a lot of people, you might have seen this from OpenAI yesterday, too, they maybe have a series of prompts on their computers and they're copying and pasting them in, and they have these custom use cases. 



And why do that? When you could have a custom UI built for your specific need that's built in Python, which a lot of people in the room were a lot of our data signed to start to know. And I'm sorry on the webinar if you can't see very well, but basically instead of writing free text here, I have a custom UI just with Python where I can say how many people are my family, if I want cooking instructions or not, if we have any dietary restrictions, and then I can have it generate a really good prompt for me. 



I've done some practicing. You'll hear about prompt tuning later today to write a good prompt that's customized with what I said. And then I can still edit it if I want. So one of the folks on our marketing team used this and put in that she wanted Costa Rican food, so she could just edit that herself really easily. 



But a nice prompt that we know works is there for us automatically. Then we click the button and we get our meal plan. This is kind of a fun example. But if this was a business use case, instead of just printing it to the screen, maybe it exports your content into your workday system because your use case wasn't meal planning, it was creating some sort of internal content. 



So this is sort of where we're going. We want to be able to create bespoke applications that abstract away some of the more complicated stuff that's really exciting. And you'll hear a lot of people talk about today. 



But not everyone needs to get super in the weeds. So we have the smart people getting in the weeds, and then we want to wrap it up in a way that everyone can use a little more easily without necessarily having to get so deep. 



All right. And so a little bit more on why GenAI apps before I give you the links, you can try that store if you want to. There's a lot of custom things that go into it. So custom inputs. In this use case, one of our engineers is an avid cyclist. 



So he quickly built a custom application that will help you make a cycling training plan. Shout out to a Ladislav. So you can say information if you're a beginner, your experience, how often you want to ride, and it will take that with these fine tune prompts in the background, custom prompts in the background, and write you a plan and tell you how you can get better at your cycling goals. 



But what does it take to do this? Well, custom inputs, the UI of what I want the end user to give me, do I want them to only be able to do, here you see like your weight and pounds, so we could have some checks in place that make sure if they put in a negative number, that that's unreasonable. 



So if someone was just writing the prompt themselves, we wouldn't really have these checks in place to make sure that they were only doing valid inputs. So that's one thing that's custom. We have custom prompts. 



So we've looked at for each of these apps, building out a prompt that works well for the use case. We also often might, maybe would, maybe would, have custom LLMs. So you'll hear about LLM Studio today, where you can actually fine tune LLMs for your specific use case. 



So some of these apps might use fine tune LLMs. Other are going to be customized with the UI or the prompts, and they might just be using the Llama 70 billion, which for a lot of use cases is pretty good. 



We'll hear a lot about RAG, retrieval augmented generation. So some of these apps will have data in the background. This one doesn't. It's pure LLM. But we'll look at some today where, for example, ask H2O. 



We have provided the LLM essentially with all of our H2O product documentation. So you ask a question. It will do a semantic search to find contexts in our documentation related to it, and then send that to the LLM, and it will write you an answer and a nice and friendly tone. 



So some of our apps need custom data. And if you want to do all of these things, you know, you could open up a GPT and type it yourself and do it. But it's a management. It's a lot of work. And just like we see AI and governance for models, I think use cases will We'll see some of that too. 



So having an application where it's all contained, it can be evaluated is something there we see a lot of organizations are going towards. So yes, anyway, with machine learning models, we did see that business users were often using ML models. 



Maybe they didn't know it. It was behind the scenes. But it's happening even more with GenAI. So wrapping in custom apps is a way to make it a little more straightforward for the end users. All right, so now we will announce the public H2O GenAI app store. 



So yeah, woo! So here at H2O, we have been building and helping customers build custom app stores for several years now. So some of you in the room might have seen this UI before. You might even have it privately at your organization. 



But this is our public instance where anyone can go to genai .h2o .ai. And if you're on a laptop, the experience will be real nice. And if you're on your phone, please go to your laptop instead. And you will see the ability to log in. 



So really quick, if you are an H2O employee, do not click Google. Click H2O AI Micrologin. If you click Google, you'll have to get help from someone in Ottawa with a chicken to fix your account. So if you're an employee, H2O Maker Login. 



Anyone else, you can log in with your Google. You can log in with your GitHub. But you do not have to. You can start by going to the link. And it will take you to the App Store page. You can explore the apps. 



We're going to be adding more. And these are custom applications to help you get started in thinking about how your organization could have your own app store with your own apps. So if you want to just see what an app does, these little buttons here will take you to a View More link. 



And you'll see screenshots. And we'll do a live demo in a minute. You'll see screenshots. It will tell you about the app. It will tell you about the use case. But also, these apps are all open source. 



So for any of my Python users or people that want to be Python users in the room, you can look at some of these apps, look at the templates from them, and maybe even write your own. We are totally open to pull requests if you want to write your own app and publish it here. 



You can just do that in GitHub. And we'll talk about it more. All right. So instead of reading to you what some of the apps are, we're going to go ahead and walk through some of them now. All right. So here, we'll just go to the home page. 



That's Anyone can come visit. You'll see that I'm not signed in right now. So this is the experience you will see as well. And we can explore some of the apps. So we can go ahead and click details to learn about an app, see screenshots, see what it does for us. 



Here, we have H2O Study Partner from one of our customer data scientists, Ratima. And what this will do is generate for you custom questions. So you can say, I want to know about data science, specifically unsupervised learning, and automatically generated topics on the content here. 



And then it will write you a question. So here, it's saying, which of the  the following is an unsupervised learning method used for exploration and the user can answer and then it will give you feedback. 



So, this is a multiple course question but it can also come up with open -ended questions and review your answer for you. So in this specific app store, this is an app running on top of H2O documents but you can imagine running this application based on your own rag, based on your own context. 



I will do a fun one too. So if you, I'll go back really quick, if you click on one of the applications and you are not logged in, that is when you will see this nice guy and we are going to ask you to log in to actually use the applications but again you can make an account just using your GitHub or your personal Gmail. 



All right. So, another example, we will do some business use cases too but I think the fun examples you might use in your day to day life are. are kind of interesting. So here we have tomato AI from one of our Kaggle grandmasters, Laura Fink. 



You can choose the area that you live in. So I'm in Seattle, a somewhat temperate environment. But I don't actually remember which client at Subzone. If I wasn't holding a mic, I would type a chat and ask. 



But that would be complicated right now. So we'll just say that I live in a subpolar oceanic area, which is probably not true. But I do want to eat more vegetables. You saw that with the meal planning app. 



So we can say that I want to grow tomatoes and bell peppers. And I'm open to growing other things. But let's see. I would like to know more about composting. And here it will take a little bit to think. 



And then it will generate an answer for us. So if you kind of squint really hard, you might be able to see on the screen. Based on what I wrote, it generated a question for me. Tell me more about composting. 



How can I start with it easily? And how does it work? And it also knows that I'm trying to grow tomatoes and bell peppers. So we'll end up getting here a customized based on our region that we live in and based on the foods that we say we're interested in answered to this. 



And it also has the ability to free chat. So we could just ask any sort of question if we wanted to. Now in this specific case, we are just using the Llama 70 billion chat. Just as Llama 2. It's quite good. 



But we could instead have a fine -tuned model. So we could point this to another model. And maybe the downside of 70 billion and telling you all this link is it's working really hard right now, maybe generating a lot of things for all of us. 



But that's OK. It's not a live demo. Something doesn't go a little slow. So we'll look at some of the other apps and maybe come back to see the answer here. Thank you. One of the things I wanted to share about the templates for anyone in the room who does Python, actually, I'll make it interactive. 



Who knows or wants to know Python? What's our audience? Okay, cool, cool, cool, cool. So one thing I'll show you is the open source app, so I'll tell you where the source code is. You can see this application which asks us some questions about our home that maybe we're trying to sell in the market right now. 



So how many bedrooms, how many bathrooms, what are some of the attributes does it have, and then we can generate both a prompt and then a listing. And then we can also go back to our app store and maybe you remember the meal planning app from a couple slides ago. 



And you might notice that these two apps look pretty similar. So we have a template that is for a specific use case, but that use case is not actually home listing or weekly meal planning. The use case is that I want to ask my end users some very specific information. 



I want them to be able to use drop downs or toggles to tell me what they need in their prompt. I want them to be able to see the prompt and then I want Jenny I to make. a template for them. So in the background, these two apps are using the same template code where you can customize the models pointing to. 



You can customize the system prompt that you're using. You can customize what this UI looks like. But the overall process of the application and what it takes to deploy it and make sure that it will scale for users is handled for you with both the template and with the app store. 



Awesome. OK, cool. So Tomato AI came back to us and it says, I don't read the whole thing, of course, but I know that it's a little hard to read. So it has been told to be cute to respond nicely that we're trying to grow a garden, maybe we're new to it. 



So it says, well, well, well, it sounds like you're trying to get your green thumb on and start composting like a pro. So we can customize the language and how we want it to talk to us. And this app was told to be playful, because that's what we want we're gardening. 



We maybe don't want someone to be really strict and robotic at us. Or maybe you do. I don't know your life. But anyway. So I will take us now to the open source repository. These slides will maybe be available, but you can take a picture of the link if you care. 



And here we have the 12 apps that you see in the store today. We'll be adding more apps to the store, some of them open source, but some of them won't be. And we'll also add more code here. We if you go into a given app, like we just looked at Tomato AI, you can see a nice little description that tells you how the app works and runs. 



And then one thing I wanted to point out, in case anyone is not familiar, H2O has an open source product called Wave, which allows you to build front ends using only Python. So I wrote some of the apps that you see in the app store. 



And I may be not an engineer. I can write Python code. But I don't have to know HTML and CSS and JavaScript and all that stuff. I just have to be able to read enough Python to know that if I want a header card at the top, I say UI header card. 



And then I give it the title and the subtitle. It handles everything for me. I can add components, and it's interactive, and buttons, and I don't have to know anything outside of Python, which as a data scientist is a skill I'm pretty comfortable with. 



So please feel free to check out the source code of these apps. And you can also submit a pull request of your own app. And we would love to review it if you want to host it in the public cloud with us. 



So we're about at time. Yeah, we won't dig into this one. There'll be a lot of people that will talk about the details and the weeds of how we make these things and what we're building the H2O. So I will wrap us up with that link in case you care about it, and let the next person go.