Media Thumbnail
00:00
00:00
1x
  • 0.5
  • 1
  • 1.25
  • 1.5
  • 1.75
  • 2

Ep 141 Building Smarter Apps with AI

This is a podcast episode titled, Ep 141 Building Smarter Apps with AI. The summary for this episode is: <p>Amit Ben, CEO and Co-founder at OneAI discusses his background in technology and his work in building products that understand human language, including chatbots and human language commands. He discusses his previous company, Nano Rep, which provided customer service and sales solutions using language AI, and his role leading global AI for LogMeIn's product portfolio. Amit then discussed the founding of One AI and the decision to build the company's natural language processing capabilities on top of established platforms like MongoDB and cloud providers rather than building from scratch. The goal of One AI is to make it easy for developers to integrate natural language processing into their products with just one line of code.</p>
Welcome
00:21 MIN
Amit Ben shares his background
04:41 MIN
Difference between AI and ML
01:38 MIN
Overview of OneAI
01:02 MIN
The customers OneAI is servicing today.
01:06 MIN
How is One AI Using MongoDB?
00:54 MIN

Today's Hosts

Guest Thumbnail

Shane McAllister

|Lead, Developer Advocacy
Guest Thumbnail

Michael Lynn

|Principal Developer Advocate

Today's Guests

Guest Thumbnail

Amit Ben

|Co-founder, and CEO at OneAI

Michael Lynn: Welcome to the show. My name is Michael Lynn and this is the MongoDB Podcast. Today on the show, Amit Ben, co- founder and CEO of One AI. One AI is a platform, it's leveraging MongoDB, to help developers build smarter apps using AI with as little as a single line of code. Stay tuned to learn more.

Amit Ben: Hey everyone, my name is Amit Ben, and I'm co- founder and CEO of One AI. It's great to be on the Mongo DB podcast. Today we'll be talking about making your app and database understand human language with one line of code.

Michael Lynn: Welcome to the show. It's great to have you on the podcast. Tell me a little bit about your background.

Amit Ben: Oh boy, that's a long story. I think it goes all the way back to my core in essence of being a builder and an engineer. I started writing code around the time when I started writing anything at all, and I've been writing and building products ever since, and went through all different iterations of roles and positions in tech companies from writing, telecommunication servers in C ++ for Linux systems to all the way to writing apps in Python. Over the journey of my career, started going into trying to make my products understand language, so either that being chatbots or human language commands. And whenever we tried to do that, that didn't work and that started troubling me. And ever since then, I think it was somewhere close to 20 years ago, it bothered me. So I reluctantly had to become a data scientist, although in my essence, I'm an engineer. So I started learning that field and that led to founding my first startup, which was called Nanorep. And Nanorep was a company providing customer service and sales solutions using language AI. We provided autonomous chatbots that were generative and very intelligent for teleco companies and large enterprises. We also gave conversational insights on an automation of operations inside organizations, all based on language AI. And the technology that I've built was the basis for that product line. I sold the company in 2017 and then transitioned to LogMeIn to lead global AI for all of the products in the LogMeIn portfolio. That includes product like GoToMeeting, LastPass Connect, Jive Webinar, Grasshopper and so on, amazing products. And my organization that I formed, and we had a research center in Israel for that effect, and also there were research teams in San Francisco and in Europe, our mandate was to create all of the language AI and machine learning capabilities that went into the entire product portfolio. And we've built a lot of amazing capabilities. We had an amazing team over there all the way from speech analysis and speech recognition to language processing and automated notes taking and smart chatbots. And I've been there for about four years until LogMeIn was acquired by private equity for$ 4. 3 billion. And then I decided it's time to start something new. And I reconnected with my co- founders and friends from the past and we tried to figure out what is going to be the next thing we're going to build? What's going to be the next company and the next product? And as we went through the process of ideation and vetting all of those ideas and talking to potential customers and try to vet all of these opportunities, we realized that the undertaking of starting a new product company while building from the bottom up, the full technological stack of the machine learning and natural language processing, that's going to be very hard to do especially as a new startup. You need to raise money, you need to split your resources across all of these efforts, including so you need to do the research, but you also need to build the product and the backend and the front end and the engineering stack. And that's a huge undertaking in and of itself. And then we realized we're taking everything from established platforms, everything that we need, so databases, of course. We use MongoDB. We run on cloud providers. When we use voice over IP, we would use communication APIs. We use billing platforms. We use analytics platforms. Why do we need to build our own language AI? And that's the problem that we decided we need to solve. Language AI should be a fundamental platform- based solution like any other underlying technologies so product companies and developers can focus on building their app, their product, and their business value, not worry about the fundamental, how it works and the underlying models and all of that machine learning headache that you normally you don't want to mess with, but you don't have a choice unless you have a platform that gives that to you as a service. And I'm not talking about project based delivery, I'm talking about a service that you can plug into and in two minutes get results from an API, which are the results you expected, and it's auto scalable and it's highly performant, and they can just go.

Michael Lynn: I want to back up just a little bit before we get into the product and service that One AI provides. And I want for the listeners to understand the subtle nuance between machine learning and artificial intelligence. Can you help us understand that, that subtle difference?

Amit Ben: Yeah, of course. That's a very good question. So artificial intelligence is a very broad umbrella term that describes any type of technology that exhibits intelligence or human- like behavior. And that could include anything from a program that plays chess to an algorithm that optimizes elevators to self- driving cars. And machine learning is a field of research that talks about a way to create those technologies. So artificial intelligence can be rule- based, can be code- based, it can be heuristics based, it can be machine learning based. That means we create a set of rules for the machine to learn from. So we provide a model which is basically the bedrock on which the machine learns on, and the machine learns to create its own decision making based on data. So whenever you dictate the logic, the rules, the heuristics, that's not machine learning. Let me backtrack. Whenever you give the machine data and there are emergent capabilities from that data, that's machine learning. And inside machine learning, there's also deep learning which is the training of machine learning capabilities based on neural networks. And usually when you use neural networks with multiple layers and a lot of parameters, that's what we call deep learning.

Michael Lynn: Great explanation. Thank you for that. Now let's talk about One AI. Who's going to be interested in the services that One AI provides and maybe talk about what those services are?

Amit Ben: One AI is an NLP platform for developers that provides you product ready APIs that you can plug into your product to convert natural language, speech, audio, text. You can convert it to structure data you can then plug into your product and into your database and use it to create value for your customers. So we completely transition you from the realm of language to the realm of data. For example, you can take a conversation and convert it to metadata that describes the people, the concepts, the topics, the highlights, the numerical values, the dates and so on. And then you can put it into your MongoDB and make it all queryable, or you can put it in your product and allow inaudible in your data source to find something specific. Or you can find all the sad scenes that Luke and Leia appear in in Star Wars.

Michael Lynn: So I can process movies, I can process audio, I can process blog text and extract the data from that.

Amit Ben: Exactly. So what we provide is a set of skills, language skills we call them. And language skills are packaged NLP models that provide value for different domains and use cases. So for example, there's an emotion skill. You can plug any type of input, whether it's textual, spoken, word or blog posts, and get all of the emotional metadata on all of that text. Or you can apply the highlight skills to take the most important sound bites from that input if you want to create notes or create extractive summarization of the conversation, or maybe give highlights of the blog post or the podcast that you have appeared in. And we have all sorts of scales. Some of them are generative like summarizations and topics, and some of them are descriptives like highlights, keywords, names, people, prices, quantities, and so on. And the idea is that all of these skills are stackable and composable so you can mix and match them together to create the value that you need as a developer in your product. So maybe you have emails in your product input, or conversations and maybe you're in sales tech, right? You have a sales conversation. So you can apply the topic split skill to split the conversation to different topics, pick the topic that talks about pricing, then apply the emotion skill and then see what does the customer feel about the pricing information or about specific competitions that we can pull from you. And the core part is that all of that is already pre- deployed, pre- trained in production. So you can just start getting value from the first minute that you're integrating with us. You don't have to train anything.

Michael Lynn: That's great. And I mean, what a way to fast track software development and an amazing set of skills. How was One AI built? What does the stack look like?

Amit Ben: So there are multiple high level components in the stack. Some of it is the deployment environment that holds all of the machine learning capabilities. So it's a Kubernetes- based environment that has pods that are designed for each specific skill. Because in machine learning, every type of model might need different sets of resources. So the performance characteristics will be different. Some are very GPU processing heavy. Others require a lot of memory. Others can easily run on CPU in low memory environments. Some are more latency sensitive. So for every skill that we have and behind every skill, there's a set of models. For every one of those, the platform engineering team prescribe the right combination of resources and the pod that is required to run it. And then there is another level of logic in the Kubernetes deployment that combines and kind of makes sure all of these pods are being deployed in the right set of nodes so we have good utilization of resources. But more importantly, we have very high performance for our users. And all of the auto scaling that might be trivial for CPU based loads, which is relatively easy to measure and easy to deploy, we had to build a lot of that logic into our platform so we can easily and quickly autoscale as traffic comes in to utilize GPUs very efficiently. So that's just basically the machine learning delivery platform. Next to it, there's all of the business logic, including all of the user management and access credential management and tracking and optimization and API usage and API protocol management. And that's a different deployment environment that manages all of the business logic of what we provide. And it also manages all of the flow between the different skills. Because as I said, everything is composable and stackable and it's all been done with one API. So when you provide, for example, a podcast as an input and you want to apply five or six or 10 different skills on it, so maybe you want to summarize it, create some titles, create segment, you want to split it to different segment, you want to find the highlights, the emotional parts, dimensions of different companies, so you just list all of those requirements in the right sequence in the API call and the platform will then do sort of map reduce and map all of the different skills when they're not dependent on one large cause you can create dependencies. So it kind of maps all of the tasks to the different pods and reduces all the results. And then you get with a single API call, we can run it as fast as possible concurrently on all the sub- services and get the result back. And all of that runs on the business logic platform that controls the data flow around the system.

Michael Lynn: How many customers are you servicing today?

Amit Ben: We're very excited to have a ton of developers registered on our platform. I think we have over 50,000 developers registered on the One AI platform using the language studio, which is a low code environment that developers can use to experiment and discover different capabilities, render samples on auto- generated code snippet in any platform. And so we had 50, 000 developers coming into the platform trying it out, testing it, registering, which it was mind blowing. I think we didn't expect anything in that scale so quickly. We really hope to get even more developers on the platform as fast as we can, because for me, my passion is to get all of these language AI capabilities in the hands of builders that can create amazing things that we have not thought of yet. And just seeing the ideas that people bring in to the platform and the way they use their product and their data with our skills combined to create something new, that's the most exciting part.

Michael Lynn: Love the passion, and I love the UI. I'm looking at it now, and if you're listening to this and you want to check it out, you can head on to studio. OneAI.com, and it almost looks a lot like the aggregation pipeline builder where you're stacking in operators and I guess in this case the operators would be skills. Is that correct?

Amit Ben: That's completely accurate, yeah.

Michael Lynn: Yeah, so it's an interesting UI and it seems to be a perfect tool for someone running a podcast. I like the example you gave so I'll definitely be checking that out. Who are some of your customers and what are some of the common use cases?

Amit Ben: So we have a very broad number of use cases and I think that speak to the power of the platform. And I'll give you a few examples. So we have customers that are doing data curation platforms and content feeds and they use our platform to identify content, curate, filter it, and also create summaries of it. Think of quick TLDRs for that content and topic labeling so they can do the personalization and tagging of the content so I can get content that I'm interested in and you can get content that you're interested in. So all of that is a very powerful type of deployment that we have. A customer that we have in this field is Daily Dev, which is a very popular content provider for developers. So Daily Dev are a great customer. We have a great relationship with them. They're an amazing content platform. So you can see for example, the TLDR in Daily Dev is a One AI skill. It's a summary. And the labels are One AI topic extractions and so on. We have completely different use cases like companies doing video editing tools. And we have customers that have built a complete video editing solution which is AI based. You put your video in and then it gives you all of the different segments and give you for each segment subheading and the topics and the people mentions and if there are emotions or sentiments in that. And it'll also give you the highlights for the entire video. And then you can with a few clicks very easily pick the segments or the highlights that you want to put in in different orders and very easily create your video. So we have, there's Click Maker AI, there's Peach AI, there are multiple companies building amazing products around that. I mean it blew our minds what is possible. We've never imagined this kind of use case. And then you have companies building social network research tools using our analytics capabilities, all of the processing of sentiment and summaries and topic detection. And one of the things we have not discussed is the One AI analytics scale, which allows you to move from processing a single object like a document or a podcast or a blog or a tweet. And we discuss how you can take one object and get all of the metadata that it's in there, but what if you have a million tweets or a hundred thousand Reddit posts or a hundred thousand customer service emails? So that's where the analytics aggregation skill comes into play where you take all of those data points and pass them through all of the skills to create the metadata and then finally into the analytics engine. And that creates clusters of all of those objects based on meaning. So you can find all of the tweets that refer to a specific topic and the metadata, and then push that also to your MongoDB, and then it's all queryable. You can find all of the topics about all of the tweets about MongoDB that have talked about performance and have also mentioned One AI for example. So you can very easily create these types of capabilities and you create your analytics dashboards based on it.

Michael Lynn: So it seems like a slam dunk to use this in conjunction with MongoDB, so if you're a developer and you want to extract and leverage metadata associated with your media. But how is One AI using MongoDB? What's MongoDB doing on the backend?

Amit Ben: As I mentioned, all of the business logic part of our platform needs to make a lot of decisions very fast. We need to manage all of the identities and credentials and API keys and tracking of quotas and usage and billing. So all of that business logic is running on Atlas, and that allows us to move much faster as we didn't have to spend time on deploying these databases and worrying about scaling and deployments. So it's all running on Atlas and we've seen amazing performance, and it's basically something we don't need to think about. And that that's the best sign, I think for a good technology choice. We don't think about it anymore. Whenever we need to add additional fields, more queries, more steps in the business logic, it just works.

Michael Lynn: And it sounds like a great relationship. So your goals are to alleviate the task of worrying about the AI and ML on the part of the developer. And MongoDB has a similar goal in that we want to alleviate the responsibility that you might have for running and managing your databases. So that's a great relationship.

Amit Ben: Yeah. And in many cases when I need to describe what One AI is to developers, I would say the same way you don't build your own MongoDB, there's no reason you'll build your own language AI processing. These are managed services that solves part of the value chain.

Michael Lynn: Okay, so NLP, natural Language processing, it's a huge domain and I have very little knowledge of NLP. But I do know that there are specializations where one model can be attuned to processing language from an article for example, but that's very different from the conversations that occur in a movie per se. How does One AI tackle that?

Amit Ben: That's a very good observation and I think that's one of the core problems that One AI is solving for our customers. Because different models, as you said, are specialized for specific use cases, domains and types of language usage. A model that's been trained to summarize blog posts is going to be useless in summarizing sales conversations or finding highlights in podcasts. So a core tenet of our platform is that behind each skill, there's not one model. There's an assorted model collection that is trained for different domains. So summarization skill has five or six different models trained for different use cases and domains of data. And when you input your data, our platform classifies your input automatically and chooses the right model that will give you the best results. And our task as a managed service provider is to solve that problem for you. It doesn't make sense for you to choose the model cause you're not a model expert. And models change every day, right? Every day there's a new technology, we improve the models, we'll find the better ones and improve them for you. And I think it's very much akin to Twilio for example. If you use Twilio for voiceover IP and you connect from your mobile, you'll get a different encryption and encoding algorithm that you get on your PC according to your bandwidth and device capabilities. You don't choose it. It's automatically chosen for you by the platform. In MongoDB, in the next version, you might change the engine behind it to make it faster. And that's part of the problem you're solving for your customers. So all of our skills are trained on multiple domains and they automatically choose the right flavor of machine learning model that will provide the best results for each input.

Michael Lynn: So once again, removing the responsibility on the customer's part to having to understand the specifics of NLP, that's a great capability. Is there anything else you'd like to share with the listeners about One AI?

Amit Ben: I think I would want developers to know that One AI is very developer- centric. Our goal is to wrap machine learning in a way that developers can just use. And one of our core values in the company is that it should just work and it's one of the guiding principles in everything that we build. We have a very large free tier right now that is also usable in commercial products. You can go live with a product using One AI and get all of the capabilities and functionalities with a free tier of one million words a month right now, which is probably going to be reduced. So go ahead and sign up for your free account at oneai. com and get your free tier of one million words a month and you can run any type of test that you want. And I welcome you to go ahead, test the platform, user inputs, try and get value, and give us the feedback. There are feedback tools integrated into the platform. Would love to hear from you, what worked for you, what didn't work for you, and how we can improve and make it better for developers.

Michael Lynn: Terrific. Well, I'll make sure that we include links in the show notes to One AI and some of the things we talked about. Check the show notes. Amit, it's been a great conversation. I want to thank you for your time today.

Amit Ben: Thank you so much. I appreciate you having me.

Michael Lynn: Thanks so much to Amit for joining us today. And thanks to you the listeners. If you want to learn more about One AI, visit studio. OneAI. com. Thanks everybody. Have a great day.