top of page
Writer's pictureAnastasia Khomyakova

Nate Slater: What problems should enterprises actually trust an LLM to solve?


In the late spring of 1997, Nate packed up what few possessions he kept upon graduating fromcollege on the east coast and took a one-way flight to SFO on a Saturday morning. The

following Monday, he started a job working on Solaris servers for an investment bank in San

Francisco. Proficient in C, Perl, and shell scripting, he soon found that these skills were in high demand for building web applications. For the next 5 years, Nate worked for several Bay Area companies developing server side web application code using nearly all the major programming languages and web platforms in use at that time. In the early 2000’s he started working at the Lawrence Berkeley National Lab, developing systems for analyzing the first fully sequenced genome that resulted from the Human Genome Project. It was during this time that Natedeveloped a passion for distributed systems and big data. While still working in biotech during the summer of 2012, Nate built a fully automated development and test environment provisioning system using Amazon Web Services and he was hooked. He began work for AWS in early 2014, where he spent the next several years working with key San Francisco based startups such as Stripe, Airbnb, Slack, Okta, and Zendesk. It was during this time at AWS that Nate met two of his Flip AI co-founders, Deap Ubhi and Sunil Mallya. In 2018, Nate started leading a global ML solutions team at AWS. His team spearheaded strategic partnerships for building AI powered solutions such as the NFL Player Health and Safety initiative. After nearly a decade at AWS, Nate left last year to join the founding team of Flip AI as the Head of Solutions and Architecture.


What problems should enterprises actually trust an LLM to solve?


Generative AI is the most discussed topic today all the way from developers to board rooms. The technology while exciting hasn’t seen true enterprise adoption. Three main factors hinder adoption a) Domain specificity b) Privacy c) Cost. Enterprises want to use general purpose LLMs to automate processes that are very specific to their business needs, hosted within their data perimeters while being cost efficient. No public LLM provider satisfies these conditions. Building on top of open source technologies is the way to go here to satisfy all the aforementioned conditions. Through the lens of how Flip AI deployed it’s LLM into enterprises, we’ll dive into types of problems that are prime for automation and how startups can build trust with the enterprise. This talk will dive into workflow automation, data pipelines, training method, infrastructure, inference pipelines to on-premises and VPC deployments for LLMs. Flip AI has built domain specific Devops LLM from the ground up to automate root cause analysis of production incidents by connecting to existing observability (MELT) data systems. With Flip, LLMs can now take your on-call page.


Comments


bottom of page