A missouri native who grew up on a farm raising cows, ticks, corn and soybeans, Brett Koonce is a CTO at QuarkWorks, a software engineering and design firm based out of Columbia, MO and San Francisco, CA. QuarkWorks partner with startups and enterprise companies working on new products and features helping them with iOS and Android apps, websites, backend APIs, e-commerce and Machine Learning.
At Scale By the Bay, Brett will show how to use Swift for TensorFlow to build and train a convolutional neural network from scratch. In advance of his talk, we spoke to Brett about deploying Machine Learning on mobile devices, and the trends that are shaping the future of Machine Learning.
How did you get interested in Machine Learning and what attracts you most in the space?
Using computers and algorithms to find patterns in numbers beyond human comprehension has the potential to fundamentally change what we consider reason. What is interesting to me, in particular, is the idea that as computer power continues to grow, if we can simply maintain the pace of progress, then in a few years we will be able to perform calculations at an order of magnitude beyond what is possible today.
What's your current role at QuarkWorks and what exciting things are you working on at the moment?
As CTO, I spend most of my time herding cats. :) We've been working with video a lot the past few years and have recently been working on building tools to analyze live streams.
What's the biggest challenge that you face in your work and how are you addressing the challenge? What's the biggest challenge of implementing Machine Learning on mobile?
Scaling up our team, be it hiring or training up new hires, is our current top priority. There's no secret to working on it, just putting in the time.
Fundamentally, mobile is underpowered compared to the desktop and the cloud, so we have to be an order of magnitude more efficient with resources than other approaches.
What's the biggest thing that is misunderstood about Machine Learning?
It's just math. Artificial General Intelligence (AGI) has been getting a lot of ink as of late, but math has been with us since the dawn of time. Whenever we attempt to let our imagination run away with the possibility of thinking machines, I think we ignore the thousands of years of concerted effort required to bring us here. Also, people talk a lot about models, but that is usually just the tip of the ML project iceberg.
What are the three trends that will shape the future of the Machine Learning in general and on mobile specifically?
Memory: The past decade of progress in Deep Learning has been done mostly on devices with less than 8GB of GPU memory. This past year or two has seen devices with more memory becoming commonplace. There's a whole generation of researchers who have access to resources that were previously limited to only a select handful of people. My hope is that this is going to unlock new approaches.
Genericism: There's a lot of focus on hardware right now, but I feel like the larger trend will be towards universal software. Tools like LLVM make targeting specific architectures much simpler and allow you to target the cloud and edge simultaneously. Flexible approaches will gradually assimilate whatever innovations are brought to market.
Optimization: Figuring out how to fully utilize the resources available is still very much an open problem. Currently, a lot of interesting research is in finding ways to break problems apart into smaller pieces, which can be solved separately.
What will you talk about at Scale By the Bay and why did you choose to cover this subject?
My plan is to demonstrate how to use Swift for TensorFlow (s4tf) to build and train a convolutional neural network from scratch and deploy it to an edge device. Last year, I did a talk about s4tf and why I am betting on it, this year I would love to have a demonstration of the power of this approach.
Who should attend your talk and what will they learn?
Anybody interested in Machine Learning in general and Functional Programming in particular, and is interested in combining these tools to unlock new capabilities in the future. My hope is that you will understand the tradeoffs you can make when deploying a model on a device with limited resources.
Anything else you'd like to add?
Enjoyed talking last year, looking forward to doing it again! Feel free to say hi!
Join Brett Koonce and his talk "Machine Learning and mobile" at Scale By the Bay in November. Book your ticket now.
Komentarze