When we think about machine learning today we often think in terms of immense scale — large language models that require huge amounts of computational power, for example. But one of the most interesting innovations in machine learning right now is actually happening on a really small scale.
Thanks to TinyML, models can now be run on small devices at the edge of a network. This has significant implications for the future of many different fields, from automated vehicles to security and privacy.
In this episode of the Technology Podcast, hosts Scott Shaw and Rebecca Parsons are joined by Andy Nolan, Director of Emerging Technology at Thoughtworks Australia, and Matt Kelcey of Edge Impulse, to discuss what TinyML means for our understanding of machine learning as a discipline and how it could help drive innovation in the years to come.