In this Intel Conversations in the Cloud audio podcast: Nir Shavit, co-founder of Neural Magic, joins host Jake Smith to talk about enabling convolutional neural networks to run on commodity CPUs. Nir speaks to how his background as a professor researching the connectivity maps of brain tissue and tracking neurons evolved into developing machine learning […]
In this Intel Conversations in the Cloud audio podcast: Nir Shavit, co-founder of Neural Magic, joins host Jake Smith to talk about enabling convolutional neural networks to run on commodity CPUs. Nir speaks to how his background as a professor researching the connectivity maps of brain tissue and tracking neurons evolved into developing machine learning algorithms that run on multicore processors instead of GPUs. The two discuss the relationship between performance and CPU cache and memory, compression techniques like model pruning and quantization, and the future of sparsification. Nir also expands on the similarities and differences between the human brain and modern processors and why software optimization could be more important than specialized hardware.
Follow Neural Magic on Twitter at:
twitter.com/neuralmagic
Follow Jake on Twitter at:
twitter.com/jakesmithintel