Abstract:
In the context of qplum's latest report on experience with deep learning & auto-encoders, Petar would like to present a few alternatives from Computational Learning Theory, which are philosophically incomparable to techniques like Neural Networks but have had powerful impact in biology-related fields. Petar will introduce the PAC learning framework, boosting, & possibly discuss a hybrid approach with NN. Petar would like to keep this incredibly interactive & devote a large part of the session to answering questions.
Petar's bio:
I studied Mathematics at Harvard (BA), Computer Science at NYU's Courant Institute (MSc) & Theoretical Computer Science at MIT (PhD). In 2001 with David Mazieres, I co-authored the Kademlia DHT (Distributed Hash Table), which is the fueling technology of most file-sharing, blockchain & Internet botnets technologies. Kademlia is now included in the Symantec Encyclopedia of Cybersecurity & is taught in many universities. In 2013 I earned a DARPA XDATA research grant (under President Obama), where I developed the first infrastructure for creating maintenance-free distributed applications. In 2015 I joined Google's Search product, where I continued a line of research from my work at DARPA & developed a Google-specific language for specifying business logic (ML/AI/NN) independently of underlying technology. I am currently an independent scientist working on releasing a "universal language" for programming connected systems of any kind.