Events  Deals  Jobs  SF Climate Week 2024 
    Sign in  
 
 
Convex relaxations for weakly supervised information extraction
Thu, Dec 18, 2014 @ 07:00 PM   FREE   Pivotal Labs, 625 Ave of Americas, 2nd Fl
 
   
 
 
              

    
 
Sign up for our awesome New York
Tech Events weekly email newsletter.
   
LOCATION
EVENT DETAILS

Happy holidays everyone! We're going to close out the year with a fantastic talk withEdouard Grave presenting "Convex relaxations for weakly supervised information extraction"

As usual, we'll try to film the event and the waiting list is automatic, so please don't ask about either.

Abstract:

In this talk, I will present convex formulations for two weakly supervised information extraction tasks: relation extraction and named entity classification.


I will first introduce distant supervision for relation extraction. Distant supervision is a recent paradigm for learning to extract information by using an existing knowledge base instead of label data as a form of supervision. The corresponding problem is an instance of multiple label, multiple instance learning. I will show how to obtain a convex formulation of this problem, inspired by the discriminative clustering framework.


Second, I will present a method to learn to extract named entities from a seed list of such entities. This problem can be formulated as PU learning (learning from positive and unlabeled examples only) and I will describe a convex formulation for this problem.

Bio:

Edouard Grave graduated from Ecole polytechnique with a MSc in machine learning and computer vision in 2010. He obtained his PhD in computer science in 2014, working under the supervision of Francis Bach and Guillaume Obozinski. He did a postdoc at UC Berkeley, working with Laurent El Ghaoui and is now a postdoc at Columbia University, working with Noemie Elhadad and Chris Wiggins. His research interests revolve around natural language processing and machine learning.

 
 
 
 
© 2024 GarysGuide      About    Feedback    Press    Terms