"Signal to Symbols (via Skills)"

Thursday, Jan. 25th @ 11am

FAH 3002 and Zoom: https://ucsd.zoom.us/j/95959557868

Speaker: George Konidaris

Seminar Abstract

While AI has achieved expert-level performance on many 
individual tasks, progress remains stalled on designing a single agent 
capable of reaching adequate performance on a wide range of tasks. A 
major obstacle is that general-purpose agents (most generally, robots) 
must operate using sensorimotor spaces complex enough to support the 
solution to all possible tasks they may be given, which by the same 
token drastically hinder their effectiveness for any one specific task.

I propose that a key, and understudied, requirement for general 
intelligence is the ability of an agent to autonomously formulate 
streamlined, task-specific representations, of the sort that single-task 
agents are typically assumed to be given. I will describe my research on 
this question, which has established a formal link between the skills 
(abstract actions) available to a robot and the symbols (abstract 
representations) it should use to plan with them. I will present an 
example of a robot autonomously learning a (sound and complete) abstract 
representation directly from sensorimotor data, and then using it to 
plan. I will also discuss ongoing work on making the resulting 
abstractions practical and portable across tasks.


George Konidaris is an Associate Professor of Computer Science at 
Brown and the Chief Roboticist of Realtime Robotics, a startup 
commercializing his work on hardware-accelerated motion planning. He 
holds a BScHons from the University of the Witwatersrand, an MSc from 
the University of Edinburgh, and a PhD from the University of 
Massachusetts Amherst. Prior to joining Brown, he held a faculty 
position at Duke and was a postdoctoral researcher at MIT. George is the 
recent recipient of an NSF CAREER award, young faculty awards from DARPA 
and the AFOSR, and the IJCAI-JAIR Best Paper Prize.