BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//Date iCal//NONSGML kigkonsult.se iCalcreator 2.20.2//
METHOD:PUBLISH
X-WR-CALNAME;VALUE=TEXT:Eventi DIAG
BEGIN:VTIMEZONE
TZID:Europe/Paris
BEGIN:STANDARD
DTSTART:20151025T030000
TZOFFSETFROM:+0200
TZOFFSETTO:+0100
TZNAME:CET
END:STANDARD
BEGIN:DAYLIGHT
DTSTART:20160327T020000
TZOFFSETFROM:+0100
TZOFFSETTO:+0200
TZNAME:CEST
END:DAYLIGHT
END:VTIMEZONE
BEGIN:VEVENT
UID:calendar.7213.field_data.0@www.ugovricerca.uniroma1.it
DTSTAMP:20260404T182637Z
CREATED:20151217T164927Z
DESCRIPTION:In the last few years\, the increasingly rise of new forms of c
 omputing devices is dramatically revolutionizing the way users create\, co
 nsume and interact with data. Portable and ubiquitous computing devices\, 
 3D printers and micro aerial vehicles are just an example of this growing 
 trend – their fast reach and penetration to the market has created an abun
 dance of data which users have now literally at their fingertips. It is cl
 ear that the old desktop interaction paradigm (i.e.\, mouse+keyboard) is r
 apidly becoming obsolete\, and new forms of interaction are required. Like
 wise\, the computational power at our hands has never been bigger\, and th
 is opens up exciting research possibilities which only few years ago seeme
 d unreachable. In my research\, I am interested in understanding how peopl
 e will interact with data in the future\, and how machines can help users 
 to accomplish strongly specialized\, and often critical\, tasks.In the fir
 st part of the talk\, I will give an overview of the research group I am c
 urrently part of\, the Advanced Interactive Technologies lab (AIT) at ETH 
 Zurich\, led by Prof. Otmar Hilliges. I will introduce some of the project
 s conducted within the group\, including works related to gesture recognit
 ion for mobile devices\, computational design and micro aerial vehicles co
 ntrol.I will then introduce a project in which we explored the possibility
  to detect in-air gestures around unmodified portable devices\, to complem
 ent the well know touch-based interaction paradigm [1\, 2].  The project f
 ocuses on developing a novel machine-learning gesture recognition pipeline
  which is capable of detecting a variety of gestures and rough hand distan
 ce to the camera\, from a single RGB-camera input.Finally\, I will conclud
 e my talk with an overview of a project focused on an interface that prese
 nts large\, unstructured collections of videos\, arranged in their origina
 l context and sorted in time [3]. With our tool\, we extended the focus+co
 ntext paradigm to create a video-collections+context interface by embeddin
 g videos captured with mobile devices into a panorama. We built a spatio-t
 emporal index and tools for fast exploration of the space and time of the 
 video collection\, helping users in navigating otherwise unstructured coll
 ections of videos.BioDr. Fabrizio Pece is currently a postdoctoral researc
 her (Marie Curie Fellow) at ETH Zurich\, working with Prof. Otmar Hilliges
  in the Advanced Interactive Technologies lab at the Institute of Pervasiv
 e Computing. Prior to joining ETH\, he was a PhD student at University Col
 lege London (2010 -- 2014)\, where he earned his doctoral degree in the Vi
 rtual Environment and Computer Graphics group\, under the supervision of P
 rof. Jan Kautz. Dr. Pece earned his BSc in Computer Science from Universit
 à degli Studi di Roma Torvergata\, Italy (2008) and his MSc in Vision and 
 Virtual Environment from University College London\, UK (Distinction\, 200
 9). Between June and October 2010\, he has completed an internship at Disn
 ey Research Zurich under the supervision of Prof. Jan Kautz and Prof. Wojc
 iech Matusik\, working in the area of digital fabrication.
DTSTART;TZID=Europe/Paris:20151221T120000
DTEND;TZID=Europe/Paris:20151221T120000
LAST-MODIFIED:20190805T155749Z
LOCATION:Diag\, Via Ariosto 25\, Roma\, Aula A5
SUMMARY:Making Everything Interacting: New ways of interaction in the Ubiqu
 itous Computing Age - Fabrizio Pece
URL;TYPE=URI:http://www.ugovricerca.uniroma1.it/node/7213
END:VEVENT
END:VCALENDAR
