Lecturer (Assistant Professor) and Chief Investigator
Queensland University of Technology
I am passionate about my overarching research goal: providing robots with perceptual abilities that allow safe, intelligent interactions with humans in real-world environments. To develop these perceptual abilities, I believe it is useful to study the principles of the human, animal and insect visual systems. I use these principles to develop new computer vision and machine learning algorithms and validate their effectiveness in intelligent robotic systems. I am enthusiastic about this forward/reverse engineering approach, which combines concepts from computer science and engineering with those from biology and social sciences, as it offers the dual benefit of uncovering principles inherent in the animal visual system, as well as applying these principles to its artificial counterpart. I am a recipient of two best thesis awards and two best poster awards and have acquired grants in excess of 400,000 AUD.
I wonder what all the layoffs @Meta@Google (+ @twitter) etc. will mean for the research output of those companies at conferences such as @ieee_ras_icra@CVPR@ICCVConference@siggraph? Will they go down in the next few years? Any thoughts?
Getting busy on a Wednesday morning on our @overleaf ahead of the @CVPR deadline in ~80 hours. Good luck to all the #computervision folks out there! PS: After moving to @openreviewnet I wonder if @overleaf will be the next bottleneck? Tip: enable your @overleaf - @Dropbox sync.
Hi #ScienceTwitter, does anyone have experience with @github#ClassRoom? I am starting teaching next year (yay!) and one of my first aims is to move away from @MATLAB with its #AutoGrader to another platform that supports @ThePSF#Python. Courses are in Robotics (+some vision).
Our results show that our approach achieves #competitive performance when compared to several baseline methods (EventVLAD, Event-VPR, Ensemble-Event-VPR), and is particularly well suited for compute- and energy-constrained platforms.
We contribute a new indoor QCR-Event-VPR dataset captured with a DAVIS346 camera mounted on a mobile @clearpathrobots Jackal #robot (@petercorke59 photo-bombing in the background).
We explore the distinctiveness of event streams from a small subset of pixels (in the tens or hundreds) for the #VPR task. Using sparse (over image coordinates) but varying (variance over the #events per pixel) pixels enables frequent and computationally cheap location estimates.
Anyone from the #casino industry here? This suggests you could use robots to persuade humans to gamble. Just in case profits are not high enough yet ;)
The four ingredients for humane #robots: @alefreedot starting out a great talk @2022Iros, suggesting robots need to be sensitive, adaptive, considerate and consistent.
One of my favourite topics, active sensing, discussed by @mpopovic514 in fully autonomous #robots and #drones. Slides so far still use GPs, I wonder if the next slides will reveal implicit representations. @2022Iros