Immersive & Creative Technologies Lab
  • Home
  • About
  • People
  • Publications
  • Courses
  • Contact
Sign in Subscribe

Strategic Incorporation of Synthetic Data for Performance Enhancement in Deep Learning: A Case Study on Object Tracking Tasks

Charalambos (Charis) Poullis

Sep 5, 2023
Strategic Incorporation of Synthetic Data for Performance Enhancement in Deep Learning: A Case Study on Object Tracking Tasks

Our paper Strategic Incorporation of Synthetic Data for Performance Enhancement in Deep Learning: A Case Study on Object Tracking Tasks has been published as a conference paper at the 18th International Symposium on Visual Computing (ISVC), 2023. The work is co-authored by Jatin Katyal and Charalambos Poullis.

Sign up for more like this.

Enter your email
Subscribe
A novel AI-driven approach to redirected walking in virtual reality that eliminates the need for eye-tracking hardware.
Featured

A novel AI-driven approach to redirected walking in virtual reality that eliminates the need for eye-tracking hardware.

Our patent "Methods and Systems for Real-Time Saccade Prediction" has been granted. TL;DR: A machine learning system that predicts natural saccadic eye movements in real-time, enabling redirected walking in virtual environments by leveraging inattentional blindness—without requiring expensive eye-tracking equipment or artificially triggering major saccades in users.
Jan 8, 2026 1 min read
Extreme Views: 3DGS Filter for Novel View Synthesis from Out‑of‑Distribution Camera Poses
Featured

Extreme Views: 3DGS Filter for Novel View Synthesis from Out‑of‑Distribution Camera Poses

The paper “Extreme Views: 3DGS Filter for Novel View Synthesis from Out‑of‑Distribution Camera Poses” by Damian Bowness and Charalambos Poullis will appear in the 20th International Symposium on Visual Computing (ISVC), 2025. TL;DR: A plug‑in, real‑time filter for 3D Gaussian Splatting that performs a rotation‑
Jan 8, 2026 1 min read
Fast Self-Supervised Depth and Mask Aware Association for Multi-Object Tracking
Featured

Fast Self-Supervised Depth and Mask Aware Association for Multi-Object Tracking

The paper “Fast Self-Supervised Depth and Mask Aware Association for Multi-Object Tracking” by Milad Khanchi, Maria Amer, and Charalambos Poullis has been accepted for publication at British Machine Vision Conference (BMVC) 2025. TL;DR: SelfTrEncMOT is a novel multi-object tracking framework that integrates zero-shot monocular depth estimation and promptable segmentation
Aug 19, 2025 1 min read
Immersive & Creative Technologies Lab © 2026
  • Sign up
Powered by Ghost