I am a Canadian citizen, US TN Visa eligible (no sponsorship required), and a Graphics / XR Software Engineer with a strong foundation in computer science and a passion for real-time, interactive systems. I specialize in C# and C/C++, with hands-on experience in GLSL/HLSL, WebGL/OpenGL, Unreal Engine, Unity, and Android Studio to build games, simulations, immersive XR applications, and real-time 3D systems.
My background includes graphics engineering and 3D geometry workflows, enabling me to create performant, visually rich applications. I also have a solid foundation in algorithm analysis, object-oriented design, parallel computing, and data/network structures, allowing me to approach engineering challenges with both efficiency and architectural rigour.
I am particularly passionate about game development and spatial computing for AR/VR/MR applications. I thrive on projects that blend real-time engines, graphics programming, and immersive technologies to deliver high-impact user experiences.
I welcome connections with teams seeking a Graphics / XR Software Engineer with expertise in C#, C/C++, GLSL/HLSL, WebGL/OpenGL, Unreal, and Unity for real-time 3D and XR applications.
fully immersive VR environment built with Unity and showcased in the video above.
In this experience I implemented:
🔹 Interactive VR environment with responsive objects
🔹 Locomotion and player movement mechanics for intuitive navigation
🔹 Rich user interactions such as grabbing, throwing, or using tools
🔹 Spatial sound and visual feedback to enhance presence
🔹 Optimized VR play area and comfort considerations
This project pushed me deeper into VR interaction design, user-centric locomotion systems, and immersive experience polish. I learned a lot about performance optimization, comfort best practices, and building intuitive mechanics in XR.
I developed an interactive VR room experience using Unity, focused on realistic object interaction and player movement within a defined play space.
Key features:
VR locomotion system for smooth player movement
Defined a locomotion/play area to ensure comfortable and controlled navigation
Physics-based tennis and ball interaction
Realistic grabbing, throwing, and collision behaviour
Optimized for immersive and comfortable VR gameplay
This project strengthened my experience with VR interaction design, physics systems, locomotion comfort, and real-time performance considerations. It was a hands-on exploration of how users move and interact naturally in virtual environments.
This AR project explores spatial layering by placing an interactive portal window onto detected vertical surfaces, revealing a secondary virtual world within the real environment. By separating physical space and digital space through depth, occlusion, and perspective, the experience creates a seamless and immersive layered reality.
Key Features
Vertical Plane Detection: Anchors the portal naturally on real-world walls.
Portal Creation: Tap to place a window revealing a virtual world beyond the surface.
Layered Reality: Clear separation between real space, portal frame, and virtual depth.
Depth & Occlusion: Accurate depth cues enhance realism behind the portal.
Perspective Interaction: Viewpoint-based response strengthens the window illusion.
Immersive World: A distinct virtual environment with its own lighting and atmosphere.
I recently built an Augmented Reality face-tracking experience using Unity, where virtual elements dynamically interact with the user in real time.
Key features:
Real-time face tracking
Virtual accessories (hat) are accurately aligned to head movement
Particle effects: a dynamic cloud with animated rain
Smooth tracking and interaction optimized for AR experiences
This project helped me deepen my understanding of AR pipelines, face anchors, coordinate systems, and real-time rendering in Unity. It was a great hands-on experience combining graphics, interaction design, and immersive tech.
AR Project UFO is an augmented reality version of a desktop game that transforms traditional gameplay into an immersive, real-world interactive experience.
Key Features of AR Project UFO
Ported from desktop to interactive AR
Virtual UFO elements integrated into real environments
Spatial, perspective-based interaction
Intuitive AR-optimized controls
Enhanced immersion and player presence
Dynamic gameplay influenced by surroundings
This augmented reality project was developed in Unity and demonstrates marker-based AR interaction using barcode scanning. When a barcode printed on paper and placed on a table is scanned through the device camera, a 3D fuse box model is accurately anchored and displayed directly over the barcode in real time. The project focuses on spatial alignment, tracking stability, and seamless integration of virtual content into a real-world environment. It showcases the practical use of AR for visualization, inspection, or educational purposes by overlaying interactive 3D objects onto physical markers.
Key Features
Marker-based AR using barcode scanning
Real-time detection and tracking of printed barcodes
Accurate placement and alignment of a 3D fuse box model over the physical marker
Stable AR tracking on flat surfaces such as tables
Built entirely in Unity for cross-platform AR development
Clean and responsive interaction between real-world input and virtual content
Suitable for educational, training, or demonstration use cases
One of the biggest challenges I faced while creating Galagsis was getting the collider feature to work properly. The player wasn’t consistently recognizing collisions with obstacles or enemies, which caused issues with gameplay mechanics. After some frustrating trial and error, I was able to fix it by adjusting the collider settings and refining the detection logic. Another challenge was implementing player tilting without affecting horizontal movement. It took some experimentation, but I eventually found a way to achieve the desired effect while keeping the controls responsive.
Playable Demo / Links:
Solo Developer | UI Designer | Gameplay Programmer
Tools: Unity, C#
Description:
Galagsis is a fast-paced arcade space shooter inspired by retro classics, built as a solo project to deliver a complete and polished gameplay experience. Players pilot a sleek ship, dodging and destroying waves of incoming enemies in dynamic vertical-scrolling levels.
My Role:
As the sole developer, I designed the user interface, programmed all gameplay mechanics, and handled visual polish and deployment.
Key Technical Features:
Overcame complex collision detection issues by fine-tuning collider logic to ensure reliable interaction with enemies and obstacles.
Implemented responsive player tilting mechanics that enhance visual feedback without compromising horizontal movement or control precision.
Bowling
Solo Developer | UI Designer | Gameplay Programmer
Tools: Unity, C#
A physics-based arcade bowling game focused on intuitive ball control, realistic hook mechanics, and a polished user experience.
Role:
Handled all aspects of development, including gameplay programming, UI design, camera setup, and physics tuning.
Highlights:
Implemented a natural ball hook system through refined input controls.
Solved camera–player alignment and gutter physics using precise positioning and collision-based force adjustments.
While creating Clicky Crates, one of the main challenges I faced was detecting when crates fell out of the scene to trigger a game over. At first, the game didn’t properly recognize when a crate was off-screen, which caused issues with ending the game at the right time. After some troubleshooting, I solved it by setting up boundary colliders and using triggers to detect when a crate left the play area, ensuring the game over condition activated correctly.
While creating Ball Mechanic, one of my main challenges was setting up a rounded fixed camera that followed the player’s movement smoothly. I had to fine-tune the camera behavior to maintain a consistent view without abrupt shifts. Another challenge was linking the camera movement naturally to the player's motion, ensuring it felt responsive and intuitive. Additionally, implementing power-up activation was tricky at first, but after refining the logic and timing, I was able to get it working as intended.