top of page

HUNTING VR - Gameplay Developer, AI Designer
- Unity, Git, Jira -
VR hunting game - Prototyping
Introduction to My Experience at ICONIK - VR Gameplay Developer
I joined ICONIK as a Gameplay Developer intern and quickly found my place within a technically ambitious project: the Meta Quest version of Hunting Simulator 2, titled Hunting VR. This virtual reality hunting simulation game, published by Nacon and available on the Meta Quest Store, involved extensive collaboration with the publisher throughout the various milestones and development stages.
When I arrived, the project had just reached the "vertical slice" phase. Specializing in artificial intelligence development, I was immediately tasked with integrating various animal behaviors as defined in the Game Design Document (GDD).
The animal AIs were created using a nodal approach through the Unity plugin NodeCanvas, with some behaviors driven by a timeline to enable cinematic actions. Later, I expanded on these features by developing the AI for a companion, creating custom nodes in C# to enhance the plugin's capabilities.
In parallel, I took on the responsibility of updating and improving the GDD while actively participating in the project's creative discussions and decision-making.
Finally, I contributed to the development of several VR game prototypes through creative meetings, brainstorming sessions, and the rapid prototyping of game features and levels.
The Main Challenge: Developing the Companion Dog AI
The primary challenge of this project was to develop the AI for a dog companion, intended to accompany the player during their hunting sessions. When I joined, the dog's AI was limited to just two functions:
-
Pointing towards prey
-
Requesting pets
However, the team was not satisfied with this initial version. They wanted to strengthen the bond between the player and their companion, creating a more immersive and engaging relationship. My mission was to prototype new interactions with the dog to enrich this relationship and enhance the overall gameplay experience.

Implementing Interactions: The Dog Recall
I began by implementing a recall function, where a whistle from the character would bring the dog back to their side or send it back to work. Since the recall already occupied one of the limited available controls, dedicating additional buttons to further interactions with the dog was not feasible. This constraint led me to explore innovative solutions to enrich the interaction between the player and their canine companion while optimizing the use of the available controls.
Next Phase: Gesture Interactions and the Flushing Mechanic
To enhance interactions with the dog without adding new commands to the controller, the next phase of prototyping involved combining the recall whistle with hand gestures enabled by the VR controllers. This approach allowed for a variety of interactions while adhering to the control limitations.
I developed and tested several dog behaviors triggered by hand gestures, such as lying down, running to a pointed location, and more. Among these interactions, the most compelling was the command that allowed the player to send the dog to a designated location. This concept proved particularly effective and led to the prototyping of a new gameplay mechanic: "flushing."
The flushing mechanic involves sending the dog to bushes where small animals are hiding, causing them to flee. This action gives the player a clear line of sight on their targets, adding a new strategic dimension to the virtual reality hunting experience.


Development of the Dog's AI: Modular State Machines with NodeCanvas
The dog's AI was designed using nested State Machines with the Unity plugin NodeCanvas. Each state within these State Machines includes entry, exit, and update callbacks, which allowed me to easily create a variety of behaviors.
The nested State Machines provided a modular approach, facilitating the numerous iterations needed to integrate and improve the dog's AI. Parallel State Machines continuously monitor specific game events, such as the death of a target or a missed shot. This structure enables the dog to dynamically respond to these events, providing visual and auditory feedback to the player through expressive barking.
Thanks to this modular and responsive architecture, the dog's AI can adapt and enhance the gameplay experience while remaining flexible enough to accommodate new features and interactions.

Shooting Range VR Prototype — Genesis of Varsat
During my time at Iconik, I had the opportunity to design a prototype of a shooting range invirtual reality, marking the beginnings of what would later become the Varsat company. This project aimed to create an immersive VR shooting experience, integrating ProTubeVR's haptic accessories to simulate the recoil and weight of weapons, thus providing a realistic sensation to users.
The prototype combined an interactive virtual environment with advanced haptic devices, allowing users to train in near-real-life conditions. This innovative approach laid the foundation for Varsat, a company co-founded by Iconik after my departure, which specializes in developing virtual reality tactical training solutions for security forces and industry professionals.
This short project was an enriching experience and allowed me to concretely explore the potential of virtual reality for training purposes, by combining immersion and physical interaction.

Varsat1

Varsat5

Varsat3

Varsat1
1/5
bottom of page