Door: Thijs Zumbrink
I recently implemented "virtual displays" in my first-person game. When the player gets close to one of these, the crosshair is removed and the player instead controls the mouse on the virtual display. It enables immersive consoles that can provide all kinds of utility.
I first saw this used in Doom 3 and was intrigued:
In the game, Stratagem, it's used to purchase new structures/item for your base. The concept is simple:
1. Walk to your main building
2. Look at the virtual console
3. Order the construction of a new item by using the interface
4. When completed, pick up the item in the drop zone
If danger is near, you can quickly react, since you are in the game world, not in some sort of menu overlay.
Ingame, it looks like this:
How it's done in Unity
The central element is a Canvas GameObject rendered in world space. I chose to regard the virtual screen as 800x600 for convenience, then used the scaling controls to make it a proper size. It is placed on a scaled cube that forms the screen hardware.
The contents on the screen are simply Unity UI elements such as Buttons. Place them in the editor or create them via a script, depending on the level of interactivity.
A virtual mouse cursor to replace the crosshair is a nice touch. (link) It's implemented as a Raw Image placed in such a way that the tip of the cursor is at local (0, 0) of its GameObject. It's placed in a separate canvas, as a sibling of the regular canvas, so we can control the depth and make sure it is always drawn on top of other elements. Additionally, by placing a RectMask2D component on the Canvas, the cursor won't draw outside of the screen.
Now let's make it interactive. Two things are needed and they work a bit differently: we need to move the virtual mouse cursor and we need to make sure that the UI elements fire their events properly. This should only happen when the player is close enough and looking directly at the screen.
To limit the range of interaction, we need a custom implementation of a Raycaster. The GraphicRaycaster will do fine as a basis, since it lets us interact with the UI elements, even when placed in world space. There is no maximum range setting though, so that is where the custom implementation comes in. The code is as follows:
If the editor gave you a Raycaster by default, make sure to remove or disable it, or use the LayerMask to ensure the virtual console is not seen by it.
Moving the virtual cursor
Unfortunately Unity doesn't provide a "mouse moved" event. If it did, we could let our custom Raycaster interact with the rest of the event system and solve this cleanly. Instead, we must manually shoot a ray (performing the same computation that the VirtualConsoleRaycaster does) and see whether it hits the canvas. Upon a hit, we move the cursor to that (2D) position.
There is one challenge though: since the VirtualConsoleRaycaster is based on the GraphicRaycaster, it does not give us the world location of the hit, only the screen position. To remedy this, we first use it to see if there is a hit, then follow up with a physics raycast to get the actual world position.
The code looks something like this: