Placing Anchor to Real World
Anchor points ensure that objects maintain the same position and orientation in space, helping you maintain the illusion that virtual objects are placed in the real world.
Use anchors to make virtual objects appear to stay in place in an AR scene.

Why use anchors?

As AR-MOD's environmental understanding updates throughout an AR experience, virtual objects can appear to drift away from where they were placed. This can impact your app's realism and user experience.
Anchors ensure that objects appear to stay at the same position and orientation in space, helping you maintain the illusion of virtual objects placed in the real world.

How anchors work

If you are new to using anchors, it is helpful to review world space and poses.
  • World space
    • Coordinate space in which the camera and objects are positioned
    • Camera and object positions are updated in world space from frame to frame
  • Pose
    • Represents an object’s position and orientation in world space
When you create an anchor, you use a pose that describes a position and orientation relative to the world space estimate for the current frame.

Placing your Anchor with AR-MOD

Now we know the Anchor and how it works. Next we need to create an AR-MOD AR Experience through PackageTools.
You must install the Unity Editor and AR-MOD development kit

Create Project

Open the PackageTools by Tools->AR-MOD->PackageTools Editor.Next right click the Project list view and select the BLANK template to create a new AR-Experience project.
Create AR-MOD Project
Wait for Unity compilation to complete. And We will see the result as shown below.
New AR-MOD Project

Modify Properties

Perfect, PackageTools help us to create the AR-MOD AR-Experience project. But now we have not completed the Anchor project, we need to use the PackageTools Editor to further set up the project.

What is Properties?

Used to set some attributes of the current AR-MOD AR experience project, such as the type of AR algorithm used, script entry, image quality, etc.
  • Change the AR algorithm to Anchor
  • Add new properites block Programmable Block
  • Add new properties block Visualizer -> Plane ; And input the plane visualizer prefab name, in here we will use builtin plane visualizer.
  • Resize the ARWorld Scale object to 10. This will prevent virtual objects from blocking our view.

Visualizer

Tracking the plane of the real world, used to place virtual objects into reality

Why is 10?

Because one unit in Unity means one meter. An Unity unit in an AR project also means one meter in the real world.
If you are using the Visual Scripting, you must change Programmable Type to Visua scripting
Properties

Make Art

In this tutorial we will using the Unity Editor Primitive Objects(e.g. Cube,Sphere)
Now we need to create a new scene to create our Anchor project art assets.

How to create new Scene?

You can open File ->New Scene -> Basic(Built-in) to create a new scene in Unity Editor
Create a new primitive object - Sphere.
Create Primitive Object
And Make it to be a prefab. Save it to our Anchor project.

Add virutal objects to our AR Experience package

Now back to the Package Tools view, and click the Contents tab button to show contents list view. Drag-And-Drop our virtual object(Sphere) to Contents view.

Add logics

Write Pure C# code

Open AnchorDemo ->Scripts -> Runtime -> AnchorDemoMainEntry.cs to write our logics.
Script
AnchorDemoMainEntry.cs file is our Anchor Demo project logic main entry. Like the C/C++/C#
language the Main function

Analyze the logic we are going to do

  1. 1.
    Because Anchor project will be packaged, so we need to load our virtual object from anchor demo project package
  2. 2.
    Send the command to place our virtual object in the real world when user tap in screen.
In the AnchorDemoMainEntry script we can found the ARMODAPI field. The ARMODAPI field will be help us to load package and send command to place our virutal object.
Method
Descript
OnLoad
OnLoad is called before the first frame update only if the script instance is enabled
OnEvent
Recive the AR-MOD SDK AR algorithm event
OnUpdate
OnUpdate is called once per frame. It is the main workhorse function for frame updates.
ReleaseMemory
To release the objects
Script

Coding for Anchor Demo

Add new field to store our virtual object. Then load the virutal object from package we can call ARMODAPI.LoadAssetAsync method.
LoadAssetAsync only load the asset from package to the RAM. Not Instanced.
1
using System;
2
using UnityEngine;
3
using System.Collections;
4
using com.Phantoms.ARMODAPI.Runtime;
5
using com.Phantoms.ActionNotification.Runtime;
6
7
namespace AnchorDemo
8
{
9
public class AnchorDemoMainEntry
10
{
11
//ARMOD API
12
internal static API ARMODAPI = new API(nameof(AnchorDemo));
13
14
+ private GameObject virtualObject;
15
16
public async void OnLoad()
17
{
18
//Use this for initialization. Please delete the function if it is not used
19
+ virtualObject = await ARMODAPI.LoadAssetAsync<GameObject>("Sphere");
20
}
21
}
22
}
Copied!
Now we have the virtual obejct from the package.Next we need to detect the user tap in screen and send the command to place virtual obejct in the real world. We can using the ARMODAPI.StickObject to placed.
Lets write some new code to do this.
1
using System;
2
using UnityEngine;
3
using System.Collections;
4
using com.Phantoms.ARMODAPI.Runtime;
5
using com.Phantoms.ActionNotification.Runtime;
6
7
namespace AnchorDemo
8
{
9
public class AnchorDemoMainEntry
10
{
11
12
public void OnUpdate()
13
{
14
//Detect user is tap in screen
15
+ if (!Input.GetMouseButtonDown(0)) return;
16
//Create need to send data
17
+ var tmp_AnchorData = new AnchorNotificationData
18
+ {
19
+ Position = Input.mousePosition,
20
+ TrackableType = AnchorNotificationData.TrackableTypeEnum.PlaneWithinBounds,
21
+ ControllerTargetNode = virtualObject
22
+ };
23
// Send command to place
24
+ ARMODAPI.StickObject(tmp_AnchorData);
25
}
26
27
}
28
}
Copied!

Full Code

1
using System;
2
using UnityEngine;
3
using System.Collections;
4
using com.Phantoms.ARMODAPI.Runtime;
5
using com.Phantoms.ActionNotification.Runtime;
6
7
namespace AnchorDemo
8
{
9
public class AnchorDemoMainEntry
10
{
11
//ARMOD API
12
internal static API ARMODAPI = new API(nameof(AnchorDemo));
13
14
private GameObject virtualObject;
15
16
public async void OnLoad()
17
{
18
//Use this for initialization. Please delete the function if it is not used
19
virtualObject = await ARMODAPI.LoadAssetAsync<GameObject>("Sphere");
20
}
21
22
public void OnUpdate()
23
{
24
//Like Unity Update method. Please delete the function if it is not used
25
if (!Input.GetMouseButtonDown(0)) return;
26
var tmp_AnchorData = new AnchorNotificationData
27
{
28
Position = Input.mousePosition,
29
TrackableType = AnchorNotificationData.TrackableTypeEnum.PlaneWithinBounds,
30
ControllerTargetNode = virtualObject
31
};
32
ARMODAPI.StickObject(tmp_AnchorData);
33
}
34
}
35
}
Copied!

Write with Visual Scripting

Go to Scripts folder then create new empty folder and rename it to Macros. Create a new Bolt flow state in this folder.
Create a new GameObject and rename to MainFlow. Add Flow Machine to this gameobject. Drag-and-drop our flow just created to Flow machine Macro field. Make this gameobject to be a prefab.
Now we need to modify the Programmable Type to Visual Scripting, And filling the MainFlow to main Visual Scripting field.
Write the logic for AnchorDemo flow macro file.
Visual scripting
Load Asset async unit must use Coroutine to call it. In here we using the Start method and enabled the Coroutine feature.
Drag-and-drop the MainFlow to PackageTools Contents view.

Packing AR-Experience

Back to the PackageTools and click the Build tap button. Select the corresponding platform for packaging.
Packing

Upload AR-Experience

Go to https://phantomsxr.com/dashboard/ to uploda this project. more information about dashoboard you can go to this articel.
AnchorDemo.zip
19KB
Binary
AnchorDemo - AR-MOD Source
Last modified 1mo ago