Augmented Reality (AR) technology, as an innovative technology that integrates digital information and real-life scenarios, has seen rapid development in recent years and has demonstrated its unique charm in a number of application areas. For example, in the education industry, teachers can help students understand abstract concepts vividly and intuitively through virtual reality scenarios; in the tourism industry, AR technology can also virtualize historical and cultural scenarios, virtual navigation, etc., providing visitors with a more immersive interactive experience.
However, it is never easy to develop and use AR technology for applications, which requires high development costs and specialized technical talents. For this reason, the HarmonyOS SDKAR Engine Services(AR Engine) provides advanced AR technology for a wide range of application developers, solving the challenges of development costs and technical thresholds.
After integrating the AR Engine capability, developers only need 6 development steps to realize the fusion of virtual and reality by placing virtual objects on the plane of the real world. This function can be applied to virtual furniture placement, digital showroom exhibition and other scenarios, providing users with a new experience of combining the virtual and the real.
business process
The business process of AR placement implementation is divided into three main parts: opening the application, recognizing the plane and displaying it, and placing the virtual object.
The first part is that the user opens the app and the app needs to request camera permission from the user. If the user does not agree to the authorization, the feature cannot be used.
In the second part, AR Engine recognizes the plane and displays it. It includes the steps of completing the initialization of AR Engine, updating the ARFrame object, obtaining the plane, drawing the plane and displaying the preview screen.
The third part is placing virtual objects. That is, the user clicks on the screen, obtains the interest points in the real environment through collision detection, and creates anchor points on the interest points, and finally realizes drawing virtual objects at the anchor point positions and displaying the virtual objects on the preview screen.
development step
Before implementing the specific development steps for AR object placement, the developer needs to create the Native C++ project, declare the ArkTs interface, and apply for the following permission grants.
1.Create UI interface
After the preparations, a UI interface needs to be created for displaying the camera preview screen and triggering the drawing of each frame at regular intervals.
import { Logger } from '../utils/Logger';
import arEngineDemo from '';
import { resourceManager } from '@';
import { display } from '@';
[@Entry](/u/4127701)
[@Component](/u/3907912)
struct ArWorld {
private xcomponentId = 'ArWorld';
private panOption: PanGestureOptions = new PanGestureOptions({ direction: });
private resMgr: = getContext(this).resourceManager;
private interval: number = -1;
private isUpdate: boolean = true;
aboutToAppear() {
('aboutToAppear ' + );
();
();
("foldStatusChange", (foldStatus: ) => {
('foldStatusChange display on ' + foldStatus);
if (foldStatus === .FOLD_STATUS_EXPANDED
|| foldStatus === .FOLD_STATUS_FOLDED) {
();
();
// call (programming)Native(used form a nominal expression)startconnector,establishARSession。
();
();
}
})
}
aboutToDisappear() {
('aboutToDisappear ' + );
();
}
onPageShow() {
= true;
('onPageShow ' + );
();
}
onPageHide() {
('onPageHide ' + );
= false;
();
}
build() {
Column() {
XComponent({ id: , type: 'surface', libraryname: 'entry' })
.onLoad(() => {
('XComponent onLoad ' + );
= setInterval(() => {
if () {
// call (programming)Native(used form a nominal expression)update,updateAR Engineeach一帧(used form a nominal expression)计算结果
();
}
}, 33); // The control frame rate is30fps(each33Refresh one frame in milliseconds)。
})
.width('100%');
.height('100%');
.onDestroy(() => {
('XComponent onDestroy ' + );
clearInterval();
})
.backgroundColor();
}
.justifyContent();
.alignItems();
.backgroundColor();
.borderRadius(24);
.width('100%');
.height('100%');
}
}
2. Introduction of AR Engine
After creating the UI interface, introduce the AR Engine header file and write.
#include "ar/ar_engine_core.h"
find_library(
# Sets the name of the path variable.
arengine-lib
# Specifies the name of the NDK library that
# you want CMake to locate.
libarengine_ndk.
)
target_link_libraries(entry PUBLIC
${arengine-lib}
)
3.Create AR scene
First, configure the AR session and preview size.
// [Optional] Create a configuration object with a reasonable default configuration.
AREngine_ARConfig *arConfig = nullptr;
HMS_AREngine_ARConfig_Create(arSession, &arConfig);
// [Optional] Configure an AREngine_ARSession session.
HMS_AREngine_ARSession_Configure(arSession, arConfig);
// [Optional] Free the memory space for the specified configuration object.
HMS_AREngine_ARConfig_Destroy(arConfig).
// Create a new AREngine_ARFrame object.
HMS_AREngine_ARFrame_Create(arSession, & arFrame);
// The actual width and height of the preview area, which is the width and height of the xcomponent if it is being displayed using the xcomponent component; any inconsistency will result in an error displaying the camera preview.
int32_t width = 1440;
int32_t height = 1080;
// Set the width and height of the display (in pixels).
HMS_AREngine_ARSession_SetDisplayGeometry(arSession, displayRotation, width, height);
Get the texture ID through the openGL interface.
// Get the texture ID through openGL interface.
GLuint textureId = 0;
glGenTextures(1, &textureId);
Set openGL texture to store camera preview stream data.
// Set the openGL texture that can be used to store the camera preview stream data.
HMS_AREngine_ARSession_SetCameraGLTexture(arSession, textureId );
4. Getting the plane
Call the HMS_AREngine_ARSession_Update function to update the current AREngine_ARFrame object.
// Get the frame data AREngine_ARFrame.
HMS_AREngine_ARSession_Update(arSession, arFrame);
Get the camera's view matrix and the camera's projection matrix for subsequent drawing.
// The camera object AREngine_ARCamera can be obtained from the AREngine_ARFrame object.
AREngine_ARCamera *arCamera = nullptr;
HMS_AREngine_ARFrame_AcquireCamera(arSession, arFrame, &arCamera);
// Get the view matrix for the camera in the latest frame.
HMS_AREngine_ARCamera_GetViewMatrix(arSession, arCamera, glm::value_ptr(*viewMat), 16);
// Get the projection matrix used to render virtual content on the upper layer of the camera image, which can be used for camera coordinate system to cropping coordinate system conversion. near (0.1) far (100).
HMS_AREngine_ARCamera_GetProjectionMatrix(arSession, arCamera, {0.1f, }, glm::value_ptr(*projectionMat), 16);
Call the HMS_AREngine_ARSession_GetAllTrackables function to get the list of planes.
// Get a list of currently detected planes.
AREngine_ARTrackableList *planeList = nullptr;
// Create a list of trackable objects.
HMS_AREngine_ARTrackableList_Create(arSession, & planeList);
// Get a collection of all trackable pairs of the specified type ARENGINE_TRACKABLE_PLANE.
AREngine_ARTrackableType planeTrackedType = ARENGINE_TRACKABLE_PLANE;
HMS_AREngine_ARSession_GetAllTrackables(arSession, planeTrackedType, planeList);
int32_t planeListSize = 0;
// Get the number of trackable objects in this list.
HMS_AREngine_ARTrackableList_GetSize(arSession, planeList, &planeListSize);
mPlaneCount = planeListSize.
for (int i = 0; i < planeListSize; ++i) {
AREngine_ARTrackable *arTrackable = nullptr;
// Get the object with the specified index from the trackable list.
HMS_AREngine_ARTrackableList_AcquireItem(arSession, planeList, i, & arTrackable);
AREngine_ARPlane *arPlane = reinterpret_cast<AREngine_ARPlane*>(arTrackable);
// Get the tracking state of the current trackable object. Only draw if the state is: ARENGINE_TRACKING_STATE_TRACKING (trackable state).
AREngine_ARTrackingState outTrackingState.
HMS_AREngine_ARTrackable_GetTrackingState(arSession, arTrackable, & outTrackingState);
AREngine_ARPlane *subsumePlane = nullptr;
// Get the plane's parent plane (which is created when a plane is merged by another), return NULL if there is no parent plane.
HMS_AREngine_ARPlane_AcquireSubsumedBy(arSession, arPlane, &subsumePlane);
if (subsumePlane ! = nullptr) {
HMS_AREngine_ARTrackable_Release(reinterpret_cast<AREngine_ARTrackable*>(subsumePlane));
// If the current plane has a parent plane, the current plane is not displayed. Otherwise a double plane will appear.
continue;
}
// Draw only when the tracking state is: ARENGINE_TRACKING_STATE_TRACKING.
if (AREngine_ARTrackingState::ARENGINE_TRACKING_STATE_TRACKING ! = outTrackingState) {
continue; }
}
// Perform planar drawing.
}
HMS_AREngine_ARTrackableList_Destroy(planeList);
planeList = nullptr;
Call the HMS_AREngine_ARPlane_GetPolygon function to get an array of 2D vertex coordinates of the plane for drawing the plane boundary.
// Get the size of the 2D vertex array of the detected plane。
int32_t polygonLength = 0;
HMS_AREngine_ARPlane_GetPolygonSize(session, plane, &polygonLength);
// Get the 2D vertex array of the detected plane,format[x1,z1,x2,z2,...]。
const int32_t verticesSize = polygonLength / 2;
std::vector<glm::vec2> raw_vertices(verticesSize);
HMS_AREngine_ARPlane_GetPolygon(session, plane, glm::value_ptr(raw_vertices.front()), polygonLength);
// Local coordinate system vertex coordinates。
for (int32_t i = 0; i < verticesSize; ++i) {
vertices.emplace_back(raw_vertices[i].x, raw_vertices[i].y, 0.75f);
}
Converts the 2D vertex coordinates of the plane to the world coordinate system and draws the plane.
// Get the pose information for the translation from the plane's local coordinate system to the world coordinate system.
AREngine_ARPose *scopedArPose = nullptr;
HMS_AREngine_ARPose_Create(session, nullptr, 0, &scopedArPose);
HMS_AREngine_ARPlane_GetCenterPose(session, plane, scopedArPose);
// Convert the pose data into a 4x4 matrix, outMatrixColMajor4x4 is the storage array, where the data is stored column first.
// Multiply this matrix with the points of the local coordinate system to get the conversion from local to world coordinates.
HMS_AREngine_ARPose_GetMatrix(session, scopedArPose, glm::value_ptr(modelMat), 16);
HMS_AREngine_ARPose_Destroy(scopedArPose).
// Construct the data needed to draw the rendering plane.
// Generate triangles.
for (int i = 1; i < verticesSize - 1; ++i) {
triangles.push_back(0);
triangles.push_back(i);
triangles.push_back(i + 1);
}
// Generate planar enclosing lines.
for (int i = 0; i < verticesSize; ++i) {
lines.push_back(i);
}
5. Tap the screen
After the user clicks on the screen, the screen coordinates are obtained based on the click event.
// Adding Header Files:native_interface_xcomponent.h
#include <ace/xcomponent/native_interface_xcomponent.h>
float pixeLX= 0.0f;
float pixeLY= 0.0f;
int32_t ret = OH_NativeXComponent_GetTouchEvent(component, window, &mTouchEvent);
if (ret == OH_NATIVEXCOMPONENT_RESULT_SUCCESS) {
if ( == OH_NATIVEXCOMPONENT_DOWN) {
pixeLX= [0].x;
pixeLY= [0].y;
} else {
return;
}
}
The HMS_AREngine_ARFrame_HitTest function is called to perform collision detection and the results are stored in the collision detection results list.
// Create a list of hit detection result objects. arSession is the session object created in the Create AR Scene step.
AREngine_ARHitResultList *hitResultList = nullptr;
HMS_AREngine_ARHitResultList_Create(arSession, &hitResultList);
// Get a list of hit detection result objects. arFrame is the frame object created in the Create AR Scene step, and pixeLX/pixeLY are the screen point coordinates.
HMS_AREngine_ARFrame_HitTest(arSession, arFrame, pixeLX, pixeLY, hitResultList); // Get the list of hit detection result objects.
6. Placement of virtual objects
Call the HMS_AREngine_ARHitResultList_GetItem function to traverse the list of collision detection results and get the trackable objects of hits.
// Create the hit detection result object.
AREngine_ARHitResult *arHit = nullptr;
HMS_AREngine_ARHitResult_Create(arSession, &arHit);
// Get the first hit detection result object.
HMS_AREngine_ARHitResultList_GetItem(arSession, hitResultList, 0, arHit);
// Get the trackable object that was hit.
AREngine_ARTrackable *arHitTrackable = nullptr;
HMS_AREngine_ARHitResult_AcquireTrackable(arSession, arHit, & arHitTrackable);
Determine if the collision result exists inside the plane.
AREngine_ARTrackableType ar_trackable_type = ARENGINE_TRACKABLE_INVALID;
HMS_AREngine_ARTrackable_GetType(arSession, arTrackable, &ar_trackable_type)
if (ARENGINE_TRACKABLE_PLANE == ar_trackable_type) {
AREngine_ARPose *arPose = nullptr;
HMS_AREngine_ARPose_Create(arSession, nullptr, 0, & arPose);
HMS_AREngine_ARHitResult_GetHitPose(arSession, arHit, arPose);
// Determine if the bit pose is within the polygonal range of the plane. 0 means out of range, non-0 means in range.
HMS_AREngine_ARPlane_IsPoseInPolygon(mArSession, arPlane, arPose, &inPolygon)
HMS_AREngine_ARPose_Destroy(arPose);
if (!inPolygon) {
// Skip the current plane if it's not in the plane.
continue;
}
}
Create a new anchor point at the collision result location and place the virtual model based on this anchor point.
// Creates a new anchor point at the collision hit location。
AREngine_ARAnchor *anchor = nullptr;
HMS_AREngine_ARHitResult_AcquireNewAnchor(arSession, arHitResult, &anchor)
// Determining the trackable status of an anchor
AREngine_ARTrackingState trackingState = ARENGINE_TRACKING_STATE_STOPPED;
HMS_AREngine_ARAnchor_GetTrackingState(arSession, anchor, &trackingState)
if (trackingState != ARENGINE_TRACKING_STATE_TRACKING) {
HMS_AREngine_ARAnchor_Release(anchor);
return;
}
Call the HMS_AREngine_ARAnchor_GetPose function to get the anchor position and draw the virtual model based on this position.
// Get the position of the anchor point。
AREngine_ARPose *pose = nullptr;
HMS_AREngine_ARPose_Create(arSession, nullptr, 0, &pose);
HMS_AREngine_ARAnchor_GetPose(arSession, anchor, pose);
// Converts position data into4X4matricesmodelMat。
HMS_AREngine_ARPose_GetMatrix(arSession, pose, glm::value_ptr(modelMat), 16);
HMS_AREngine_ARPose_Destroy(pose);
// Drawing Virtual Models。
Learn more >>
interviewsAR Engine Alliance Official Website
gainAR Engine Development Guidelines