Viewing a LAZ-encoded LiDAR scan using LASLoader


How to load a LiDAR scan from LAS (.laz) format directly into a xeokit web viewer.
Click on the preview below to run the example. Scroll down to learn how it's made.


Click to load
Placeholder image

Viewing a LAZ-encoded LiDAR scan using LASLoader


HTML


Listed below is the HTML for this example.


<!doctype html>
<html>
<head>
    <meta charset="utf-8">
    <meta http-equiv="X-UA-Compatible" content="IE=edge,chrome=1">
    <meta name="viewport" content="width=device-width, initial-scale=1">
    <title>Viewing a LAZ-encoded LiDAR scan using LASLoader</title>
    <style>
        body {
            background-color: white;
            overflow: hidden;
            margin: 0;
            user-select: none;
        }

        #demoCanvas {
            width: 100%;
            height: 100%;
            position: absolute;
            background: white;
            border: 0;
        }
    </style>
</head>
<body>
<canvas id="demoCanvas"></canvas>
</body>
<script type="module" src="./index.js"></script>
</html>


JavaScript


Listed below is the JavaScript for this example, which we'll break down into steps.

1. Import the SDK from a bundle built for these examples

import * as xeokit from "../../js/xeokit-demo-bundle.js";
import {DemoHelper} from "../../js/DemoHelper.js";

2. Create a LASLoader to load .BIM files

const lasLoader = new xeokit.las.LASLoader();

3. Create a Scene to hold geometry and materials

const scene = new xeokit.scene.Scene();

4. Create a Data to hold semantic data

const data = new xeokit.data.Data();

5. Create a WebGLRenderer to use the browser's WebGL graphics API for rendering

const renderer = new xeokit.webglrenderer.WebGLRenderer({});

6. Create a Viewer that will use the WebGLRenderer to draw the Scene

const viewer = new xeokit.viewer.Viewer({
    id: "demoViewer",
    scene,
    renderer
});

7. Give the Viewer a single View to render the Scene in our HTML canvas element

const view = viewer.createView({
    id: "demoView",
    elementId: "demoCanvas"
});

8. Arrange the View's Camera

view.camera.eye=[-3.2316780510204524, 14.79759350180506, -40.64618522195356];
view.camera.look=[0.840821626591669, 0.5704883070260383, -26.168275027069072];
view.camera.up=[0.18608313688175518, 0.7264616310194785, 0.6615334948623277];

9. It's often a good idea to set a large distance from the eye to the far clipping plane of the Camera's PerspectiveProjection , to ensure that we fit all the points in the view volume instead of weirdly cutting them off in the distance.

view.camera.perspectiveProjection.far = 10000000;

10. Configure the View's PointsMaterial , which controls the appearance of our LAZ model

view.pointsMaterial.pointSize = 2;
view.pointsMaterial.roundPoints = false;
view.pointsMaterial.perspectivePoints = true;
view.pointsMaterial.minPerspectivePointSize = 2;
view.pointsMaterial.maxPerspectivePointSize = 4;
view.pointsMaterial.filterIntensity = true;
view.pointsMaterial.minIntensity = 0;
view.pointsMaterial.maxIntensity = 100;

11. Add a CameraControl to interactively control the View's Camera with keyboard, mouse and touch input

new xeokit.cameracontrol.CameraControl(view, {});

12. Create a SceneModel to hold our model's geometry and materials

const sceneModel = scene.createModel({
    id: "demoModel"
});

13. Ignore the DemHelper

const demoHelper = new DemoHelper({
    viewer,
    data
});
demoHelper.init()
    .then(() => {

14. Create a DataModel to hold semantic data for our model

        const dataModel = data.createModel({
            id: "demoModel"
        });
        if (sceneModel instanceof xeokit.core.SDKError) {
            console.error(`Error creating SceneModel: ${sceneModel.message}`);
        } else {

15. Use LASLoader to load a LAZ model into our SceneModel and DataModel

            fetch("../../models/Nalls-Pumpkin-Hill/laz/model.laz").then(response => {
                response
                    .arrayBuffer()
                    .then(fileData => {
                    lasLoader.load({
                        fileData,
                        sceneModel,
                        dataModel
                    }).then(() => {

16. Build the SceneModel and DataModel. The Scene and SceneModel will now contain a SceneObject to represent the LAS/LAZ point cloud, and the Data and DataModel will contain a corresponding DataObject .

                        dataModel.build();
                        sceneModel.build();
                        demoHelper.finished();
                    }).catch(message => {
                        console.error(`Error loading LAS: ${message}`);
                    });
                });
            });
        }
    });