Examples & Concepts

A loosely connected collection of examples and concepts useful for working with Ouster SDK. If you are just starting, please see Quick Start with Ouster Python SDK.

For convenience, throughout the examples and concepts we will use pcap_path and metadata_path to refer to the path to a Ouster pcap and metadata file. The pictures below are taken from the OS1 sample data.

Obtaining Sensor Metadata

Ouster sensors require metadata to interpret the readings of the sensor. Represented by the object SensorInfo, its fields include configuration parameters such as lidar_mode and sensor intrinsics like beam_azimuth_angles.

When you work with a sensor, the client will automatically fetch the metadata. Recorded pcaps, however, must always be accompanied by a json file containing the metadata of the sensor as it was running when the data was recorded.

Since it’s crucial to save the correct metadata file, let’s see how we can get that from a sensor. Try running the following example:

$ python3 -m ouster.sdk.examples.client $SENSOR_HOSTNAME fetch-metadata

And now let’s look inside the example we just ran:

with closing(client.Sensor(hostname)) as source:
    # print some useful info from
    print("Retrieved metadata:")
    print(f"  serial no:        {source.metadata.sn}")
    print(f"  firmware version: {source.metadata.fw_rev}")
    print(f"  product line:     {source.metadata.prod_line}")
    print(f"  lidar mode:       {source.metadata.mode}")
    print(f"Writing to: {hostname}.json")

    # write metadata to disk
    source.write_metadata(f"{hostname}.json")

Seems simple enough!

Working Directly with Packets

The PacketSource is the basic interface for packets from the sensor. It can be advantageous to work with packets directly when latency is a concern, or when you wish to examine the data packet by packet, for example, if you wish to examine timestamps of packets.

Let’s make a PacketSource from our sample data using pcap.Pcap:

with open(metadata_path, 'r') as f:
    metadata = client.SensorInfo(f.read())

source = pcap.Pcap(pcap_path, metadata)

Now we can read packets from source with the following code:

for packet in source:
    if isinstance(packet, client.LidarPacket):
        # Now we can process the LidarPacket. In this case, we access
        # the encoder_counts, timestamps, and ranges
        encoder_counts = packet.header(client.ColHeader.ENCODER_COUNT)
        timestamps = packet.header(client.ColHeader.TIMESTAMP)
        ranges = packet.field(client.ChanField.RANGE)
        print(f'  encoder counts = {encoder_counts.shape}')
        print(f'  timestamps = {timestamps.shape}')
        print(f'  ranges = {ranges.shape}')

    elif isinstance(packet, client.ImuPacket):
        # and access ImuPacket content
        print(f'  acceleration = {packet.accel}')
        print(f'  angular_velocity = {packet.angular_vel}')

Staggered vs Destaggered 2D Representations

The default staggered representation of a LidarScan has columns which pertain to measurements taken at a single timestamp. For a more natural 2D image, we destagger the field with the client.destagger() function.

Let’s take a look at a typical staggered representation:

_images/lidar_scan_staggered.png

LidarScan RANGE field visualized with matplotlib.pyplot.imshow() and simple gray color mapping for better look.

This staggered representation definitely doesn’t look like a normal image, which shouldn’t surprise us since its columns pertain to timestamps instead of azimuth angles.

Let’s destagger the image, changing the columns to represent the azimuth angles:

import matplotlib.pyplot as plt
from more_itertools import nth

# ... `metadata` and `source` variables created as in the previous examples

scans = client.Scans(source)

# iterate `scans` and get the 84th LidarScan
scan = nth(scans, 84)
ranges = scan.field(client.ChanField.RANGE)

# destagger ranges, notice `metadata` use, that is needed to get
# sensor intrinsics and correctly data transforms
ranges_destaggered = client.destagger(source.metadata, ranges)

plt.imshow(ranges_destaggered, cmap='gray', resample=False)

This should give the scene below, of which we have magnified two patches for better visiblity.

_images/lidar_scan_destaggered.png

destaggered LidarScan RANGE field

After destaggering, we can see the scene contains a man on a bicycle, a few cars, and many trees. This image now makes visual sense, and we can easily use this data in common visual task pipelines.

Note

By the way, you can view this particular scene in both 2D and 3D at Ouster’s Web Slam! Use your mouse to click and move the 3D scene, and the listed controls to rotate between different destaggered image views. The video at the bottom shows the registered point clouds of our internal SLAM algorithm.

Projecting into Cartesian Coordinates

To facilitate working with 3D points, you can call client.XYZLut() to create a function which will project a LidarScan into Cartesian coordinates using a precomputed lookup table. The result of calling this function will be a point cloud represented as a numpy array. See the API documentation for client.XYZLut() for more details.

1# transform data to 3d points and graph
2xyzlut = client.XYZLut(metadata)
3xyz = xyzlut(scan)
4
5[x, y, z] = [c.flatten() for c in np.dsplit(xyz, 3)]
6ax.scatter(x, y, z, c=z / max(z), s=0.2)

If you have a sensor, you can run this code with one of our examples:

$ python3 -m ouster.sdk.examples.client $SENSOR_HOSTNAME plot-xyz-points

That should open a 3D plot of a single scan of your location taken just now by your sensor. You should be able to recognize the contours of the scene around you.

If you don’t have a sensor, you can run this code with our pcap examples:

$ python3 -m ouster.sdk.examples.pcap OS1_128.pcap OS1_2048x10_128.json plot-xyz-points --scan-num 84
_images/lidar_scan_xyz_84.png

Point cloud from OS1 sample data (scan 84). Points colored by SIGNAL value.

For details check the source code of an example examples.pcap.pcap_display_xyz_points()

Also check out a more powerful way of visualizing xyz 3d points with Visualization with Open3d

Working with 2D and 3D Representations Simultaneously

The direct correlation between 2D and 3D representations in an Ouster sensor provides a powerful framework for working with the data. As an easy example, you might decide you want to look at only the 3D points within a certain range and from certain azimuth angles.

 1# obtain destaggered range
 2range_destaggered = client.destagger(metadata,
 3                                     scan.field(client.ChanField.RANGE))
 4
 5# obtain destaggered xyz representation
 6xyzlut = client.XYZLut(metadata)
 7xyz_destaggered = client.destagger(metadata, xyzlut(scan))
 8
 9# select only points with more than min range using the range data
10xyz_filtered = xyz_destaggered * (range_destaggered[:, :, np.newaxis] >
11                                  (range_min * 1000))
12
13# get first 3/4 of scan
14to_col = math.floor(metadata.mode.cols * 3 / 4)
15xyz_filtered = xyz_filtered[:, 0:to_col, :]

Since we’d like to filter on azimuth angles, first we first destagger both the 2D and 3D points, so that our columns in the HxW representation correspond to azimuth angle, not timestamp. (See Staggered vs Destaggered 2D Representations for an explanation on destaggering.)

Then we filter the 3D points xyz_destaggered by comparing the range measurement to min_range, which we can do because there is a 1:1 correspondence between the columns and rows of the destaggered representations of xyz_destaggered and range_staggered. (Similarly, there would be a 1:1 correspondence between the staggered representations xyz and range, where the columns correspond with timestamp).

Finally, we select only the azimuth columns we’re interested in. In this case, we’ve arbitrarily chosen the first 270 degrees of rotation.

If you have a sensor, you can run this code with an example:

$ python3 -m ouster.sdk.examples.client $SENSOR_HOSTNAME filter-3d-by-range-and-azimuth

Streaming Live Data

Instead of working with a recorded dataset or a few captured frames of data, let’s see if we can get a live feed from the sensor:

$ python3 -m ouster.sdk.examples.client $SENSOR_HOSTNAME live-plot-signal

This should give you a live feed from your sensor that looks like a black and white moving image. Try waving your hand or moving around to find yourself within the image!

So how did we do that?

 1# establish sensor connection
 2with closing(client.Scans.stream(hostname, lidar_port,
 3                                 complete=False)) as stream:
 4    show = True
 5    while show:
 6        for scan in stream:
 7            # uncomment if you'd like to see frame id printed
 8            # print("frame id: {} ".format(scan.frame_id))
 9            signal = client.destagger(stream.metadata,
10                                      scan.field(client.ChanField.SIGNAL))
11            signal = (signal / np.max(signal) * 255).astype(np.uint8)
12            cv2.imshow("scaled signal", signal)
13            key = cv2.waitKey(1) & 0xFF

Notice that instead of taking a sample as we did in previous example, we used Scans.stream(), which allows for a continuous live data stream. We close the stream when we are finished, hence the use of closing() in the highlighted line.

To exit the visualization, you can use ESC.

Recording Sensor Data

It’s easy to record data to a pcap file from a sensor programatically. Let’s try it first with the following example:

$ python3 -m ouster.sdk.examples.client $SENSOR_HOSTNAME record-pcap

This will capture the client.LidarPacket’s and client.ImuPacket’s data for 10 seconds and store the pcap file along with the metadata json file into the current directory.

The source code of an example below:

 1# connect to sensor and record lidar/imu packets
 2with closing(client.Sensor(hostname, lidar_port, imu_port,
 3                           buf_size=640)) as source:
 4
 5    # make a descriptive filename for metadata/pcap files
 6    time_part = datetime.now().strftime("%Y%m%d_%H%M%S")
 7    meta = source.metadata
 8    fname_base = f"{meta.prod_line}_{meta.sn}_{meta.mode}_{time_part}"
 9
10    print(f"Saving sensor metadata to: {fname_base}.json")
11    source.write_metadata(f"{fname_base}.json")
12
13    print(f"Writing to: {fname_base}.pcap (Ctrl-C to stop early)")
14    source_it = time_limited(n_seconds, source)
15    n_packets = pcap.record(source_it, f"{fname_base}.pcap")
16
17    print(f"Captured {n_packets} packets")

Good! The resulting pcap and json files can be used with any examples in the examples.pcap module.

PCAP Live Data Preview

We can easily view the data that was recorded in the previous section. Based on an example from Streaming Live Data above we have a pcap viewer with additional stagger/destagger handler on key D and pause on SPACE key (source code: examples.pcap.pcap_2d_viewer()). To run it try the following command:

$ python3 -m ouster.sdk.examples.pcap OS1_128.pcap OS1_2048x10_128.json 2d-viewer

Or substitute example data with pcap and json that you just recorded.

Converting PCAPs to CSV

Sometimes we want to get a point cloud (XYZ + other fields) as a CSV file for further analysis with other tools.

To convert the first 5 scans of our sample data from a pcap file, you can try:

$ python3 -m ouster.sdk.examples.pcap OS1_128.pcap OS1_2048x10_128.json pcap-to-csv --scan-num 5

The source code of an example below:

 1# precompute xyzlut to save computation in a loop
 2xyzlut = client.XYZLut(metadata)
 3
 4# create an iterator of LidarScans from pcap and bound it if num is specified
 5scans = iter(client.Scans(source))
 6if num:
 7    scans = islice(scans, num)
 8
 9for idx, scan in enumerate(scans):
10
11    # copy per-column timestamps for each channel
12    col_timestamps = scan.header(client.ColHeader.TIMESTAMP)
13    timestamps = np.tile(col_timestamps, (scan.h, 1))
14
15    # grab channel data
16    fields_values = [scan.field(ch) for ch in client.ChanField]
17
18    # use integer mm to avoid loss of precision casting timestamps
19    xyz = (xyzlut(scan) * 1000).astype(np.int64)
20
21    # get all data as one H x W x 8 int64 array for savetxt()
22    frame = np.dstack((timestamps, *fields_values, xyz))
23
24    # not necessary, but output points in "image" vs. staggered order
25    frame = client.destagger(metadata, frame)
26
27    # write csv out to file
28    csv_path = os.path.join(csv_dir, f'{csv_base}_{idx:06d}.{csv_ext}')
29    print(f'write frame #{idx}, to file: {csv_path}')
30
31    header = '\n'.join([f'frame num: {idx}', field_names])
32
33    np.savetxt(csv_path,
34               frame.reshape(-1, frame.shape[2]),
35               fmt=field_fmts,
36               delimiter=',',
37               header=header)

Because we stored the scan as structured 2D images, we can easily recover it by loading it back into a numpy.ndarray and continuing to use it as a 2D image.

import numpy as np

# read array from CSV
frame = np.loadtxt('my_frame_00000.csv', delimiter=',')

# convert back to "fat" 2D image [H x W x 7] shape
frame = frame.reshape((128, -1, frame.shape[1]))

We used 128 while restoring 2D image from a CSV file because it’s the number of channels of our OS-1-128.pcap sample data recording.

Check examples.pcap.pcap_to_csv() documentation for further details.

Visualization with Open3d

The Open3d library contains Python bindings for a variety of tools for working with point cloud data. Loading data into Open3d is just a matter of reshaping the numpy representation of a point cloud, as demonstrated in the examples.pcap.pcap_3d_one_scan() example:

1# compute point cloud using client.SensorInfo and client.LidarScan
2xyz = client.XYZLut(metadata)(scan)
3
4# create point cloud and coordinate axes geometries
5cloud = o3d.geometry.PointCloud(
6    o3d.utility.Vector3dVector(xyz.reshape((-1, 3))))
7axes = o3d.geometry.TriangleMesh.create_coordinate_frame(1.0)

The examples.open3d module contains a more fully-featured visualizer built using the Open3d library, which can be used to replay pcap files or visualize a running sensor. The bulk of the visualizer is implemented in the examples.open3d.viewer_3d() function.

Note

You’ll have to install the Open3d package from PyPI to run this example. Note that as of version 0.13.0, binaries are not yet provided for Python 3.9 or ARM systems.

As an example, you can view frame 84 from the sample data by running the following command:

$ python3 -m ouster.sdk.examples.open3d \
    --pcap OS1_128.pcap --meta OS1_2048x10_128.json --start 84 --pause

You may also want to try the --sensor option to display the output of a running sensor. Use the -h flag to see a full list of command line options and flags.

Running the example above should open a window displaying a scene from a city intersection, reproduced below:

_images/lidar_scan_xyz_84_3d.png

Open3D visualization of OS1 sample data (frame 84). Points colored by SIGNAL field.

You should be able to click and drag the mouse to look around. You can zoom in and out using the mouse wheel, and hold control or shift while dragging to pan and roll, respectively.

Hitting the spacebar will start playing back the rest of the pcap in real time. Note that reasonable performance for realtime playback requires relatively fast hardware, since Open3d runs all rendering and processing in a single thread.

All of the visualizer controls are listed in the table below:

Open3d Visualizer Controls

Key

What it does

Mouse wheel

Zoom in an out

Left click + drag

Tilt and rotate the camera

Ctrl + left click + drag

Pan the camera laterally

Shift + left click + drag

Roll the camera

“+” / “-“

Increase or decrease point sizes

Spacebar

Pause or resume playback

“M”

Cycle through channel fields used for visualization

Right arrow key

When reading a pcap, jump 10 frames forward

Working with IMU data from the Ouster Sensor

IMU data from the Ouster sensor can be read as ImuPacket. Let’s do something easy, like graph the acceleration in z direction over time. Let’s look at some code:

1    # connect to sensor and get imu packets within n_seconds
2    source = client.Sensor(hostname, lidar_port, imu_port, buf_size=640)
3    with closing(source):
4        ts, z_accel = zip(*[(p.sys_ts, p.accel[2])
5                            for p in time_limited(n_seconds, source)
6                            if isinstance(p, client.ImuPacket)])
7    # initialize plot
8    fig, ax = plt.subplots(figsize=(12.0, 2))
9    ax.plot(ts, z_accel)

Like other Packets, we’ll want to get them from a PacketSource. After getting imu_packet_list, we obtain the sys_ts and z part of accel and plot them.

If you have a sensor, you can run the code above with the plot-imu-z-accel example:

$ python3 -m ouster.sdk.examples.client $SENSOR_HOSTNAME plot-imu-z-accel