Visualizations in 3D
The Ouster Python SDK provides two visualization utilities for user convenience. These are introduced briefly below.
Ouster’s OpenGL-based visualizer allows for easy visualization from pcaps and sensors on all platforms the Ouster SDK supports.
The default Ouster SDK CLI
ouster-cli source <sensor | pcap | osf> viz visualizer view includes
two 2D range images atop which can be cycled through the available fields, and a 3D point cloud on
the bottom. For dual returns sensors, both returns are displayed by default.
The visualizer can be controlled with mouse and keyboard:
- Keyboard controls:
What it does
Toggle on-screen display
p / P
Increase/decrease point size
m / M
Cycle point cloud coloring mode
f / F
Cycle point cloud color palette
b / B
Cycle top 2D image
n / N
Cycle bottom 2D image
Set camera to the birds-eye view
Toggle camera mode FOLLOW/FIXED
e / E
Increase/decrease size of displayed 2D images
' / "
Increase/decrease spacing in range markers
Camera pitch up
Camera pitch down
Camera yaw left
Camera yaw right
Toggle first return point cloud visibility
Toggle second return point cloud visibility
Toggle orthographic camera
= / -
Prints key bindings
. / ,
Step one frame forward/back
ctrl + . / ,
Step 10 frames forward/back
> / <
Increase/decrease playback rate (during replay)
Camera Translation with mouse drag
Save a screenshot of the current view
Toggle a continuous saving of screenshots
Print keys to standard out
To run the visualizer with a sensor:
$ ouster-cli source $SENSOR_HOSTNAME viz
This will auto-configure the udp destination of the sensor while leaving the lidar port as previously set on the sensor.
To run the visualizer with a pcap:
$ ouster-cli source $SAMPLE_DATA_PCAP_PATH [--meta $SAMPLE_DATA_JSON_PATH] viz
The Open3d library contains Python bindings for a variety of tools for working with point cloud
data. Loading data into Open3d is just a matter of reshaping the numpy representation of a point
cloud, as demonstrated in the
1# compute point cloud using client.SensorInfo and client.LidarScan 2xyz = client.XYZLut(metadata)(scan) 3 4# create point cloud and coordinate axes geometries 5cloud = o3d.geometry.PointCloud( 6 o3d.utility.Vector3dVector(xyz.reshape((-1, 3)))) # type: ignore 7axes = o3d.geometry.TriangleMesh.create_coordinate_frame( 8 1.0) # type: ignore 9
examples.open3d module contains a more fully-featured visualizer built using the Open3d
library, which can be used to replay pcap files or visualize a running sensor. The bulk of the
visualizer is implemented in the
You’ll have to install the Open3d package from PyPI to run this example.
As an example, you can view frame
84 from the sample data by running the following command:
$ python3 -m ouster.sdk.examples.open3d \ --pcap $SAMPLE_DATA_PCAP_PATH --meta $SAMPLE_DATA_JSON_PATH --start 84 --pause
PS > py -3 -m ouster.sdk.examples.open3d ^ --pcap $SAMPLE_DATA_PCAP_PATH --meta $SAMPLE_DATA_JSON_PATH --start 84 --pause
You may also want to try the
--sensor option to display the output of a running sensor. Use the
-h flag to see a full list of command line options and flags.
Running the example above should open a window displaying a scene from a city intersection, reproduced below:
You should be able to click and drag the mouse to look around. You can zoom in and out using the mouse wheel, and hold control or shift while dragging to pan and roll, respectively.
Hitting the spacebar will start playing back the rest of the pcap in real time. Note that reasonable performance for realtime playback requires relatively fast hardware, since Open3d runs all rendering and processing in a single thread.
All of the visualizer controls are listed in the table below:
What it does
Zoom in and out
Left click + drag
Tilt and rotate the camera
Ctrl + left click + drag
Pan the camera laterally
Shift + left click + drag
Roll the camera
“+” / “-“
Increase or decrease point sizes
Pause or resume playback
Cycle through channel fields used for visualization
Right arrow key
When reading a pcap, jump 10 frames forward
You should have defined
source using either a pcap file or UDP data streaming directly from a
sensor, please refer to Developer’s Quick Start with the Ouster Python SDK for introduction.
Below pictures were rendered using OS2 128 Rev 05 Bridge sample data.
Let’s read from
source until we get to the 50th frame of data:
from contextlib import closing from more_itertools import nth with closing(client.Scans(source)) as scans: scan = nth(scans, 50)
If you’re using a sensor and it takes a few seconds, don’t be alarmed! It has to get to the 50th frame of data, which would be 5.0 seconds for a sensor running in 1024x10 mode.
We can extract the range measurements from the frame of data stored in the
datatype and plot a range image where each column corresponds to a single azimuth angle:
range_field = scan.field(client.ChanField.RANGE) range_img = client.destagger(info, range_field)
We can plot the results using standard Python tools that work with numpy datatypes. Here, we extract a column segment of the range data and display the result:
import matplotlib.pyplot as plt plt.imshow(range_img[:, 640:1024], resample=False) plt.axis('off') plt.show()
plt.show gives you an error about your Matplotlib backend, you will need a GUI
backend such as TkAgg or Qt5Agg in order to visualize your data with matplotlib.
In addition to viewing the data in 2D, we can also plot the results in 3D by projecting the range measurements into Cartesian coordinates. To do this, we first create a lookup table, then use it to produce X, Y, Z coordinates from our scan data with shape (H x W x 3):
xyzlut = client.XYZLut(info) xyz = xyzlut(scan)
Now we rearrange the resulting numpy array into a shape that’s suitable for plotting:
import numpy as np [x, y, z] = [c.flatten() for c in np.dsplit(xyz, 3)] ax = plt.axes(projection='3d') r = 10 ax.set_xlim3d([-r, r]) ax.set_ylim3d([-r, r]) ax.set_zlim3d([-r/2, r/2]) plt.axis('off') z_col = np.minimum(np.absolute(z), 5) ax.scatter(x, y, z, c=z_col, s=0.2) plt.show()
You should be able to rotate the resulting scene to view it from different angles.
To learn more about manipulating lidar data, see: