CARLA

CARLA is an open-source simulator for autonomous driving research. As such CARLA data is synthetic and can be generated with varying sensor and environmental conditions. The following documentation is largely incomplete and merely describes the provided demo data.

Quick Links

Paper

CARLA: An Open Urban Driving Simulator

Website

carla.org/

Code

github.com/carla-simulator/carla

License

MIT License

Available splits

n/a

Available Modalities

Name

Available

Description

Ego Vehicle

Depending on the collected dataset. For further information, see EgoStateSE3.

Map

We included a conversion method of OpenDRIVE maps. For further information, see MapAPI.

Bounding Boxes

Depending on the collected dataset. For further information, see BoxDetectionWrapper.

Traffic Lights

X

n/a

Pinhole Cameras

Depending on the collected dataset. For further information, see PinholeCamera.

Fisheye Cameras

X

n/a

LiDARs

Depending on the collected dataset. For further information, see LiDAR.

Download

n/a

Installation

n/a

Dataset Specific

Box Detection Labels
class py123d.conversion.registry.DefaultBoxDetectionLabel[source]

Default box detection labels used in 123D. Common labels across datasets.

EGO = 0
VEHICLE = 1
TRAIN = 2
BICYCLE = 3
PERSON = 4
ANIMAL = 5
TRAFFIC_SIGN = 6
TRAFFIC_CONE = 7
TRAFFIC_LIGHT = 8
BARRIER = 9
GENERIC_OBJECT = 10
to_default()[source]

Inherited, see superclass.

Return type:

DefaultBoxDetectionLabel

LiDAR Index
class py123d.conversion.registry.DefaultLiDARIndex[source]

Default LiDAR indices for XYZ point clouds.

X = 0
Y = 1
Z = 2

Dataset Issues

n/a

Citation

If you use CARLA in your research, please cite:

@article{Dosovitskiy2017CORL,
  title = {{CARLA}: {An} Open Urban Driving Simulator},
  author = {Alexey Dosovitskiy and German Ros and Felipe Codevilla and Antonio Lopez and Vladlen Koltun},
  booktitle = {Proceedings of the 1st Annual Conference on Robot Learning},
  year = {2017}
}