Remote Access Funded by NSF
With a $5.1 million Mcity 2.0 grant from the U.S. National Science Foundation, Mcity has enhanced the Test Facility by developing digital infrastructure to overlay the physical test facility and create a cloud-based, augmented-reality CAV testbed available to academic and industry researchers nationwide. This gives researchers, many without testing resources, remote access to the facility and helps create a more equitable playing field in mobility. Remote access is now operational, and can be used with the physical test facility and Mcity research vehicles.
Researchers may apply to a Request for Proposals to use Mcity 2.0 remote capabilities.
Enhancing Autonomous Vehicle Testing
Mcity 2.0 is a platform designed to help users remotely test and refine autonomous vehicle (AV) motion planning algorithms. It allows users to conduct tests in both simulated and mixed reality environments without needing a complete AV system or physical testing facility. Mcity 2.0 integrates digital infrastructure with physical facilities to provide real-time visualization of AV status, safety metrics, and testing data, which are archived for ongoing analysis and improvement.
Goals and Objectives
- Create realistic traffic environments for AV testing.
- Enable remote access and control of AV testing facilities.
- Offer comprehensive evaluation metrics and sensor data to enhance AV motion planning algorithms.
Key Features
- TeraSim on the Cloud: A fast, realistic traffic simulation generating critical events like collisions. It uses models calibrated with real-world data to simulate both normal and safety-critical driving scenarios.
- AWS Cloud Hosting: TeraSim is hosted on AWS, allowing global remote access for seamless integration with user algorithms via the TeraSim API.
- Evaluation Metrics: Includes minimum distance, time-to-collision, and speed difference to assess AV safety and performance.
- Remote Control Communication: Uses Redis for data exchange and a low-latency communication pipeline to manage real-time data and control commands.
Testing Capabilities
- Simulation-Based Testing: AVs and background traffic are visualized in real-time, with performance metrics plotted. Data is archived for post-analysis.
- Mixed Reality Testing: Combines digital simulations with physical AVs. Remote testers can view real-time AV status and control inputs via Mcity’s HD map.
Testing Procedure
- Simulation Testing: Remote visualization and data recording of AV and virtual traffic.
- Mixed Reality Testing: Real-time visualization of physical AV’s control inputs and performance on Mcity’s map, with remote access to onboard and roadside views.
Outcomes
- Simulation Testing: Identified potential safety issues and improved longitudinal control to reduce conflicts.
- Mixed Reality Testing: Enhanced passenger comfort by smoothing deceleration and reducing harsh braking.
By leveraging Mcity 2.0’s advanced simulation and mixed reality capabilities, remote users can effectively test, validate, and enhance AV motion planning algorithms, driving progress in autonomous vehicle safety and performance.
How to get remote access to Mcity?
- Review use case examples below
- Experiment using Mcity open-source tools
- Contact us to discuss your project
- Or apply to the Mcity 2.0 Request for Proposal
USE CASE 1: Remote Testing of AV Motion Planning Algorithms
The original proof of concept for Mcity 2.0 remote access. For this first use case, the Connected Automated and Resilient Transportation (CART) Lab, led by Dr. Yiheng Feng, Assistant Professor of Civil Engineering at Purdue University, were the guest remote researchers. Our goals were to establish that this kind of remote collaboration is possible, and that it is beneficial for all participants. The CART Lab does not have access to a test facility, no advanced infrastructure, nor can it run tests with background/challenge vehicles (due to safety concerns of testing on public roads). These limitations were successfully overcome by the Mcity remote access platform.
USE CASE 2: Multi-Agent Distributed Remote AV Testing
An initiative to integrate its advanced simulation and mixed reality system with the VOICES platform. VOICES, an initiative led by the Department of Transportation, is a distributed virtual platform designed to facilitate collaborative efforts among diverse stakeholders including state and local governments, the private sector, and academic institutions. This integration aligns with the National Science Foundation’s objectives of fostering collaboration with national laboratories.
Unique capabilities Mcity 2.0 delivered to the VOICES project:
- Ability to run physical test vehicles on a physical test track that is integrated into the Voices distributed simulation environment.
- Ability to create virtual background traffic in TeraSim with the potential to create challenging test scenarios for the virtual and physical vehicles that are being tested.
- Provision of a high-definition map and a dynamic CARLA environment representing the Mcity test facility.
Objective: To analyze the econometrics of connected/autonomous vehicles within a collaborative environment and compare these metrics against solo drive baselines.
- Participants managed their vehicles within individual simulations with key econometric data being collected for subsequent analysis by the Argonne National Laboratory.
- String of autonomous vehicles, in the distributed simulation, traversing through the roundabout at Mcity shown right.
Integration and Collaboration: At the heart of this use case is the seamless integration of Mcity’s simulation capabilities and its remote-access technology into the VOICES platform.
- This demonstration showcased Mcity’s adaptability and interoperability.
- Key participants in this event included Argonne National Laboratory, Oak Ridge National Laboratory, Econolite, the University of California Los Angeles, and the Federal Highway Administration.
Features and Capabilities: This use case emphasized the Mcity 2.0 platform’s flexibility. Instead of hosting the simulation entirely in Mcity’s OS Cloud, the simulation map and world simulation was shared with participants. Each participant was able to set up their own simulation and link in with every other participant’s simulation. This technique is called Distributed Simulation. Mcity’s participation showcased that the Mcity 2.0 platform is capable of almost any kind of deployment and collaboration.
The test plan for use case 2.
Simulation and Real-World Testing: Mcity contributed a high-definition map and a dynamic CARLA environment of its test facility. This formed the basis for each participant’s hosted simulation.
- Real-time vehicle data was shared among all participants. Unique to this project, both Argonne National Laboratory and Mcity linked their simulations to real vehicles.
- In Argonne’s case, an electric vehicle was mounted on a dynamometer, while Mcity employed a real-world vehicle at its Ann Arbor, Michigan test track.
- Data from both simulated and real-world environments was gathered for comprehensive analysis.
Electric vehicle connected to the distributed simulation and a dynamometer at Argonne National Labs.
Additional Contributions and Future Applications: Mcity also introduced background vehicles into the distributed simulation. These background vehicles were controlled by the TeraSim system, which is trained upon extensive traffic data from Ann Arbor intersections. These vehicles were configured to avoid interfering with the test vehicles to ensure unaltered econometric data. This unobtrusive implementation served as a proof-of-concept of the interoperability of TeraSim with distributed simulation systems like VOICES. For this use case, the background vehicles were configured to be very safe, so as not to interfere with the econometrics being gathered. Future test runs could be configured to be more adversarial toward the vehicles under test.
USE CASE 3: Teleoperation of AVs
Mcity 2.0 capabilities offer a safe way to experiment with potentially risky scenarios, such as employing a remote teleoperator to help an AV get unstuck. This demonstration uses a state-of-the-art driving simulator with a sit-in cockpit and projection screen, supplied by VI-grade. It allows the user to attempt to safely navigate a real AV around a virtual obstacle in the Mcity Test Facility.
Image courtesy of VI-grade.
USE CASE 4: Remote Testing of AV Perception Systems
This use of Mcity 2.0 capabilities illustrates the benefits of having a digital twin of the Mcity Test Facility and the simulations research teams can run with it. The demonstration, presented by Dr. Gaurav Pandey, associate professor of Engineering Technology & Industrial Distribution at Texas A&M University, uses a full simulation of the Mcity Test Facility environment, including virtual background vehicles.
The simulation renders a synthetic front-facing camera feed from inside a real autonomous vehicle inside the Mcity Test Facility that is sent to Texas, where Dr. Pandey’s team carries out real-time depth perception from the mono-camera feed. Mcity’s TeraSim, a traffic simulator trained on Ann Arbor driving behaviors, controls the vehicle as the synthetic feed is generated. Camera depth perception using neural networks is currently very challenging, making it an ideal candidate for improvement through digital twin simulations. Once the algorithm is refined, testing can be conducted in a mixed reality environment, using both real and synthetic data feeds.
USE CASE 5: Remote Testing of Joint Control of AVs and Traffic Signals
Dr. Jeff Ban, professor of Civil and Environmental Engineering at the University of Washington, and his team in the intelligent Urban Transportation Systems (iUTS) Lab, have been working on an algorithm to control traffic signals more intelligently based on data from connected vehicles. This scenario is fairly difficult (if not dangerous) to test in the real world and the team needed a safe way to experiment with it. Collaborating with Mcity and using the Mcity 2.0 platform, the iUTS tested scenarios of 25%, 50%, 75%, and 100% connected and automated vehicle (CAV) penetration in pure simulation, and in a mixed reality environment, which combined a physical vehicle with simulated background vehicles.
Objectives:
- To test the performance of their signal controlling framework, the “SVCC” Multiscale Signal-Vehicle Coupled Control in real-world and mixed traffic environments.
- To explore how they might extend this platform to a large urban environment and different penetration levels.
The iUTS also has similar encumbrances to those encountered by the Purdue team, featured in use case 1. UW has no test facility, They have no access to advanced infrastructure. And it is unsafe to test on public roads, so they cannot test with background vehicles.
The encumbrances were easily overcome as Mcity had already vetted that capability through use case 1. The iUTS was also able to test all of their planned scenarios and hone their algorithm while proving that the algorithm does indeed perform well under almost any penetration level. The performance metrics tested were fuel consumption, waiting time, time loss, queue length, and number vehicles (through the intersection).
Integration & Collaboration: The team from the University of Washington was able to participate in real-time by viewing various feeds made accessible by the Mcity 2.0 platform. They were also able to control how many background vehicles were “connected” and therefore visible to their algorithm, thus simulating different levels of penetration.
Features & Capabilities: A prototype was created to allow a researcher to immediately change the states of traffic signals via an API. The prototype is a Raspberry Pi connected to the Mcity OS network and directly to the traffic signals via typical A, B, and C connectors. The local Malfunction Monitoring Unit (MMU) was configured to allow all states, so that an immediate change from the research team would not trigger a fault. A simple REST API was written and installed on the Raspberry Pi so that the research team need only get on the Mcity OS network in order to control the state.