GET MCITY 2.0 REMOTE ACCESS

Progress developing remote access capabilities for the Mcity Test Facility has been measured by milestones marking the completion of several sample use cases. Each use case highlights a different set of goals and features.

USE CASE 1: Remote Testing of AV Motion Planning Algorithms

The original proof of concept for Mcity 2.0 remote access. For this first use case, the Connected Automated and Resilient Transportation (CART) Lab, led by Dr. Yiheng Feng, Assistant Professor of Civil Engineering at Purdue University, were the guest remote researchers. Our goals were to establish that this kind of remote collaboration is possible, and that it is beneficial for all participants. The CART Lab does not have access to a test facility, no advanced infrastructure, nor can it run tests with background/challenge vehicles (due to safety concerns of testing on public roads). These limitations were successfully overcome by the Mcity remote access platform.

USE CASE 2: Multi-Agent Distributed Remote AV Testing

An initiative to integrate its advanced simulation and mixed reality system with the VOICES platform. VOICES, an initiative led by the Department of Transportation, is a distributed virtual platform designed to facilitate collaborative efforts among diverse stakeholders including state and local governments, the private sector, and academic institutions. This integration aligns with the National Science Foundation’s objectives of fostering collaboration with national laboratories.

Unique capabilities Mcity 2.0 delivered to the VOICES project:

  • Ability to run physical test vehicles on a physical test track that is integrated into the Voices distributed simulation environment.
  • Ability to create virtual background traffic in TeraSim with the potential to create challenging test scenarios for the virtual and physical vehicles that are being tested.
  • Provision of a high-definition map and a dynamic CARLA environment representing the Mcity test facility.

Objective: To analyze the econometrics of connected/autonomous vehicles within a collaborative environment and compare these metrics against solo drive baselines.

  • Participants managed their vehicles within individual simulations with key econometric data being collected for subsequent analysis by the Argonne National Laboratory.
  • String of autonomous vehicles, in the distributed simulation, traversing through the roundabout at Mcity shown right.

Integration and Collaboration: At the heart of this use case is the seamless integration of Mcity’s simulation capabilities and its remote-access technology into the VOICES platform.

Features and Capabilities: This use case emphasized the Mcity 2.0 platform’s flexibility. Instead of hosting the simulation entirely in Mcity’s OS Cloud, the simulation map and world simulation was shared with participants. Each participant was able to set up their own simulation and link in with every other participant’s simulation. This technique is called Distributed Simulation. Mcity’s participation showcased that the Mcity 2.0 platform is capable of almost any kind of deployment and collaboration.

The test plan for use case 2.

Simulation and Real-World Testing: Mcity contributed a high-definition map and a dynamic CARLA environment of its test facility. This formed the basis for each participant’s hosted simulation.

  • Real-time vehicle data was shared among all participants. Unique to this project, both Argonne National Laboratory and Mcity linked their simulations to real vehicles.
  • In Argonne’s case, an electric vehicle was mounted on a dynamometer, while Mcity employed a real-world vehicle at its Ann Arbor, Michigan test track.
  • Data from both simulated and real-world environments was gathered for comprehensive analysis.

Electric vehicle connected to the distributed simulation and a dynamometer at Argonne National Labs.

Additional Contributions and Future Applications: Mcity also introduced background vehicles into the distributed simulation. These background vehicles were controlled by the TeraSim system, which is trained upon extensive traffic data from Ann Arbor intersections. These vehicles were configured to avoid interfering with the test vehicles to ensure unaltered econometric data. This unobtrusive implementation served as a proof-of-concept of the interoperability of TeraSim with distributed simulation systems like VOICES. For this use case, the background vehicles were configured to be very safe, so as not to interfere with the econometrics being gathered. Future test runs could be configured to be more adversarial toward the vehicles under test.

USE CASE 3: Teleoperation of AVs

Mcity 2.0 capabilities offer a safe way to experiment with potentially risky scenarios, such as employing a remote teleoperator to help an AV get unstuck. This demonstration uses a state-of-the-art driving simulator with a sit-in cockpit and projection screen, supplied by VI-grade. It allows the user to attempt to safely navigate a real AV around a virtual obstacle in the Mcity Test Facility.

Image courtesy of VI-grade.

USE CASE 4: Remote Testing of AV Perception Systems

This use of Mcity 2.0 capabilities illustrates the benefits of having a digital twin of the Mcity Test Facility and the simulations research teams can run with it. The demonstration, presented by Dr. Gaurav Pandey, associate professor of Engineering Technology & Industrial Distribution at Texas A&M University, uses a full simulation of the Mcity Test Facility environment, including virtual background vehicles.

The simulation renders a synthetic front-facing camera feed from inside a real autonomous vehicle inside the Mcity Test Facility that is sent to Texas, where Dr. Pandey’s team carries out real-time depth perception from the mono-camera feed. Mcity’s TeraSim, a traffic simulator trained on Ann Arbor driving behaviors, controls the vehicle as the synthetic feed is generated. Camera depth perception using neural networks is currently very challenging, making it an ideal candidate for improvement through digital twin simulations. Once the algorithm is refined, testing can be conducted in a mixed reality environment, using both real and synthetic data feeds.

USE CASE 5: Remote Testing of Joint Control of AVs and Traffic Signals

Dr. Jeff Ban, professor of Civil and Environmental Engineering at the University of Washington, and his team in the intelligent Urban Transportation Systems (iUTS) Lab, have been working on an algorithm to control traffic signals more intelligently based on data from connected vehicles. This scenario is fairly difficult (if not dangerous) to test in the real world and the team needed a safe way to experiment with it. Collaborating with Mcity and using the Mcity 2.0 platform, the iUTS tested scenarios of 25%, 50%, 75%, and 100% connected and automated vehicle (CAV) penetration in pure simulation, and in a mixed reality environment, which combined a physical vehicle with simulated background vehicles.

Objectives:

  • To test the performance of their signal controlling framework, the “SVCC” Multiscale Signal-Vehicle Coupled Control in real-world and mixed traffic environments.
  • To explore how they might extend this platform to a large urban environment and different penetration levels.

The iUTS also has similar encumbrances to those encountered by the Purdue team, featured in use case 1. UW has no test facility, They have no access to advanced infrastructure. And it is unsafe to test on public roads, so they cannot test with background vehicles.

The encumbrances were easily overcome as Mcity had already vetted that capability through use case 1. The iUTS was also able to test all of their planned scenarios and hone their algorithm while proving that the algorithm does indeed perform well under almost any penetration level. The performance metrics tested were fuel consumption, waiting time, time loss, queue length, and number vehicles (through the intersection).

Integration & Collaboration: The team from the University of Washington was able to participate in real-time by viewing various feeds made accessible by the Mcity 2.0 platform. They were also able to control how many background vehicles were “connected” and therefore visible to their algorithm, thus simulating different levels of penetration.

Features & Capabilities: A prototype was created to allow a researcher to immediately change the states of traffic signals via an API. The prototype is a Raspberry Pi connected to the Mcity OS network and directly to the traffic signals via typical A, B, and C connectors. The local Malfunction Monitoring Unit (MMU) was configured to allow all states, so that an immediate change from the research team would not trigger a fault. A simple REST API was written and installed on the Raspberry Pi so that the research team need only get on the Mcity OS network in order to control the state.