Internet DRAFT - draft-dong-remote-driving-usecase
draft-dong-remote-driving-usecase
Independent Submission L. Dong
Internet-Draft R. Li
Intended status: Informational Futurewei Technologies Inc.
Expires: 29 December 2022 J. Hong
ETRI
27 June 2022
Use Case of Remote Driving and its Network Requirements
draft-dong-remote-driving-usecase-00
Abstract
This document illustrates the use case of remote driving that
leverages the human driver's advanced perceptual and cognitive skills
to enhance autonomous driving when it is absent or falls short.
Specifically the document analyzes the end-to-end latency that is
required in the network to support collision avoidance in remote
driving. The document also summarizes the other necessary
requirements that the networking services shall support.
Status of This Memo
This Internet-Draft is submitted in full conformance with the
provisions of BCP 78 and BCP 79.
Internet-Drafts are working documents of the Internet Engineering
Task Force (IETF). Note that other groups may also distribute
working documents as Internet-Drafts. The list of current Internet-
Drafts is at https://datatracker.ietf.org/drafts/current/.
Internet-Drafts are draft documents valid for a maximum of six months
and may be updated, replaced, or obsoleted by other documents at any
time. It is inappropriate to use Internet-Drafts as reference
material or to cite them other than as "work in progress."
This Internet-Draft will expire on 29 December 2022.
Copyright Notice
Copyright (c) 2022 IETF Trust and the persons identified as the
document authors. All rights reserved.
This document is subject to BCP 78 and the IETF Trust's Legal
Provisions Relating to IETF Documents (https://trustee.ietf.org/
license-info) in effect on the date of publication of this document.
Please review these documents carefully, as they describe your rights
Dong, et al. Expires 29 December 2022 [Page 1]
Internet-Draft draft-dong-remote-driving-usecase-00 June 2022
and restrictions with respect to this document. Code Components
extracted from this document must include Revised BSD License text as
described in Section 4.e of the Trust Legal Provisions and are
provided without warranty as described in the Revised BSD License.
Table of Contents
1. Introduction to Autonomous Vehicles . . . . . . . . . . . . . 2
2. Terms and Abbreviations . . . . . . . . . . . . . . . . . . . 2
3. Remote Driving . . . . . . . . . . . . . . . . . . . . . . . 3
3.1. Collision Avoidance in Remote Driving . . . . . . . . . . 4
4. Network Requirements . . . . . . . . . . . . . . . . . . . . 6
5. IANA Considerations . . . . . . . . . . . . . . . . . . . . . 7
6. Security Considerations . . . . . . . . . . . . . . . . . . . 7
7. Acknowledgements . . . . . . . . . . . . . . . . . . . . . . 7
8. Informative References . . . . . . . . . . . . . . . . . . . 7
Authors' Addresses . . . . . . . . . . . . . . . . . . . . . . . 8
1. Introduction to Autonomous Vehicles
Autonomous vehicles (AV) have made great progress in the recent
years, which rely on numerous well-placed sensors that continuously
detect, observe the location and movement of surrounding vehicles,
conditions on the road, pedestrians, traffic lights, etc. Autonomous
vehicle can be controlled by its own central computer, which
manipulates the steering, accelerator, and brake, achieving self-
driving in different levels.
SAE International's new standard "J3016: Taxonomy and Definitions for
Terms Related to On-Road Motor Vehicle Automated Driving Systems"
defines six LoAs (Level of Automation) [SAEJ3016], including full
automation (level 5), high automation (level 4), conditional
automation (level 3), partial automation (level 2), driver assistance
(level 1), and no automation (level 0).
Although each vehicle manufacturer has been taking its best effort of
making progress in increasing the level of automation, the current
automated vehicles by themselves can only fit into the SAE
classification 2 or 3. AVs may fail short in unexpected situations.
In such cases, it is desirable that humans can operate the vehicle
manually to recover from a failure situation through remote driving.
Until the autonomous technology becomes mature enough to be level 5,
the experts suggest AVs should be backed up by tele-operations.
2. Terms and Abbreviations
The terms and abbreviations used in this document are listed below.
Dong, et al. Expires 29 December 2022 [Page 2]
Internet-Draft draft-dong-remote-driving-usecase-00 June 2022
* AI: Artificial Intelligence
* AV: Autonomous Vehicle
* BE: Best-Effort
* GPS: Global Positioning System
The above terminology is defined in greater details in the remainder
of this document.
3. Remote Driving
Remote driving is a mechanism in which a human driver operates a
vehicle from a distance through communication networks. Remote
driving leverages the human driver's advanced perceptual and
cognitive skills to further assist the autonomous driving when it
falls short, and overcomes many complex situations that computer
vision or artificial intelligence could not foresee or apprehend.
Such situations and possible failures of autonomous driving include:
(1) perception failure at night or under challenging weather
conditions, e.g., low visibility due to fog, lane markers are covered
by snow; (2) confusing or malfunctioning traffic lights,
unrecognizable traffic signs due to corrosion or graffiti; (3)
Confusing detour signs or complex instructions temporarily ordered by
police officers, which require extra knowledge about the local
traffic and understanding of the local construction works; (4)
Complex or confusing parking signs, which might be handwritten and
hard to be understood by computers. Parking might only be allowed on
certain dates during the week, or parking lots are only permissible
for certain types of vehicles. With remote driving being added to
the AV control loop, passengers could feel safe enough.
Remotely operated vehicles may also be of interest to personal
transportation services. Vay, a Berlin-based startup [Vay] plans to
debut a fleet of taxis controlled by remote teledrivers by 2022. The
concept behind Vay is that when you order a Vay, one of teledrivers
is tasked to navigate one Vay to your pickup location. Then you take
control the Vay. After you reach your destination, the teledriver
takes control of the Vay and deliver it to the next nearby customer.
During the whole transaction, the remote driving takes place for Vay
delivery. This is advertised to happen at the initial roll-out
stage, the Vay might be remotely controlled by teledrivers to drive
the customers around in the future stages when the technologies are
mature enough. Vay's system is promised to be built safer than
conventional driving by controlling the top four causes of fatal
urban accidents, which are driving under the influence, speeding,
distraction, and fatigue.
Dong, et al. Expires 29 December 2022 [Page 3]
Internet-Draft draft-dong-remote-driving-usecase-00 June 2022
Remotely operated trunks could possibly eliminate the threats to road
safety, driver/passenger safety that are caused by fleet driver
fatigue during long drives. Remotely operated vehicles are also
particularly useful compared to autonomous trunking [Tusimple] in
situations where it would be hazardous or impossible for humans to
operate in, for example, construction vehicles in remote sites or
emergency service vehicles in areas that are affected by chemical
spills, by active wildfires, or by hurricane conditions.
A remotely controlled vehicle needs to transit necessary data in high
volumes to the remote operation center which might be located in edge
cloud or central cloud. The data includes all the sensory feeds that
the autonomous vehicle itself could collect. Signals from GPS
(Global Positioning System) satellites could be combined with reading
from tachometers, altimeters, and gyroscopes to provide more accurate
positioning of the vehicle. Radar sensors monitor the positions of
other vehicles nearby. Lidar (Light Detection and Ranging) sensors
bounce pulses of light off the surroundings to identify lane markings
and road boundaries. Ultrasonic sensors are used to measure the
position of objects that are very close to the vehicle. Video
cameras consistently take pictures of the surroundings from different
angles. Volumetric data from vehicles are sent from the vehicles to
the remote driving center to provide the remote driver with adequate
perception of the environment. The remote driver can then provide
appropriate instructions to help the autonomous vehicle resolve the
issues.
3.1. Collision Avoidance in Remote Driving
In this section, we use a specific collision avoidance scenario in
remote driving as shown in Figure 1 to illustrate that the network
and its protocols need to provide the necessary support. There are
many similar use cases that have already been specified in [TR22.885]
and [TR22.886].
Dong, et al. Expires 29 December 2022 [Page 4]
Internet-Draft draft-dong-remote-driving-usecase-00 June 2022
______ [ ]
/|_||_\`.__ [ ]
( _ _ _\ <----Collision Avoidance Distance--->[ ]
=`-(_)--(_)-' [ P ]
.-~~~-.
.- ~ ~-( )_ _
/ ~ -.
| Networks \
\ .'
~- . _____________ . -~
+------+
+Remote+
+driver+
+------+
Figure 1
Given the current technologies in sensing, encoding and decoding,
together with the Best Effort (BE) service provided in the current
Internet, the total roundtrip delay between the time when the
roadside camera captures picture of pedestrian on the crossroad and
the time when the self-driving car receives the signal to brake is
around 250-400 ms. On the other hand, the latency already incurred
by the remote driver's reaction time also adds the total latency,
adding to the distance required for the vehicle to come to a stop.
The detailed breakdown of the total latency is shown as below:
* Image capture, encoding, decoding and display: 100 ms [Nuvation]
[Sensoray];
* Remote driver's reaction time: 100 ms;
* Total transmission time in the network: 50-200 ms, which includes
the time for the image data to reach the remote driver as well as
the time for the command to reach the vehicle [VerizonNetwork]
[Candela2020]; The image data could be encapsulated in multiple
packets, depending on the image resolution and size. Thus the
total transmission time in the network might involve 2 or more
packets transmission. With the best-effort nature of the current
Internet, the total transmission time is not determined and
changes at per packet basis, might for example range between 50ms
to 200 ms.
* Total: 250-400 ms.
The collision avoidance distance is proportional to the vehicle
speed. For example, if the car is driving at 60 km/hour, the
collision avoidance distance must be longer than 7 meters, in other
Dong, et al. Expires 29 December 2022 [Page 5]
Internet-Draft draft-dong-remote-driving-usecase-00 June 2022
words, the self-driving car must start to brake more than 7 meters
away from the pedestrians. Table 1 shows the calculation of
collision avoidance distance based on the vehicle's speed and the
current total latency.
If the vehicle is driving at higher speed (e.g., 80 km/hour) and for
it to start to brake at shorter distance away from the pedestrians
(e.g., 4 meters), the total round-trip delay needs to be much
shortened (e.g., 4/(80/3600)=180 ms). Assuming with the technologies
advancement, the total time needed for sensory image capture, framing
and encoding, decoding and display is reduced to 60 ms, the total
transmission time in the network cannot be longer than 20 ms
precisely. Within the 20 ms, the captured image or video data, and
other sensory data need to arrive the remote server, the command from
the remote driver needs to reach the vehicle as well.
+==========================+==============================+
| Speed | Collision Avoidance Distance |
+==========================+==============================+
| 5 km/hour = 1.4 m/sec | 1.4*0.4 = 0.56 m |
+--------------------------+------------------------------+
| 30 km/hour = 8.4 m/sec | 8.4*0.4 = 3.36 m |
+--------------------------+------------------------------+
| 60 km/hour = 16.8 m/sec | 16.8*0.4 = 6.72 m |
+--------------------------+------------------------------+
| 80 km/hour = 22.3 m/sec | 22.3*0.4 = 8.92 m |
+--------------------------+------------------------------+
| 120 km/hour = 33.4 m/sec | 33.4*0.4 = 13.36 m |
+--------------------------+------------------------------+
Table 1: Collision avoidance distance based on
vehicle's speed
4. Network Requirements
The following requirements need to be supported by the networks:
* The networking services shall support multiple concurrent flow
streams at high data rates and volumetric data transmission from
vehicles with high mobility.
* The networks shall deliver services with service level objectives,
specifically latency objectives. The latency objectives are
required to be precisely guaranteed and highly reliable, not just
"optimized" but quantifiable.
Dong, et al. Expires 29 December 2022 [Page 6]
Internet-Draft draft-dong-remote-driving-usecase-00 June 2022
* The network shall be able to identify the packets which carry
urgent information and treat them in a differentiated manner with
highest priority
* The networking services shall reduce and even avoid dropping/re-
transmission of packets with high significance. Packet loss of
certain urgent packets are not permissible in the network.
5. IANA Considerations
This document requires no actions from IANA.
6. Security Considerations
This document introduces no new security issues.
7. Acknowledgements
8. Informative References
[Candela2020]
Candela, M., Luconi, V., and A. Vecchio, "Impact of the
COVID-19 pandemic on the Internet latency: A large-scale
study", Computer Networks, vol. 182, no. 11, 2020,
<hhttps://doi.org/10.1016/j.comnet.2020.107495>.
[Nuvation] "Video Capture and Display", 2022,
<https://www.nuvation.com/industrial-video-capture-
display-system>.
[SAEJ3016] "Taxonomy and Definitions for Terms Related to Driving
Automation Systems for On-Road Motor Vehicles, SAE
J3016_202104", 2021, <sae.org/standards/content/
j3016_202104/>.
[Sensoray] Eberlein, P., "Video Latency, What It Is and Why It's
Important", 2015, <https://www.nuvation.com/industrial-
video-capture-display-system>.
[TR22.885] "Study on LTE support for Vehicle to Everything (V2X)
services, 3GPP TR 22.885", 2015,
<https://www.3gpp.org/ftp/Specs/
archive/22_series/22.885/>.
[TR22.886] "Study on enhancement of 3GPP Support for 5G V2X Services,
3GPP TR 22.886", 2018, <https://www.3gpp.org/ftp/Specs/
archive/22_series/22.886/>.
Dong, et al. Expires 29 December 2022 [Page 7]
Internet-Draft draft-dong-remote-driving-usecase-00 June 2022
[Tusimple] "TuSimple Autonomous Trucking", 2022,
<https://www.tusimple.com/>.
[Vay] "A New Approach to Driverless mobility", 2022,
<https://vay.io/>.
[VerizonNetwork]
"Verizon Network Latency Statistics", 2022,
<https://www.verizon.com/business/solutions/business-
continuity/weekly-latency-statistics/>.
Authors' Addresses
Lijun Dong
Futurewei Technologies Inc.
Email: lijun.dong@futurewei.com
Richard Li
Futurewei Technologies Inc.
Email: richard.li@futurewei.com
Jungha Hong
ETRI
Email: jhong@etri.re.kr
Dong, et al. Expires 29 December 2022 [Page 8]