rfc9699.original   rfc9699.txt 
MOPS R. Krishna Internet Engineering Task Force (IETF) R. Krishna
Internet-Draft Request for Comments: 9699
Intended status: Informational A. Rahman Category: Informational A. Rahman
Expires: 21 December 2024 Ericsson ISSN: 2070-1721 Ericsson
19 June 2024 December 2024
Media Operations Use Case for an Extended Reality Application on Edge Media Operations Use Case for an Extended Reality Application on Edge
Computing Infrastructure Computing Infrastructure
draft-ietf-mops-ar-use-case-18
Abstract Abstract
This document explores the issues involved in the use of Edge This document explores the issues involved in the use of Edge
Computing resources to operationalize media use cases that involve Computing resources to operationalize media use cases that involve
Extended Reality (XR) applications. In particular, this document Extended Reality (XR) applications. In particular, this document
discusses those applications that run on devices having different discusses XR applications that run on devices having different form
form factors (such as different physical sizes and shapes) and need factors (such as different physical sizes and shapes) and need Edge
Edge computing resources to mitigate the effect of problems such as a computing resources to mitigate the effect of problems such as the
need to support interactive communication requiring low latency, need to support interactive communication requiring low latency,
limited battery power, and heat dissipation from those devices. The limited battery power, and heat dissipation from those devices.
intended audience for this document are network operators who are Network operators who are interested in providing edge computing
interested in providing edge computing resources to operationalize resources to operationalize the requirements of such applications are
the requirements of such applications. This document discusses the the intended audience for this document. This document also
expected behavior of XR applications which can be used to manage the discusses the expected behavior of XR applications, which can be used
traffic. In addition, the document discusses the service to manage traffic, and the service requirements for XR applications
requirements of XR applications to be able to run on the network. to be able to run on the network.
Status of This Memo Status of This Memo
This Internet-Draft is submitted in full conformance with the This document is not an Internet Standards Track specification; it is
provisions of BCP 78 and BCP 79. published for informational purposes.
Internet-Drafts are working documents of the Internet Engineering
Task Force (IETF). Note that other groups may also distribute
working documents as Internet-Drafts. The list of current Internet-
Drafts is at https://datatracker.ietf.org/drafts/current/.
Internet-Drafts are draft documents valid for a maximum of six months This document is a product of the Internet Engineering Task Force
and may be updated, replaced, or obsoleted by other documents at any (IETF). It represents the consensus of the IETF community. It has
time. It is inappropriate to use Internet-Drafts as reference received public review and has been approved for publication by the
material or to cite them other than as "work in progress." Internet Engineering Steering Group (IESG). Not all documents
approved by the IESG are candidates for any level of Internet
Standard; see Section 2 of RFC 7841.
This Internet-Draft will expire on 21 December 2024. Information about the current status of this document, any errata,
and how to provide feedback on it may be obtained at
https://www.rfc-editor.org/info/rfc9699.
Copyright Notice Copyright Notice
Copyright (c) 2024 IETF Trust and the persons identified as the Copyright (c) 2024 IETF Trust and the persons identified as the
document authors. All rights reserved. document authors. All rights reserved.
This document is subject to BCP 78 and the IETF Trust's Legal This document is subject to BCP 78 and the IETF Trust's Legal
Provisions Relating to IETF Documents (https://trustee.ietf.org/ Provisions Relating to IETF Documents
license-info) in effect on the date of publication of this document. (https://trustee.ietf.org/license-info) in effect on the date of
Please review these documents carefully, as they describe your rights publication of this document. Please review these documents
and restrictions with respect to this document. Code Components carefully, as they describe your rights and restrictions with respect
extracted from this document must include Revised BSD License text as to this document. Code Components extracted from this document must
described in Section 4.e of the Trust Legal Provisions and are include Revised BSD License text as described in Section 4.e of the
provided without warranty as described in the Revised BSD License. Trust Legal Provisions and are provided without warranty as described
in the Revised BSD License.
Table of Contents Table of Contents
1. Introduction . . . . . . . . . . . . . . . . . . . . . . . . 2 1. Introduction
2. Use Case . . . . . . . . . . . . . . . . . . . . . . . . . . 4 2. Use Case
2.1. Processing of Scenes . . . . . . . . . . . . . . . . . . 5 2.1. Processing of Scenes
2.2. Generation of Images . . . . . . . . . . . . . . . . . . 6 2.2. Generation of Images
3. Technical Challenges and Solutions . . . . . . . . . . . . . 6 3. Technical Challenges and Solutions
4. XR Network Traffic . . . . . . . . . . . . . . . . . . . . . 8 4. XR Network Traffic
4.1. Traffic Workload . . . . . . . . . . . . . . . . . . . . 8 4.1. Traffic Workload
4.2. Traffic Performance Metrics . . . . . . . . . . . . . . . 9 4.2. Traffic Performance Metrics
5. Conclusion . . . . . . . . . . . . . . . . . . . . . . . . . 11 5. Conclusion
6. IANA Considerations . . . . . . . . . . . . . . . . . . . . . 11 6. IANA Considerations
7. Security Considerations . . . . . . . . . . . . . . . . . . . 12 7. Security Considerations
8. Acknowledgements . . . . . . . . . . . . . . . . . . . . . . 12 8. Informative References
9. Informative References . . . . . . . . . . . . . . . . . . . 12 Acknowledgements
Authors' Addresses . . . . . . . . . . . . . . . . . . . . . . . 17 Authors' Addresses
1. Introduction 1. Introduction
Extended Reality (XR) is a term that includes Augmented Reality (AR), Extended Reality (XR) is a term that includes Augmented Reality (AR),
Virtual Reality (VR) and Mixed Reality (MR) [XR]. AR combines the Virtual Reality (VR), and Mixed Reality (MR) [XR]. AR combines the
real and virtual, is interactive and is aligned to the physical world real and virtual, is interactive, and is aligned to the physical
of the user [AUGMENTED_2]. On the other hand, VR places the user world of the user [AUGMENTED_2]. On the other hand, VR places the
inside a virtual environment generated by a computer [AUGMENTED].MR user inside a virtual environment generated by a computer
merges the real and virtual world along a continuum that connects [AUGMENTED]. MR merges the real and virtual along a continuum that
completely real environment at one end to a completely virtual connects a completely real environment at one end to a completely
environment at the other end. In this continuum, all combinations of virtual environment at the other end. In this continuum, all
the real and virtual are captured [AUGMENTED]. combinations of the real and virtual are captured [AUGMENTED].
XR applications will bring several requirements for the network and XR applications have several requirements for the network and the
the mobile devices running these applications. Some XR applications mobile devices running these applications. Some XR applications such
such as AR require a real-time processing of video streams to as AR require real-time processing of video streams to recognize
recognize specific objects. This is then used to overlay information specific objects. This is then used to overlay information on the
on the video being displayed to the user. In addition, XR video being displayed to the user. In addition, XR applications such
applications such as AR and VR will also require generation of new as AR and VR will also require generation of new video frames to be
video frames to be played to the user. Both the real-time processing played to the user. Both the real-time processing of video streams
of video streams and the generation of overlay information are and the generation of overlay information are computationally
computationally intensive tasks that generate heat [DEV_HEAT_1], intensive tasks that generate heat [DEV_HEAT_1] [DEV_HEAT_2] and
[DEV_HEAT_2] and drain battery power [BATT_DRAIN] on the mobile drain battery power [BATT_DRAIN] on the mobile device running the XR
device running the XR application. Consequently, in order to run application. Consequently, in order to run applications with XR
applications with XR characteristics on mobile devices, characteristics on mobile devices, computationally intensive tasks
computationally intensive tasks need to be offloaded to resources need to be offloaded to resources provided by Edge Computing.
provided by Edge Computing.
Edge Computing is an emerging paradigm where for the purpose of this Edge Computing is an emerging paradigm where, for the purpose of this
document, computing resources and storage are made available in close document, computing resources and storage are made available in close
network proximity at the edge of the Internet to mobile devices and network proximity at the edge of the Internet to mobile devices and
sensors [EDGE_1], [EDGE_2]. A computing resource or storage is in sensors [EDGE_1] [EDGE_2]. A computing resource or storage is in
close network proximity to a mobile device or sensor if there is a close network proximity to a mobile device or sensor if there is a
short and high-capacity network path to it such that the latency and short and high-capacity network path to it such that the latency and
bandwidth requirements of applications running on those mobile bandwidth requirements of applications running on those mobile
devices or sensors can be met. These edge computing devices use devices or sensors can be met. These edge computing devices use
cloud technologies that enable them to support offloaded XR cloud technologies that enable them to support offloaded XR
applications. In particular, cloud implementation techniques applications. In particular, cloud implementation techniques
[EDGE_3] such as the follows can be deployed: [EDGE_3] such as the following can be deployed:
* Disaggregation (using SDN to break vertically integrated systems Disaggregation: Using Software-Defined Networking (SDN) to break
into independent components- these components can have open vertically integrated systems into independent components. These
interfaces which are standard, well documented and not components can have open interfaces that are standard, well
proprietary), documented, and non-proprietary.
* Virtualization (being able to run multiple independent copies of Virtualization: Being able to run multiple independent copies of
those components such as SDN Controller apps, Virtual Network those components, such as SDN Controller applications and Virtual
Functions on a common hardware platform). Network Functions, on a common hardware platform.
* Commoditization (being able to elastically scale those virtual Commoditization: Being able to elastically scale those virtual
components across commodity hardware as the workload dictates). components across commodity hardware as the workload dictates.
Such techniques enable XR applications requiring low-latency and high Such techniques enable XR applications that require low latency and
bandwidth to be delivered by proximate edge devices. This is because high bandwidth to be delivered by proximate edge devices. This is
the disaggregated components can run on proximate edge devices rather because the disaggregated components can run on proximate edge
than on remote cloud several hops away and deliver low latency, high devices rather than on a remote cloud several hops away and deliver
bandwidth service to offloaded applications [EDGE_2]. low-latency, high-bandwidth service to offloaded applications
[EDGE_2].
This document discusses the issues involved when edge computing This document discusses the issues involved when edge computing
resources are offered by network operators to operationalize the resources are offered by network operators to operationalize the
requirements of XR applications running on devices with various form requirements of XR applications running on devices with various form
factors. A network operator for the purposes of this document is any factors. For the purpose of this document, a network operator is any
organization or individual that manages or operates the compute organization or individual that manages or operates the computing
resources or storage in close network proximity to a mobile device or resources or storage in close network proximity to a mobile device or
sensors. Examples of form factors include Head Mounted Displays sensor. Examples of form factors include head-mounted displays
(HMD) such as Optical-see through HMDs and video-see-through HMDs and (HMDs), such as optical see-through HMDs and video see-through HMDs,
Hand-held displays. Smart phones with video cameras and location and hand-held displays. Smartphones with video cameras and location-
sensing capabilities using systems such as a global navigation sensing capabilities using systems such as a global navigation
satellite system (GNSS) are another example of such devices. These satellite system (GNSS) are another example of such devices. These
devices have limited battery capacity and dissipate heat when devices have limited battery capacity and dissipate heat when
running. Besides as the user of these devices moves around as they running. Also, as the user of these devices moves around as they run
run the XR application, the wireless latency and bandwidth available the XR application, the wireless latency and bandwidth available to
to the devices fluctuates and the communication link itself might the devices fluctuates, and the communication link itself might fail.
fail. As a result, algorithms such as those based on adaptive-bit- As a result, algorithms such as those based on Adaptive Bitrate (ABR)
rate techniques that base their policy on heuristics or models of techniques that base their policy on heuristics or models of
deployment perform sub-optimally in such dynamic environments deployment perform sub-optimally in such dynamic environments
[ABR_1]. In addition, network operators can expect that the [ABR_1]. In addition, network operators can expect that the
parameters that characterize the expected behavior of XR applications parameters that characterize the expected behavior of XR applications
are heavy-tailed. Heaviness of tails is defined as the difference are heavy-tailed. Heaviness of tails is defined as the difference
from the normal distribution in the proportion of the values that from the normal distribution in the proportion of the values that
fall a long way from the mean [HEAVY_TAIL_3]. Such workloads require fall a long way from the mean [HEAVY_TAIL_3]. Such workloads require
appropriate resource management policies to be used on the Edge. The appropriate resource management policies to be used on the Edge. The
service requirements of XR applications are also challenging when service requirements of XR applications are also challenging when
compared to the current video applications. In particular several compared to current video applications. In particular, several
Quality of Experience (QoE) factors such as motion sickness are Quality-of-Experience (QoE) factors such as motion sickness are
unique to XR applications and must be considered when unique to XR applications and must be considered when
operationalizing a network. This document motivates these issues operationalizing a network. This document motivates these issues
with a use-case that is presented in the following sections. with a use case that is presented in the following section.
2. Use Case 2. Use Case
A use case is now described that involves an application with XR This use case involves an application with characteristics of an XR
systems' characteristics. Consider a group of tourists who are being system. Consider a group of tourists who are taking a tour around
conducted in a tour around the historical site of the Tower of the historical site of the Tower of London. As they move around the
London. As they move around the site and within the historical site and within the historical buildings, they can watch and listen
buildings, they can watch and listen to historical scenes in 3D that to historical scenes in 3D that are generated by the XR application
are generated by the XR application and then overlaid by their XR and then overlaid by their XR headsets onto their real-world view.
headsets onto their real-world view. The headset then continuously The headset continuously updates their view as they move around.
updates their view as they move around.
The XR application first processes the scene that the walking tourist The XR application first processes the scene that the walking tourist
is watching in real-time and identifies objects that will be targeted is watching in real time and identifies objects that will be targeted
for overlay of high-resolution videos. It then generates high- for overlay of high-resolution videos. It then generates high-
resolution 3D images of historical scenes related to the perspective resolution 3D images of historical scenes related to the perspective
of the tourist in real-time. These generated video images are then of the tourist in real time. These generated video images are then
overlaid on the view of the real-world as seen by the tourist. overlaid on the view of the real world as seen by the tourist.
This processing of scenes and generation of high-resolution images is This processing of scenes and generation of high-resolution images
now discussed in greater detail. are discussed in greater detail below.
2.1. Processing of Scenes 2.1. Processing of Scenes
The task of processing a scene can be broken down into a pipeline of The task of processing a scene can be broken down into a pipeline of
three consecutive subtasks namely tracking, followed by an three consecutive subtasks: tracking, acquisition of a model of the
acquisition of a model of the real world, and finally registration real world, and registration [AUGMENTED].
[AUGMENTED].
Tracking: The XR application that runs on the mobile device needs to Tracking: The XR application that runs on the mobile device needs to
track the six-dimensional pose (translational in the three track the six-dimensional pose (translational in the three
perpendicular axes and rotational about those three axes) of the perpendicular axes and rotational about those three axes) of the
user's head, eyes and the objects that are in view [AUGMENTED]. This user's head, eyes, and objects that are in view [AUGMENTED]. This
requires tracking natural features (for example points or edges of requires tracking natural features (for example, points or edges
objects) that are then used in the next stage of the pipeline. of objects) that are then used in the next stage of the pipeline.
Acquisition of a model of the real world: The tracked natural Acquisition of a model of the real world: The tracked natural
features are used to develop a model of the real world. One of the features are used to develop a model of the real world. One of
ways this is done is to develop an annotated point cloud (a set of the ways this is done is to develop a model based on an annotated
points in space that are annotated with descriptors) based model that point cloud (a set of points in space that are annotated with
is then stored in a database. To ensure that this database can be descriptors) that is then stored in a database. To ensure that
scaled up, techniques such as combining a client-side simultaneous this database can be scaled up, techniques such as combining
tracking and mapping and a server-side localization are used to client-side simultaneous tracking and mapping with server-side
construct a model of the real world [SLAM_1], [SLAM_2], [SLAM_3], localization are used to construct a model of the real world
[SLAM_4]. Another model that can be built is based on polygon mesh [SLAM_1] [SLAM_2] [SLAM_3] [SLAM_4]. Another model that can be
and texture mapping technique. The polygon mesh encodes a 3D built is based on a polygon mesh and texture mapping technique.
object's shape which is expressed as a collection of small flat The polygon mesh encodes a 3D object's shape, which is expressed
surfaces that are polygons. In texture mapping, color patterns are as a collection of small flat surfaces that are polygons. In
mapped on to an object's surface. A third modelling technique uses a texture mapping, color patterns are mapped onto an object's
2D lightfield that describes the intensity or color of the light rays surface. A third modeling technique uses a 2D lightfield that
arriving at a single point from arbitrary directions. Such a 2D describes the intensity or color of the light rays arriving at a
lightfield is stored as a two-dimensional table. Assuming distant single point from arbitrary directions. Such a 2D lightfield is
light sources, the single point is approximately valid for small stored as a two-dimensional table. Assuming distant light
scenes. For larger scenes, many 3D positions are additionally stored sources, the single point is approximately valid for small scenes.
making the table 5D. A set of all such points (either 2D or 5D For larger scenes, many 3D positions are additionally stored,
lightfield) can then be used to construct a model of the real world making the table 5D. A set of all such points (either a 2D or 5D
[AUGMENTED]. lightfield) can then be used to construct a model of the real
world [AUGMENTED].
Registration: The coordinate systems, brightness, and color of Registration: The coordinate systems, brightness, and color of
virtual and real objects need to be aligned with each other and this virtual and real objects need to be aligned with each other; this
process is called registration [REG]. Once the natural features are process is called "registration" [REG]. Once the natural features
tracked as discussed above, virtual objects are geometrically aligned are tracked as discussed above, virtual objects are geometrically
with those features by geometric registration. This is followed by aligned with those features by geometric registration. This is
resolving occlusion that can occur between virtual and the real followed by resolving occlusion that can occur between virtual and
objects [OCCL_1], [OCCL_2]. The XR application also applies real objects [OCCL_1] [OCCL_2]. The XR application also applies
photometric registration [PHOTO_REG] by aligning the brightness and photometric registration [PHOTO_REG] by aligning brightness and
color between the virtual and real objects. Additionally, algorithms color between the virtual and real objects. Additionally,
that calculate global illumination of both the virtual and real algorithms that calculate global illumination of both the virtual
objects [GLB_ILLUM_1], [GLB_ILLUM_2] are executed. Various and real objects [GLB_ILLUM_1] [GLB_ILLUM_2] are executed.
algorithms to deal with artifacts generated by lens distortion Various algorithms are also required to deal with artifacts
[LENS_DIST], blur [BLUR], noise [NOISE] etc. are also required. generated by lens distortion [LENS_DIST], blur [BLUR], noise
[NOISE], etc.
2.2. Generation of Images 2.2. Generation of Images
The XR application must generate a high-quality video that has the The XR application must generate a high-quality video that has the
properties described in the previous step and overlay the video on properties described in the previous step and overlay the video on
the XR device's display- a step called situated visualization. A the XR device's display. This step is called "situated
situated visualization is a visualization in which the virtual visualization". A situated visualization is a visualization in which
objects that need to be seen by the XR user are overlaid correctly on the virtual objects that need to be seen by the XR user are overlaid
the real world. This entails dealing with registration errors that correctly on the real world. This entails dealing with registration
may arise, ensuring that there is no visual interference errors that may arise, ensuring that there is no visual interference
[VIS_INTERFERE], and finally maintaining temporal coherence by [VIS_INTERFERE], and finally maintaining temporal coherence by
adapting to the movement of user's eyes and head. adapting to the movement of user's eyes and head.
3. Technical Challenges and Solutions 3. Technical Challenges and Solutions
As discussed in section 2, the components of XR applications perform As discussed in Section 2, the components of XR applications perform
tasks such as real-time generation and processing of high-quality tasks that are computationally intensive, such as real-time
video content that are computationally intensive. This section will generation and processing of high-quality video content. This
discuss the challenges such applications can face as a consequence. section discusses the challenges such applications can face as a
consequence.
As a result of performing computationally intensive tasks on XR As a result of performing computationally intensive tasks on XR
devices such as XR glasses, excessive heat is generated by the chip- devices such as XR glasses, excessive heat is generated by the
sets that are involved in the computation [DEV_HEAT_1], [DEV_HEAT_2]. chipsets that are involved in the computation [DEV_HEAT_1]
Additionally, the battery on such devices discharges quickly when [DEV_HEAT_2]. Additionally, the battery on such devices discharges
running such applications [BATT_DRAIN]. quickly when running such applications [BATT_DRAIN].
A solution to the heat dissipation and battery drainage problem is to A solution to problem of heat dissipation and battery drainage is to
offload the processing and video generation tasks to the remote offload the processing and video generation tasks to the remote
cloud. However, running such tasks on the cloud is not feasible as cloud. However, running such tasks on the cloud is not feasible as
the end-to-end delays must be within the order of a few milliseconds. the end-to-end delays must be within the order of a few milliseconds.
Additionally, such applications require high bandwidth and low jitter Additionally, such applications require high bandwidth and low jitter
to provide a high QoE to the user. In order to achieve such hard to provide a high QoE to the user. In order to achieve such hard
timing constraints, computationally intensive tasks can be offloaded timing constraints, computationally intensive tasks can be offloaded
to Edge devices. to Edge devices.
Another requirement for our use case and similar applications such as Another requirement for our use case and similar applications, such
360-degree streaming (streaming of video that represents a view in as 360-degree streaming (streaming of video that represents a view in
every direction in 3D space) is that the display on the XR device every direction in 3D space), is that the display on the XR device
should synchronize the visual input with the way the user is moving should synchronize the visual input with the way the user is moving
their head. This synchronization is necessary to avoid motion their head. This synchronization is necessary to avoid motion
sickness that results from a time-lag between when the user moves sickness that results from a time lag between when the user moves
their head and when the appropriate video scene is rendered. This their head and when the appropriate video scene is rendered. This
time lag is often called "motion-to-photon" delay. Studies have time lag is often called "motion-to-photon delay". Studies have
shown [PER_SENSE], [XR], [OCCL_3] that this delay can be at most 20ms shown that this delay can be at most 20 ms and preferably between
and preferably between 7-15ms in order to avoid the motion sickness 7-15 ms in order to avoid motion sickness [PER_SENSE] [XR] [OCCL_3].
problem. Out of these 20ms, display techniques including the refresh Out of these 20 ms, display techniques including the refresh rate of
rate of write displays and pixel switching take 12-13ms [OCCL_3], write displays and pixel switching take 12-13 ms [OCCL_3] [CLOUD].
[CLOUD]. This leaves 7-8ms for the processing of motion sensor This leaves 7-8 ms for the processing of motion sensor inputs,
inputs, graphic rendering, and round-trip-time (RTT) between the XR graphic rendering, and round-trip time (RTT) between the XR device
device and the Edge. The use of predictive techniques to mask and the Edge. The use of predictive techniques to mask latencies has
latencies has been considered as a mitigating strategy to reduce been considered as a mitigating strategy to reduce motion sickness
motion sickness [PREDICT]. In addition, Edge Devices that are [PREDICT]. In addition, Edge Devices that are proximate to the user
proximate to the user might be used to offload these computationally might be used to offload these computationally intensive tasks.
intensive tasks. Towards this end, a 3GPP study indicates an Ultra Towards this end, a 3GPP study indicates an Ultra-Reliable Low
Reliable Low Latency of 0.1ms to 1ms for communication between an Latency of 0.1 to 1 ms for communication between an Edge server and
Edge server and User Equipment (UE) [URLLC]. User Equipment (UE) [URLLC].
Note that the Edge device providing the computation and storage is Note that the Edge device providing the computation and storage is
itself limited in such resources compared to the Cloud. So, for itself limited in such resources compared to the cloud. For example,
example, a sudden surge in demand from a large group of tourists can a sudden surge in demand from a large group of tourists can overwhelm
overwhelm that device. This will result in a degraded user the device. This will result in a degraded user experience as their
experience as their XR device experiences delays in receiving the XR device experiences delays in receiving the video frames. In order
video frames. In order to deal with this problem, the client XR to deal with this problem, the client XR applications will need to
applications will need to use Adaptive Bit Rate (ABR) algorithms that use ABR algorithms that choose bitrate policies tailored in a fine-
choose bit-rates policies tailored in a fine-grained manner to the grained manner to the resource demands and play back the videos with
resource demands and playback the videos with appropriate QoE metrics appropriate QoE metrics as the user moves around with the group of
as the user moves around with the group of tourists. tourists.
However, heavy-tailed nature of several operational parameters makes However, the heavy-tailed nature of several operational parameters
prediction-based adaptation by ABR algorithms sub-optimal [ABR_2]. makes prediction-based adaptation by ABR algorithms sub-optimal
This is because with such distributions, law of large numbers (how [ABR_2]. This is because with such distributions, the law of large
long does it take for sample mean to stabilize) works too slowly numbers (how long it takes for the sample mean to stabilize) works
[HEAVY_TAIL_2], the mean of sample does not equal the mean of too slowly [HEAVY_TAIL_2] and the mean of sample does not equal the
distribution [HEAVY_TAIL_2], and as a result standard deviation and mean of distribution [HEAVY_TAIL_2]; as a result, standard deviation
variance are unsuitable as metrics for such operational parameters and variance are unsuitable as metrics for such operational
[HEAVY_TAIL_1]. Other subtle issues with these distributions include parameters [HEAVY_TAIL_1]. Other subtle issues with these
the "expectation paradox" [HEAVY_TAIL_1] where the longer the wait distributions include the "expectation paradox" [HEAVY_TAIL_1] (the
for an event, the longer a further need to wait and the issue of longer the wait for an event, the longer a further need to wait) and
mismatch between the size and count of events [HEAVY_TAIL_1]. This the mismatch between the size and count of events [HEAVY_TAIL_1].
makes designing an algorithm for adaptation error-prone and This makes designing an algorithm for adaptation error-prone and
challenging. Such operational parameters include but are not limited challenging. Such operational parameters include but are not limited
to buffer occupancy, throughput, client-server latency, and variable to buffer occupancy, throughput, client-server latency, and variable
transmission times. In addition, edge devices and communication transmission times. In addition, edge devices and communication
links may fail and logical communication relationships between links may fail, and logical communication relationships between
various software components change frequently as the user moves various software components change frequently as the user moves
around with their XR device [UBICOMP]. around with their XR device [UBICOMP].
4. XR Network Traffic 4. XR Network Traffic
4.1. Traffic Workload 4.1. Traffic Workload
As discussed earlier, the parameters that capture the characteristics As discussed earlier, the parameters that capture the characteristics
of XR application behavior are heavy-tailed. Examples of such of XR application behavior are heavy-tailed. Examples of such
parameters include the distribution of arrival times between XR parameters include the distribution of arrival times between XR
application invocation, the amount of data transferred, and the application invocation, the amount of data transferred, and the
inter-arrival times of packets within a session. As a result, any inter-arrival times of packets within a session. As a result, any
traffic model based on such parameters are themselves heavy-tailed. traffic model based on such parameters is also heavy-tailed. Using
Using these models to predict performance under alternative resource these models to predict performance under alternative resource
allocations by the network operator is challenging. For example, allocations by the network operator is challenging. For example,
both uplink and downlink traffic to a user device has parameters such both uplink and downlink traffic to a user device has parameters such
as volume of XR data, burst time, and idle time that are heavy- as volume of XR data, burst time, and idle time that are heavy-
tailed. tailed.
Table 1 below shows various streaming video applications and their Table 1 below shows various streaming video applications and their
associated throughput requirements [METRICS_1]. Since our use case associated throughput requirements [METRICS_1]. Since our use case
envisages a 6 degrees of freedom (6DoF) video or point cloud, it can envisages a 6 degrees of freedom (6DoF) video or point cloud, the
be seen from the table that it will require 200 to 1000Mbps of table indicates that it will require 200 to 1000 Mbps of bandwidth.
bandwidth. As seen from the table, the XR application such as our Also, the table shows that XR applications, such as the one in our
use case transmit a larger amount of data per unit time as compared use case, transmit a larger amount of data per unit time as compared
to traditional video applications. As a result, issues arising out to traditional video applications. As a result, issues arising from
of heavy-tailed parameters such as long-range dependent traffic heavy-tailed parameters, such as long-range dependent traffic
[METRICS_2], self-similar traffic [METRICS_3], would be experienced [METRICS_2] and self-similar traffic [METRICS_3], would be
at time scales of milliseconds and microseconds rather than hours or experienced at timescales of milliseconds and microseconds rather
seconds. Additionally, burstiness at the time scale of tens of than hours or seconds. Additionally, burstiness at the timescale of
milliseconds due to multi-fractal spectrum of traffic will be tens of milliseconds due to the multi-fractal spectrum of traffic
experienced [METRICS_4]. Long-range dependent traffic can have long will be experienced [METRICS_4]. Long-range dependent traffic can
bursts and various traffic parameters from widely separated time can have long bursts, and various traffic parameters from widely
show correlation [HEAVY_TAIL_1]. Self-similar traffic contains separated times can show correlation [HEAVY_TAIL_1]. Self-similar
bursts at a wide range of time scales [HEAVY_TAIL_1]. Multi-fractal traffic contains bursts at a wide range of timescales [HEAVY_TAIL_1].
spectrum bursts for traffic summarizes the statistical distribution Multi-fractal spectrum bursts for traffic summarize the statistical
of local scaling exponents found in a traffic trace [HEAVY_TAIL_1]. distribution of local scaling exponents found in a traffic trace
The operational consequences of XR traffic having characteristics [HEAVY_TAIL_1]. The operational consequence of XR traffic having
such as long-range dependency, and self-similarity is that the edge characteristics such as long-range dependency and self-similarity is
servers to which multiple XR devices are connected wirelessly could that the edge servers to which multiple XR devices are connected
face long bursts of traffic [METRICS_2], [METRICS_3]. In addition, wirelessly could face long bursts of traffic [METRICS_2] [METRICS_3].
multi-fractal spectrum burstiness at the scale of milli-seconds could In addition, multi-fractal spectrum burstiness at the scale of
induce jitter contributing to motion sickness [METRICS_4]. This is milliseconds could induce jitter contributing to motion sickness
because bursty traffic combined with variable queueing delays leads [METRICS_4]. This is because bursty traffic combined with variable
to large delay jitter [METRICS_4]. The operators of edge servers queueing delays leads to large delay jitter [METRICS_4]. The
will need to run a 'managed edge cloud service' [METRICS_5] to deal operators of edge servers will need to run a "managed edge cloud
with the above problems. Functionalities that such a managed edge service" [METRICS_5] to deal with the above problems.
cloud service could operationally provide include dynamic placement Functionalities that such a managed edge cloud service could
of XR servers, mobility support and energy management [METRICS_6]. operationally provide include dynamic placement of XR servers,
Providing Edge server support for the techniques being developed at mobility support, and energy management [METRICS_6]. Providing Edge
the DETNET Working Group at the IETF [RFC8939], [RFC9023], [RFC9450] server support for the techniques being developed at the DETNET
could guarantee performance of XR applications. For example, these Working Group in the IETF [RFC8939] [RFC9023] [RFC9450] could
guarantee performance of XR applications. For example, these
techniques could be used for the link between the XR device and the techniques could be used for the link between the XR device and the
edge as well as within the managed edge cloud service. Another edge as well as within the managed edge cloud service. Another
option for the network operators could be to deploy equipment that option for network operators would be to deploy equipment that
supports differentiated services [RFC2475] or per-connection quality- supports differentiated services [RFC2475] or per-connection Quality-
of-service guarantees [RFC2210]. of-Service (QoS) guarantees [RFC2210].
+===============================================+============+ +===============================================+============+
| Application | Throughput | | Application | Throughput |
| | Required | | | Required |
+===============================================+============+ +===============================================+============+
| Real-world objects annotated with text and | 1 Mbps | | Real-world objects annotated with text and | 1 Mbps |
| images for workflow assistance (e.g. repair) | | | images for workflow assistance (e.g., repair) | |
+-----------------------------------------------+------------+ +-----------------------------------------------+------------+
| Video Conferencing | 2 Mbps | | Video conferencing | 2 Mbps |
+-----------------------------------------------+------------+ +-----------------------------------------------+------------+
| 3D Model and Data Visualization | 2 to 20 | | 3D model and data visualization | 2 to 20 |
| | Mbps | | | Mbps |
+-----------------------------------------------+------------+ +-----------------------------------------------+------------+
| Two-way 3D Telepresence | 5 to 25 | | Two-way 3D telepresence | 5 to 25 |
| | Mbps | | | Mbps |
+-----------------------------------------------+------------+ +-----------------------------------------------+------------+
| Current-Gen 360-degree video (4K) | 10 to 50 | | Current-Gen 360-degree video (4K) | 10 to 50 |
| | Mbps | | | Mbps |
+-----------------------------------------------+------------+ +-----------------------------------------------+------------+
| Next-Gen 360-degree video (8K, 90+ Frames- | 50 to 200 | | Next-Gen 360-degree video (8K, 90+ frames per | 50 to 200 |
| per-second, High Dynamic Range, Stereoscopic) | Mbps | | second, high dynamic range, stereoscopic) | Mbps |
+-----------------------------------------------+------------+ +-----------------------------------------------+------------+
| 6 Degree of Freedom Video or Point Cloud | 200 to | | 6DoF video or point cloud | 200 to |
| | 1000 Mbps | | | 1000 Mbps |
+-----------------------------------------------+------------+ +-----------------------------------------------+------------+
Table 1: Throughput requirement for streaming video Table 1: Throughput Requirements for Streaming Video
applications Applications
Thus, the provisioning of edge servers in terms of the number of Thus, the provisioning of edge servers (in terms of the number of
servers, the topology, where to place them, the assignment of link servers, the topology, the placement of servers, the assignment of
capacity, CPUs and GPUs should keep the above factors in mind. link capacity, CPUs, and Graphics Processing Units (GPUs)) should be
performed with the above factors in mind.
4.2. Traffic Performance Metrics 4.2. Traffic Performance Metrics
The performance requirements for XR traffic have characteristics that The performance requirements for XR traffic have characteristics that
need to be considered when operationalizing a network. These need to be considered when operationalizing a network. These
characteristics are now discussed. characteristics are discussed in this section.
The bandwidth requirements of XR applications are substantially The bandwidth requirements of XR applications are substantially
higher than those of video-based applications. higher than those of video-based applications.
The latency requirements of XR applications have been studied The latency requirements of XR applications have been studied
recently [XR_TRAFFIC]. The following characteristics were recently [XR_TRAFFIC]. The following characteristics were
identified.: identified:
* The uploading of data from an XR device to a remote server for * The uploading of data from an XR device to a remote server for
processing dominates the end-to-end latency. processing dominates the end-to-end latency.
* A lack of visual features in the grid environment can cause * A lack of visual features in the grid environment can cause
increased latencies as the XR device uploads additional visual increased latencies as the XR device uploads additional visual
data for processing to the remote server. data for processing to the remote server.
* XR applications tend to have large bursts that are separated by * XR applications tend to have large bursts that are separated by
significant time gaps. significant time gaps.
Additionally, XR applications interact with each other on a time Additionally, XR applications interact with each other on a timescale
scale of a round-trip-time propagation, and this must be considered of an RTT propagation, and this must be considered when
when operationalizing a network. operationalizing a network.
The following Table 2 [METRICS_6] shows a taxonomy of applications Table 2 [METRICS_6] shows a taxonomy of applications with their
with their associated required response times and bandwidths. associated required response times and bandwidths. Response times
Response times can be defined as the time interval between the end of can be defined as the time interval between the end of a request
a request submission and the end of the corresponding response from a submission and the end of the corresponding response from a system.
system. If the XR device offloads a task to an edge server, the If the XR device offloads a task to an edge server, the response time
response time of the server is the round-trip time from when a data of the server is the RTT from when a data packet is sent from the XR
packet is sent from the XR device until a response is received. Note device until a response is received. Note that the required response
that the required response time provides an upper bound on the sum of time provides an upper bound for the sum of the time taken by
the time taken by computational tasks such as processing of scenes, computational tasks (such as processing of scenes and generation of
generation of images and the round-trip time. This response time images) and the RTT. This response time depends only on the QoS
depends only on the Quality of Service (QOS) required by an required by an application. The response time is therefore
application. The response time is therefore independent of the independent of the underlying technology of the network and the time
underlying technology of the network and the time taken by the taken by the computational tasks.
computational tasks.
Our use case requires a response time of 20ms at most and preferably Our use case requires a response time of 20 ms at most and preferably
between 7-15ms as discussed earlier. This requirement for response between 7-15 ms, as discussed earlier. This requirement for response
time is similar to the first two entries of Table 2 below. time is similar to the first two entries in Table 2. Additionally,
Additionally, the required bandwidth for our use case as discussed in the required bandwidth for our use case is 200 to 1000 Mbps (see
section 5.1, Table 1, is 200Mbps-1000Mbps. Since our use case Section 4.1). Since our use case envisages multiple users running
envisages multiple users running the XR applications on their the XR application on their devices and connecting to the edge server
devices, and connected to an edge server that is closest to them, that is closest to them, these latency and bandwidth connections will
these latency and bandwidth connections will grow linearly with the grow linearly with the number of users. The operators should match
number of users. The operators should match the network provisioning the network provisioning to the maximum number of tourists that can
to the maximum number of tourists that can be supported by a link to be supported by a link to an edge server.
an edge server.
+===================+==============+==========+=====================+ +===================+==============+==========+=====================+
| Application | Required | Expected | Possible | | Application | Required | Expected | Possible |
| | Response | Data | Implementations/ | | | Response | Data | Implementations/ |
| | Time | Capacity | Examples | | | Time | Capacity | Examples |
+===================+==============+==========+=====================+ +===================+==============+==========+=====================+
| Mobile XR based | Less than 10 | Greater | Assisting | | Mobile XR-based | Less than 10 | Greater | Assisting |
| remote assistance | milliseconds | than 7.5 | maintenance | | remote assistance | milliseconds | than 7.5 | maintenance |
| with uncompressed | | Gbps | technicians, | | with uncompressed | | Gbps | technicians, |
| 4K (1920x1080 | | | Industry 4.0 | | 4K (1920x1080 | | | Industry 4.0 |
| pixels) 120 fps | | | remote | | pixels) 120 fps | | | remote |
| HDR 10-bit real- | | | maintenance, | | HDR 10-bit real- | | | maintenance, |
| time video stream | | | remote assistance | | time video stream | | | remote assistance |
| | | | in robotics | | | | | in robotics |
| | | | industry | | | | | industry |
+-------------------+--------------+----------+---------------------+ +-------------------+--------------+----------+---------------------+
| Indoor and | Less than 20 | 50 to | Theme Parks, | | Indoor and | Less than 20 | 50 to | Guidance in theme |
| localized outdoor | milliseconds | 200 Mbps | Shopping Malls, | | localized outdoor | milliseconds | 200 Mbps | parks, shopping |
| navigation | | | Archaeological | | navigation | | | malls, |
| | | | Sites, Museum | | | | | archaeological |
| | | | guidance | | | | | sites, and |
| | | | museums |
+-------------------+--------------+----------+---------------------+ +-------------------+--------------+----------+---------------------+
| Cloud-based | Less than 50 | 50 to | Google Live View, | | Cloud-based | Less than 50 | 50 to | Google Live View, |
| Mobile XR | milliseconds | 100 Mbps | XR-enhanced | | mobile XR | milliseconds | 100 Mbps | XR-enhanced |
| applications | | | Google Translate | | applications | | | Google Translate |
+-------------------+--------------+----------+---------------------+ +-------------------+--------------+----------+---------------------+
Table 2: Traffic Performance Metrics of Selected XR Applications Table 2: Traffic Performance Metrics of Selected XR Applications
5. Conclusion 5. Conclusion
In order to operationalize a use case such as the one presented in In order to operationalize a use case such as the one presented in
this document, a network operator could dimension their network to this document, a network operator could dimension their network to
provide a short and high-capacity network path from the edge compute provide a short and high-capacity network path from the edge
resources or storage to the mobile devices running the XR computing resources or storage to the mobile devices running the XR
application. This is required to ensure a response time of 20ms at application. This is required to ensure a response time of 20 ms at
most and preferably between 7-15ms. Additionally, a bandwidth of 200 most and preferably between 7-15 ms. Additionally, a bandwidth of
to 1000Mbps is required by such applications. To deal with the 200 to 1000 Mbps is required by such applications. To deal with the
characteristics of XR traffic as discussed in this document, network characteristics of XR traffic as discussed in this document, network
operators could deploy a managed edge cloud service that operators could deploy a managed edge cloud service that
operationally provides dynamic placement of XR servers, mobility operationally provides dynamic placement of XR servers, mobility
support and energy management. Although the use case is technically support, and energy management. Although the use case is technically
feasible, economic viability is an important factor that must be feasible, economic viability is an important factor that must be
considered. considered.
6. IANA Considerations 6. IANA Considerations
This document has no IANA actions. This document has no IANA actions.
7. Security Considerations 7. Security Considerations
The security issues for the presented use case are similar to other The security issues for the presented use case are similar to other
streaming applications [DIST], [NIST1], [CWE], [NIST2]. This streaming applications [DIST] [NIST1] [CWE] [NIST2]. This document
document itself introduces no new security issues. does not introduce any new security issues.
8. Acknowledgements
Many Thanks to Spencer Dawkins, Rohit Abhishek, Jake Holland, Kiran
Makhijani, Ali Begen, Cullen Jennings, Stephan Wenger, Eric Vyncke,
Wesley Eddy, Paul Kyzivat, Jim Guichard, Roman Danyliw, Warren
Kumari, and Zaheduzzaman Sarker for providing very helpful feedback,
suggestions and comments.
9. Informative References 8. Informative References
[ABR_1] Mao, H., Netravali, R., and M. Alizadeh, "Neural Adaptive [ABR_1] Mao, H., Netravali, R., and M. Alizadeh, "Neural Adaptive
Video Streaming with Pensieve", In Proceedings of the Video Streaming with Pensieve", SIGCOMM '17: Proceedings
Conference of the ACM Special Interest Group on Data of the Conference of the ACM Special Interest Group on
Communication, pp. 197-210, 2017. Data Communication, pp. 197-210,
DOI 10.1145/3098822.3098843, 2017,
<https://dl.acm.org/doi/10.1145/3098822.3098843>.
[ABR_2] Yan, F., Ayers, H., Zhu, C., Fouladi, S., Hong, J., Zhang, [ABR_2] Yan, F., Ayers, H., Zhu, C., Fouladi, S., Hong, J., Zhang,
K., Levis, P., and K. Winstein, "Learning in situ: a K., Levis, P., and K. Winstein, "Learning in situ: a
randomized experiment in video streaming", In 17th USENIX randomized experiment in video streaming", 17th USENIX
Symposium on Networked Systems Design and Implementation Symposium on Networked Systems Design and Implementation
(NSDI 20), pp. 495-511, 2020. (NSDI '20), pp. 495-511, February 2020,
<https://www.usenix.org/conference/nsdi20/presentation/
yan>.
[AUGMENTED] [AUGMENTED]
Schmalstieg, D. S. and T.H. Hollerer, "Augmented Schmalstieg, D. and T. Höllerer, "Augmented Reality:
Reality", Addison Wesley, 2016. Principles and Practice", Addison-Wesley Professional,
2016, <https://www.oreilly.com/library/view/augmented-
reality-principles/9780133153217/>.
[AUGMENTED_2] [AUGMENTED_2]
Azuma, R. T., "A Survey of Augmented Azuma, R.T., "A Survey of Augmented Reality", Presence:
Reality.", Presence:Teleoperators and Virtual Teleoperators and Virtual Environments, vol. 6, no. 4, pp.
Environments 6.4, pp. 355-385., 1997. 355-385, DOI 10.1162/pres.1997.6.4.355, August 1997,
<https://direct.mit.edu/pvar/article-
abstract/6/4/355/18336/A-Survey-of-Augmented-
Reality?redirectedFrom=fulltext>.
[BATT_DRAIN] [BATT_DRAIN]
Seneviratne, S., Hu, Y., Nguyen, T., Lan, G., Khalifa, S., Seneviratne, S., Hu, Y., Nguyen, T., Lan, G., Khalifa, S.,
Thilakarathna, K., Hassan, M., and A. Seneviratne, "A Thilakarathna, K., Hassan, M., and A. Seneviratne, "A
survey of wearable devices and challenges.", In IEEE Survey of Wearable Devices and Challenges", IEEE
Communication Surveys and Tutorials, 19(4), p.2573-2620., Communication Surveys and Tutorials, vol. 19 no. 4, pp.
2017. 2573-2620, DOI 10.1109/COMST.2017.2731979, 2017,
<https://ieeexplore.ieee.org/document/7993011>.
[BLUR] Kan, P. and H. Kaufmann, "Physically-Based Depth of Field [BLUR] Kan, P. and H. Kaufmann, "Physically-Based Depth of Field
in Augmented Reality.", In Eurographics (Short Papers), in Augmented Reality", Eurographics 2012 - Short Papers,
pp. 89-92., 2012. pp. 89-92, DOI 10.2312/conf/EG2012/short/089-092, 2012,
<https://diglib.eg.org/items/6954bf7e-5852-44cf-
8155-4ba269dc4cee>.
[CLOUD] Corneo, L., Eder, M., Mohan, N., Zavodovski, A., Bayhan, [CLOUD] Corneo, L., Eder, M., Mohan, N., Zavodovski, A., Bayhan,
S., Wong, W., Gunningberg, P., Kangasharju, J., and J. S., Wong, W., Gunningberg, P., Kangasharju, J., and J.
Ott, "Surrounded by the Clouds: A Comprehensive Cloud Ott, "Surrounded by the Clouds: A Comprehensive Cloud
Reachability Study.", In Proceedings of the Web Conference Reachability Study", WWW '21: Proceedings of the Web
2021, pp. 295-304, 2021. Conference 2021, pp. 295-304, DOI 10.1145/3442381.3449854,
2021, <https://dl.acm.org/doi/10.1145/3442381.3449854>.
[CWE] "CWE/SANS TOP 25 Most Dangerous Software Errorss", Common [CWE] SANS Institute, "CWE/SANS TOP 25 Most Dangerous Software
Weakness Enumeration, SANS Institute, 2012. Errors", <https://www.sans.org/top25-software-errors/>.
[DEV_HEAT_1] [DEV_HEAT_1]
LiKamWa, R., Wang, Z., Carroll, A., Lin, F., and L. Zhong, LiKamWa, R., Wang, Z., Carroll, A., Lin, F., and L. Zhong,
"Draining our Glass: An Energy and Heat characterization "Draining our glass: an energy and heat characterization
of Google Glass", In Proceedings of 5th Asia-Pacific of Google Glass", APSys '14: 5th Asia-Pacific Workshop on
Workshop on Systems pp. 1-7, 2013. Systems, pp. 1-7, DOI 10.1145/2637166.2637230, 2014,
<https://dl.acm.org/doi/10.1145/2637166.2637230>.
[DEV_HEAT_2] [DEV_HEAT_2]
Matsuhashi, K., Kanamoto, T., and A. Kurokawa, "Thermal Matsuhashi, K., Kanamoto, T., and A. Kurokawa, "Thermal
model and countermeasures for future smart glasses.", Model and Countermeasures for Future Smart Glasses",
In Sensors, 20(5), p.1446., 2020. Sensors, vol. 20, no. 5, p. 1446, DOI 10.3390/s20051446,
2020, <https://www.mdpi.com/1424-8220/20/5/1446>.
[DIST] Coulouris, G., Dollimore, J., Kindberg, T., and G. Blair, [DIST] Coulouris, G., Dollimore, J., Kindberg, T., and G. Blair,
"Distributed Systems: Concepts and Design", Addison "Distributed Systems: Concepts and Design", Addison-
Wesley, 2011. Wesley, 2011, <https://dl.acm.org/doi/10.5555/2029110>.
[EDGE_1] Satyanarayanan, M., "The Emergence of Edge Computing", [EDGE_1] Satyanarayanan, M., "The Emergence of Edge Computing",
In Computer 50(1) pp. 30-39, 2017. Computer, vol. 50, no. 1, pp. 30-39,
DOI 10.1109/MC.2017.9, 2017,
<https://ieeexplore.ieee.org/document/7807196>.
[EDGE_2] Satyanarayanan, M., Klas, G., Silva, M., and S. Mangiante, [EDGE_2] Satyanarayanan, M., Klas, G., Silva, M., and S. Mangiante,
"The Seminal Role of Edge-Native Applications", In IEEE "The Seminal Role of Edge-Native Applications", 2019 IEEE
International Conference on Edge Computing (EDGE) pp. International Conference on Edge Computing (EDGE), pp.
33-40, 2019. 33-40, DOI 10.1109/EDGE.2019.00022, 2019,
<https://ieeexplore.ieee.org/document/8812200>.
[EDGE_3] Peterson, L. and O. Sunay, "5G mobile networks: A systems [EDGE_3] Peterson, L. and O. Sunay, "5G Mobile Networks: A Systems
approach.", In Synthesis Lectures on Network Systems., Approach", Synthesis Lectures on Network Systems,
2020. DOI 10.1007/978-3-031-79733-0, 2020,
<https://link.springer.com/
book/10.1007/978-3-031-79733-0>.
[GLB_ILLUM_1] [GLB_ILLUM_1]
Kan, P. and H. Kaufmann, "Differential irradiance caching Kan, P. and H. Kaufmann, "Differential Irradiance Caching
for fast high-quality light transport between virtual and for fast high-quality light transport between virtual and
real worlds.", In IEEE International Symposium on Mixed real worlds", 2013 IEEE International Symposium on Mixed
and Augmented Reality (ISMAR),pp. 133-141, 2013. and Augmented Reality (ISMAR), pp. 133-141,
DOI 10.1109/ISMAR.2013.6671773, 2013,
<https://ieeexplore.ieee.org/document/6671773>.
[GLB_ILLUM_2] [GLB_ILLUM_2]
Franke, T., "Delta voxel cone tracing.", In IEEE Franke, T., "Delta Voxel Cone Tracing", 2014 IEEE
International Symposium on Mixed and Augmented Reality International Symposium on Mixed and Augmented Reality
(ISMAR), pp. 39-44, 2014. (ISMAR), pp. 39-44, DOI 10.1109/ISMAR.2014.6948407, 2014,
<https://ieeexplore.ieee.org/document/6948407>.
[HEAVY_TAIL_1] [HEAVY_TAIL_1]
Crovella, M. and B. Krishnamurthy, "Internet measurement: Crovella, M. and B. Krishnamurthy, "Internet Measurement:
infrastructure, traffic and applications", John Wiley and Infrastructure, Traffic and Applications", John Wiley and
Sons Inc., 2006. Sons, 2006, <https://www.wiley.com/en-us/Internet+Measurem
ent%3A+Infrastructure%2C+Traffic+and+Applications-
p-9780470014615>.
[HEAVY_TAIL_2] [HEAVY_TAIL_2]
Taleb, N., "The Statistical Consequences of Fat Tails", Taleb, N., "Statistical Consequences of Fat Tails: Real
STEM Academic Press, 2020. World Preasymptotics, Epistemology, and Applications",
Revised Edition, STEM Academic Press, 2022,
<https://arxiv.org/pdf/2001.10488>.
[HEAVY_TAIL_3] [HEAVY_TAIL_3]
Ehrenberg, A., "A Primer in Data Reduction.", John Wiley, Ehrenberg, A., "A Primer in Data Reduction: An
London, 1982. Introductory Statistics Textbook", John Wiley and Sons,
2007, <https://www.wiley.com/en-us/A+Primer+in+Data+Reduct
ion%3A+An+Introductory+Statistics+Textbook-
p-9780471101352>.
[LENS_DIST] [LENS_DIST]
Fuhrmann, A. and D. Schmalstieg, "Practical calibration Fuhrmann, A., Schmalstieg, D., and W. Purgathofer,
procedures for augmented reality.", In Virtual "Practical Calibration Procedures for Augmented Reality",
Environments 2000, pp. 3-12. Springer, Vienna, 2000. Virtual Environments 2000, pp. 3-12,
DOI 10.1007/978-3-7091-6785-4_2, 2000,
<https://link.springer.com/
chapter/10.1007/978-3-7091-6785-4_2>.
[METRICS_1] [METRICS_1]
ABI Research, "Augmented and Virtual Reality: The first ABI Research, "Augmented and Virtual Reality: The first
Wave of Killer Apps.", Wave of Killer Apps: Qualcomm - ABI Research", April 2017,
https://gsacom.com/paper/augmented-virtual-reality-first- <https://gsacom.com/paper/augmented-virtual-reality-first-
wave-5g-killer-apps-qualcomm-abi-research/, 2017. wave-5g-killer-apps-qualcomm-abi-research/>.
[METRICS_2] [METRICS_2]
Paxon, V. and S. Floyd, "Wide Area Traffic: The Failure of Paxon, V. and S. Floyd, "Wide area traffic: the failure of
Poisson Modelling.", In IEEE/ACM Transactions on Poisson modeling", IEEE/ACM Transactions on Networking,
Networking, pp. 226-244., 1995. vol. 3, no. 3, pp. 226-244, DOI 10.1109/90.392383, June
1995, <https://ieeexplore.ieee.org/document/392383>.
[METRICS_3] [METRICS_3]
Willinger, W., Taqqu, M.S., Sherman, R., and D.V. Wilson, Willinger, W., Taqqu, M.S., Sherman, R., and D.V. Wilson,
"Self-Similarity Through High Variability: Statistical "Self-similarity through high variability: statistical
Analysis and Ethernet LAN Traffic at Source Level.", analysis and Ethernet LAN traffic at source level", IEEE/
In IEEE/ACM Transactions on Networking, pp. 71-86., 1997. ACM Transactions on Networking, vol. 5, no. 1, pp. 71-86,
DOI 10.1109/90.554723, February 1997,
<https://ieeexplore.ieee.org/abstract/document/554723>.
[METRICS_4] [METRICS_4]
Gilbert, A.C., "Multiscale Analysis and Data Networks.", Gilbert, A.C., "Multiscale Analysis and Data Networks",
In Applied and Computational Harmonic Analysis, pp. Applied and Computational Harmonic Analysis, vol. 10, no.
185-202., 2001. 3, pp. 185-202, DOI 10.1006/acha.2000.0342, May 2001,
<https://www.sciencedirect.com/science/article/pii/
S1063520300903427>.
[METRICS_5] [METRICS_5]
Beyer, B., Jones, C., Petoff, J., and N.R. Murphy, "Site Beyer, B., Ed., Jones, C., Ed., Petoff, J., Ed., and N.R.
Reliability Engineering: How Google Runs Production Murphy, Ed., "Site Reliability Engineering: How Google
Systems.", O'Reilly Media, Inc., 2016. Runs Production Systems", O'Reilly Media, Inc., 2016,
<https://research.google/pubs/site-reliability-
engineering-how-google-runs-production-systems/>.
[METRICS_6] [METRICS_6]
Siriwardhana, Y., Porambage, P., Liyanage, M., and M. Siriwardhana, Y., Porambage, P., Liyanage, M., and M.
Ylianttila, "A survey on mobile augmented reality with 5G Ylianttila, "A Survey on Mobile Augmented Reality With 5G
mobile edge computing: architectures, applications, and Mobile Edge Computing: Architectures, Applications, and
technical aspects.", In IEEE Communications Surveys and Technical Aspects", IEEE Communications Surveys and
Tutorials, Vol 23, No. 2, 2021. Tutorials, vol. 23, no. 2, pp. 1160-1192,
DOI 10.1109/COMST.2021.3061981, 2021,
<https://ieeexplore.ieee.org/document/9363323>.
[NIST1] "NIST SP 800-146: Cloud Computing Synopsis and [NIST1] NIST, "Cloud Computing Synopsis and Recommendations", NIST
Recommendations", National Institute of Standards and SP 800-146, DOI 10.6028/NIST.SP.800-146, May 2012,
Technology, US Department of Commerce, 2012. <https://csrc.nist.gov/pubs/sp/800/146/final>.
[NIST2] "NIST SP 800-123: Guide to General Server [NIST2] NIST, "Guide to General Server Security", NIST SP 800-123,
Security", National Institute of Standards and DOI 10.6028/NIST.SP.800-123, July 2008,
Technology, US Department of Commerce, 2008. <https://csrc.nist.gov/pubs/sp/800/123/final>.
[NOISE] Fischer, J., Bartz, D., and W. Straßer, "Enhanced visual [NOISE] Fischer, J., Bartz, D., and W. Strasser, "Enhanced visual
realism by incorporating camera image effects.", realism by incorporating camera image effects", 2006 IEEE/
In IEEE/ACM International Symposium on Mixed and Augmented ACM International Symposium on Mixed and Augmented
Reality, pp. 205-208., 2006. Reality, pp. 205-208, DOI 10.1109/ISMAR.2006.297815, 2006,
<https://ieeexplore.ieee.org/document/4079277>.
[OCCL_1] Breen, D.E., Whitaker, R.T., and M. Tuceryan, "Interactive [OCCL_1] Breen, D.E., Whitaker, R.T., Rose, E., and M. Tuceryan,
Occlusion and automatic object placementfor augmented "Interactive Occlusion and Automatic Object Placement for
reality", In Computer Graphics Forum, vol. 15, no. 3 , pp. Augmented Reality", Computer Graphics Forum, vol. 15, no.
229-238,Edinburgh, UK: Blackwell Science Ltd, 1996. 3, pp. 11-22, DOI 10.1111/1467-8659.1530011, August 1996,
<https://onlinelibrary.wiley.com/
doi/10.1111/1467-8659.1530011>.
[OCCL_2] Zheng, F., Schmalstieg, D., and G. Welch, "Pixel-wise [OCCL_2] Zheng, F., Schmalstieg, D., and G. Welch, "Pixel-wise
closed-loop registration in video-based augmented closed-loop registration in video-based augmented
reality", In IEEE International Symposium on Mixed and reality", 2014 IEEE International Symposium on Mixed and
Augmented Reality (ISMAR), pp. 135-143, 2014. Augmented Reality (ISMAR), pp. 135-143,
DOI 10.1109/ISMAR.2014.6948419, 2014,
<https://ieeexplore.ieee.org/document/6948419>.
[OCCL_3] Lang, B., "Oculus Shares 5 Key Ingredients for Presence in [OCCL_3] Lang, B., "Oculus Shares 5 Key Ingredients for Presence in
Virtual Reality.", https://www.roadtovr.com/oculus- Virtual Reality", Road to VR, 24 September 2014,
shares-5-key-ingredients-for-presence-in-virtual-reality/, <https://www.roadtovr.com/oculus-shares-5-key-ingredients-
2014. for-presence-in-virtual-reality/>.
[PER_SENSE] [PER_SENSE]
Mania, K., Adelstein, B.D., Ellis, S.R., and M.I. Hill, Mania, K., Adelstein, B.D., Ellis, S.R., and M.I. Hill,
"Perceptual sensitivity to head tracking latency in "Perceptual sensitivity to head tracking latency in
virtual environments with varying degrees of scene virtual environments with varying degrees of scene
complexity.", In Proceedings of the 1st Symposium on complexity.", APGV '04: Proceedings of the 1st Symposium
Applied perception in graphics and visualization pp. on Applied perception in graphics and visualization, pp.
39-47., 2004. 39-47, DOI 10.1145/1012551.1012559, 2004,
<https://dl.acm.org/doi/10.1145/1012551.1012559>.
[PHOTO_REG] [PHOTO_REG]
Liu, Y. and X. Granier, "Online tracking of outdoor Liu, Y. and X. Granier, "Online Tracking of Outdoor
lighting variations for augmented reality with moving Lighting Variations for Augmented Reality with Moving
cameras", In IEEE Transactions on visualization and Cameras", IEEE Transactions on Visualization and Computer
computer graphics, 18(4), pp.573-580, 2012. Graphics, vol. 18, no. 4, pp. 573-580,
DOI 10.1109/TVCG.2012.53, 2012,
<https://ieeexplore.ieee.org/document/6165138>.
[PREDICT] Buker, T. J., Vincenzi, D.A., and J.E. Deaton, "The effect [PREDICT] Buker, T.J., Vincenzi, D.A., and J.E. Deaton, "The effect
of apparent latency on simulator sickness while using a of apparent latency on simulator sickness while using a
see-through helmet-mounted display: Reducing apparent see-through helmet-mounted display: reducing apparent
latency with predictive compensation..", In Human factors latency with predictive compensation", Human Factors, vol.
54.2, pp. 235-249., 2012. 54, no. 2, pp. 235-249, DOI 10.1177/0018720811428734,
April 2012, <https://pubmed.ncbi.nlm.nih.gov/22624290/>.
[REG] Holloway, R. L., "Registration error analysis for [REG] Holloway, R.L., "Registration Error Analysis for Augmented
augmented reality.", In Presence:Teleoperators and Virtual Reality", Presence: Teleoperators and Virtual
Environments 6.4, pp. 413-432., 1997. Environments, vol. 6, no. 4, pp. 413-432,
DOI 10.1162/pres.1997.6.4.413, August 1997,
<https://direct.mit.edu/pvar/article-
abstract/6/4/413/18334/Registration-Error-Analysis-for-
Augmented-Reality?redirectedFrom=fulltext>.
[RFC2210] Wroclawski, J., "The Use of RSVP with IETF Integrated [RFC2210] Wroclawski, J., "The Use of RSVP with IETF Integrated
Services", RFC 2210, DOI 10.17487/RFC2210, September 1997, Services", RFC 2210, DOI 10.17487/RFC2210, September 1997,
<https://www.rfc-editor.org/info/rfc2210>. <https://www.rfc-editor.org/info/rfc2210>.
[RFC2475] Blake, S., Black, D., Carlson, M., Davies, E., Wang, Z., [RFC2475] Blake, S., Black, D., Carlson, M., Davies, E., Wang, Z.,
and W. Weiss, "An Architecture for Differentiated and W. Weiss, "An Architecture for Differentiated
Services", RFC 2475, DOI 10.17487/RFC2475, December 1998, Services", RFC 2475, DOI 10.17487/RFC2475, December 1998,
<https://www.rfc-editor.org/info/rfc2475>. <https://www.rfc-editor.org/info/rfc2475>.
skipping to change at page 16, line 41 skipping to change at line 783
IEEE 802.1 Time-Sensitive Networking (TSN)", RFC 9023, IEEE 802.1 Time-Sensitive Networking (TSN)", RFC 9023,
DOI 10.17487/RFC9023, June 2021, DOI 10.17487/RFC9023, June 2021,
<https://www.rfc-editor.org/info/rfc9023>. <https://www.rfc-editor.org/info/rfc9023>.
[RFC9450] Bernardos, CJ., Ed., Papadopoulos, G., Thubert, P., and F. [RFC9450] Bernardos, CJ., Ed., Papadopoulos, G., Thubert, P., and F.
Theoleyre, "Reliable and Available Wireless (RAW) Use Theoleyre, "Reliable and Available Wireless (RAW) Use
Cases", RFC 9450, DOI 10.17487/RFC9450, August 2023, Cases", RFC 9450, DOI 10.17487/RFC9450, August 2023,
<https://www.rfc-editor.org/info/rfc9450>. <https://www.rfc-editor.org/info/rfc9450>.
[SLAM_1] Ventura, J., Arth, C., Reitmayr, G., and D. Schmalstieg, [SLAM_1] Ventura, J., Arth, C., Reitmayr, G., and D. Schmalstieg,
"A minimal solution to the generalized pose-and-scale "A Minimal Solution to the Generalized Pose-and-Scale
problem", In Proceedings of the IEEE Conference on Problem", 2014 IEEE Conference on Computer Vision and
Computer Vision and Pattern Recognition, pp. 422-429, Pattern Recognition, pp. 422-429,
2014. DOI 10.1109/CVPR.2014.61, 2014,
<https://ieeexplore.ieee.org/document/6909455>.
[SLAM_2] Sweeny, C., Fragoso, V., Hollerer, T., and M. Turk, "A [SLAM_2] Sweeny, C., Fragoso, V., Höllerer, T., and M. Turk, "gDLS:
scalable solution to the generalized pose and scale A Scalable Solution to the Generalized Pose and Scale
problem", In European Conference on Computer Vision, pp. Problem", Computer Vision - ECCV 2014, pp. 16-31,
16-31, 2014. DOI 10.1007/978-3-319-10593-2_2, 2014,
<https://link.springer.com/
chapter/10.1007/978-3-319-10593-2_2>.
[SLAM_3] Gauglitz, S., Sweeny, C., Ventura, J., Turk, M., and T. [SLAM_3] Gauglitz, S., Sweeney, C., Ventura, J., Turk, M., and T.
Hollerer, "Model estimation and selection towards Höllerer, "Model Estimation and Selection towards
unconstrained real-time tracking and mapping", In IEEE Unconstrained Real-Time Tracking and Mapping", IEEE
transactions on visualization and computer graphics, Transactions on Visualization and Computer Graphics, vol.
20(6), pp. 825-838, 2013. 20, no. 6, pp. 825-838, DOI 10.1109/TVCG.2013.243, 2014,
<https://ieeexplore.ieee.org/document/6636302>.
[SLAM_4] Pirchheim, C., Schmalstieg, D., and G. Reitmayr, "Handling [SLAM_4] Pirchheim, C., Schmalstieg, D., and G. Reitmayr, "Handling
pure camera rotation in keyframe-based SLAM", In 2013 IEEE pure camera rotation in keyframe-based SLAM", 2013 IEEE
international symposium on mixed and augmented reality International Symposium on Mixed and Augmented Reality
(ISMAR), pp. 229-238, 2013. (ISMAR), pp. 229-238, DOI 10.1109/ISMAR.2013.6671783,
2013, <https://ieeexplore.ieee.org/document/6671783>.
[UBICOMP] Bardram, J. and A. Friday, "Ubiquitous Computing Systems", [UBICOMP] Bardram, J. and A. Friday, "Ubiquitous Computing Systems",
In Ubiquitous Computing Fundamentals pp. 37-94. CRC Press, Ubiquitous Computing Fundamentals, 1st Edition, Chapman
2009. and Hall/CRC Press, pp. 37-94, 2009,
<https://www.taylorfrancis.com/chapters/
edit/10.1201/9781420093612-6/ubiquitous-computing-systems-
jakob-bardram-adrian-friday>.
[URLLC] 3GPP, "3GPP TR 23.725: Study on enhancement of Ultra- [URLLC] 3GPP, "Study on enhancement of Ultra-Reliable Low-Latency
Reliable Low-Latency Communication (URLLC) support in the Communication (URLLC) support in the 5G Core network
5G Core network (5GC).", (5GC)", 3GPP TR 23.725, 2019,
https://portal.3gpp.org/desktopmodules/Specifications/ <https://portal.3gpp.org/desktopmodules/Specifications/
SpecificationDetails.aspx?specificationId=3453, 2019. SpecificationDetails.aspx?specificationId=3453>.
[VIS_INTERFERE] [VIS_INTERFERE]
Kalkofen, D., Mendez, E., and D. Schmalstieg, "Interactive Kalkofen, D., Mendez, E., and D. Schmalstieg, "Interactive
focus and context visualization for augmented reality.", Focus and Context Visualization for Augmented Reality",
In 6th IEEE and ACM International Symposium on Mixed and 2007 6th IEEE and ACM International Symposium on Mixed and
Augmented Reality, pp. 191-201., 2007. Augmented Reality, pp. 191-201,
DOI 10.1109/ISMAR.2007.4538846, 2007,
<https://ieeexplore.ieee.org/document/4538846>.
[XR] 3GPP, "3GPP TR 26.928: Extended Reality (XR) in 5G.", [XR] 3GPP, "Extended Reality (XR) in 5G", 3GPP TR 26.928, 2020,
https://portal.3gpp.org/desktopmodules/Specifications/ <https://portal.3gpp.org/desktopmodules/Specifications/
SpecificationDetails.aspx?specificationId=3534, 2020. SpecificationDetails.aspx?specificationId=3534>.
[XR_TRAFFIC] [XR_TRAFFIC]
Apicharttrisorn, K., Balasubramanian, B., Chen, J., Apicharttrisorn, K., Balasubramanian, B., Chen, J.,
Sivaraj, R., Tsai, Y., Jana, R., Krishnamurthy, S., Tran, Sivaraj, R., Tsai, Y., Jana, R., Krishnamurthy, S., Tran,
T., and Y. Zhou, "Characterization of Multi-User Augmented T., and Y. Zhou, "Characterization of Multi-User Augmented
Reality over Cellular Networks", In 17th Annual IEEE Reality over Cellular Networks", 2020 17th Annual IEEE
International Conference on Sensing, Communication, and International Conference on Sensing, Communication, and
Networking (SECON), pp. 1-9. IEEE, 2020. Networking (SECON), pp. 1-9,
DOI 10.1109/SECON48991.2020.9158434, 2020,
<https://ieeexplore.ieee.org/document/9158434>.
Acknowledgements
Many thanks to Spencer Dawkins, Rohit Abhishek, Jake Holland, Kiran
Makhijani, Ali Begen, Cullen Jennings, Stephan Wenger, Eric Vyncke,
Wesley Eddy, Paul Kyzivat, Jim Guichard, Roman Danyliw, Warren
Kumari, and Zaheduzzaman Sarker for providing helpful feedback,
suggestions, and comments.
Authors' Addresses Authors' Addresses
Renan Krishna Renan Krishna
United Kingdom United Kingdom
Email: renan.krishna@gmail.com Email: renan.krishna@gmail.com
Akbar Rahman Akbar Rahman
Ericsson Ericsson
349 Terry Fox Drive 349 Terry Fox Drive
Ottawa Ontario K2K 2V6 Ottawa Ontario K2K 2V6
Canada Canada
Email: Akbar.Rahman@ericsson.com Email: Akbar.Rahman@ericsson.com
 End of changes. 116 change blocks. 
445 lines changed or deleted 514 lines changed or added

This html diff was produced by rfcdiff 1.48.