<?xml version='1.0' encoding='utf-8'?>
<!DOCTYPE rfc [
  <!ENTITY nbsp    "&#160;">
  <!ENTITY zwsp   "&#8203;">
  <!ENTITY nbhy   "&#8209;">
  <!ENTITY wj     "&#8288;">
]>
<?xml-stylesheet type="text/xsl" href="rfc2629.xslt" ?>
<!-- generated by https://github.com/cabo/kramdown-rfc version 1.7.30 (Ruby 3.2.3) -->
<rfc xmlns:xi="http://www.w3.org/2001/XInclude" ipr="trust200902" docName="draft-ietf-bmwg-savnet-sav-benchmarking-00" category="info" submissionType="IETF" xml:lang="en" version="3">
  <!-- xml2rfc v2v3 conversion 3.31.0 -->
  <front>
    <title abbrev="SAVBench">Benchmarking Methodology for Intra-domain and Inter-domain Source Address Validation</title>
    <seriesInfo name="Internet-Draft" value="draft-ietf-bmwg-savnet-sav-benchmarking-00"/>
    <author initials="L." surname="Chen" fullname="Li Chen">
      <organization>Zhongguancun Laboratory</organization>
      <address>
        <postal>
          <city>Beijing</city>
          <country>China</country>
        </postal>
        <email>lichen@zgclab.edu.cn</email>
      </address>
    </author>
    <author initials="D." surname="Li" fullname="Dan Li">
      <organization>Tsinghua University</organization>
      <address>
        <postal>
          <city>Beijing</city>
          <country>China</country>
        </postal>
        <email>tolidan@tsinghua.edu.cn</email>
      </address>
    </author>
    <author initials="L." surname="Liu" fullname="Libin Liu">
      <organization>Zhongguancun Laboratory</organization>
      <address>
        <postal>
          <city>Beijing</city>
          <country>China</country>
        </postal>
        <email>liulb@zgclab.edu.cn</email>
      </address>
    </author>
    <author initials="L." surname="Qin" fullname="Lancheng Qin">
      <organization>Zhongguancun Laboratory</organization>
      <address>
        <postal>
          <city>Beijing</city>
          <country>China</country>
        </postal>
        <email>qinlc@zgclab.edu.cn</email>
      </address>
    </author>
    <date year="2026" month="February" day="07"/>
    <area>General [REPLACE]</area>
    <workgroup>IETF</workgroup>
    <abstract>
      <?line 65?>

<t>This document defines methodologies for benchmarking the performance of intra-domain and inter-domain source address validation (SAV) mechanisms. SAV mechanisms are utilized to generate SAV rules to prevent source address spoofing, and have been implemented with many various designs in order to perform SAV in the corresponding scenarios. This document takes the approach of considering a SAV device to be a black box, defining the methodology in a manner that is agnostic to the mechanisms. This document provides a method for measuring the performance of existing and new SAV implementations.</t>
    </abstract>
  </front>
  <middle>
    <?line 69?>

<section anchor="introduction">
      <name>Introduction</name>
      <t>Source address validation (SAV) is significantly important to prevent source address spoofing. Operators are suggested to deploy different SAV mechanisms <xref target="RFC3704"/> <xref target="RFC8704"/> based on their deployment network environments. In addition, existing intra-domain (intra-AS) and inter-domain (inter-AS) SAV mechanisms have problems in operational overhead and SAV accuracy under various scenarios <xref target="intra-domain-ps"/> <xref target="inter-domain-ps"/>. Intra-domain and inter-domain SAVNET architectures <xref target="intra-domain-arch"/> <xref target="inter-domain-arch"/> are proposed to guide the design of new intra-domain and inter-domain SAV mechanisms to solve the problems. The benchmarking methodology defined in this document will help operators to get a more accurate idea of the SAV performance when their deployed devices enable SAV and will also help vendors to test the performance of SAV implementation for their devices.</t>
      <t>This document provides generic methodologies for benchmarking SAV mechanism performance. To achieve the desired functionality, a SAV device may support multiple SAV mechanisms, allowing operators to enable those most suitable for their specific network environments. This document considers a SAV device to be a black box, regardless of the design and implementation. The tests defined in this document can be used to benchmark a SAV device for SAV accuracy (i.e., false positive and false negative rates), SAV protocol convergence performance, and control plane and data plane forwarding performance. These tests can be performed on a hardware router, a software router, a virtual machine (VM) instance, or a container instance, which runs as a SAV device. This document outlines methodologies for assessing SAV device performance and comparing various SAV mechanisms and implementations.</t>
      <section anchor="goal-and-scope">
        <name>Goal and Scope</name>
        <t>The benchmarking methodology outlined in this draft focuses on two objectives:</t>
        <ul spacing="normal">
          <li>
            <t>Assessing “which SAV mechanism performs best” over a set of well-defined scenarios.</t>
          </li>
          <li>
            <t>Measuring the contribution of sub-systems to the overall SAV systems' performance (also known as “micro-benchmark”).</t>
          </li>
        </ul>
        <t>This benchmark evaluates the SAV performance of individual devices (e.g., hardware/software routers) by comparing different SAV mechanisms under specific network scenarios. The results help determine the appropriate SAV deployment for real-world network scenarios.</t>
      </section>
      <section anchor="requirements-language">
        <name>Requirements Language</name>
        <t>The key words "<bcp14>MUST</bcp14>", "<bcp14>MUST NOT</bcp14>", "<bcp14>REQUIRED</bcp14>", "<bcp14>SHALL</bcp14>", "<bcp14>SHALL
NOT</bcp14>", "<bcp14>SHOULD</bcp14>", "<bcp14>SHOULD NOT</bcp14>", "<bcp14>RECOMMENDED</bcp14>", "<bcp14>NOT RECOMMENDED</bcp14>",
"<bcp14>MAY</bcp14>", and "<bcp14>OPTIONAL</bcp14>" in this document are to be interpreted as
described in BCP 14 <xref target="RFC2119"/> <xref target="RFC8174"/> when, and only when, they
appear in all capitals, as shown here.</t>
        <?line -18?>

</section>
    </section>
    <section anchor="terminology">
      <name>Terminology</name>
      <t>SAV Control Plane: The SAV control plane consists of processes including gathering and communicating SAV-related information.</t>
      <t>SAV Data Plane: The SAV data plane stores the SAV rules within a specific data structure and validates each incoming packet to determine whether to permit or discard it.</t>
      <t>Host-facing Router: An edge router directly connected to a layer-2 host network.</t>
      <t>Customer-facing Router: An edge router connected to a non-BGP customer network which includes routers and runs the routing protocol.</t>
      <t>AS Border Router: An intra-domain router facing an external AS.</t>
    </section>
    <section anchor="test-methodology">
      <name>Test Methodology</name>
      <section anchor="test-setup">
        <name>Test Setup</name>
        <t>The test setup in general is compliant with <xref target="RFC2544"/>. The Device Under Test (DUT) is connected to a Tester and other network devices to construct the network topology introduced in <xref target="testcase-sec"/>. The Tester is a traffic generator to generate network traffic with various source and destination addresses in order to emulate the spoofing or legitimate traffic. It is <bcp14>OPTIONAL</bcp14> to choose various proportions of traffic.</t>
        <figure anchor="testsetup">
          <name>Test Setup.</name>
          <artwork><![CDATA[
    +~~~~~~~~~~~~~~~~~~~~~~~~~~+
    | Test Network Environment |
    |     +--------------+     |
    |     |              |     |
+-->|     |      DUT     |     |---+
|   |     |              |     |   |
|   |     +--------------+     |   |
|   +~~~~~~~~~~~~~~~~~~~~~~~~~~+   |
|                                  |
|         +--------------+         |
|         |              |         |
+---------|    Tester    |<--------+
          |              |
          +--------------+
]]></artwork>
        </figure>
        <t><xref target="testsetup"/> illustrates the test configuration for the Device Under Test (DUT). Within the test network environment, the DUT can be interconnected with other devices to create a variety of test scenarios. The Tester may establish a direct connection with the DUT or link through intermediary devices. The nature of the connection between them is dictated by the benchmarking tests outlined in <xref target="testcase-sec"/>. Furthermore, the Tester has the capability to produce both spoofed and legitimate traffic to evaluate the SAV accuracy of the DUT in relevant scenarios, and it can also generate traffic at line rate to assess the data plane forwarding performance of the DUT. Additionally, the DUT is required to support logging functionalities to document all test outcomes.</t>
      </section>
      <section anchor="network-topology-and-device-configuration">
        <name>Network Topology and Device Configuration</name>
        <t>The positioning of the DUT within the network topology has an impact on SAV performance. Therefore, the benchmarking process <bcp14>MUST</bcp14> include evaluating the DUT at multiple locations across the network to ensure a comprehensive assessment.</t>
        <t>The routing configurations of network devices may differ, and the resulting SAV rules depend on these settings. It is essential to clearly document the specific device configurations used during testing.</t>
        <t>Furthermore, the role of each device, such as host-facing router, customer-facing router, or AS border router in an intra-domain network, <bcp14>SHOULD</bcp14> be clearly identified. In an inter-domain context, the business relationships between ASes <bcp14>MUST</bcp14> also be specified.</t>
        <t>When evaluating data plane forwarding performance, the traffic generated by the Tester must be characterized by defined traffic rates, the ratio of spoofed to legitimate traffic, and the distribution of source addresses, as all of these factors can influence test results.</t>
      </section>
    </section>
    <section anchor="sav-performance-indicators">
      <name>SAV Performance Indicators</name>
      <t>This section lists key performance indicators (KPIs) of SAV for overall benchmarking tests. All KPIs <bcp14>SHOULD</bcp14> be measured in the benchmarking scenarios described in <xref target="testcase-sec"/>. Also, the KPIs <bcp14>SHOULD</bcp14> be measured from the result output of the DUT.
The standard deviation for KPIs' testing results <bcp14>SHOULD</bcp14> be analyzed for each fixed test setup, which can help understand the stability of the DUT's performance. The data plane SAV table refreshing rate and data plane forwarding rate below <bcp14>SHOULD</bcp14> be tested using varying SAV table sizes for each fixed test setup, which can help measure DUT's sensibility to the SAV table size for these two KPIs.</t>
      <section anchor="false-positive-rate">
        <name>False Positive Rate</name>
        <t>The proportion of legitimate traffic which is determined to be spoofing traffic by the DUT across all the legitimate traffic, and this can reflect the SAV accuracy of the DUT.</t>
      </section>
      <section anchor="false-negative-rate">
        <name>False Negative Rate</name>
        <t>The proportion of spoofing traffic which is determined to be legitimate traffic by the DUT across all the spoofing traffic, and this can reflect the SAV accuracy of the DUT.</t>
      </section>
      <section anchor="protocol-convergence-time">
        <name>Protocol Convergence Time</name>
        <t>The control protocol convergence time represents the period during which the SAV control plane protocol converges to update the SAV rules when routing changes happen, and it is the time elapsed from the beginning of routing change to the completion of SAV rule update. This KPI can indicate the convergence performance of the SAV protocol.</t>
      </section>
      <section anchor="protocol-message-processing-throughput">
        <name>Protocol Message Processing Throughput</name>
        <t>The protocol message processing throughput measures the throughput of processing the packets for communicating SAV-related information on the control plane, and it can indicate the SAV control plane performance of the DUT.</t>
      </section>
      <section anchor="data-plane-sav-table-refreshing-rate">
        <name>Data Plane SAV Table Refreshing Rate</name>
        <t>The data plane SAV table refreshing rate refers to the rate at which a DUT updates its SAV table with new SAV rules, and it can reflect the SAV data plane performance of the DUT.</t>
      </section>
      <section anchor="data-plane-forwarding-rate">
        <name>Data Plane Forwarding Rate</name>
        <t>The data plane forwarding rate measures the SAV data plane forwarding throughput for processing the data plane traffic, and it can indicate the SAV data plane performance of the DUT. It is suggested that measuring the data plane forwarding rate of DUT enabling and disabling SAV to see the proportion of decrease for the data plane forwarding rate. This can help analyze the efficiency for SAV data plane implementation of the DUT.</t>
      </section>
      <section anchor="resource-utilization">
        <name>Resource Utilization</name>
        <t>The resource utilization refers to the CPU and memory usage of the SAV processes within the DUT.</t>
      </section>
    </section>
    <section anchor="testcase-sec">
      <name>Benchmarking Tests</name>
      <section anchor="intra_domain_sav">
        <name>Intra-domain SAV</name>
        <section anchor="false-positive-and-false-negative-rates">
          <name>False Positive and False Negative Rates</name>
          <t><strong>Objective</strong>: Evaluate the false positive rate and false negative rate of the DUT in processing both legitimate and spoofed traffic across various intra-domain network scenarios. These scenarios include SAV implementations for customer/host networks, Internet-facing networks, and aggregation-router-facing networks.</t>
          <t>In the following, this document presents the test scenarios for evaluating intra-domain SAV performance on the DUT. Under each scenario, the generated spoofed traffic <bcp14>SHOULD</bcp14> include different types of forged source addresses, such as unused source addresses within the subnetwork, private network source addresses, internal-use-only source addresses of the subnetwork, and external source addresses. The ratios among these different types of forged source addresses <bcp14>SHOULD</bcp14> vary, since different SAV mechanisms may differ in their capability to block packets with forged source addresses of various types. Nevertheless, for all these types of spoofed traffic, the expected result is that the DUT <bcp14>SHOULD</bcp14> block them.</t>
          <figure anchor="intra-domain-customer-syn">
            <name>SAV for customer or host network in intra-domain symmetric routing scenario.</name>
            <artwork><![CDATA[
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
|                  Test Network Environment                |
|                       +~~~~~~~~~~+                       |
|                       | Router 1 |                       |
| FIB on DUT            +~~~~~~~~~~+                       |
| Dest         Next_hop   /\    |                          |
| 10.0.0.0/15  Network 1   |    |                          |
|                          |    \/                         |
|                       +----------+                       |
|                       |   DUT    |                       |
|                       +----------+                       |
|                         /\    |                          |
|             Traffic with |    | Traffic with             |
|      source IP addresses |    | destination IP addresses |
|           of 10.0.0.0/15 |    | of 10.0.0.0/15           |
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
                           |    \/
                    +--------------------+
                    |Tester (Sub Network)|
                    |    (10.0.0.0/15)   |
                    +--------------------+
]]></artwork>
          </figure>
          <t><strong>SAV for Customer or Host Network</strong>: <xref target="intra-domain-customer-syn"/> illustrates an intra-domain symmetric routing scenario in which SAV is deployed for a customer or host network. The DUT performs SAV as a customer/host-facing router and connects to Router 1 for Internet access. A sub network, which resides within the AS and uses the prefix 10.0.0.0/15, is connected to the DUT. The Tester emulates a sub network by advertising this prefix in the control plane and generating both spoofed and legitimate traffic in the data plane. In this setup, the Tester is configured so that inbound traffic destined for 10.0.0.0/15 arrives via the DUT. The DUT learns the route to 10.0.0.0/15 from the Tester, while the Tester sends outbound traffic with source addresses within 10.0.0.0/15 to the DUT, simulating a symmetric routing scenario between the two. The IP addresses used in this test case are optional; users may substitute them with other addresses, as applies equally to other test cases.</t>
          <t>The <strong>procedure</strong> for testing SAV in this intra-domain symmetric routing scenario is as follows:</t>
          <ol spacing="normal" type="1"><li>
              <t>To verify whether the DUT can generate accurate SAV rules for customer or host network under symmetric routing conditions, construct a testbed as depicted in <xref target="intra-domain-customer-syn"/>. The Tester is connected to the DUT and acts as a sub network.</t>
            </li>
            <li>
              <t>Configure the DUT and Router 1 to establish symmetric routing environment.</t>
            </li>
            <li>
              <t>The Tester generates both legitimate traffic (with source addresses in 10.0.0.0/15) and spoofed traffic (with source addresses in 10.2.0.0/15) toward the DUT. The prefix 10.2.0.0/15 does not belong to the sub network and thus is not advertised by the Tester. The ratio of spoofed to legitimate traffic may vary, for example, from 1:9 to 9:1.</t>
            </li>
          </ol>
          <t>The <strong>expected results</strong> for this test case are that the DUT blocks spoofed traffic and allows legitimate traffic originating from the sub network.</t>
          <figure anchor="intra-domain-customer-asyn">
            <name>SAV for customer or host network in intra-domain asymmetric routing scenario.</name>
            <artwork><![CDATA[
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
|                   Test Network Environment                  |
|                       +~~~~~~~~~~+                          |
|                       | Router 2 |                          |
| FIB on DUT            +~~~~~~~~~~+   FIB on Router 1        |
| Dest         Next_hop   /\      \    Dest         Next_hop  |
| 10.1.0.0/16  Network 1  /        \   10.0.0.0/16  Network 1 |
| 10.0.0.0/16  Router 2  /         \/  10.1.0.0/16  Router 2  |
|               +----------+     +~~~~~~~~~~+                 |
|               |   DUT    |     | Router 1 |                 |
|               +----------+     +~~~~~~~~~~+                 |
|                     /\           /                          |
|         Traffic with \          / Traffic with              |
|   source IP addresses \        / destination IP addresses   |
|         of 10.0.0.0/16 \      / of 10.0.0.0/16              |
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
                           \   \/
                    +--------------------+
                    |Tester (Sub Network)|
                    |   (10.0.0.0/15)    |
                    +--------------------+
]]></artwork>
          </figure>
          <t><strong>SAV for Customer or Host Network</strong>: <xref target="intra-domain-customer-asyn"/> illustrates an intra-domain asymmetric routing scenario in which SAV is deployed for a customer or host network. The DUT performs SAV as a customer/host-facing router. A sub network, i.e., a customer/host network within the AS, is connected to both the DUT and Router 1, and uses the prefix 10.0.0.0/15. The Tester emulates a sub network and handles both its control plane and data plane functions. In this setup, the Tester is configured so that inbound traffic destined for 10.1.0.0/16 is received only from the DUT, while inbound traffic for 10.0.0.0/16 is received only from Router 1. The DUT learns the route to prefix 10.1.0.0/16 from the Tester, and Router 1 learns the route to 10.0.0.0/16 from the Tester. Both the DUT and Router 1 then advertise their respective learned prefixes to Router 2. Consequently, the DUT learns the route to 10.0.0.0/16 from Router 2, and Router 1 learns the route to 10.1.0.0/16 from Router 2. The Tester sends outbound traffic with source addresses in 10.0.0.0/16 to the DUT, simulating an asymmetric routing scenario between the Tester and the DUT.</t>
          <t>The <strong>procedure</strong> for testing SAV in this intra-domain asymmetric routing scenario is as follows:</t>
          <ol spacing="normal" type="1"><li>
              <t>To determine whether the DUT can generate accurate SAV rules under asymmetric routing conditions, set up the test environment as shown in <xref target="intra-domain-customer-asyn"/>. The Tester is connected to both the DUT and Router 1 and emulates the functions of a sub network.</t>
            </li>
            <li>
              <t>Configure the DUT, Router 1, and Router 2 to establish the asymmetric routing scenario.</t>
            </li>
            <li>
              <t>The Tester generates both spoofed traffic (using source addresses in 10.1.0.0/16) and legitimate traffic (using source addresses in 10.0.0.0/16) toward the DUT. The prefix 10.1.0.0/16 does not belong to the sub network and thus is not advertised by the Tester. The ratio of spoofed to legitimate traffic may vary, for example, from 1:9 to 9:1.</t>
            </li>
          </ol>
          <t>The <strong>expected results</strong> for this test case are that the DUT blocks spoofed traffic and permits legitimate traffic originating from the sub network.</t>
          <figure anchor="intra-domain-internet-syn">
            <name>SAV for Internet-facing network in intra-domain symmetric routing scenario.</name>
            <artwork><![CDATA[
                   +---------------------+
                   |  Tester (Internet)  |
                   +---------------------+
                           /\   | Inbound traffic with source 
                           |    | IP address of 10.2.0.0/15
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
| Test Network Environment |    |                          |
|                          |   \/                          |
|                       +----------+                       |
|                       |    DUT   | SAV facing Internet   |
| FIB on DUT            +----------+                       |
| Dest         Next_hop   /\    |                          |
| 10.0.0.0/15  Network 1   |    |                          |
|                          |    \/                         |
|                       +~~~~~~~~~~+                       |
|                       | Router 1 |                       |
|                       +~~~~~~~~~~+                       |
|                         /\    |                          |
|             Traffic with |    | Traffic with             |
|      source IP addresses |    | destination IP addresses |
|           of 10.0.0.0/15 |    | of 10.0.0.0/15           |
|                          |    \/                         |
|                  +--------------------+                  |
|                  |    Sub Network     |                  |
|                  |   (10.0.0.0/15)    |                  |
|                  +--------------------+                  |
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
]]></artwork>
          </figure>
          <t><strong>SAV for Internet-facing Network</strong>: <xref target="intra-domain-internet-syn"/> illustrates the test scenario for SAV in an Internet-facing network under intra-domain symmetric routing conditions. The network topology resembles that of <xref target="intra-domain-customer-syn"/>, with the key difference being the positioning of the DUT. In this case, the DUT is connected to Router 1 and the Internet, while the Tester emulates the Internet. The DUT performs SAV from an Internet-facing perspective, as opposed to a customer/host-facing role.</t>
          <t>The <strong>procedure</strong> for testing SAV for an Internet-facing network in an intra-domain symmetric routing scenario is as follows:</t>
          <ol spacing="normal" type="1"><li>
              <t>To evaluate whether the DUT can generate accurate SAV rules for Internet-facing SAV under symmetric routing, set up the test environment as depicted in <xref target="intra-domain-internet-syn"/>. The Tester is connected to the DUT and emulates the Internet.</t>
            </li>
            <li>
              <t>Configure the DUT and Router 1 to establish symmetric routing environment.</t>
            </li>
            <li>
              <t>The Tester generates both spoofed traffic (using source addresses in 10.0.0.0/15) and legitimate traffic (using source addresses in 10.2.0.0/15) toward the DUT. The ratio of spoofed to legitimate traffic may vary, for example, from 1:9 to 9:1.</t>
            </li>
          </ol>
          <t>The <strong>expected results</strong> for this test case are that the DUT blocks spoofed traffic and allows legitimate traffic originating from the Internet.</t>
          <figure anchor="intra-domain-internet-asyn">
            <name>SAV for Internet-facing network in intra-domain asymmetric routing scenario.</name>
            <artwork><![CDATA[
                   +---------------------+
                   |  Tester (Internet)  |
                   +---------------------+
                           /\   | Inbound traffic with source 
                           |    | IP address of 10.2.0.0/15
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
| Test Network Environment |    |                             |
|                          |   \/                             |
|                       +----------+                          |
|                       |    DUT   |                          |
| FIB on Router 1       +----------+   FIB on Router 2        |
| Dest         Next_hop   /\      \    Dest         Next_hop  |
| 10.1.0.0/16  Network 1  /        \   10.0.0.0/16  Network 1 |
| 10.0.0.0/16  DUT       /         \/  10.1.0.0/16  DUT       |
|               +~~~~~~~~~~+     +~~~~~~~~~~+                 |
|               | Router 1 |     | Router 2 |                 |
|               +~~~~~~~~~~+     +~~~~~~~~~~+                 |
|                     /\           /                          |
|         Traffic with \          / Traffic with              |
|   source IP addresses \        / destination IP addresses   |
|         of 10.0.0.0/16 \      / of 10.0.0.0/16              |
|                         \    \/                             |
|                  +--------------------+                     |
|                  |    Sub Network     |                     |
|                  |   (10.0.0.0/15)    |                     |
|                  +--------------------+                     |
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
]]></artwork>
          </figure>
          <t><strong>SAV for Internet-facing Network</strong>: <xref target="intra-domain-internet-asyn"/> illustrates a test case for SAV in an Internet-facing network under intra-domain asymmetric routing conditions. The network topology is identical to that of <xref target="intra-domain-customer-asyn"/>, with the key distinction being the placement of the DUT. In this scenario, the DUT is connected to Router 1 and Router 2 within the same AS, as well as to the Internet. The Tester emulates the Internet, and the DUT performs Internet-facing SAV rather than customer/host-network-facing SAV.</t>
          <t>The <strong>procedure</strong> for testing SAV in this intra-domain asymmetric routing scenario is as follows:</t>
          <ol spacing="normal" type="1"><li>
              <t>To evaluate whether the DUT can generate accurate SAV rules for Internet-facing SAV under asymmetric routing, construct the test environment as shown in <xref target="intra-domain-internet-asyn"/>. The Tester is connected to the DUT and emulates the Internet.</t>
            </li>
            <li>
              <t>Configure the DUT, Router 1, and Router 2 to establish the asymmetric routing scenario.</t>
            </li>
            <li>
              <t>The Tester generates both spoofed traffic (using source addresses in 10.0.0.0/15) and legitimate traffic (using source addresses in 10.2.0.0/15) toward the DUT. The ratio of spoofed to legitimate traffic may vary, for example, from 1:9 to 9:1.</t>
            </li>
          </ol>
          <t>The <strong>expected results</strong> for this test case are that the DUT blocks spoofed traffic and permits legitimate traffic originating from the Internet.</t>
          <figure anchor="intra-domain-agg-syn">
            <name>SAV for aggregation-router-facing network in intra-domain symmetric routing scenario.</name>
            <artwork><![CDATA[
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
|                  Test Network Environment                |
|                       +----------+                       |
|                       |    DUT   | SAV facing Router 1   |
| FIB on DUT            +----------+                       |
| Dest         Next_hop   /\    |                          |
| 10.0.0.0/15  Network 1   |    |                          |
|                          |    \/                         |
|                       +~~~~~~~~~~+                       |
|                       | Router 1 |                       |
|                       +~~~~~~~~~~+                       |
|                         /\    |                          |
|             Traffic with |    | Traffic with             |
|      source IP addresses |    | destination IP addresses |
|           of 10.0.0.0/15 |    | of 10.0.0.0/15           |
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
                           |    \/
                    +--------------------+
                    |Tester (Sub Network)|
                    |   (10.0.0.0/15)    |
                    +--------------------+
]]></artwork>
          </figure>
          <t><strong>SAV for Aggregation-router-facing Network</strong>: <xref target="intra-domain-agg-syn"/> depicts the test scenario for SAV in an aggregation-router-facing network under intra-domain symmetric routing conditions. The network topology in <xref target="intra-domain-agg-syn"/> is identical to that of <xref target="intra-domain-internet-syn"/>. The Tester is connected to Router 1 to emulate a sub network, enabling evaluation of the DUT's false positive and false negative rates when facing Router 1.</t>
          <t>The <strong>procedure</strong> for testing SAV in this aggregation-router-facing scenario is as follows:</t>
          <ol spacing="normal" type="1"><li>
              <t>To evaluate whether the DUT can generate accurate SAV rules for aggregation-router-facing SAV under symmetric routing, construct the test environment as shown in <xref target="intra-domain-agg-syn"/>. The Tester is connected to Router 1 and emulates a sub network.</t>
            </li>
            <li>
              <t>Configure the DUT and Router 1 to establish symmetric routing environment.</t>
            </li>
            <li>
              <t>The Tester generates both legitimate traffic (using source addresses in 10.1.0.0/15) and spoofed traffic (using source addresses in 10.2.0.0/15) toward Router 1. The prefix 10.2.0.0/15 does not belong to the sub network and thus is not advertised by the Tester. The ratio of spoofed to legitimate traffic may vary, for example, from 1:9 to 9:1.</t>
            </li>
          </ol>
          <t>The <strong>expected results</strong> for this test case are that the DUT blocks spoofed traffic and permits legitimate traffic originating from the direction of Router 1.</t>
          <figure anchor="intra-domain-agg-asyn">
            <name>SAV for aggregation-router-facing network in intra-domain asymmetric routing scenario.</name>
            <artwork><![CDATA[
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
|                   Test Network Environment                  |
|                       +----------+                          |
|                       |    DUT   | SAV facing Router 1 and 2|
| FIB on Router 1       +----------+   FIB on Router 2        |
| Dest         Next_hop   /\      \    Dest         Next_hop  |
| 10.1.0.0/15  Network 1  /        \   10.0.0.0/15  Network 1 |
| 10.0.0.0/15  DUT       /         \/  10.1.0.0/15  DUT       |
|               +~~~~~~~~~~+     +~~~~~~~~~~+                 |
|               | Router 1 |     | Router 2 |                 |
|               +~~~~~~~~~~+     +~~~~~~~~~~+                 |
|                     /\           /                          |
|         Traffic with \          / Traffic with              |
|   source IP addresses \        / destination IP addresses   |
|         of 10.0.0.0/15 \      / of 10.0.0.0/15              |
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
                           \   \/
                   +--------------------+
                   |Tester (Sub Network)|
                   |   (10.0.0.0/15)    |
                   +--------------------+
]]></artwork>
          </figure>
          <t><strong>SAV for Aggregation-router-facing Network</strong>: <xref target="intra-domain-agg-asyn"/> illustrates the test case for SAV in an aggregation-router-facing network under intra-domain asymmetric routing conditions. The network topology in <xref target="intra-domain-agg-asyn"/> is identical to that of <xref target="intra-domain-internet-asyn"/>. The Tester is connected to both Router 1 and Router 2 to emulate a sub network, enabling evaluation of the DUT's false positive and false negative rates when facing Router 1 and Router 2.</t>
          <t>The <strong>procedure</strong> for testing SAV in this aggregation-router-facing asymmetric routing scenario is as follows:</t>
          <ol spacing="normal" type="1"><li>
              <t>To evaluate whether the DUT can generate accurate SAV rules under asymmetric routing conditions, construct the test environment as shown in <xref target="intra-domain-agg-asyn"/>. The Tester is connected to Router 1 and Router 2 and emulates the functions of a sub network.</t>
            </li>
            <li>
              <t>Configure the DUT, Router 1, and Router 2 to establish an asymmetric routing environment.</t>
            </li>
            <li>
              <t>The Tester generates both spoofed traffic (using source addresses in 10.2.0.0/15) and legitimate traffic (using source addresses in 10.1.0.0/15) toward Router 1. The prefix 10.2.0.0/15 does not belong to the sub network and thus is not advertised by the Tester. The ratio of spoofed to legitimate traffic may vary, for example, from 1:9 to 9:1.</t>
            </li>
          </ol>
          <t>The <strong>expected results</strong> for this test case are that the DUT blocks spoofed traffic and permits legitimate traffic originating from the direction of Router 1 and Router 2.</t>
          <figure anchor="intra-domain-frr-topo">
            <name>Intra-domain SAV under Fast Reroute (FRR) scenario.</name>
            <artwork><![CDATA[
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
|                   Test Network Environment                  |
|     +------------+                     +------------+       |
|     |   Router2  |---------------------|   Router3  |       |
|     +------------+                     +------------+       |
|          /\                                  /\             |
|          |                                   |              |
|          | backup path                       | primary path |
|          |                                   |              |
|     +-----------------------------------------------+       |
|     |                     DUT                       |       |
|     +-----------------------------------------------+       |
|                           /\                                |
|                           | Legitimate and                  |
|                           | Spoofed Traffic                 |
|                           |                                 |
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
                            |
                  +--------------------+
                  |Tester (Sub Network)|
                  +--------------------+
]]></artwork>
          </figure>
          <t><strong>SAV under Fast Reroute (FRR) Scenario</strong>: Fast Reroute (FRR) mechanisms such as Loop-Free Alternates (LFA) or Topology-Independent Loop-Free Alternates (TI-LFA) provide sub-second restoration of traffic forwarding after link or node failures. During FRR activation, temporary forwarding changes may occur before the control plane converges, potentially impacting SAV rule consistency and causing transient
false positives or false negatives.</t>
          <t>The <strong>procedure</strong> for testing SAV under FRR scenario is as follows:</t>
          <ol spacing="normal" type="1"><li>
              <t>Configure the DUT and adjacent routers with FRR protection for the primary link (Router3–DUT).</t>
            </li>
            <li>
              <t>The Tester continuously sends legitimate and spoofed traffic toward the protected prefix.</t>
            </li>
            <li>
              <t>Trigger a link failure between Router3 and the DUT, causing FRR switchover to Router2.</t>
            </li>
            <li>
              <t>Measure false positive and false negative rates during the switchover and after reconvergence.</t>
            </li>
            <li>
              <t>Restore the primary link and verify that SAV rules revert correctly.</t>
            </li>
          </ol>
          <t>The <strong>expected results</strong> for this test case are that the DUT should maintain correct SAV behavior throughout FRR activation and recovery. False positive and false negative rates <bcp14>SHOULD</bcp14> remain minimal during FRR events, and SAV rules <bcp14>SHOULD</bcp14> update promptly to reflect restored routing.</t>
          <figure anchor="intra-domain-pbr-topo">
            <name>Intra-domain SAV under Policy-based Routing (PBR) scenario.</name>
            <artwork><![CDATA[
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
|             Test Network Environment           |
|                 +------------+                 |
|                 |  Router2   |                 |
|                 +------------+                 |
|                       /\                       |
|                        | default path          |
|                        |                       |
|               +----------------+               |
|               |       DUT      |               |
|               +----------------+               |
|                 /\           /\                |
|    policy-based /             \ default path   |
|           path /               \               |
|         +-----------+      +-----------+       |
|         |  Router3  |      |  Router1  |       |
|         +-----------+      +-----------+       |
|              /\                 /\             |
|              |                   |             |
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
               |                   |
          +-----------------------------+
          |     Tester (Sub Network)    |
          +-----------------------------+
]]></artwork>
          </figure>
          <t><strong>SAV under Policy-based Routing (PBR) Scenario</strong>: Policy-based Routing (PBR) enables forwarding decisions based on user-defined match conditions (e.g., source prefix, DSCP, or interface) instead of the standard routing table. Such policies can create asymmetric paths that challenge the SAV mechanism if rules are derived solely from RIB or FIB information.</t>
          <t>The <strong>procedure</strong> for testing SAV under PBR scenario is as follows:</t>
          <ol spacing="normal" type="1"><li>
              <t>Configure PBR on the DUT to forward traffic matching a specific source prefix (e.g., 10.1.0.0/16) to Router3, while other traffic follows the default path to Router1.</t>
            </li>
            <li>
              <t>The Tester sends both legitimate and spoofed traffic that matches and does not match the PBR policy.</t>
            </li>
            <li>
              <t>Measure the false positive and false negative rates for both traffic types.</t>
            </li>
            <li>
              <t>Dynamically modify or remove the PBR policy and observe SAV rule adaptation.</t>
            </li>
          </ol>
          <t>The <strong>expected results</strong> for this test case are that the DUT <bcp14>SHOULD</bcp14> continue to correctly filter spoofed packets and permit legitimate packets under PBR scenario. SAV rules <bcp14>MUST</bcp14> adapt to policy-based forwarding paths without producing misclassification.</t>
        </section>
        <section anchor="intra-control-plane-sec">
          <name>Control Plane Performance</name>
          <t><strong>Objective</strong>: Measure the control plane performance of the DUT, including both protocol convergence performance and protocol message processing performance in response to route changes caused by network failures or operator configurations. Protocol convergence performance is quantified by the convergence time, defined as the duration from the onset of a routing change until the completion of the corresponding SAV rule update. Protocol message processing performance is measured by the processing throughput, represented by the total size of protocol messages processed per second.</t>
          <t>Note that the tests for control plane performance of the DUT which performs intra-domain SAV are <bcp14>OPTIONAL</bcp14>. Only DUT which implements the SAV mechanism using an explicit control-plane communication protocol, such as SAV-specific information communication mechanism proposed in <xref target="intra-domain-arch"/> <bcp14>SHOULD</bcp14> be tested on its control plane performance.</t>
          <figure anchor="intra-convg-perf">
            <name>Test setup for protocol convergence performance measurement.</name>
            <artwork><![CDATA[
+~~~~~~~~~~~~~~~~~~~+      +-------------+          +-----------+
| Emulated Topology |------|   Tester    |<-------->|    DUT    |
+~~~~~~~~~~~~~~~~~~~+      +-------------+          +-----------+
]]></artwork>
          </figure>
          <t><strong>Protocol Convergence Performance</strong>: <xref target="intra-convg-perf"/> illustrates the test setup for measuring protocol convergence performance. The convergence process of the DUT, during which SAV rules are updated, is triggered by route changes resulting from network failures or operator configurations. In <xref target="intra-convg-perf"/>, the Tester is directly connected to the DUT and simulates these route changes by adding or withdrawing prefixes to initiate the DUT's convergence procedure.</t>
          <t>The <strong>procedure</strong> for testing protocol convergence performance is as follows:</t>
          <ol spacing="normal" type="1"><li>
              <t>To measure the protocol convergence time of the DUT, set up the test environment as depicted in <xref target="intra-convg-perf"/>, with the Tester directly connected to the DUT.</t>
            </li>
            <li>
              <t>The Tester withdraws a specified percentage of the total prefixes supported by the DUT, for example, 10%, 20%, up to 100%.</t>
            </li>
            <li>
              <t>The protocol convergence time is calculated based on DUT logs that record the start and completion times of the convergence process.</t>
            </li>
          </ol>
          <t>Please note that for IGP, proportional prefix withdrawal can be achieved by selectively shutting down interfaces. For instance, if the Tester is connected to ten emulated devices through ten interfaces, each advertising a prefix, withdrawing 10% of prefixes can be accomplished by randomly disabling one interface. Similarly, 20% withdrawal corresponds to shutting down two interfaces, and so forth. This is one suggested method, and other approaches that achieve the same effect should be also acceptable.</t>
          <t>The protocol convergence time, defined as the duration required for the DUT to complete the convergence process, should be measured from the moment the last “hello” message is received from the emulated device on the disabled interface until SAV rule generation is finalized. To ensure accuracy, the DUT should log the timestamp of the last hello message received and the timestamp when SAV rule updates are complete. The convergence time is the difference between these two timestamps.</t>
          <t>It is recommended that if the emulated device sends a “goodbye hello” message during interface shutdown, using the receipt time of this message, rather than the last standard hello, as the starting point will provide a more precise measurement, as advised in <xref target="RFC4061"/>.</t>
          <t><strong>Protocol Message Processing Performance</strong>: The test for protocol message processing performance uses the same setup illustrated in <xref target="intra-convg-perf"/>. This performance metric evaluates the protocol message processing throughput, the rate at which the DUT processes protocol messages. The Tester varies the sending rate of protocol messages, ranging from 10% to 100% of the total link capacity between the Tester and the DUT. The DUT records both the total size of processed protocol messages and the corresponding processing time.</t>
          <t>The <strong>procedure</strong> for testing protocol message processing performance is as follows:</t>
          <ol spacing="normal" type="1"><li>
              <t>To measure the protocol message processing throughput of the DUT, set up the test environment as shown in <xref target="intra-convg-perf"/>, with the Tester directly connected to the DUT.</t>
            </li>
            <li>
              <t>The Tester sends protocol messages at varying rates, such as 10%, 20%, up to 100%, of the total link capacity between the Tester and the DUT.</t>
            </li>
            <li>
              <t>The protocol message processing throughput is calculated based on DUT logs that record the total size of processed protocol messages and the total processing time.</t>
            </li>
          </ol>
          <t>To compute the protocol message processing throughput, the DUT logs <bcp14>MUST</bcp14> include the total size of the protocol messages processed and the total time taken for processing. The throughput is then derived by dividing the total message size by the total processing time.</t>
        </section>
        <section anchor="intra-data-plane-sec">
          <name>Data Plane Performance</name>
          <t><strong>Objective</strong>: Evaluate the data plane performance of the DUT, including both data plane SAV table refresh performance and data plane forwarding performance. Data plane SAV table refresh performance is quantified by the refresh rate, which indicates how quickly the DUT updates its SAV table with new SAV rules. Data plane forwarding performance is measured by the forwarding rate, defined as the total size of packets forwarded by the DUT per second.</t>
          <t><strong>Data Plane SAV Table Refreshing Performance</strong>: The evaluation of data plane SAV table refresh performance uses the same test setup shown in <xref target="intra-convg-perf"/>. This metric measures the rate at which the DUT refreshes its SAV table with new SAV rules. The Tester varies the transmission rate of protocol messages, from 10% to 100% of the total link capacity between the Tester and the DUT, to influence the proportion of updated SAV rules and corresponding SAV table entries. The DUT records the total number of updated SAV table entries and the time taken to complete the refresh process.</t>
          <t>The <strong>procedure</strong> for testing data plane SAV table refresh performance is as follows:</t>
          <ol spacing="normal" type="1"><li>
              <t>To measure the data plane SAV table refreshing rate of the DUT, set up the test environment as depicted in <xref target="intra-convg-perf"/>, with the Tester directly connected to the DUT.</t>
            </li>
            <li>
              <t>The Tester sends protocol messages at varying percentages of the total link capacity, for example, 10%, 20%, up to 100%.</t>
            </li>
            <li>
              <t>The data plane SAV table refreshing rate is calculated based on DUT logs that record the total number of updated SAV table entries and the total refresh time.</t>
            </li>
          </ol>
          <t>To compute the refresh rate, the DUT logs <bcp14>MUST</bcp14> capture the total number of updated SAV table entries and the total time required for refreshing. The refresh rate is then derived by dividing the total number of updated entries by the total refresh time.</t>
          <t><strong>Data Plane Forwarding Performance</strong>: The evaluation of data plane forwarding performance uses the same test setup shown in <xref target="intra-convg-perf"/>. The Tester transmits a mixture of spoofed and legitimate traffic at a rate matching the total link capacity between the Tester and the DUT, while the DUT maintains a fully populated SAV table. The ratio of spoofed to legitimate traffic can be varied within a range, for example, from 1:9 to 9:1. The DUT records the total size of forwarded packets and the total duration of the forwarding process.</t>
          <t>The procedure for testing data plane forwarding performance is as follows:</t>
          <ol spacing="normal" type="1"><li>
              <t>To measure the data plane forwarding rate of the DUT, set up the test environment as depicted in <xref target="intra-convg-perf"/>, with the Tester directly connected to the DUT.</t>
            </li>
            <li>
              <t>The Tester sends a mix of spoofed and legitimate traffic to the DUT at the full link capacity between the Tester and the DUT. The ratio of spoofed to legitimate traffic may vary, for example, from 1:9 to 9:1.</t>
            </li>
            <li>
              <t>The data plane forwarding rate is calculated based on DUT logs that record the total size of forwarded traffic and the total forwarding time.</t>
            </li>
          </ol>
          <t>To compute the forwarding rate, the DUT logs must include the total size of forwarded traffic and the total time taken for forwarding. The forwarding rate is then derived by dividing the total traffic size by the total forwarding time.</t>
        </section>
      </section>
      <section anchor="inter_domain_sav">
        <name>Inter-domain SAV</name>
        <section anchor="false-positive-and-false-negative-rates-1">
          <name>False Positive and False Negative Rates</name>
          <t><strong>Objective</strong>: Measure the false positive rate and false negative rate of the DUT when processing legitimate and spoofed traffic across multiple inter-domain network scenarios, including SAV implementations for both customer-facing ASes and provider-/peer-facing ASes.</t>
          <t>In the following, this document presents the test scenarios for evaluating inter-domain SAV performance on the DUT. Under each scenario, the generated spoofed traffic <bcp14>SHOULD</bcp14> include different types of forged source addresses, such as source addresses belonging to the local AS but not announced to external networks, private network source addresses, source addresses belonging to other ASes, and unallocated (unused) source addresses. The ratios among these different types of forged source addresses <bcp14>SHOULD</bcp14> vary, since different inter-domain SAV mechanisms may differ in their capability to block packets with forged source addresses of various origins. Nevertheless, for all these types of spoofed traffic, the expected result is that the DUT <bcp14>SHOULD</bcp14> block them.</t>
          <figure anchor="inter-customer-syn">
            <name>SAV for customer-facing ASes in inter-domain symmetric routing scenario.</name>
            <artwork><![CDATA[
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
|                  Test Network Environment                |
|                        +~~~~~~~~~~~~~~~~+                |
|                        |    AS 3(P3)    |                |
|                        +~+/\~~~~~~+/\+~~+                |
|                           /         \                    |
|                          /           \                   |
|                         /             \                  |
|                        / (C2P)         \                 |
|              +------------------+       \                |
|              |      DUT(P4)     |        \               |
|              ++/\+--+/\+----+/\++         \              |
|                /      |       \            \             |
|      P2[AS 2] /       |        \            \            |
|              /        |         \            \           |
|             / (C2P)   |          \ P5[AS 5]   \ P5[AS 5] |
|+~~~~~~~~~~~~~~~~+     |           \            \         |
||    AS 2(P2)    |     | P1[AS 1]   \            \        |
|+~~~~~~~~~~+/\+~~+     | P6[AS 1]    \            \       |
|             \         |              \            \      |
|     P6[AS 1] \        |               \            \     |
|      P1[AS 1] \       |                \            \    |
|          (C2P) \      | (C2P/P2P) (C2P) \      (C2P) \   |
|             +~~~~~~~~~~~~~~~~+        +~~~~~~~~~~~~~~~~+ |
|             |  AS 1(P1, P6)  |        |    AS 5(P5)    | |
|             +~~~~~~~~~~~~~~~~+        +~~~~~~~~~~~~~~~~+ |
|                  /\     |                                |
|                  |      |                                |
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
                   |     \/
              +----------------+
              |     Tester     |
              +----------------+
]]></artwork>
          </figure>
          <t><strong>SAV for Customer-facing ASes</strong>: <xref target="inter-customer-syn"/> presents a test case for SAV in customer-facing ASes under an inter-domain symmetric routing scenario. In this setup, AS 1, AS 2, AS 3, the DUT, and AS 5 form the test network environment, with the DUT performing SAV at the AS level. AS 1 is a customer of both AS 2 and the DUT; AS 2 is a customer of the DUT, which in turn is a customer of AS 3; and AS 5 is a customer of both AS 3 and the DUT. AS 1 advertises prefixes P1 and P6 to AS 2 and the DUT, respectively. AS 2 then propagates routes for P1 and P6 to the DUT, enabling the DUT to learn these prefixes from both AS 1 and AS 2. In this test, the legitimate path for traffic with source addresses in P1 and destination addresses in P4 is AS 1-&gt;AS 2-&gt;DUT-&gt;AS 4. The Tester is connected to AS 1 to evaluate the DUT's SAV performance for customer-facing ASes.</t>
          <t>The <strong>procedure</strong> for testing SAV in this scenario is as follows:</t>
          <ol spacing="normal" type="1"><li>
              <t>To evaluate whether the DUT can generate accurate SAV rules for customer-facing ASes under symmetric inter-domain routing, construct the test environment as shown in <xref target="inter-customer-syn"/>. The Tester is connected to AS 1 and generates test traffic toward the DUT.</t>
            </li>
            <li>
              <t>Configure AS 1, AS 2, AS 3, the DUT, and AS 5 to establish symmetric routing environment.</t>
            </li>
            <li>
              <t>The Tester sends both legitimate traffic (with source addresses in P1 and destination addresses in P4) and spoofed traffic (with source addresses in P5 and destination addresses in P4) to the DUT via AS 2. The ratio of spoofed to legitimate traffic may vary, for example, from 1:9 to 9:1.</t>
            </li>
          </ol>
          <t>The <strong>expected results</strong> for this test case are that the DUT blocks spoofed traffic and permits legitimate traffic received from the direction of AS 2.</t>
          <t>Note that the DUT may also be placed at AS 1 or AS 2 in <xref target="inter-customer-syn"/> to evaluate its false positive and false negative rates using the same procedure. In these configurations, the DUT is expected to effectively block spoofed traffic.</t>
          <figure anchor="inter-customer-lpp">
            <name>SAV for customer-facing ASes in inter-domain asymmetric routing scenario caused by NO_EXPORT.</name>
            <artwork><![CDATA[
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
|                  Test Network Environment                |
|                        +~~~~~~~~~~~~~~~~+                |
|                        |    AS 3(P3)    |                |
|                        +~+/\~~~~~~+/\+~~+                |
|                           /         \                    |
|                          /           \                   |
|                         /             \                  |
|                        / (C2P)         \                 |
|              +------------------+       \                |
|              |      DUT(P4)     |        \               |
|              ++/\+--+/\+----+/\++         \              |
|                /      |       \            \             |
|      P2[AS 2] /       |        \            \            |
|              /        |         \            \           |
|             / (C2P)   |          \ P5[AS 5]   \ P5[AS 5] |
|+~~~~~~~~~~~~~~~~+     |           \            \         |
||    AS 2(P2)    |     | P1[AS 1]   \            \        |
|+~~~~~~~~~~+/\+~~+     | P6[AS 1]    \            \       |
|    P6[AS 1] \         | NO_EXPORT    \            \      |
|     P1[AS 1] \        |               \            \     |
|     NO_EXPORT \       |                \            \    |
|          (C2P) \      | (C2P)     (C2P) \      (C2P) \   |
|             +~~~~~~~~~~~~~~~~+        +~~~~~~~~~~~~~~~~+ |
|             |  AS 1(P1, P6)  |        |    AS 5(P5)    | |
|             +~~~~~~~~~~~~~~~~+        +~~~~~~~~~~~~~~~~+ |
|                  /\     |                                |
|                  |      |                                |
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
                   |     \/
              +----------------+
              |     Tester     |
              +----------------+
]]></artwork>
          </figure>
          <t>SAV for Customer-facing ASes: <xref target="inter-customer-lpp"/> presents a test case for SAV in customer-facing ASes under an inter-domain asymmetric routing scenario induced by NO_EXPORT community configuration. In this setup, AS 1, AS 2, AS 3, the DUT, and AS 5 form the test network, with the DUT performing SAV at the AS level. AS 1 is a customer of both AS 2 and the DUT; AS 2 is a customer of the DUT, which is itself a customer of AS 3; and AS 5 is a customer of both AS 3 and the DUT. AS 1 advertises prefix P1 to AS 2 with the NO_EXPORT community attribute, preventing AS 2 from propagating the route for P1 to the DUT. Similarly, AS 1 advertises prefix P6 to the DUT with the NO_EXPORT attribute, preventing the DUT from propagating this route to AS 3. As a result, the DUT learns the route for prefix P1 only from AS 1. The legitimate path for traffic with source addresses in P1 and destination addresses in P4 is AS 1-&gt;AS 2-&gt;DUT. The Tester is connected to AS 1 to evaluate the DUT's SAV performance for customer-facing ASes.</t>
          <t>The <strong>procedure</strong> for testing SAV in this asymmetric routing scenario is as follows:</t>
          <ol spacing="normal" type="1"><li>
              <t>To evaluate whether the DUT can generate accurate SAV rules under NO_EXPORT-induced asymmetric routing, construct the test environment as shown in <xref target="inter-customer-lpp"/>. The Tester is connected to AS 1 and generates test traffic toward the DUT.</t>
            </li>
            <li>
              <t>Configure AS 1, AS 2, AS 3, the DUT, and AS 5 to establish the asymmetric routing scenario.</t>
            </li>
            <li>
              <t>The Tester sends both legitimate traffic (with source addresses in P1 and destination addresses in P4) and spoofed traffic (with source addresses in P5 and destination addresses in P4) to the DUT via AS 2. The ratio of spoofed to legitimate traffic may vary—for example, from 1:9 to 9:1.</t>
            </li>
          </ol>
          <t>The <strong>expected results</strong> for this test case are that the DUT blocks spoofed traffic and permits legitimate traffic received from the direction of AS 2.</t>
          <t>Note that the DUT may also be placed at AS 1 or AS 2 in <xref target="inter-customer-lpp"/> to evaluate its false positive and false negative rates using the same procedure. In these configurations, the DUT is expected to effectively block spoofed traffic.</t>
          <figure anchor="inter-customer-dsr">
            <name>SAV for customer-facing ASes in the scenario of direct server return (DSR).</name>
            <artwork><![CDATA[
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
|                  Test Network Environment                       |
|                                +----------------+               |
|                Anycast Server+-+    AS 3(P3)    |               |
|                                +-+/\----+/\+----+               |
|                                   /       \                     |
|                         P3[AS 3] /         \ P3[AS 3]           |
|                                 /           \                   |
|                                / (C2P)       \                  |
|                       +----------------+      \                 |
|                       |     DUT(P4)    |       \                |
|                       ++/\+--+/\+--+/\++        \               |
|          P6[AS 1, AS 2] /     |      \           \              |
|               P2[AS 2] /      |       \           \             |
|                       /       |        \           \            |
|                      / (C2P)  |         \ P5[AS 5]  \ P5[AS 5]  |
|      +----------------+       |          \           \          |
|User+-+    AS 2(P2)    |       | P1[AS 1]  \           \         |
|      +----------+/\+--+       | P6[AS 1]   \           \        |
|          P6[AS 1] \           |             \           \       |
|           P1[AS 1] \          |              \           \      |
|                     \         |               \           \     |
|                      \ (C2P)  | (C2P)    (C2P) \     (C2P) \    |
|                    +----------------+        +----------------+ |
|                    |AS 1(P1, P3, P6)|        |    AS 5(P5)    | |
|                    +----------------+        +----------------+ |
|                         /\     |                                |
|                          |     |                                |
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
                           |    \/
                     +----------------+
                     |     Tester     |
                     | (Edge Server)  |
                     +----------------+

Within the test network environment, P3 is the anycast prefix and is only advertised by AS 3 through BGP.
]]></artwork>
          </figure>
          <t><strong>SAV for Customer-facing ASes</strong>: <xref target="inter-customer-dsr"/> presents a test case for SAV in customer-facing ASes under a Direct Server Return (DSR) scenario. In this setup, AS 1, AS 2, AS 3, the DUT, and AS 5 form the test network, with the DUT performing SAV at the AS level. AS 1 is a customer of both AS 2 and the DUT; AS 2 is a customer of the DUT, which is itself a customer of AS 3; and AS 5 is a customer of both AS 3 and the DUT. When users in AS 2 send requests to an anycast destination IP, the forwarding path is AS 2-&gt;DUT-&gt;AS 3. Anycast servers in AS 3 receive the requests and tunnel them to edge servers in AS 1. The edge servers then return content to the users with source addresses in prefix P3. If the reverse forwarding path is AS 1-&gt;DUT-&gt;AS 2, the Tester sends traffic with source addresses in P3 and destination addresses in P2 along the path AS 1-&gt;DUT-&gt;AS 2. Alternatively, if the reverse forwarding path is AS 1-&gt;AS 2, the Tester sends traffic with source addresses in P3 and destination addresses in P2 along the path AS 1-&gt;AS 2. In this case, AS 2 may serve as the DUT.</t>
          <t>The <strong>procedure</strong> for testing SAV in this DSR scenario is as follows:</t>
          <ol spacing="normal" type="1"><li>
              <t>To evaluate whether the DUT can generate accurate SAV rules under DSR conditions, construct the test environment as shown in <xref target="inter-customer-dsr"/>. The Tester is connected to AS 1 and generates test traffic toward the DUT.</t>
            </li>
            <li>
              <t>Configure AS 1, AS 2, AS 3, the DUT, and AS 5 to establish the DSR scenario.</t>
            </li>
            <li>
              <t>The Tester sends legitimate traffic (with source addresses in P3 and destination addresses in P2) to AS 2 via the DUT.</t>
            </li>
          </ol>
          <t>The <strong>expected results</strong> for this test case are that the DUT permits legitimate traffic with source addresses in P3 received from the direction of AS 1.</t>
          <t>Note that the DUT may also be placed at AS 1 or AS 2 in <xref target="inter-customer-dsr"/> to evaluate its false positive and false negative rates using the same procedure. In these configurations, the DUT is expected to effectively block spoofed traffic.</t>
          <figure anchor="inter-customer-reflect">
            <name>SAV for customer-facing ASes in the scenario of reflection attacks.</name>
            <artwork><![CDATA[
             +~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
             |                   Test Network Environment                 |
             |                          +----------------+                |
             |                          |    AS 3(P3)    |                |
             |                          +--+/\+--+/\+----+                |
             |                              /      \                      |
             |                             /        \                     |
             |                            /          \                    |
             |                           / (C2P)      \                   |
             |                  +----------------+     \                  |
             |                  |     DUT(P4)    |      \                 |
             |                  ++/\+--+/\+--+/\++       \                |
             |     P6[AS 1, AS 2] /     |      \          \               |
             |          P2[AS 2] /      |       \          \              |
             |                  /       |        \          \             |
             |                 / (C2P)  |         \ P5[AS 5] \ P5[AS 5]   |
+----------+ |  +----------------+      |          \          \           |
|  Tester  |-|->|                |      |           \          \          |
|(Attacker)| |  |    AS 2(P2)    |      |            \          \         |
|  (P1')   |<|--|                |      | P1[AS 1]    \          \        |
+----------+ |  +---------+/\+---+      | P6[AS 1]     \          \       |
             |     P6[AS 1] \           | NO_EXPORT     \          \      |
             |      P1[AS 1] \          |                \          \     |
             |      NO_EXPORT \         |                 \          \    |
             |                 \ (C2P)  | (C2P)      (C2P) \    (C2P) \   |
             |             +----------------+          +----------------+ |
             |     Victim+-+  AS 1(P1, P6)  |  Server+-+    AS 5(P5)    | |
             |             +----------------+          +----------------+ |
             +~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
P1' is the spoofed source prefix P1 by the attacker which is inside of 
AS 2 or connected to AS 2 through other ASes.
]]></artwork>
          </figure>
          <t><strong>SAV for Customer-facing ASes</strong>: <xref target="inter-customer-reflect"/> illustrates a test case for SAV in customer-facing ASes under a reflection attack scenario. In this scenario, a reflection attack using source address spoofing occurs within the DUT's customer cone. The attacker spoofs the victim's IP address (P1) and sends requests to server IP addresses (P5) that are configured to respond to such requests. The Tester emulates the attacker by performing source address spoofing. The arrows in <xref target="inter-customer-reflect"/> indicate the business relationships between ASes: AS 3 serves as the provider for both the DUT and AS 5, while the DUT acts as the provider for AS 1, AS 2, and AS 5. Additionally, AS 2 is the provider for AS 1.</t>
          <t>The <strong>procedure</strong> for testing SAV under reflection attack conditions is as follows:</t>
          <ol spacing="normal" type="1"><li>
              <t>To evaluate whether the DUT can generate accurate SAV rules in a reflection attack scenario, construct the test environment as shown in <xref target="inter-customer-reflect"/>. The Tester is connected to AS 2 and generates test traffic toward the DUT.</t>
            </li>
            <li>
              <t>Configure AS 1, AS 2, AS 3, the DUT, and AS 5 to simulate the reflection attack scenario.</t>
            </li>
            <li>
              <t>The Tester sends spoofed traffic (with source addresses in P1 and destination addresses in P5) toward AS 5 via the DUT.</t>
            </li>
          </ol>
          <t>The <strong>expected results</strong> for this test case are that the DUT blocks spoofed traffic with source addresses in P1 received from the direction of AS 2.</t>
          <t>Note that the DUT may also be placed at AS 1 or AS 2 in <xref target="inter-customer-reflect"/> to evaluate its false positive and false negative rates using the same procedure. In these configurations, the DUT is expected to effectively block spoofed traffic.</t>
          <figure anchor="inter-customer-direct">
            <name>SAV for customer-facing ASes in the scenario of direct attacks.</name>
            <artwork><![CDATA[
             +~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
             |                   Test Network Environment                 |
             |                          +----------------+                |
             |                          |    AS 3(P3)    |                |
             |                          +--+/\+--+/\+----+                |
             |                              /      \                      |
             |                             /        \                     |
             |                            /          \                    |
             |                           / (C2P)      \                   |
             |                  +----------------+     \                  |
             |                  |     DUT(P4)    |      \                 |
             |                  ++/\+--+/\+--+/\++       \                |
             |     P6[AS 1, AS 2] /     |      \          \               |
             |          P2[AS 2] /      |       \          \              |
             |                  /       |        \          \             |
             |                 / (C2P)  |         \ P5[AS 5] \ P5[AS 5]   |
+----------+ |  +----------------+      |          \          \           |
|  Tester  |-|->|                |      |           \          \          |
|(Attacker)| |  |    AS 2(P2)    |      |            \          \         |
|  (P5')   |<|--|                |      | P1[AS 1]    \          \        |
+----------+ |  +---------+/\+---+      | P6[AS 1]     \          \       |
             |     P6[AS 1] \           | NO_EXPORT     \          \      |
             |      P1[AS 1] \          |                \          \     |
             |      NO_EXPORT \         |                 \          \    |
             |                 \ (C2P)  | (C2P)      (C2P) \    (C2P) \   |
             |             +----------------+          +----------------+ |
             |     Victim+-+  AS 1(P1, P6)  |          |    AS 5(P5)    | |
             |             +----------------+          +----------------+ |
             +~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
P5' is the spoofed source prefix P5 by the attacker which is inside of 
AS 2 or connected to AS 2 through other ASes.
]]></artwork>
          </figure>
          <t><strong>SAV for Customer-facing ASes</strong>: <xref target="inter-customer-direct"/> presents a test case for SAV in customer-facing ASes under a direct attack scenario. In this scenario, a direct attack using source address spoofing occurs within the DUT's customer cone. The attacker spoofs a source address (P5) and directly targets the victim's IP address (P1), aiming to overwhelm its network resources. The Tester emulates the attacker by performing source address spoofing. The arrows in <xref target="inter-customer-direct"/> indicate the business relationships between ASes: AS 3 serves as the provider for both the DUT and AS 5, while the DUT acts as the provider for AS 1, AS 2, and AS 5. Additionally, AS 2 is the provider for AS 1.</t>
          <t>The <strong>procedure</strong> for testing SAV under direct attack conditions is as follows:</t>
          <ol spacing="normal" type="1"><li>
              <t>To evaluate whether the DUT can generate accurate SAV rules in a direct attack scenario, construct the test environment as shown in <xref target="inter-customer-direct"/>. The Tester is connected to AS 2 and generates test traffic toward the DUT.</t>
            </li>
            <li>
              <t>Configure AS 1, AS 2, AS 3, the DUT, and AS 5 to simulate the direct attack scenario.</t>
            </li>
            <li>
              <t>The Tester sends spoofed traffic (with source addresses in P5 and destination addresses in P1) toward AS 1 via the DUT.</t>
            </li>
          </ol>
          <t>The <strong>expected results</strong> for this test case are that the DUT blocks spoofed traffic with source addresses in P5 received from the direction of AS 2.</t>
          <t>Note that DUT may also be placed at AS 1 or AS 2 in <xref target="inter-customer-direct"/> to evaluate its false positive and false negative rates using the same procedure. In these configurations, the DUT is expected to effectively block spoofed traffic.</t>
          <figure anchor="reflection-attack-p">
            <name>SAV for provider-facing ASes in the scenario of reflection attacks.</name>
            <artwork><![CDATA[
                                   +----------------+
                                   |     Tester     |
                                   |   (Attacker)   |
                                   |      (P1')     |
                                   +----------------+
                                        |     /\
                                        |      |
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
| Test Network Environment              \/     |                    |
|                                  +----------------+               |
|                                  |                |               |
|                                  |    AS 3(P3)    |               |
|                                  |                |               |
|                                  +-+/\----+/\+----+               |
|                                     /       \                     |
|                                    /         \                    |
|                                   /           \                   |
|                                  / (C2P/P2P)   \                  |
|                         +----------------+      \                 |
|                         |     DUT(P4)    |       \                |
|                         ++/\+--+/\+--+/\++        \               |
|            P6[AS 1, AS 2] /     |      \           \              |
|                 P2[AS 2] /      |       \           \             |
|                         /       |        \           \            |
|                        / (C2P)  |         \ P5[AS 5]  \ P5[AS 5]  |
|        +----------------+       |          \           \          |
|Server+-+    AS 2(P2)    |       | P1[AS 1]  \           \         |
|        +----------+/\+--+       | P6[AS 1]   \           \        |
|            P6[AS 1] \           | NO_EXPORT   \           \       |
|             P1[AS 1] \          |              \           \      |
|             NO_EXPORT \         |               \           \     |
|                        \ (C2P)  | (C2P)    (C2P) \     (C2P) \    |
|                      +----------------+        +----------------+ |
|              Victim+-+  AS 1(P1, P6)  |        |    AS 5(P5)    | |
|                      +----------------+        +----------------+ |
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
P1' is the spoofed source prefix P1 by the attacker which is inside of 
AS 3 or connected to AS 3 through other ASes.
]]></artwork>
          </figure>
          <t><strong>SAV for Provider/Peer-facing ASes</strong>: <xref target="reflection-attack-p"/> illustrates a test case for SAV in provider/peer-facing ASes under a reflection attack scenario. In this scenario, the attacker spoofs the victim's IP address (P1) and sends requests to server IP addresses (P2) that are configured to respond. The Tester emulates the attacker by performing source address spoofing. The servers then send overwhelming responses to the victim, exhausting its network resources. The arrows in <xref target="reflection-attack-p"/> represent the business relationships between ASes: AS 3 acts as either a provider or a lateral peer of the DUT and is the provider for AS 5, while the DUT serves as the provider for AS 1, AS 2, and AS 5. Additionally, AS 2 is the provider for AS 1.</t>
          <t>The <strong>procedure</strong> for testing SAV under reflection attack conditions is as follows:</t>
          <ol spacing="normal" type="1"><li>
              <t>To evaluate whether the DUT can generate accurate SAV rules for provider/peer-facing ASes in a reflection attack scenario, construct the test environment as shown in <xref target="reflection-attack-p"/>. The Tester is connected to AS 3 and generates test traffic toward the DUT.</t>
            </li>
            <li>
              <t>Configure AS 1, AS 2, AS 3, the DUT, and AS 5 to simulate the reflection attack scenario.</t>
            </li>
            <li>
              <t>The Tester sends spoofed traffic (with source addresses in P1 and destination addresses in P2) toward AS 2 via AS 3 and the DUT.</t>
            </li>
          </ol>
          <t>The <strong>expected results</strong> for this test case are that the DUT blocks spoofed traffic with source addresses in P1 received from the direction of AS 3.</t>
          <t>Note that the DUT may also be placed at AS 1 or AS 2 in <xref target="reflection-attack-p"/> to evaluate its false positive and false negative rates using the same procedure. In these configurations, the DUT is expected to effectively block spoofed traffic.</t>
          <figure anchor="direct-attack-p">
            <name>SAV for provider-facing ASes in the scenario of direct attacks.</name>
            <artwork><![CDATA[
                           +----------------+
                           |     Tester     |
                           |   (Attacker)   |
                           |      (P2')     |
                           +----------------+
                                |     /\
                                |      |
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
| Test Network Environment      \/     |                    |
|                          +----------------+               |
|                          |    AS 3(P3)    |               |
|                          +-+/\----+/\+----+               |
|                             /       \                     |
|                            /         \                    |
|                           /           \                   |
|                          / (C2P/P2P)   \                  |
|                 +----------------+      \                 |
|                 |     DUT(P4)    |       \                |
|                 ++/\+--+/\+--+/\++        \               |
|    P6[AS 1, AS 2] /     |      \           \              |
|         P2[AS 2] /      |       \           \             |
|                 /       |        \           \            |
|                / (C2P)  |         \ P5[AS 5]  \ P5[AS 5]  |
|+----------------+       |          \           \          |
||    AS 2(P2)    |       | P1[AS 1]  \           \         |
|+----------+/\+--+       | P6[AS 1]   \           \        |
|    P6[AS 1] \           | NO_EXPORT   \           \       |
|     P1[AS 1] \          |              \           \      |
|     NO_EXPORT \         |               \           \     |
|                \ (C2P)  | (C2P)    (C2P) \     (C2P) \    |
|              +----------------+        +----------------+ |
|      Victim+-+  AS 1(P1, P6)  |        |    AS 5(P5)    | |
|              +----------------+        +----------------+ |
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
P2' is the spoofed source prefix P2 by the attacker which is inside of 
AS 3 or connected to AS 3 through other ASes.
]]></artwork>
          </figure>
          <t><xref target="direct-attack-p"/> presents a test case for SAV in provider-facing ASes under a direct attack scenario. In this scenario, the attacker spoofs a source address (P2) and directly targets the victim's IP address (P1), overwhelming its network resources. The arrows in <xref target="direct-attack-p"/> represent the business relationships between ASes: AS 3 acts as either a provider or a lateral peer of the DUT and is the provider for AS 5, while the DUT serves as the provider for AS 1, AS 2, and AS 5. Additionally, AS 2 is the provider for AS 1.</t>
          <t>The procedure for testing SAV under direct attack conditions is as follows:</t>
          <ol spacing="normal" type="1"><li>
              <t>To evaluate whether the DUT can generate accurate SAV rules for provider-facing ASes in a direct attack scenario, construct the test environment as shown in <xref target="direct-attack-p"/>. The Tester is connected to AS 3 and generates test traffic toward the DUT.</t>
            </li>
            <li>
              <t>Configure AS 1, AS 2, AS 3, the DUT, and AS 5 to simulate the direct attack scenario.</t>
            </li>
            <li>
              <t>The Tester sends spoofed traffic (with source addresses in P2 and destination addresses in P1) toward AS 1 via AS 3 and the DUT.</t>
            </li>
          </ol>
          <t>The <strong>expected results</strong> for this test case are that the DUT blocks spoofed traffic with source addresses in P2 received from the direction of AS 3.</t>
          <t>Note that the DUT may also be placed at AS 1 or AS 2 in <xref target="direct-attack-p"/> to evaluate its false positive and false negative rates using the same procedure. In these configurations, the DUT is expected to effectively block spoofed traffic.</t>
          <figure anchor="inter-domain-frr-topo">
            <name>Inter-domain SAV under FRR scenario.</name>
            <artwork><![CDATA[
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
|                   Test Network Environment           |
|          +-----------+            +-----------+      |
|          |   AS3     |------------|   AS2     |      |
|          +-----------+            +-----------+      |
|               /\                       /\            |
|               |                        |             |
| primary link  |            backup link |             |
|               | (C2P)                  | (C2P)       |
|        +-----------------------------------------+   |
|        |                   DUT                   |   |
|        +-----------------------------------------+   |
|                           /\                         |
|                           |                          |
|                           | Legitimate and           |
|                           | Spoofed Traffic          |
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
                            | (C2P)
                     +-------------+
                     |    Tester   |
                     +-------------+
]]></artwork>
          </figure>
          <t><strong>SAV under FRR Scenario</strong>: Inter-domain Fast Reroute (FRR) mechanisms, such as BGP Prefix Independent Convergence (PIC) or MPLS-based FRR, allow rapid failover between ASes after a link or node failure. These events may temporarily desynchronize routing information and SAV rules.</t>
          <t>The <strong>procedure</strong> for testing SAV under FRR scenario is as follows:</t>
          <ol spacing="normal" type="1"><li>
              <t>Configure FRR or BGP PIC on the DUT for inter-AS links to AS3 (primary) and AS2 (backup).</t>
            </li>
            <li>
              <t>Continuously send legitimate and spoofed traffic from AS1 toward DUT.</t>
            </li>
            <li>
              <t>Trigger a failure on the AS3–DUT link to activate the FRR path via AS2.</t>
            </li>
            <li>
              <t>Measure false positive and false negative rates during and after switchover.</t>
            </li>
            <li>
              <t>Restore the AS3 link and verify SAV table consistency.</t>
            </li>
          </ol>
          <t>The <strong>expected results</strong> for this test case are that the DUT <bcp14>MUST</bcp14> maintain consistent SAV filtering during FRR events. Transient topology changes <bcp14>SHOULD NOT</bcp14> lead to acceptance of spoofed traffic or unnecessary blocking of legitimate packets.</t>
          <figure anchor="inter-domain-pbr-topo">
            <name>Inter-domain SAV under PBR scenario.</name>
            <artwork><![CDATA[
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
|               Test Network Environment           |
|     +-----------+            +-----------+       |
|     |   AS3     |------------|   AS2     |       |
|     +-----------+            +-----------+       |
|          /\                       /\             |
|           |                        |             |
|           | preferred path         | default path|
|           | (C2P)                  | (C2P)       |
|    +-----------------------------------------+   |
|    |                  DUT                    |   |
|    +-----------------------------------------+   |
|                        /\                        |
|                         | Legitimate and         |
|                         | Spoofed Traffic        |
|                         |                        |
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
                          | (C2P) 
                   +-------------+
                   |    Tester   |
                   +-------------+
]]></artwork>
          </figure>
          <t><strong>SAV under PBR Scenario</strong>: In inter-domain environments, routing policies such as local preference, route maps, or communities may alter path selection independently of shortest-path routing. Such policy-driven forwarding can affect how the SAV rules are derived and applied.</t>
          <t>The <strong>procedure</strong> for testing SAV under PBR scenario is as follows:</t>
          <ol spacing="normal" type="1"><li>
              <t>Configure a routing policy on the DUT (e.g., set local preference) to prefer AS3 for specific prefixes while maintaining AS2 as an alternative path.</t>
            </li>
            <li>
              <t>Generate legitimate and spoofed traffic from AS1 matching both policy-affected and unaffected prefixes.</t>
            </li>
            <li>
              <t>Observe SAV filtering behavior before and after policy changes.</t>
            </li>
            <li>
              <t>Modify the routing policy dynamically and measure false positive and false negative rates.</t>
            </li>
          </ol>
          <t>The <strong>expected results</strong> for this test case are that the DUT <bcp14>SHOULD</bcp14> maintain correct SAV filtering regardless of routing policy changes. Legitimate traffic rerouted by policy <bcp14>MUST NOT</bcp14> be dropped, and spoofed traffic <bcp14>MUST NOT</bcp14> be forwarded during or after policy updates.</t>
        </section>
        <section anchor="control-plane-performance">
          <name>Control Plane Performance</name>
          <t>The test setup, procedure, and metrics for evaluating protocol convergence performance and protocol message processing performance can refer to <xref target="intra-control-plane-sec"/>. Note that the tests for control plane performance of the DUT which performs inter-domain SAV are <bcp14>OPTIONAL</bcp14>. Only DUT which implements the SAV mechanism using an explicit control-plane communication protocol, such as SAV-specific information communication mechanism proposed in <xref target="inter-domain-arch"/> <bcp14>SHOULD</bcp14> be tested on its control plane performance.</t>
        </section>
        <section anchor="data-plane-performance">
          <name>Data Plane Performance</name>
          <t>The test setup, procedure, and metrics for evaluating data plane SAV table refresh performance and data plane forwarding performance can refer to <xref target="intra-data-plane-sec"/>.</t>
        </section>
      </section>
      <section anchor="resource-utilization-1">
        <name>Resource Utilization</name>
        <t>When evaluating the DUT for both intra-domain (<xref target="intra_domain_sav"/>) and inter-domain SAV (<xref target="inter_domain_sav"/>) functionality, CPU utilization (for both control and data planes) and memory utilization (for both control and data planes) <bcp14>MUST</bcp14> be recorded. These metrics <bcp14>SHOULD</bcp14> be collected separately per plane to facilitate granular performance analysis.</t>
      </section>
    </section>
    <section anchor="reporting-format">
      <name>Reporting Format</name>
      <t>Each test follows a reporting format comprising both global, standardized components and individual elements specific to each test. The following parameters for test configuration and SAV mechanism settings <bcp14>MUST</bcp14> be documented in the test report.</t>
      <t>Test Configuration Parameters:</t>
      <ol spacing="normal" type="1"><li>
          <t>Test device hardware and software versions</t>
        </li>
        <li>
          <t>Network topology</t>
        </li>
        <li>
          <t>Test traffic attributes</t>
        </li>
        <li>
          <t>System configuration (e.g., physical or virtual machine, CPU, memory, caches, operating system, interface capacity)</t>
        </li>
        <li>
          <t>Device configuration (e.g., symmetric routing, NO_EXPORT)</t>
        </li>
        <li>
          <t>SAV mechanism</t>
        </li>
      </ol>
    </section>
    <section anchor="IANA">
      <name>IANA Considerations</name>
      <t>This document has no IANA actions.</t>
    </section>
    <section anchor="security">
      <name>Security Considerations</name>
      <t>The benchmarking tests outlined in this document are confined to evaluating the performance of SAV devices within a controlled laboratory environment, utilizing isolated networks.</t>
      <t>The network topology employed for benchmarking must constitute an independent test setup. It is imperative that this setup remains disconnected from any devices that could potentially relay test traffic into an operational production network.</t>
    </section>
  </middle>
  <back>
    <references anchor="sec-combined-references">
      <name>References</name>
      <references anchor="sec-normative-references">
        <name>Normative References</name>
        <reference anchor="RFC3704">
          <front>
            <title>Ingress Filtering for Multihomed Networks</title>
            <author fullname="F. Baker" initials="F." surname="Baker"/>
            <author fullname="P. Savola" initials="P." surname="Savola"/>
            <date month="March" year="2004"/>
            <abstract>
              <t>BCP 38, RFC 2827, is designed to limit the impact of distributed denial of service attacks, by denying traffic with spoofed addresses access to the network, and to help ensure that traffic is traceable to its correct source network. As a side effect of protecting the Internet against such attacks, the network implementing the solution also protects itself from this and other attacks, such as spoofed management access to networking equipment. There are cases when this may create problems, e.g., with multihoming. This document describes the current ingress filtering operational mechanisms, examines generic issues related to ingress filtering, and delves into the effects on multihoming in particular. This memo updates RFC 2827. This document specifies an Internet Best Current Practices for the Internet Community, and requests discussion and suggestions for improvements.</t>
            </abstract>
          </front>
          <seriesInfo name="BCP" value="84"/>
          <seriesInfo name="RFC" value="3704"/>
          <seriesInfo name="DOI" value="10.17487/RFC3704"/>
        </reference>
        <reference anchor="RFC8704">
          <front>
            <title>Enhanced Feasible-Path Unicast Reverse Path Forwarding</title>
            <author fullname="K. Sriram" initials="K." surname="Sriram"/>
            <author fullname="D. Montgomery" initials="D." surname="Montgomery"/>
            <author fullname="J. Haas" initials="J." surname="Haas"/>
            <date month="February" year="2020"/>
            <abstract>
              <t>This document identifies a need for and proposes improvement of the unicast Reverse Path Forwarding (uRPF) techniques (see RFC 3704) for detection and mitigation of source address spoofing (see BCP 38). Strict uRPF is inflexible about directionality, the loose uRPF is oblivious to directionality, and the current feasible-path uRPF attempts to strike a balance between the two (see RFC 3704). However, as shown in this document, the existing feasible-path uRPF still has shortcomings. This document describes enhanced feasible-path uRPF (EFP-uRPF) techniques that are more flexible (in a meaningful way) about directionality than the feasible-path uRPF (RFC 3704). The proposed EFP-uRPF methods aim to significantly reduce false positives regarding invalid detection in source address validation (SAV). Hence, they can potentially alleviate ISPs' concerns about the possibility of disrupting service for their customers and encourage greater deployment of uRPF techniques. This document updates RFC 3704.</t>
            </abstract>
          </front>
          <seriesInfo name="BCP" value="84"/>
          <seriesInfo name="RFC" value="8704"/>
          <seriesInfo name="DOI" value="10.17487/RFC8704"/>
        </reference>
        <reference anchor="RFC2544">
          <front>
            <title>Benchmarking Methodology for Network Interconnect Devices</title>
            <author fullname="S. Bradner" initials="S." surname="Bradner"/>
            <author fullname="J. McQuaid" initials="J." surname="McQuaid"/>
            <date month="March" year="1999"/>
            <abstract>
              <t>This document is a republication of RFC 1944 correcting the values for the IP addresses which were assigned to be used as the default addresses for networking test equipment. This memo provides information for the Internet community.</t>
            </abstract>
          </front>
          <seriesInfo name="RFC" value="2544"/>
          <seriesInfo name="DOI" value="10.17487/RFC2544"/>
        </reference>
        <reference anchor="RFC4061">
          <front>
            <title>Benchmarking Basic OSPF Single Router Control Plane Convergence</title>
            <author fullname="V. Manral" initials="V." surname="Manral"/>
            <author fullname="R. White" initials="R." surname="White"/>
            <author fullname="A. Shaikh" initials="A." surname="Shaikh"/>
            <date month="April" year="2005"/>
            <abstract>
              <t>This document provides suggestions for measuring OSPF single router control plane convergence. Its initial emphasis is on the control plane of a single OSPF router. We do not address forwarding plane performance.</t>
              <t>NOTE: In this document, the word "convergence" relates to single router control plane convergence only. This memo provides information for the Internet community.</t>
            </abstract>
          </front>
          <seriesInfo name="RFC" value="4061"/>
          <seriesInfo name="DOI" value="10.17487/RFC4061"/>
        </reference>
        <reference anchor="RFC2119">
          <front>
            <title>Key words for use in RFCs to Indicate Requirement Levels</title>
            <author fullname="S. Bradner" initials="S." surname="Bradner"/>
            <date month="March" year="1997"/>
            <abstract>
              <t>In many standards track documents several words are used to signify the requirements in the specification. These words are often capitalized. This document defines these words as they should be interpreted in IETF documents. This document specifies an Internet Best Current Practices for the Internet Community, and requests discussion and suggestions for improvements.</t>
            </abstract>
          </front>
          <seriesInfo name="BCP" value="14"/>
          <seriesInfo name="RFC" value="2119"/>
          <seriesInfo name="DOI" value="10.17487/RFC2119"/>
        </reference>
        <reference anchor="RFC8174">
          <front>
            <title>Ambiguity of Uppercase vs Lowercase in RFC 2119 Key Words</title>
            <author fullname="B. Leiba" initials="B." surname="Leiba"/>
            <date month="May" year="2017"/>
            <abstract>
              <t>RFC 2119 specifies common key words that may be used in protocol specifications. This document aims to reduce the ambiguity by clarifying that only UPPERCASE usage of the key words have the defined special meanings.</t>
            </abstract>
          </front>
          <seriesInfo name="BCP" value="14"/>
          <seriesInfo name="RFC" value="8174"/>
          <seriesInfo name="DOI" value="10.17487/RFC8174"/>
        </reference>
      </references>
      <references anchor="sec-informative-references">
        <name>Informative References</name>
        <reference anchor="intra-domain-ps" target="https://datatracker.ietf.org/doc/draft-ietf-savnet-intra-domain-problem-statement/">
          <front>
            <title>Source Address Validation in Intra-domain Networks Gap Analysis, Problem Statement, and Requirements</title>
            <author>
              <organization/>
            </author>
            <date year="2025"/>
          </front>
        </reference>
        <reference anchor="inter-domain-ps" target="https://datatracker.ietf.org/doc/draft-ietf-savnet-inter-domain-problem-statement/">
          <front>
            <title>Source Address Validation in Inter-domain Networks Gap Analysis, Problem Statement, and Requirements</title>
            <author>
              <organization/>
            </author>
            <date year="2025"/>
          </front>
        </reference>
        <reference anchor="intra-domain-arch" target="https://datatracker.ietf.org/doc/draft-ietf-savnet-intra-domain-architecture/">
          <front>
            <title>Intra-domain Source Address Validation (SAVNET) Architecture</title>
            <author>
              <organization/>
            </author>
            <date year="2025"/>
          </front>
        </reference>
        <reference anchor="inter-domain-arch" target="https://datatracker.ietf.org/doc/draft-wu-savnet-inter-domain-architecture/">
          <front>
            <title>Inter-domain Source Address Validation (SAVNET) Architecture</title>
            <author>
              <organization/>
            </author>
            <date year="2025"/>
          </front>
        </reference>
      </references>
    </references>
    <?line 1081?>

<section numbered="false" anchor="Acknowledgements">
      <name>Acknowledgements</name>
      <t>Many thanks to Aijun Wang, Nan Geng, Susan Hares, Giuseppe Fioccola, Minh-Ngoc Tran, Shengnan Yue, Changwang Lin, Yuanyuan Zhang, and Xueyan Song, for their valuable comments and reviews on this document.
Apologies to any others whose names the authors may have missed mentioning.</t>
    </section>
  </back>
  <!-- ##markdown-source:
H4sIAAAAAAAAA+19+ZbbRnb3/32O3gGxTsbdEkmpW5Izo5k4aWvx6EQL0y3N
ZBL7+IAgSGIEAjSWbtNqzZl3yPcA37N8jzJP8t2lVqCwcGlZtsVk5CaIqrp1
69ZdfrXc4XB44yAv/GT6nR+nSfjQK7IyvHEQrTL6My9O7t793d2TGweBXzz0
omSWeje9R4sweAvlyskyyvMoTYr1Coo+e/L66Y0DPwv9h97XYRJmfuz9z9mT
8fPTR0++vXFwOZev3DiYpkHiL6HMNPNnxTAKi9lwsrycD3P/IgkL/M9wEibB
Yulnb6NkPrx7F4sVURFDoa+MX7wXYbFIp2mcztfeLM28Z0mR+cNpuvSjxIOO
4YMwkw/O0zILQu90Os3CPPf+5MfR1C+gC0D4ZJKFFw+989M/UQM3DmI/AZrD
BJv2S2gme3jjYAhcyB96Nw48j7vwPEKGJPggzeD9/16kyXxe+klQJt5zf5Jm
fpFma/w9iIo1kh/9FSinB2kJ5MKzR4so8YG1ZR56r58/9g7DH4JwVXhv/uMI
apXvUYtYLoTOxA+9OAqg5X//cR7E/mQUTstRkDgofOwDIZEi8HUOrS9K33uT
RBdhlgNR10FckSJvk38vRHPN9D2PJhFSWP40PCzjSScLnwMpwOq595/RTzLS
30dJHFSpvHGQpNkSxPcifIjvnj19dO9f7t6Xf//W+PvkwX319/27Xxzj3zDN
YT6bFUTG3BmucnrmeYWfzUOY/YuigGd37sCE8eG94G2YjXDmjoAZd2BG3zEm
s5jHdoVZOonD5RD0TREuw6S4I+rnWd04NYEse1a/DIvLNHube1/7K+808eN1
HuUDb8z1e+ey/gHN/7Pw+zLK6EHucYtQLzR4cvfkgei1UhD76rVR4W691qpr
/73WQ+NnwWLPo41VRkUYFGUW2l22xrK5/4egiF8+eX3knRo1dQ/g9l25LJ0D
2NqRHpZlg46A4hkOPX+SI5UFTvDXiyj3gMQSB9KbhrMoCXNvqWxeBN/Q6pnG
0isWobcKM5rbCdCUzqzRJvkwu+jlTLsvaL+waT+C9oKFn0T5Mh+heTS+e2Du
vbKI4ujHcAoq35uT4S9Cei8rY6APnq7AsmIHKg3lqzSFHs1ZZBf+RQgdCUH0
l6uYRBfqvIyKhQf9WANZWZSWwI4wj+ZJjhMkzaZhRg1wd6lVeI4cCNIMGlml
yRR5kgdhguWhBzZPC/8t0ggF/BXMVT9YIL+CNMkjqBuL+lTrNLyIgHRoawKv
epMY5MibpD8MeFQk35eGO4KsRtITpHHhFx6068+TNC+iACvi9zVrbcKAmAsg
Icc6qFIa6GXo52XWMMrhDxHUjSQDO5PwktkhmUkjmo9QrFDMltF0Gof47SZN
yXRaBuwJ3Tg47xAIoBPHIJpFgZ8U8RobSTPwI4seoz3yXq1CMpUsPnk5n4d5
wfIzDVdxuvam0WwWZlhJRd7evRNW7v17/vu3/PfEz6GClIY+ykQ1xMeE9Sb4
cRdRliakEUfQY6Qrwj4NNN+saXLI307Pj+oz5pC/4W8VAkmKhdJnGaXOQjvg
Dqfgby1Cf0oVYkE/CEqY62uvTFCSpYgraYU+Vowy9btist6/H9X9XoteVkGe
qctqdeOP9drFUxwo6NUqzcU8L0E2SQZ5OqL4ocS1K5oKr6CePI0vuB7JM5wH
oa3QzEnFOnDKs9ycL5dRHHuLMF4JjqN4kUIqcAalQD8zG3QTkO4jwdgskmRO
o0tw8ywhgrZ47ucgQj6QyAOXTLlFP85TbhZkfioaLUCeXRO0Ph9pUsvmqJVR
Xe8rVUDaFZRHhwGw2GzSALxNgQ2LKLzQg5dBD2dlErCMguM6sFXe0l/DHF3h
BPeWZVxEK8EDPZJQIo7TS2zbYr5gGBAL3u4SFB9UFBX0TPc7X4UBapKGmWqz
QirmvFMtZ+Hcz0DDgeYRQy0klaTSGgQWORy0vFm+QNNhG6WYAIrhNiHYLWti
H0ajcDTwZiAnIA4pRFvgbRMN/CgBMukRSmZ+NGCBzNIiDdIYuwsqA4Y9sGSJ
DSb8CGo79lYQqHKV6OGIr/DqJfQfh8QWgEWYy76KLonfWX/6oMGy6SXO9ywt
Ye6iNOTprKg8giEqSlBpSxQnaO/wTy+OMGoqmD5gg08EwrQHxaZ/uFxA0Aqe
ARhw3x7E6lBDU3GDu+PnOYyrFHXBenOuMXuWK59MpVSrVd+lJgg8+W7e9L5O
oWukpAMQaJ6RLUpJkGpIDXqUQGoA4pKTWbpMvXTyV1C9MNYYY9w4uOWdqm78
Jp5+X6a/Z+Y4p28OrefFbzJ6jwwJjgsoNxDuyzCOh1JwtafDjbywPAaSmWhS
ku6Bonk5GeZrsL6sjvEVrBumM5EhfvrcYu4h6by3SXqZ4BgK2pdRkKUatRGU
Hml1pmdMCA5FieLu1MDkrk4j0HgoX1L3HoajOcwjKZx3KiKZH3mTtTHkjQ4E
G9qazrH8Q6g0zEHT5azYpyE0sEQhV27iKoukk2u4GiiaWejHQ6gynjrqFtJl
BWfP/WRe+nMlZG/DtQfFprn32Ys3568/G/B/vZev6O+zJ//55tnZk8f49/kf
T58/V38ciDfO//jqzfPH+i9d8tGrFy+evHzMheGpZz06+OzF6V8+Y9Xy2avx
62evXp4+/6yuCJHprG/JtoO3h76bnx+Afg1AtngafPVo/P/+7/F9cCf+CfGH
4+PfkW+BX357/C/osqGl5dbSBFxI/gocXh8Ai0M/I/8Z5DDwV2AzYjQy4Bkt
UOoWMLSjg4Nb/4Oc+fah94dJsDq+/6V4gB22HkqeWQ+JZ/UntcLMRMcjRzOK
m9bzCqdtek//Yn2XfDce/uHfULd4w+Pf/tuXB+yvvyZ5JN1D7jqI4SNhDMao
/R+SEONj20aQ9UTND3MMpBjmFaqnKAnikiwFmKKFCHpYgS7LBFz8QmjaYRbG
fkHjK4AjsJ2SgMdoeyqtG/YoB5/AmPAcGmJwR1GSmo9UAuLfkqNkJENEH+h/
YXQG1KZLMmsYyRccNMgJCkKEPRAh4TIq0BBNozwAteFFxchDav8Ijshw5gdY
yRlpj4feaeKF07nUJlAkA1Udo0aB8C0QwYnvxf4avNkTb4G+jJjfxIFHJXRw
Cb+111upLkmT4Vdfj71AlFYqgw0BDwx0XOg4YgeZTmQjPiQ+CGeB6Dg9977i
qNigwHLKBSWCTvAAwh/gO0Ynp+dUBQsY9M+A1oXiosfnYVGupLYiTzfHJzhd
5wL2B3WBqjiOfHLMIYKnWA1RSIxVsOBjNtpvSB1TvYeP37w+4qIWk/BHtHao
KGhwJZOkaYC3ULBJaIgz8oUC4hURiXN8y6rp3TukOoCIcZiHgSRINIMxugfs
mqE0CjQjzSxoQ1Uv3qIOqtBNBL3ojYUYVbKbL4Lg0IYtQvCnsUYkWobHKLFx
OAc/cUk/cSMQ4BF+IPUDdXqRol8tW6bwLCM3hjxeURBH6m/qw7DT7b81fm7z
G1c8JgJ09J5ol9y7km9QTUPrc5seWm/wv+pzJd+Akl9ab8Dwm29gbTcOrjrq
4br0W2569FttPddvdXyst5wt1t5yUq85IT70WAgi/vYHVStz1F2X+VuVGnvs
3z30bpLrTxOWYMx//UxP6tFn71FaeH7QO2CmIcgtEZCU7hrNeJhvs2heZlYM
2zSnR96fWc+r4o5Ib8A1gAyIqIScC60KaJLx/DfnPbhbBQZ+OAnCYk2CTyrJ
9ucETzGUhb8gAI3yBZRiNS8VDnaFmpGU4EyMEpjoC1Ca8wWTBHFS5GdrFa5T
/TDL0VyJQNOobwI9DRlRWOL8nUZBQSYUvNWiGlJwVGYGE3VV9bTMkAkIaDDL
RNcWPg8PeEv+JMIwnqE4UnsQEUO3SMOEDD7VNQzpI+GYKzOtwljRM+QK2pAw
hlcTg83syEUcJlN8oNSlrN8vPHJk+GEqgjiOzLviVqP9EWLsEYMV8VqLDTA3
Y7+a7IZELED/z7EyE+GIWHi0RwteJkkNsB7sVqgcdan+XktDgp0UUv7ImgLS
HnKEnxIebPDsUk+AmnHCkfMJ9PZBFtOkGhCRhGXhTI24JTPCjfPI8RUOgxxH
GfQhBb4B3sRpwOEujG+WijHQdMG0zMn5IiOeheCY5wRa0IAhx0ayu9IHsfRB
zmCgbaRx6nFQxqJSqChLhvHsEkI4FSYSyQXrBmoI38ilAUQjmhQROBk4/WOI
E8BL02g+GVLpS/JAVWgjAGcqImKy0HPqTm1igdvMqDp6nVzXAMQKvsCALQwf
UoIiQcUHlM9BjYBbNmGrL9wvQkdtx0xwbOCJ8AJ0oOxfNMU+z6Jwysh1YsOq
6OSDFyeko8wRNcHZEHOXF9EqV4ro9DwUwkKzdKL4BXUjG/6M+KchP51Tk1ut
eExawUnNC7yhHkEAD2IOQcaP/I6ELWQFZGjEACD5BFIIxQUjXtdbWpzAz7ex
DWv9IeQAEuc6T0wQLhgogioD4ugsLgloI1UgEADWBCSfY0MdPUumGBdBWQVv
5ELjxxRgYRRv6q9IFfAO/2P8LD+ScDDaTom41I0BKDt4jgUMqeA1IIk2VfSB
XjqwwvG6ITmF4Wc+N1U/y9KlMU9RO67KwlTFrARoAxEGWDhJtEeA1X4up5hC
VHQ7Pq5hoxTgyzTJZtEPOMoqpJBoIQ4PITEE3lBzPNMLaeo0TZ/nNd1pyjCy
nAFo0KhA04JoIx+iET+lnydhnF4a1Be8ZFXmAmBcSzXGtecg3/kGPRNMFz3I
UeNqKy6tsa5aulwI5V6mxGpps54SrDyWSPMZEK+Mk4oQkGEOF0CEnbmOqAXW
raMT+aqY3mRa2IiQFYVHzVM04pkGnI9DEao1OBl2Z15KjLy5MzX6mrvi6Hdz
Z6r1bt+VscT0HxmY/utoqTqkoBoX+A8Eo8iCNc4JNRSLS1GqbBl3WBJi4z61
Ksn/KVdT09cTkAzqf2XWF36CLy8QkUuUjxeJSABpAiOzyk1lMQH2JtL/sSuS
skzYQCiHTrYt6BHLACDSQi2T4gylY+1aDrHW8kw0xGT7C7ABPtAwZocJqXrN
fj0oNUOo+O2leHul3y7U23K2Cjbo5xpWUyv0BFKxJuiFpwnPxx5Ay7u2OOIY
a7fXLNihUToq+po0yplWheYM66U24XuYqbUD1qSFEEafZhQPaw7k50Y9FGjJ
PQokelYnqxPLoKV3B59qFd7QraqSt8a10qzxrjHiOK6VITfKWEqjafi6uyZc
X2OjBO4msbeCtPQKasJxoMVYie2CryS+0ZBAvBSqZXhDrU5DjLBzZW9amhHz
Vlk0Yd6pVIhciGDartUCqVFRZUW8PqZnoXDl3tBOI19uUxErNfxbqX+rCOWj
8Rvq8zIE534NJhsntq0yBBJuhGmqeXu78WsK0d/dtLwpQaa9sQ0qfneTHPzv
+NF3uX8hXq1ZaSTPYevIu7x165VcOLx166H3xIzRK8vKypFxrC1XQnhDaAkc
MKwiVqBcbhm8s1mUWKMrcKlgLhi7KV9UBqaODUmsGkXodMcE1kEj0BY73JIn
Qir9CxLpz+cZdTFNhhxXVd+jIXzGIzpLxR6FQWVByzKqNnrEHpyOhqLqEFsz
VkuOwMDI95OVsbOtA6Qqi4VvKVmlVzBxez3F1NDSHAvW4hoZlZYJRbfVF0y5
zsuJCjRXWXRhAtr1iinMhIk8hHqHtExXq1uIlVkvjo1aVagWEAusOGrgZi1T
1l/5Jh2WrEK3GzofIfMbF3w18CDipSirgGSTOA3eKktNhqmpYaBJTgEicgTz
FTwSqBU3mgx4cwK7juiay35UhpolIfxhxdCmCK/Iq/ILNUllrEHUIX5Yg/Jb
wOyuz20n0N2I+bcB4PbHoOl2wystpa/EspV3XEW57dJPn32F000uGWzY9mPs
pvy8BEn9bpGu4M873zANjR8qfXx3RP935/iBp5h1LAt2lW7+Ff/55s42pQ3Q
fyueq7WXVp5fR9t9eW5+XpvLboLn1jN3aTGXn42N6SxKmwt19u922zCTzcEX
pStPrbZ3m6HNTJHS4n6nsgjkWEIyKhIA3eF5OZHSfHTV8C7+c2h09sirLD91
06A7yMtR1hZUhaDm60QuT0mQTK2Rw9+ml4Ba3TLL+Xq5DAvcJimjT2mBxfrW
rVuyzkdGnbgrQDIA/azK7liTtMqqWBXIbSYAadV7vAicEFtMZ7xhrqGPYskc
JqnaDUZoQ26UuVOHpOUuQVyMIm9YKVdxUo58KwQtQNxH3ikacQ1Di516YU5b
Tw0X4vScKqatbRwyhLPoB3MSDGrL+MoxMlbjxOo3dsJoGMEYf4pGNRIhVZTL
JiJHbEy0CK9KebMdS12iHh2DEKxeMIxL8JwBXXNXaAmBXAK20lEySctEe2+s
Q8RImvrAzzLc8uddRL7NBRxOxPeN7RyEkJiFFarCtNCgxKFJHXiuU1oztMkh
RdjkB5ot6LFBT4pGhA89tEixsaiJ8CN3x1Kc5IbKnWO8YoxRJG4dS1e8DPd7
fCnLxfbiCXCvKDmqWZqLvRX0frWKcfUu/L7E1T+knl9TbeRqberWLYpxpjBs
t25xACuwaHVKJMr7z1zar8oxBO/fPKbd1CCo0WytNx4Zi9hqBVRtPNcwW6tG
E3sUa8QEeJiFgqaBsePFp35NaA8eKpQoKCTm36LBqpteXLOVgyzUHX5lkhKT
T0ZqETS0iig1g2uJarm93h9j+Z8qvGcRJdmX1wJUKeSHbim3JfzIGc+2Fj1R
RYsUIQ573mp1J9+DUBLKJmlBqwSos1IZE6kxZdgYQ2d+U6q46kqZER51Ln3R
3OEgiOLUH3wMrQesNY4f/g5L/e7hsTEjKjFHLidGfZpaoQjFIHkdFEDxoAnh
Ii7Nojl5Vbj6LvWYKUPeHuOZvzWFNL1jml2jmvYKVGBz0unr9optxEtqppkV
tIc34DbiPw0vyQjnmEX7CyvCUcEJ1qCnmPVSJUSC31THdXmKc6xW9EsOJtZi
jNbhcFRQC3Fa48zroIA/cgT4i/OVegVWbGNUcKc56pEVuGKeb3T5xrinQoEV
4Xwha7hTfV6lYOfp3MwgouGnCH+q0c/1hD/+LvGPf80BkN8dAbWQ8IFDoFpU
w8exfDfibEc59SiGHBGXpzPoioj6BD58CDrBM2vcEq6YtR/yEpva8v3HL0o1
0766IIQYRhzUULacYgYOSKq12UFQUyWSe+3xkGamoqkWFlleZ3tQVSs98r5q
GlZ8mGhXTaDHeKyc12O4KegT0xiaQTZ7xzlEKiEekdYbFXuRJyvp17VjZ1lL
5jaKEy0P+ovGGLF9optRorGH31xc2zJMa1UvDXGa44xIz1CNozFHm2Y4hifx
cEu1XEEyQht9bKklJvO7g7JG5cPrLVKn0FKX1Atoov260+2K3AYVZaY8MiuI
w5fbDExnGFcLxHgHVYMISsE+akJy2ovfVcXbAzk1f37FgRyfl9ouknOdMenj
EDV4ZVfqEMShBCmPmjytTeqVH3LDr8BkNuvCbvD9ynCZhScs8YDdV+aaD9+o
9htI61xjalliut41JhGBXfG2V3bSFAbdHvv2bPsXt65nSMXmpfuupV5H27/0
db29j7c7Xuxbmh4ZEax+2Ld0Pa69Bsp304r6iyN0juRGIUfk3LCJaJeFw2qV
zWGzSVjTaTrlvMrNcXw6pIludkg7SNfOqTihVj12hJuelhO6oAs9BJD21vWC
gT4ah2cb5HYbPFsWqs2uzsNPOjZFn8Q6sGX5t5Y7iy/J/jtWvSxvV77XABGQ
x+JgJ7wk4zhaW0pX6najRkghDnvGLQRmNA+h4/jPFjGNOq63zepTlTT8tWHh
qTO8aVl1suW/96qTe4A//KrTZuGKvea0cbjSvu708wklNlwTsobX1PJ1I/Ip
jNjMZu4QSXi7BhPtFfTw6dsrsEKK9grci2UVCuyXTqwKPrIVNR0etayo6Zdc
61mGlLgf1JhY5X4luGhd4LwOCgQDflUras2CTjVsNRv7+u+NFfQNPlor6BN/
7KkLOytV/aUtEHGt4fWNRDZZwtsiFHGt4BmmfusopBUjbwhDENun0+0BH+nv
CkZ8dzSC007euKHCkdgPQr5K0BGM2McyOgMSpd3MsxT+ktcJwQXGO/jwv8KP
tWOStqhlYK6K6MjF5Z5nvvDyYVjsAEWw1Xj5Q6+vXFMsUifF3Pq26XJLZQZc
UzTyUa6kfApNtlrlaIxNPsqzN90+9RYYveE2f8Loa59PGP3HitH/8s7eXOvm
M38+d4HnnWdsd4HRTxsrb/ZiBZ3gvzLq2I2jd3dhP4h63d/QtPZ0czdASy24
U9wj6dt73dSJf3mC2Tpd/3ne91ZyvhSkYg42dDCbB+F6XcvmdlsB7+19TDXm
/UbPci8/xjMWffbnNB2y2MyltLcDfjpmsYvfyjdbiglvTVnTCHzEBy72Cg+7
vFnk6slHBw8/6AMPP2iBhx/0gYcffIKHWyr4yODhBw3w8IMaBT/RgYsNXN4N
PN4NHN6t/V0XRru5w7sJWrutx+uAbJVj4gBtt3J5t4Jvnf6Pv53T23cTtBuY
/Yn8YIuIvTnFHxSC7bXDfTePuMfQukd1k73tuwGy7uMM17hb5GQ3SPb4k//8
gf3n+lQ3LczH4kxb1tDtPDtfURXgv9xLPI7rsq1D/co9DSDujwL6WL5bw6fy
jl1B++YW5zvVCiZ+8LZceSu/6u8Z76wykKdszS/tjwL3jqTmj3MY658KbO8g
ZH8UuD/d49pRwZX33L6scPMKzoWKkA79xhV0fa7ZG3c7wv0d8f5++Fa+9SzL
hugiSt+6djMmexxPfVBuZyGfZDx8enZ25HacG98+F2+j0+z42bgFUN6Q+DxN
V8OnWRh6pzF6nWS7D58/PaXs6zLPw/BZwnkIUNO6i7x+NqRSIjcnJ9IL0WFC
o1WkmXYz9ZlYeV+qP0PmU24RaDVJp3iTZxTj1bMj7zFf6go9wLtX8IZGSlJb
hJhjFxWNUZG8oBktbIqOHZh2TBRBNqyW+4uvfh6Ao1twDgXO3IutJDoHg8wS
Rte10g1SPvsh0BH4AUreOLAd5hx7YTvMfa/hEUMLfW13b90gqD/9qx/gGMkE
WRSYY214kbMw4PLqWqmnie2Hwnr94+//yylqeKsm+Y+Gf4csjJIyLXO8+pKO
0HZclGosjwsa1PHgkdgOyk4keB1z9LqZHjH86tCsNK7G3pCBGghiF3Q1WFAe
SOU7n+hu3B+JxI+1O2Ibg5upvkzYqJzYTOKahcb127ovD0Z4P28hhc7iMqVv
4wuSyFHTIUdG12Zykm5MtLa78wfRRxlPPVQxBSfFoKqpzUm48C8iqoRubgZu
VeYXp1aDHgJZ65G4i7ebZ+KOziwkzbaMEuh8LDmJLVAmbHFdre69KCYuYAc5
Wa4Kvs1K3nvNOgSZwHHI7t5mzcHs4Vo6zWCHd+csc2V4lP0AwO3a4U+jh9Fm
1nEpeubjRay2t9depnc7NTta7VHDtTX4UT5b7YX9tFPFSWvsk2XAOkbBesjJ
1m049Zsq+yrt0NMqAlttyCpzu94JxyO7zFU9LFGPjh0u7rbtNPCpPR4R1NQ+
9rNtPMe6j+dspyVRXN3Bq9bl8hg3r7XVbVxNermNY1MKzwROczj+qt19bCll
upEtr3E289z0v6ZhEOUERnEBMCR4q6FKxgxuAmZ5UTCaTGMsIB32Cwbe4/NH
Y8oTRVDoDHwazqUd+lN1v7ZMsiOBKUqjMPLO0bGleYn3IiLSJ3PiaSwL5544
5gcOYxyHlItjUcni7kUzYZ7QtgLL6LaWPI1DdV8LroFltBRWzQDb19MDRvb2
9PBdfac6WkfBeQNYKoKFuK5SZv2yWCv5bV0lofyle/JkobhDUjnqfH6JcCBT
qamCx+oajdoNK30u0+f0DUh7yDldFTDH8oINY+dZ3Y4Mr1H6dASE9vRRcCD4
7hDZPN9cLqsFV/HxOvGXiNLDSC/TKfprlMp6Ce5QhRrOwzoBIb/QEDI44v6q
qErDlo6ccI6E6x1yelfhJnqzKCZeC47KS9s1xGfyXv5aF72R4YxxMjTsAN02
ZM5/M+MZzSGML9B55LyK+HwZ5UHs5znKnmYApniwMjJb2cNEXoihCNCGFKCp
TBKVfA/mkPfJ9DIw8jnTsDvzCZlliXktiW/sPGZ0/RFebETuKkXbMgrF+ISB
Ygkny6gWxSldiVy6dkK+kU7R00QgyMv3pS8y4EkcupodaaBSyYlMmFOVnlSC
ukh1wcsGlcREJVQei2rN3ET8JOMuT60oWWYrGvflXK4zq4kuOJMLDXSaJ/1m
kRaYSgEzf3GOIavJXKUwoVngMQ5BgvgyLYzpxQlGORdRtySJe9rUkYRa8guc
ujIh8ch7hfd66WIqzUfuMDUcx1Ly6RWarsKz5oKZKilNVH91ogtMnqQ0vpk9
yS6oG6S0NrnzhLKfBYv37+up3aB8/QI2M7dcn5DM4VBaLvlt2z268p7wWtdU
5xwVCPyV504N/KWx06XJedyQCreThhNuPkQGWJmDOZewyIPUPo+F/NNamnTQ
nGnRDGVprodrChpvUlDE6OxIXWSxEbd+FDlVTaVqZVnTxgOnAGuCKV0XWDCs
wzPX1o462ykppI1U5LPEyYTqPX/utPUmYiaub2Oe5WGFRLp6ndQc0IC2bpr5
l8xDfb9dlIB9lQmIeP28xjz0AXv4hZ0S07TUvTSMYnO+PHP8trjBwOa0OnAm
2N3K61HdPZTszLWzyuoaAUwjIxWresVwkcdYmwLqjbWOenz3nwfeCf6DHcQL
Ae/+s7lW3cwgupMjDoTGUSEM3VKYzkXIgKhYphJuZoXIKqDMJNaUa2NZm0ZE
yjimNGKJMkd07Ovr8cBIOKa6rXgFT0Q2cB/8/PCCuZCHMbtHiMkuyoLz1PLe
AxE+wZR5SuEUhk2YozaaVaaKPWqY81YqXpVeXGT9xh91xQNO6mRmKPBVHGdO
GRgVttViIFVHiHVRvhA6AriZLuO1kZEtTULdIHiq0TKKMQ0wDbLFG+WZ0MS0
eYF5QU2yafZTCFUsRK42+H9sS+eVg3hxkU75XXH9/grGx6dAhYZNjIM+/hjO
ZohXCvQVO4hphTGlxIoj1Fpqxf6Om8rnrRLMcxgohM+RE5IlbmCQU89mu0xV
qugY14x+E0+/L9PfL0JQMr/J6G/ly5n3lqoKKpIiY1QeQFIhguvCrVQuo8xS
gZ4FaLUIk5H/iDmdX+uc2yKBqD6VKroC85HVA862Ama+nHHUByJeUa1IlqsH
uhDtYKr4sGzFJFPr5lCqCu6kce+Put9TpKFVzYi8a4XgH/hkuJYm8haKqVjl
IofPvhyOeZpOJ+vQcw6LMMaa0Sj6KPUD4VliA8QFDOqUKSAXnGoYWGdpFRsV
yEKtDqRAktoje5VCkzAD41it+/keZgzHaR7gDbGGi8MJK6YXkXI7z54+un/3
i+P370cV38eRm7TiAr2WdstytDpCDnUfMc1U9o2029Ro64R2sD03ApLkDjN5
y3GfZKkql7eRGFTKts68WAtpLNuJOddkV8LEymxZK4hDm8yVm4VqWNhE28LS
AhUmggswDVzHZbXqZik2hrm+jLUWmck4rBajyfrseNLkGQjqJk5Td8DZ33dq
z3e7gSNV2wS4Xy+KtYSDuYVKwi0yyMtQ0eUdDXYQBZdj1c69Tb2szUVK+owO
UWJrKTLrbDRlFX0EkMm0lHUCXfWaiIRNJGnjwn8bJpXcucxSm20FWiuJRE/Q
RwKdK/U71yc7QtRYaImLHTftDMEuUA6vV29H5KwMrJ3Ze2t4XFta5Rou5863
a0Wvj/vW54TR5Is4a2TOL5miOPdgMkOZKHgb6wzpffM5W5S5iXeBYpW0wjXv
sDI7dH5tLGWFSVUs7NatruzXDqNrbxnvPXa26TWwiVb9KCyvsLZWNmq3ARVN
9xoNtz2lTT3LKM/J4W42qvszpgMGEWZxyd7lopp0WuApJspCEWcVhuWugt3B
ztRNtKYuKZcTzChhV20Vt1xloZ+qkYYaZyOubTfVvcWlj63ulY79o4I8ehhr
jX/kLRK1IeDRi1PbWeWNRIlKyOFuMsm2Aq4bXuBBIUVgWxpIqK1oWnND7OQ3
qOhpeetkyMYtM1zrv6WGn2ptv4nybTAn22tdJbNCH+KaoreMfiDWGwccGk5q
IDLCvFNL0tsqR33XLcqB3M6G5MxKXJ9dpSshs2rUNzqMIXAoMgFTeZuVT0FT
2HFAo0XFSnus7bC5NKvfU9iOmOzmQFbUqlKqTSq12afYRJdW/I2PUYOSJPaQ
QhPrZ4ALRWaLSHf/53rqmrnK992iJC135qkf/Z7RWpMirjmeli5elnnREgR1
tV8Je3RbzBcHM3poYNlUPfapdtfj4IevszKXcCnqCbPv+NF3uX/xXsZJvAd2
bO4v4Ucv5f6SM4wDOrYsVHapZHJPjGOrir30HCZmBNexo8YPsjTHQYqLaBUL
+Fz2U660yc0fuRmS0XFMuVTNy216x4y6dVCcyjw9F6ZVoH/Z8M4qtH9m5DMR
EoU6iO4TIfBxmgYl6Q6xvu+4uoYbl3ZPYpzmiFlRZqLn7Rva5EKLE/bVhvKM
Yp1rYslbSrXEdgveISTEek6bwOwTiBpdqZ1N5KOFkT5dGKd4+vf03JtATE9H
CpMkLYF60irhD3SkIpaDhKcTMtyTrc8aOxpvbZQXLnAsRFa2BO+jDogDh2WC
W1SOalUYeg9GeJnyPMs3YYpkJ2tEkNrALF0bR+NwCupRflNkpo4y0teTKEaN
jcee8WSjMqlkVZqoAALRtqdlLk42Qt9e0m77RRjT4gidco9jCdzLblWkg4Wn
soeLFVN9sxbTBw+XP4uL8rwaUbX95Z3bvkGi7x2O77kvq21v+/adb0Srd765
vWHbnnWrh5O61tLmFmxX8fYL46xvjuJtpe94h49OxkctxeulHfuJbzcVb9xs
DXJ6OL5/ZD7q2HrObePoQHv0L/9Xj9Q3XaUVs+Rzq0TTNvHxyf+AXJ18qzjt
ptf64tiYrn5ylzC/1ErrYboyS4wfIGUPvrW/YOmGuWTW2tA6lpZz6eRwfGLM
pStvfIxtHH/bWLzStjmXoPQXqrS7eK3fBlleww/6iyqt2tFk2aVdxY3xPq4U
r0lRvbhNOQ+WJIu+3hnjI+sH/aXW72ZV6PilVvqKBu/4cHw8AF4cGR2QA/vg
cCxv9N5z2/QRJy46T+S23F3er/Ter5PkRut36tRP8FTfsA5kCPK669D0qF19
4JGYWX6aUg5b7m+U2L7MFtmGzfrU/r4KLe/fay+54XJyJ3niLpP+RFYz56I4
078n9O89FQeyN4kSjUQste8uHVUDHTAif+NWbxlvCP8JqorBMYtH1CbhFkb6
4xmHIEiIGaD/np/UXjbxI1rC8YoyS+rvYZd+r3vS2Og9GxUgCtVlILne1jTm
WzHGlCW2SuzAyJUbr0f8eyFCu5U/p3Uk2oLIcY9VmapDXd8j2UlohJ/JrSeK
FgIeZAeOZSdP9BDjcPF4WqcE2JvuzokryDNv8bJ/v4/8xLaHX2LDwy+BWvrz
fuvNN0RtYVzgIzr6eV6L+Jqm5YYXD13vnZstE1NPRGt+bn3vZk1rdDMah1Df
20NNOA5vG3icPpHURzfsdB+n+wyRug9oB8lsuKGzucYH3TUaeONF5Iu5dj34
IYv2B70XqL7xz7oUiHpbP2LBoP2at0FORBKKKap8Ej68Ao70d5P8WnoAKet7
zEtvfKPFD70Nm7UfKkp7V7mV+EJxFpunLZ28v5YD+wrnPsX43W1/ivHdxT/F
+J9i/A8d49eDdCj98tV3T/5r/OrsdVNxHeNXg/SNYnzdzj5j/KP6D59i/E8x
vqbHGePHq9VWMX7bbaT60K+SdBH4t4X9jqAfiNtv0N96h2oyLYMK2fLwaLG2
PaX9QQQ/PSxAGwXDeHadqABGIxIMUP11cdkvYHAmJa61rzK6KYlHFMqRzy1B
AnWigw4rCqDA2DdhHpFqIsmEFFxUuWmRBRzkRAK5EF2FOO4UucaRibF5AIGK
vEK+5lOayMs1kHCOnj4cNvGRoRI/waXHSgCGUiPsI+NcTa19XKgE/rxZ6rlf
Jy7xj7//n0/AhLLMvzJgQvt1O+AT2rXsdD+3urjtNFkHeHDxHK/jyW5zmTbU
oh8lEGfJcLgvJY6PDFGdCEZ7JeN7GHLd+9bCQdTDDSnZCQ5RdZjAxmaoSNPI
9gFH9E/0rwFzODGHLkpMuMMCO9qhEhFAs5GRg3JVL9mNmFTBD1cvmoCT2qcV
QOnAT3QdYmBNHEUDIebfupLGuWrBKc6/sZI3uTlXK6iIjYu4a3FSIkZVV/KF
E1z5pl4JfhwYSVV7uGqpMLYOlbRup6giLdWPidd01dI8xN/oIVaT2EROjL+b
KmlWzo5fmiq50ijKPUJSNsJR9kkJfXZDVSzSrxtc0ba4rQn8pyktZw/ExepQ
G/CiXjx8Mp2HwvYeNb/oavrGwZ91HvHmXQTje/LiBV9YehE6ordF13fE60qO
DorS5d0lX309HnWiQtM864sKkRcnQzI8kkOOp0e3AeKhItpzcPj4/Oxo+80f
QM6OOJD3mMnikfHODLKuYd/HLw3U+TPuz8BbTGnEiQIM/+j8GN0fB5445soR
AmnnNBvUTvUggsHIg7EfAtESUZ5lR7Z1T0Y34mycaJIILCFmpl3LSwoGcO7Z
hQV+Yv1Cu02EYOJVbrSTm+ND7mNjpClBGqD12UyQg1U2de9Yd+/EupiLg+du
BOdeR2QLshGLjencbqXRkUpQQCGSuvSok+wPTbG9KQdnNk88ik/5ZlFx2Fvi
Hf0xJJjk1w4aYRu7pMaqabuPDx4y2diIB20GBXUKy5FCjRG1qQ//liBLC5LS
Rmw3ynK8X5SF7d7PEWWxvZ19enguv7I39lL1x1qc1E7wZZPKpDffunVkA8oq
exR2qAw/Imh2gzIbVmblp92xMgOlcda2SWUWWtMA+XRV1iASbuinq7Im9MYJ
AXVS1oTiuKAgR2U90RwHJNREWQ9Upw4NNVUmP23oTg0i6qisHeWxdr5gpGrF
0M3awY32WAAKxc4ymLwaXonreh3Ums/dtWFlh6dFgacAs6MrLCFVTRVDshpx
1saUHY6PP6dtQH+44huG3ZQZ+3SctbXyTCguxTNz246rtlahreJT1i4eR20N
otEDp3LU1lBZfYePSwSrtXULrQu3stAqa8dPS2Vt1s2NGjkq+1METsGSYMva
bqDq8oONY10fZbv6GiD9El+RPo6dPmJ8LM/V+2LeGTF3kuMFkOAL3jggp47v
bbYc9xOFxOhTyd2AjEwCtSUoI4qTh01U59uDMaKuyn3b22AyNapcUIw6uO56
35UNl8eNbs3FaC2XF5rovQoK8YCxETelqLGkwjz+FyTe8L7Oy47aUSxPU7hj
AiAC8bKSuJPQ8y25mfa2WRbE1VVUFI/Oy7qsqMrKc6yIBAE0MKWG7ouOZRkm
LnFFF8ZAitvdqJUJMhUrysKY44JFtMrVJSG8VYtwGepxLgNzeQGCkV1koW8Z
RxVQvcnGDwp3aTM+lYVH3umUw2vMSTJQOJiz8EYJaOpiZWTn2TtewFfrNEr+
btCBGtIu+ODkA8AH8mJ5gTg1zvUmLGGDHSBde0p0Rmwibq9gQsOOjTZiP/h2
DT3VP4EJza4Hfz6BCVtXhp9PYEK/yj6BCf0p+wQm/GzBhAefwAT7J0f3PoEJ
XjuYYL35cwITHnSBCQ9+CjBBbNHYbYPHzjgC17Prvg6Lmg78wH732rADv1op
SSsFKPKuz8LP5mHRDjMAxdFSXmB3EWYQaMZL8tnlviB4lRr6cJCBGrNfN2Jg
C9K1owVuGd9xk4EYyY8OKGiY0HsACbqOiRybIMHxTwwSPNgCJNhlp4Gc2D97
fMD96b3X0/6wN9Fj52e9mPakNyrm6RW4vsW27JvR5J1vNi2xn9274ixNP8Tl
GyuQrBDV6/jJVudpGjnQ/GCDanY9mLM3avZ1wmenMz6OWra67cRRy/bnfGR0
z3cobnz/yX7O+jRDPBud9tn+vM8+T/zs+czPfk79bHnuZ+eTP9WF8p3O/uzx
9E8vSMRVT43FezoB1AcGqdfTNuB7OAW08+mbbtjDDXrsiZo92fB97pq45wI6
7vUDOvQq45DbGdZuN1H39O++ZWIsqrozDl14h4OYfpsmJIm1TAJb7pwoHCDF
3jY4nHRtcNgvOGEd4qBDKAoboVQZ1GbOacl1HwcQYSz8kuP4FgjFhD7cw5eF
Aq7aEPmQyEUYkfz6GoHAq/c9ZEqGGQpD6zyPPNTlgixq4EgLuPJL3lBhTur6
jNnvdgunTHRBKPd+XXstTkwY5URet2GfKPs4N13c23HThVtj/MIQlQ3Rhg0R
lA2RE4WYnPRCTLZBSvojJPtBRnogItsjITsiIDtCFrtjDLthC7thCrthCdth
CDtiBztiBptjBfvACPaEDeyGCWyIBeyIAch5tWXsv4eYf9dYf8cYf3+x/U4x
/Zax/J5i+A8Zu2PMftIVs598kJidvbCd43XntoR37yq199hu4Gxw8+0GrqDb
sTPgZKudAVbM2zOerXPi1x7LutPJ/hTL/G2Cvqf1/9rwf3Rh6/Wt/J9svvL/
EYSsJx8gZK0rhZ9juLqlJXRf5NhnOdo23rddNrvhB7vkFbkF9/hv08DzDyfa
edhfm/S5446gqj84SjbGOPYPVHKVRUs/W3O6afuFCchbueJfHCWrVds5JBp+
aF2Va/zcrpR0dRCl1N3j/bTp+DSOUGfJtgC+q+RzO6nyBiXPxRx9LfSaWXL7
+dnSpBz8XlestV7tptCpJvSoVlfVhzUvmR/OsmxYpKtUerK1DNvsXTw9O3Pn
xdM/n4ufcT3JquUpXpN1FvL14Yfw6pGRPFinYv7q67E3Zmf+GVS6AruJegzM
NPiPYNnB5ByOnz06QpvwYvz8fMg51qG+AaYDTi9Bt68iVPhRjC6n5Qx6/owy
xfMchhqSdBrSq6T1X5PKp2vTOZ1xES5XaQbdAbUOpnidBBAdJJinXN42HSV0
fTgbaJA+5R9ttPpg8tXtqWk3Bd+FSohPzx4ZybOpbh5VMJbYw5zdonveodBq
R8KTOfEOWZkdYU51FBfpCwFhZVrm8ZoXqjoSlosr34+lK0I3r3GN7ABl0XxO
DBc8luQCUf/4+//SzfI4EngXG5pP6VZhH+mmLXZrTnSl90cqL3tfKw+cR27j
7zz8OfgvwQKFQ/ce3PEzGJZU5HtHphFlWApejGZrGi68XIqcgjyC+ZcE6919
rBdvztH3gYHDWaKqLqi9WYR3oSH5ohfIGRZQ5K4PL/NVcKs0TudrD+fTXCfx
fvmKru6fMoODcFVwwvVammyUKLyYDhPVo/UjH4b2lM/si/wpdfc+nBmHI7OJ
E7OJL6FLbeK77NoWfXr6LFVTtYnDYv6IKEiY4Vo2zR79wzSc+Zj4HB/XSm3k
qGznMDg65PZPLAdlf85Js2fSsYGtwbnoKNXgWPTYLOemcLv51diUGlrnO328
kD4+yGYeyGrSywMZf9XqgeDPtgdiJ9MxsAbwOaQRB+UZBRGoTemEgPZDMIhm
E7ocA5H1ZOmvoBjhhpz9BQtx3IqcoEmXh3IhO9L+C9hTVLqLNENLMKQXResj
7xxbJRrWw2kGhisxL7lEHMan8NJbgHeDZkODMGhNoN8UcJOBW63iKJxu5HyY
LO1yPnybZ2vT/TgMR/PRAC+irfGP7kTkr6R/kRDMqBvh5FCJbxlOk6aQ8aQT
JAc5oO8EJTaPhD8ArsHXEqTq66zAG8ECq6fzO4LxzGLBxjJRXyV1EtN5NeGr
PW3rPAkX/kWER4LCGfoQ2tkQfBKGWTsc6MikU/QraPuDzdXpOvGXUYDAIFW1
3Mzl2d0pEf6D4ZZkfBex1esMWs2mMQKwuPvM7oPq8XNX7g+aT3TTs3id3CD0
VyYg0Vm6WoXTgXMYzRfFNIFfhXuEYK7J9nI1VQy5efMmebhZGnvj2E9Cb6yT
/0iOEU/EXcpq6gzEIGCeGQY+BdxE3c3SIg2gzsAIUcy0QpQHRb60RAdrLrCm
nPAn812c7DxNYMLQiZfMHwZM9HCFRA/zMEAg1MbPCtrwNuMVDeohvWzVbcDa
vCYifsxtFUnXPYM8vBq/fvbq5elzEHm8nlsXizCNDGlQpYxUICcgNegFCB4q
1cKzqJeKM+CISbJFh39Q21BpBjO4sgvqBjGdVYoRoHFESJgUPwsW799LWZ4w
l+BNVM1F3swpJS2P/cLfp6hMsT5uTgcTMNwwNxc1kTFeNq887hIWLGZJCncG
oxsGbN8UURz9SGykC9xxU6JBoxlMkn4U1bJwHIpmvuPv3+X+xfv3HFXWhOhQ
DEfl3VmZBLzqERXrgfdo/MYrNUneoWpYjo/NjPxIcHiZQpyyYUnSHRPkOSg0
UBsy5JfjpUUFhDJmxZmHKx+1akwbPsWIAMNxyQO6gGptDoFYGftZZQz9eA2x
HA8A8H8Fxp9COJJpfPrEB6EnMRIGl/b9yfdY9lHwIXrPlb2ax+nExxkDsdwU
peJHIBJfShNeIqSxmEYX0bQEIxzKuapmFcLVsmFeo+DW+UbtzAdm4G5V6SzY
ELgCOfQMhCmA9OaKu9M0KLFNnpRqnYd7xrYJvz+y6h2rlsnr8Oga9JBuZr+I
gJ0L6OqlL0xrns4K+oL7ahGY5yLgDMjwUUbE/MM9UZdKTSXT5ImCYIzP16Ab
lpXOCpdmtYCBRI8GWAL+Y4F8XfroQ4QkwAMhjgOYkgEIFLiIK3RJaGcw1Tvg
6QEyg9MWwmiQ/SNu+8HIe8xddLbtyOOmdh6IGr4Y2SOCT1Hknp2+PEUu47K3
WMDw3t3Ep+9Zh4EPIMcK+Jt7ScplfJqg5KxgPedhANa1WNfrysUv76VOnID1
Wyz9jLADNkpAdQyMmqq7zVWTavt1IhZRbC1UsV3YRZYFdcTdlzM9hgpif5IC
ZagVrOQTrCIIq8tTXLibypVn7SclFanxQrBx6RpXscinMzq1LHlKgBddlORp
mn6+YRRG3rOCth0sWRYulLmW2RpgRqBiBI5EuV7JJDfVT9aqs1QqSMsY3IgU
kaGI/EJc8l7bK5wgY5TUQEgf6li0TNOS4xHRS+o1RGS0nsEjfBq8TdLLGFMO
sLJ4d7P66D3FbEm5nIBLP/3Xz8jv5ADsBVILVErAMfprmXh/9klUgRrwz+Gv
8zKHv/8IYw7T4+uoBLW6Cr2nURqAqvUH3osoWQxfztOAYC14H+zSPIEifylx
kqEzeQn/855H8ONfSmgS/uf994KaQZ3wX2W4hifnKT5gHzeMYL6iTDFot1wq
BZkBb8PLnMMXQyaBN6ckAlEoMkSseScIRicpOtqgpMRJgLKAcI7DP3D+wYhE
OfohWA1wGyO7g/8PPkBj/ZpCAQA=

-->

</rfc>
