Enabling Uncoordinated Dynamic Spectrum Sharing Between LTE and NR Networks

Dynamic Spectrum Sharing (DSS) is an enabler for a seamless transition from 4G Long Term Evolution (LTE) to 5G New Radio (NR) by utilizing existing LTE bands without static spectrum re-farming. In this paper, we propose a cross-band DSS scheme that utilizes the Multimedia Broadcast Multicast Service over a Single Frequency Network (MBSFN) feature of an LTE network and the Multicast Broadcast Service (MBS) feature of an NR network. The proposed DSS scheme utilizes LTE and NR resource controllers to assign muted MBSFN subframes on the LTE band and muted MBS subframes on the NR band based on traffic needs. In contrast to the state-of-the-art, the proposed DSS scheme does not require a coordination signaling channel between the LTE and NR networks. Instead, a machine learning-based Technology Recognition and Traffic Characterization (TRTC) system is used to identify and characterize traffic patterns. The LTE and NR resource controllers use the TRTC to sense the muted subframes and offload traffic accordingly. On average, the proposed DSS, as compared to static band configuration, improves the LTE throughput, NR throughput, LTE band spectrum utilization efficiency, and NR band spectrum utilization efficiency by 13.5%, 8.3%, 11.8%, and 20.7%, respectively.

both FR1 (410 MHz -7125 MHz) and FR2 (24250 MHz -52600 MHz) 3GPP standard frequency ranges [2].Even though operating 5G NR on high frequencies with a broader bandwidth may result in faster data rates, using these high bandwidths would be unfavorable in terms of coverage due to the significant signal loss experienced through signal propagation and penetration.In contrast, lower frequency bands (FR1), such as those in the C band or below, provide better coverage and penetration characteristics, making them more suitable for providing wide-area coverage and maintaining connectivity in challenging urban and indoor environments.However, there is an insufficient amount of available frequency resources in the C band to meet the 5G requirements.
In this direction, deploying 5G in the lower frequency bands, which are primarily occupied by 4G frequencies, can provide a significant improvement in spectrum utilization efficiency and capacity.Dynamic Spectrum Sharing (DSS) is a method that enables the utilization of 4G LTE spectrum by 5G NR by implementing the co-existence of the two Radio Access Technologies (RATs) [3].
One of the LTE-NR DSS implementation options in the 3GPP standard is to utilize empty Multi-Broadcast Single-Frequency Network (MBSFN) subframes to enable the coexistence [4].MBSFN is a feature utilized in Evolved Multimedia Broadcast Multicast Services (eMBMS)-based LTE point-to-multipoint transmissions.Within an LTE MBSFN frame, up to six subframes can be reserved for multicast transmissions, while the rest can be used for unicast traffic.In LTE-NR DSS solutions, this MBSFN feature of LTE is used to share resources between NR and LTE networks.In MBSFNbased DSS, the LTE scheduler schedules its traffic in certain frames, while in the other frames, MBSFN subframes are allocated [4].In each LTE MBSFN frame, the resources normally assigned for multicast traffic and control information are completely muted so that the duration reserved for these symbols can be used for 5G NR transmission rather than eMBMS.The NR scheduler, on the other hand, schedules its traffic in these muted LTE MBSFN subframes.
In the MBSFN-based DSS, this technique is used to enable the coexistence of LTE and NR traffic in a band primarily configured for an existing LTE network.However, when the traffic load on the 5G network is low, its spectrum may be underutilized.Therefore, this paper proposes a cross-band DSS scheme where both LTE and NR networks can share spectrum in the LTE and NR bands.Throughout this paper, the terms "LTE band" and "NR band" refer to the basic 1536-1276 © 2023 IEEE.Personal use is permitted, but republication/redistribution requires IEEE permission.
Authorized licensed use limited to the terms of the applicable license agreement with IEEE.Restrictions apply.
bands primarily configured for the LTE and NR networks, respectively.
In the LTE band, the proposed cross-band DSS scheme uses standard muted MBSFN subframe-based DSS.An intelligent LTE resource controller that uses a dynamic resource pool management scheme to adaptively allocate resources to the LTE scheduler is proposed.The LTE resource pool management scheme allocates resources in certain subframes for the LTE scheduler based on the LTE traffic, while the resources in the remaining subframes are muted MBSFN subframes that can be exploited by the NR network.
For the NR band, this work proposes a novel, muted Multicast and Broadcast Services (MBS) subframe-based coexistence scheme.The MBS feature of the 5G NR network released in 3GPP Release 17 [5] can be used to schedule muted NR subframes, giving LTE traffic the opportunity to exploit it.As compared to the MBSFN parameter configuration of LTE [6], the flexibility of the NR physical layer enables a more flexible number of muted NR subframes.In this direction, this paper proposes an advanced NR resource controller that utilizes a dynamic resource pool management scheme to adaptively allocate resources to the NR scheduler.Based on NR traffic, the NR resource controller assigns a selection of subframes to the NR scheduler.The remaining subframes are assigned to be muted MBS subframes and can be utilized by the LTE network.
In the state-of-the-art, the DSS between LTE and NR networks is achieved using a cell resource coordination procedure between the NR gNodeB (gNB) and LTE eNodeB (eNB) over the X2 interface [7].However, this radio scheduling synchronization leads to additional signaling overhead for DSS between the LTE and NR networks, and it is infeasible for uncoordinated LTE and NR networks.As a solution to this, this work proposes a DSS scheme for uncoordinated LTE-NR networks.The proposed DSS scheme employs Technology Recognition and Traffic Characterization (TRTC) model-based spectrum sensing in both technologies to manage resource scheduling coordination.The NR resource controller uses the TRTC to sense the available spectrum in the LTE band and allocates resources to the NR scheduler accordingly.Similarly, the LTE resource controller uses the TRTC system to sense the available spectrum in the NR band.
In summary, the key contributions of this work are as follows: • Novel spectrum-efficient resource pool management schemes are proposed for the LTE and NR schedulers.
A dynamic LTE resource pool management scheme is introduced to allocate muted MBSFN subframes within the primary LTE band in response to varying LTE traffic loads.Additionally, a novel NR resource pool management scheme is proposed for the dynamic allocation of muted MBS subframes within the primary NR band, considering the NR traffic load.These muted subframes within each band are implemented to enable the use of unused resources by the secondary network.A cross-band spectrum sharing scheme is proposed to offload LTE traffic to the NR band and vice versa.The resource controllers use the TRTC system to estimate the spectrum unused by the primary network of the band, and these resources are added to the resource pool of the secondary network scheduler.The rest of this paper is structured as follows: Section II reviews recent studies on LTE-NR coexistence schemes and spectrum sensing mechanisms used in uncoordinated networks.Section III analyzes the challenge addressed in this work and mathematically describes the proposed solution.Section V presents and discusses the performance evaluation of the proposed mechanism.Finally, Section VI presents the conclusion of this work and outlines potential future works.

II. RELATED WORK
Spectrum refarming has emerged as a noteworthy technical strategy for the introduction of 5G networks.Authors in [8] present that when the spectrum of the preceding-generation technology is partitioned and reallocated to the 5G network, there arises an insufficiency in spectrum availability to accommodate traffic from all users.Furthermore, executing spectrum refarming necessitates an effective design strategy aimed at minimizing any disruption to current users.To tackle this problem, 3GPP Release 15 [9] proposed DSS technology as a means to transition from 4G LTE to 5G NR.DSS was subsequently incorporated in Release 16 [10] and further improved in Release 17.Since the standardization, many DSS solutions have been proposed.
In [11], the authors presented two DSS interference mitigation techniques: buffer setting and rate matching.Similarly, in [12], Xin et al. demonstrate that dynamic spectrum sharing between 4G and 5G users can lead to substantial interference.They also point out that inter-cell interference, which is caused by cell reference signals from LTE cells, severely impacts the performance of the 5G system.The authors suggest several mechanisms to mitigate interference, including resource element-level rate matching, resource block-level rate matching, and control channel avoidance.To further enhance the 5G system's performance, the authors propose advanced techniques such as multiple Cell-specific Reference Signal (CRS) patterns and cross-carrier scheduling.
In [13], the development and primary implementation methods of the DSS solution are presented.The results of this work indicate that co-frequency interference can have a significant impact on the performance of DSS solutions.Similarly, authors in [14] propose different sharing ratios for resource allocation within a frame.The analysis is done using a common resource sharing controller to compute resource sharing ratios between LTE and NR technologies.The authors propose In [15], a machine learning-based framework is proposed for an LTE-NR DSS solution.The authors propose a controller that can intelligently divide resources between LTE and NR.The authors of [16] also present a technique that allows operators to provide LTE and NR services utilizing the same band in an interleaved manner.The resources allocated to LTE and NR are managed by a common resource manager.This common resource manager is in charge of figuring out the resource sharing ratio and keeping it up to date based on traffic needs.However, the authors consider a controller that has coordinated LTE and NR schedulers, where the exchange of states and decisions is made through a control channel between the two technologies.
In [17], a DSS scheme that consists of NR and LTE resource allocation algorithms is proposed.The LTE and NR resource allocation algorithms interact with each other and depend on each other.Since the environment of the 4G/5G network system is constantly changing, a dual bargaining game model is used, in which each network agent makes control decisions in coordination with the others.Taking into account the current network system status, the two proposed algorithms operate cooperatively and communicate in real time.
Table I summarizes the state-of-the-art LTE-NR DSS schemes highlighted in this section.Most of the existing LTE-NR DSS solutions propose and analyze a spectrum sharing scheme, assuming that both technologies fully share the same band.However, these DSS schemes require agile and synchronized schedulers that protect the control and synchronization signals in each technology [4], which makes them less feasible for implementation.Hence, this work proposes a cross-band DSS scheme, considering that each technology has its own primary band.On the other hand, the state-ofthe-art LTE and NR coexistence schemes in the literature consider coordinated LTE and NR networks that deploy a signaling channel between the two technologies.To the best of our knowledge, existing DSS schemes do not utilize wireless technology identification solutions to enable the coexistence of LTE and NR networks without the need for a coordination channel.This paper proposes a novel TRTC system-based DSS solution for uncoordinated LTE and NR networks.

III. SYSTEM DESIGN AND PROBLEM FORMULATION
In this section, the resource utilization efficiency and system capacity maximization problems are described considering LTE-NR DSS in the primary LTE band and the primary NR band.

A. Assumptions
The proposed muted MBSFN and muted MBS subframe-based DSS schemes utilize the standard MBSFN and MBS features of LTE and NR networks, respectively.These features are introduced for the down-link multicast traffic in FDD LTE and NR networks.Hence, the DSS solution is proposed for spectrum sharing, considering downlink traffic in FDD LTE and NR networks.
When the muted LTE MBSFN and muted NR MBS subframes are used to enable the LTE-NR DSS, the NR gNB and LTE eNB have to be synchronized on a subframe level.This enables the avoidance of overlapping transmission as the subframe timing of both technologies matches.As the MBSFN subframe pattern repeats periodically until a new configuration is selected, there is no need for synchronization at the frame level, i.e., the starting and ending times of LTE and NR frames do not need to be synchronized.Subframelevel synchronization can be achieved using a satellite-based Pulse Per Second (PPS) time synchronization protocol in the LTE eNB and NR gNB sides [18].

B. LTE-NR DSS on LTE Band
In MBSFN-based LTE-NR DSS, the LTE utilizes subframes assigned for unicast traffic, while the MBSFN subframes assigned for multicast traffic and control information are completely muted.Based on the length of the non-MBSFN region configured, the first one or two OFDM symbols of each muted MBSFN subframe are used for LTE PDCCH and CRS, while 5G NR can use the MBSFN region of the subframe.Fig. 1 shows the frame structure of LTE in the MBSFNbased DSS.The figure shows that an LTE frame with muted MBSFN subframes occurs in every M lb ρ frames, where M lb ρ is the MBSFN frame periodicity in the LTE band.Based on the standard, every MBSFN frame can have one to six muted MBSFN subframes, with a possible MBSFN frame periodicity ranging from 1 to 32 [19].
In MBSFN-based DSS, each MBSFN frame has N lb nr muted LTE subframes that are available for the NR scheduler, while the subframes assigned to the LTE scheduler (N lb lte ) are given as follows: The LTE capacity is determined by considering the total number of bits scheduled in the subframes available to the LTE scheduler.If an MBSFN configuration [N lb lte , M lb ρ ] is used for a window of N s subframes, the total number of LTE bits scheduled in the window becomes: (2) where S lb lte (N lb lte , M lb ρ , N s ) is the number of subframes available for the LTE scheduler (out of the total N s subframes) and b lte is the total number of bits scheduled in each LTE subframe.The value of b lte depends on the configured bandwidth and the Modulation and Coding Scheme (MCS) [20].Based on fixed M lb ρ and N lb nr , the number of subframes available for the LTE scheduler becomes: Replacing eq. 3 in eq. 2 and rearranging terms gives: Similarly, the NR capacity on the LTE band (C lb nr (N lb nr , M lb ρ , N s )) is computed considering the total number of subframes available for the NR scheduler within the window (S lb nr (N lb nr , M lb ρ , N s )) multiplied by the total number of bits scheduled in each subframe, which is expressed as: where b nr is the total number of bits scheduled in a subframe.N lb ms and N lb ns represent the number of symbols in the MBSFN region and non-MBSFN region, respectively.The fraction is used as the NR traffic can be scheduled in the muted MBSFN region only.Likewise, b lte , the value of b nr corresponding to a specific MCS and bandwidth, is selected based on the 3GPP specification [21].
For a given configuration, the number of subframes available for the NR scheduler (out of the N s subframes window) becomes: Replacing eq.6 in eq. 5 the NR capacity in the LTE band becomes: Considering all the resources in the LTE band, the total system capacity becomes: The goal here is to propose a scheme that maximizes the total system capacity, C lb s .Hence, we use the adaptive MBSFN parameter configuration to maximize resource allocation efficiency.In MBSFN-based DSS, the MBSFN parameters can be updated based on the Multicast Channel (MCH) Scheduling Period (S lb ρ ).The S lb ρ can take on values ranging from 4 to 1024 ms (in a doubling geometric sequence).Hence, the LTE resource utilization efficiency (ξ lb lte ) in an S lb ρ duration can be calculated based on: where Similarly, the NR resource utilization efficiency (ξ lb nr ) in an S lb ρ duration can be calculated based on: where Q lb nr is the NR traffic queue length offloaded to the LTE band and C lb nr (N lb nr , M lb ρ , S lb ρ ) is the capacity of resources allocated to NR within S lb ρ .Updating the NR Capacity equation (eq.7) for S lb ρ subframes, C lb nr (N lb nr , M lb ρ , S lb ρ ) becomes: In the muted MBSFN subframe-based DSS, the primary goal is to enhance the LTE resource efficiency by assigning resources based on its traffic demand while leaving the remaining resources for potential NR traffic.Hence, this work proposes an LTE resource allocation scheme that maximizes ξ lb lte keeping the following constraints in consideration: • In muted MBSFN subframe-based DSS, a single subframe cannot be partially shared for LTE and NR traffic.

C. LTE-NR DSS on NR Band
In the muted MBS subframe-based LTE-NR DSS, the NR scheduler utilizes subframes assigned for unicast traffic, while subframes assigned for MBS traffic are muted.Unlike MBSFN in LTE, NR MBS allows a higher degree of flexibility in resource allocation between unicast and multicast traffic [5], [22].Fig. 2 shows the frame structure of NR in the proposed MBS-based DSS on the NR band.The figure shows that an NR frame with muted MBS subframes occurs in every M nb ρ frame.Based on the standard, NR unicast and NR multicast traffic can be scheduled within resources in a single subframe.However, only subframe-level synchronization is assumed for our proposed LTE-NR DSS.For this reason, we use a configuration that allocates N nb lte muted MBS subframes, where N nb lte ∈ {1, 2, 3, . . ., 10} for non-SSB burst frame and N nb lte ∈ {1, 2, 3, 4, 5} for SSB burst frame.
In MBS-based DSS, each MBS frame has N lb lte muted subframes that are available for the LTE scheduler, and the MBS frames are scheduled with a certain MBS frame periodicity (M nb ρ ).In the MBS-based DSS in the NR band, subframes assigned for the NR scheduler (N lb nr ) are given as: Using similar procedures used in the LTE-NR DSS on the LTE band, the NR capacity on the NR band (C nb nr (N nb nr , M nb ρ , N s )) based on the configuration [M nb ρ , N nb nr ] over the N s subframes window becomes: where b nr is the total number of bits scheduled in an NR subframe [22].
On the other hand, the LTE uses the muted MBS subframes to offload its traffic, and the LTE capacity in the NR band (C nb lte (N nb lte , M nb ρ , N s )) becomes: where b lte is the total number of bits scheduled in an LTE subframe, which is selected based on the MCS and bandwidth [21].Considering all the resources in the NR band, the total system capacity becomes: ) This total system capacity in the NR band C nb s is maximized using adaptive MBS parameter configuration that maximizes resource allocation efficiency.In MBS-based DSS, the MBS parameters can be updated based on the traffic queue within the MCH scheduling period (S nb ρ ).Hence, the NR resource utilization efficiency in the NR band (ξ nb nr ) becomes: where is the capacity of resources allocated to NR within S nb ρ subframes and Q nb nr is the NR traffic queue length.By considering S nb ρ subframes in the NR capacity equation (eq.14), C nb nr (N nb nr , M nb ρ , S nb ρ ) becomes: Similarly, the LTE resource efficiency in the NR band (ξ nb lte ) measures how efficiently the LTE scheduler utilizes the muted MBS subframes in an S nb ρ duration, which is given by: where Q nb lte is the LTE traffic queue length offloaded on the NR band and C nb lte (N nb lte , M nb ρ , S nb ρ ) is the capacity of resources allocated to LTE traffic within S nb ρ period.Updating the LTE Capacity equation (eq.15) based on the scheduling period S nb ρ , C nb lte (N nb lte , M nb ρ , S nb ρ ) becomes: Generally, the proposed adaptive muted MBS subframe-based DSS aims to maximize resource utilization efficiency while keeping the following constraints into consideration: • To enable synchronization at the subframe level, a single subframe cannot be partially shared for LTE and NR traffic.• Muted MBS subframes can be scheduled in every subframe except for subframes in the SSB burst region.• NR UE decodes SIB20 and SIB21 to determine the MBS configuration used.Hence, the resource allocation update requires at least 80 ms, which is the minimum possible NR SIB20 and SIB21 periodicity [22].• In the MBS-based DSS on the NR band, the LTE network uses unused resources to offload its traffic, and the resource allocation configuration is made based on NR traffic load only.

A. Proposed Resource Coordination in LTE-NR DSS
One basic implementation challenge of DSS is the need for resource allocation synchronization between the LTE eNB and NR gNB [23].In other words, the NR scheduler must be able to update its resource allocation based on the resources used by the LTE scheduler.Resource coordination between coordinated LTE and NR needs the establishment of a real-time signaling interface.With the help of this coordination channel, the NR scheduler can determine the pattern of the LTE Physical Downlink Shared CHannel (PDSCH) used by the LTE traffic and the always-on LTE Physical Downlink Control CHannel (PDCCH) and CRS, primary synchronization signal, secondary synchronization signal, and Physical Broadcast CHannel (PBCH) of the LTE network.The NR scheduler then schedules its transmission by protecting these LTE signals.
On the other hand, the LTE scheduler has to avoid interfering with the NR PDSCH used by the NR traffic, the NR PDCCH, and the NR Synchronization Signal Block (SSB) burst region.
Considering the aforementioned flexibility requirements, the implementation of single-band DSS requires software and hardware modifications on the LTE and NR sides [7].In order to minimize the need for these additional flexibility features, this work assumes that each technology has its own primary band and that DSS is used to offload user data traffic across the other.This allows each technology to transmit the basic, always-on control signals in its own primary band while providing the flexibility to exploit the unused resources in the other technology's primary band.
The concept of spectrum sharing is expected to be relevant for future networks that aim to utilize dynamic and efficient coverage expansion to ensure quality of service regardless of mobility and location [24].In such dynamic networks, the cell resource coordination procedure between the NR gNB and LTE eNB over X2 interface synchronization requires complex and dynamic signaling, which makes it less feasible.As a solution, this work proposes a DSS scheme that employs a TRTC system for resource scheduling coordination.
Fig. 3a and Fig. 3b depict the resource coordination methods in DSS mode, illustrating the coordination channel-based approach and the proposed TRTC-based approach, respectively.In the proposed DSS scheme, the NR gNB and LTE eNB use the TRTC system to estimate each other's transmission patterns.For the proposed TRTC-based spectrum sensing, the Radio Unit (RU) is used to capture the IQ samples using its air interface, while the pre-processing and classification can be done on the Distributed Unit (DU) based on the functional split used.
The TRTC system has two parts: (i) technology recognition and (ii) traffic characterization [25] As shown in Fig. 3a, the implementation of resource coordination using a coordination channel in DSS mode demands the exchange of resources among base stations through a dedicated channel.As a result, the execution of this approach poses considerable challenges in terms of compatibility and interoperability, particularly when dealing with base stations operated by various private and public Mobile Network Operators (MNOs).To tackle this problem, our proposed Authorized licensed use limited to the terms of the applicable license agreement with IEEE.Restrictions apply.
TRTC-based scheme presents a solution that eliminates the need for direct coordination channel communication, thereby mitigating compatibility and interoperability concerns.The proposed TRTC-based resource coordination also enables scalability, as new emerging MNOs do not need to initiate a dedicated coordination channel; rather, the TRTC-based DSS can be used independently.
Another critical challenge in implementing LTE-NR DSS arises from the fact that legacy LTE and NR UE devices are designed to decode the respective LTE and NR reference signals.However, the continuous transmission of these reference symbols leads to significant interference between the LTE and NR technologies, even when only one technology sends traffic at a given time.On the other hand, the alteration of the regular reference symbol pattern necessitates modifications at the UE level, which leads to user experience degradation and increased costs.Implementing such modifications at the UE level often leads to compatibility issues, service disruptions during transitions, and substantial investments in upgrading or replacing devices.As a solution to this, the proposed scheme leverages the MBSFN feature of LTE and the MBS feature of NR to schedule muted MBSFN and muted MBS subframes by the LTE eNB and NR gNB, respectively.This scheduling strategy enables the LTE eNB and NR gNB to signal LTE UEs and NR UEs (respectively) not to anticipate reference symbols during these specific subframes.Our proposed solution thus significantly mitigates interference from reference symbols, all without necessitating modifications in the LTE and NR UEs.

B. Proposed Resource Management Schemes in LTE Band
For the MBSFN-based DSS approach, an efficient resource allocation scheme that adaptively allocates muted MBSFN subframes based on the magnitude of unused resources in the LTE band is proposed.The LTE resource controller uses the proposed resource allocation scheme to determine specific subframes that are required to schedule the traffic based on the LTE traffic load.This leaves the remaining subframes as muted MBSFN subframes that can be used by NR traffic.Resources in the non-muted subframes are added to the LTE scheduler resource pool.
According to 3GPP eMBMS specifications [6], the Multicast Coordination Entity (MCE) determines the number of resources required per MCH only when a new service joins or leaves the MBSFN-based MBMS service group.This implies that the resource allocation information in SIB2 and SIB13 is not updated unless a new service joins or leaves the MBMS services group.Similarly, the UE updates the resource allocation provided by a service bearer only when it joins the MBMS service group.In other words, the UE decodes the information in SIB2 and SIB13 when it joins a multicast service and uses the configuration until it leaves the service.
As mentioned before, the number of muted MBSFN subframes in a frame can be fixed to a value between one and six, and the MBSFN period can be varied for increased flexibility.However, fixing the number of muted MBSFN subframes and MBSFN frames to a specific value for the whole multicast service period is not spectrum efficient since the resource allocation will not always be optimal due to the dynamic nature of the traffic.Therefore, it is essential to have a mechanism that adjusts the MBSFN parameter setup for demand-based resource allocation [26]. where ) is computed using eq.10.
The binary search is used to select a target LTE capacity value ) that corresponds to the new LTE traffic queue size (Q lb lte ).The corresponding capacity value is selected from the capacity lookup table in such a way that it satisfies: where N lb is a set of possible indices of δ lte .δ lte is a vector that stores the difference between the possible LTE capacity Authorized licensed use limited to the terms of the applicable license agreement with IEEE.Restrictions apply.
values and the LTE traffic queue size, which is given by: Once the required corresponding capacity value, Algorithm 2 shows the process of the resource allocation algorithm (in the LTE band) for the NR scheduler.
Step 4 shows that the algorithm starts with an initial configuration of N lb nr = 0 and M lb ρ = 0 stored in the NR frame configuration in the LTE band (F nrlb con ).In the Monitoring I phase, the NR resource controller initially uses Energy Detection (ED) to determine the number of muted MBSFN subframes (N lb nr ) in an LTE frame.For the ED, IQ samples are collected and stored for every subframe (S gnb i ) within a frame (Steps 6-8).After that, in Step 9, the total energy received in every subframe within the frame is compared with a threshold.If the received signal exceeds the threshold, a subframe is identified as a non-muted subframe.Otherwise, it is identified as a muted subframe.This process is repeated for every frame until an LTE frame with N lb nr muted MBSFN subframes is detected, where N lb nr > 0. After that, in Monitoring II phase, the periodicity of the muted MBSFN subframe pattern (M lb ρ ) is determined by using ED for the next consecutive frames.This process takes the gNB frame counter (F gnb c ) value to track the frames indexes and it is shown in Steps 11-22 in the algorithm.Similar to phase Monitoring I, the ED-based muted MBSFN subframe pattern determination in a given frame is used (Steps 12-15).The value of M lb ρ is determined by counting the frames until a frame with the same number of muted MBSFN subframes is detected.Steps 16-18 show this counter.This process can take up to 32 frames, which is the maximum standard value of M lb ρ .In Step 19 the determined N lb nr and M lb ρ values are stored in the NR frame configuration in the LTE band (F nrlb con ).After that, the TRTC system is initiated on the NR gNB.Based on the determined F nrlb con , the NR resource controller determines the resource pool of the NR scheduler.The NR scheduler then starts to schedule the traffic based on the determined NR resource pool (Main phase).In this phase, every subframe has the possibility of being unused (free) or occupied by LTE, NR, or overlapping LTE and NR signals, and we cannot use ED to identify the signal types.Hence, in Step 24 the proposed TRTC process is used to determine the frame pattern.In other words, the TRTC determines the sequence of LTE, NR, and overlap subframes in every new frame.If an unused subframe or a subframe with an overlap signal is detected, it indicates that the number of muted MBSFN subframes assigned by the LTE resource controller has changed.Therefore, the NR resource controller goes back to Step 4 and triggers a new monitoring process, and the whole process is repeated.Otherwise, the determined resource allocation will be used by the NR scheduler until a change in frame pattern is detected (Step 28).

C. Proposed Resource Management Schemes in the NR Band
For the muted MBS subframe-based DSS, a resource scheduling algorithm that adaptively selects MBS parameters based on NR traffic is proposed.The procedures of the resource allocation scheme used by the NR resource controller are presented in Algorithm 3. Step Where is the capacity computed using eq.18. V N nb nr and V M nb ρ are vectors that store all the possible values of N nb nr and M nb ρ respectively.I nb and J nb are sets that store all possible indices of V N nb nr and V M nb ρ vectors.During the execution process, the binary search is used to select a target NR capacity value ) that corresponds to the NR traffic queue size (Q nb nr ) is selected from the capacity lookup table in such a way that it satisfies: where N nb is a set of δ nr indices and δ nr is a vector that stores all the possible difference values between each value stored in the NR capacity lookup table and the NR queue size, which is given by: The values of S nb ρ ) are selected as a new NR frame configuration in the NR band (F nrnb con ).The newly selected frame configuration F nrnb con is encoded into the NR SIBs and sent to the UE, and it is used for the next frames until a new configuration is selected based on the traffic queue after a minimum duration of P nr sib frames.However, the selected configuration is used in all frames within the P nr sib window except the NR frames that contain the SSB burst.In 5G NR, the SSB burst region is used for Synchronization Signal (SS) and PBCH.Half of the radio frame window (5 subframes), is the maximum length for an SSB burst, which may contain one or more SS/PBCH blocks.For this reason, Step 9 shows that at least five subframes are reserved for the NR scheduler in the SSB burst frame.There is some flexibility in the SSB periodicity (P ssb ), which can range from 5 to 160 ms [21].For a frame that contains an SSB burst, the number of NR subframes is set to at least 5, as follows: where N ssb nr is the number of non-muted subframes in an NR frame that contains the SSB burst.
Step 10 indicates that this frame configuration F nrnb con with N ssb nr non-muted subframes is used for an NR frame with SSB.For the rest frames, the F nrnb con determined by the binary search is used (Step 12).
On the other hand, the LTE resource controller in the eNB senses the pattern of the muted MBS subframes in the NR band and allocates resources to the LTE scheduler accordingly.Algorithm 4 shows the process of the LTE scheduler resource allocation algorithm used by the LTE resource controller on the NR band.
Step 4 shows that the algorithm starts with an initial LTE frame configuration in the NR band (F ltenb con ) with values N nb lte = 0 and M nb ρ = 0. Subsequently, the process involves utilizing ED to establish the pattern of muted MBS subframes (N nb lte ) within NR frames on the NR band.Regarding the ED procedure, IQ samples are gathered and preserved for each subframe (S enb i ) contained within a frame (Steps 6-8).Following this, during Step 9, the cumulative energy received in each subframe within the frame is compared against a predetermined threshold.If the received signal surpasses the threshold, the subframe is designated as non-muted; otherwise, it is labeled as a muted MBS subframe.This ED process is repeated for every frame until a muted MBS subframe is detected.In other words, this process in Monitoring I phase is repeated until N nb lte ̸ = 0.After that, the repeating pattern periodicity M nb ρ of the frame is found by using ED on subsequent frames (Monitoring II phase).The value of M nb ρ is determined by counting the number of frames until a frame with the same number of muted MBS subframes is repeated.This procedure utilizes the eNB frame counter ( ).Subsequently, the TRTC system is initiated by the LTE eNB.The LTE resource controller employs the determined F ltenb con to establish the resource pool for the LTE scheduler.The LTE scheduler then commences scheduling traffic based on the designated LTE resource pool (Main phase).During this phase, each subframe can either remain unused (idle) or be occupied by LTE, NR, or overlapping LTE and NR signals.In such cases, utilizing ED for signal type identification becomes unfeasible.
Thus, in Step 24, the proposed TRTC process is used to determine the frame pattern.The TRTC process determines the sequence of subframes with LTE, NR, and overlapping signal within each frame.The detection of an unused subframe or a subframe containing an overlapping signal indicates a change in the number of muted MBS subframes.Consequently, the NR resource controller returns to Step 4, initiating a new monitoring process and repeating the entire cycle.Alternatively, if no change in frame pattern is detected, Step 28 shows that the determined resource allocation remains in use by the LTE scheduler until a new muted MBS subframe pattern is identified.

A. Simulation Parameters
For the performance analysis, the Vienna LTE systemlevel simulator [27] and the 5G MATLAB Toolbox are used to model the LTE and NR networks, respectively.For the performance evaluation process of the proposed DSS solutions, co-located LTE eNB and NR gNB with omnidirectional antennas are considered.2680 MHz and 2635 MHz are used as the center frequencies for the primary LTE and NR bands, respectively.20 MHz bandwidth is used for both bands, and 15 KHz carrier spacing is used for the NR network.Each LTE eNB and NR gNB has N active UEs connected to it at a time, where N is randomly selected using a Poisson distribution between 1 and 100 UEs with a mean of 50 UEs.We also consider that an active UE can join a cell with one of the traffic types with the following probabilities: FTP (10%), HTTP (20%), VoIP (30%), video (20%), and gaming (20%) [28].
Once a UE is connected to a cell, a specific traffic type is selected with these probabilities, and it remains for N T Transmission Time Intervals (TTIs), where N T is randomly picked from an interval between 1 and 15.Considering the minimum and maximum values of N T , it can be observed that the traffic type of each active UE can change 20 to 300 times within the 300 TTI run-time.These values used for simulating dynamic traffic are adopted from [29].The performance analysis is based on 4,000 runs, where each run has a run time of 300 TTIs.
The periodicities of LTE SIB2/SIB13 and NR SIB20/23 are 160 ms and 80 ms, respectively.Based on these periodicity values, Algorithm 1 and Algorithm 3 are used to determine the resource pool size of the LTE scheduler and NR scheduler, respectively.After determining the available resources, the Proportional Fair (PF) scheduler allocates a given number of resource blocks to each user.In the rest of the paper, the PF LTE scheduler and PF NR scheduler that use the proposed adaptive resource pool size are referred to as the "customized LTE PF scheduler" and "customized NR PF scheduler" for the LTE and NR networks, respectively.For comparison, we also consider a classical PF LTE scheduler and a classical PF NR scheduler that use static LTE and NR bands, respectively.

B. Technology Recognition and Traffic Characterization
The CNN architecture comprises three layers, each with a specific purpose.In the first layer, 64 filters of size 1 × 3 capture basic features.The second layer uses 32 filters of size 2 × 3 to capture higher-level features.The third layer, with 16 filters of size 2 × 3, refines feature extraction further.To reduce spatial dimensions, Max-pooling is used between the CNN layers.After feature extraction, the classification Authorized licensed use limited to the terms of the applicable license agreement with IEEE.Restrictions apply.Additionally, a traffic pattern characterization scheme is proposed for estimating the pattern of the muted LTE MBSFN subframes and the muted NR MBS subframes that can be utilized by the NR and LTE schedulers.The technology recognition model is used to identify the type of signal present in each TRW.During the traffic pattern characterization procedure, consecutive TRWs that have been identified as belonging to the same technology are added to determine the signal type in each subframe.As the proposed DSS algorithms use the detection of collisions to initiate a new pattern sensing window, misclassification of an Overlap class can trigger a new pattern sensing window.In the pattern sensing window, the cross-band traffic offloading is stopped until the new pattern is estimated.Hence, frequent pattern-sensing windows triggered by misclassified overlap windows can lead to poor spectrum utilization.To minimize this, the detection of two consecutive overlap TRWs is used to indicate the occurrence of a collision and trigger spectrum sensing.
In the proposed DSS schemes, resource allocation decisions are made based on the outcome of the TRTC system.Hence, an optimal batch size has to be selected in such a way that the TRTC reports the traffic pattern in the shortest possible duration.Fig. 5a shows the processing time required for the pre-processing stage, the CNN model execution, and the overall process for different batch sizes.The figure shows that the processing time keeps increasing as the batch size increases.The benchmarks were executed on a host machine equipped with an AMD Ryzen 9 5900X 12-Core Processor, an NVIDIA RTX 3090 GPU, and 64GB of RAM.Fig. 5b shows the classification rate in every subframe (1 ms).For the 44 µs TRW, 1 ms duration is composed of approximately 23 TRWs.This means that a minimum of 23 TRWs have to be classified within 1 ms duration.The figure shows that a batch size of at least 16 TRWs is required to achieve the 23 TRWs/ms target classification rate.
Fig. 5c shows the probability of collision (overlap class) for different batch sizes.The probability of collision is determined from the statistics of identified TRWs collected throughout the simulation duration for different batch sizes.The lowest collision probability is obtained when a 16-TRW batch size is used.For batch sizes greater than 16 TRWs, the processing time delay keeps increasing, and this increases the probability of collision as there is a higher delay before a new resource allocation is used based on the outcome of the TRTC system.Similarly, for batch sizes smaller than 16 TRWs, the classification rate drops below the required 23 TRWs/ms leading to a delay before a new resource allocation is used based on the outcome of the TRTC system for the TRWs in 1 ms.This delay in a new resource allocation leads to an increased probability of collisions.Based on these results, a batch size of 16 TRWs is used for the TRTC system employed in the proposed DSS solution.

C. Performance of LTE Scheduler in LTE Band
Fig. 6a shows the CDF of the obtained LTE throughput for the LTE network using the PF LTE scheduler and the customized PF LTE scheduler.Below the median of the CDF, it can be observed that the LTE throughput obtained using the proposed customized PF LTE scheduler is lower than the LTE throughput obtained using the PF LTE scheduler.For example, the probability of getting LTE throughput below 30 Mbps is 32% and 36% for the PF LTE scheduler and the proposed customized PF LTE scheduler, respectively.This happens due to the fact that the customized PF LTE scheduler checks the traffic load of the active UEs and updates its scheduling every 160 TTIs (Algorithm 1).If the traffic demand increases within the 160 TTI window, the obtained throughput is limited to the maximum capacity of the resources defined based on the traffic load at the beginning of the scheduling window.If the traffic load is closer or higher than the system capacity of the LTE band, the customized PF LTE scheduler uses all the PRBs of the LTE band, as in the case of the PF LTE scheduler.Hence, the throughput CDFs are close for the higher throughput values.
The CDF of LTE spectrum utilization efficiency for the PF LTE scheduler and customized PF LTE schedulers is given in Fig. 6b.The spectrum utilization efficiency (ξ s ) of the schedulers in the simulation is computed in every scheduling window as follows: where U rb is the number of PRBs actually allocated to active UEs (based on the traffic) and A rb is the total number of available PRBs in the resource pool of the scheduler.For the PF LTE scheduler, all PRBs within the LTE band are available in the resource pool of the scheduler.On the other hand, the resource pool of the customized PF LTE scheduler is adaptively configured to include resources in certain TTIs only, while the rest of the TTIs are part of the resource pool as they are assigned to muted MBSFN subframes.At the median of the CDF (Fig. 6b), a spectrum utilization efficiency of 87% and 99% is obtained for the PF LTE scheduler and the proposed customized PF LTE scheduler, respectively, showing that the customized PF LTE scheduler has higher spectrum utilization efficiency as the PRBs available for the scheduler are adjusted based on the traffic.

D. Performance of NR Scheduler in NR Band
The CDF plots of the obtained NR throughput with the PF NR scheduler and the customized PF NR scheduler are depicted in Fig. 7a.The NR throughput obtained with the proposed customized NR scheduler is slightly lower than the NR throughput obtained with the PF NR scheduler.As an example, the likelihood of achieving an NR throughput of less than 60 Mbps is 69% for the PF NR scheduler and 72% for the customized PF NR scheduler.The customized PF NR scheduler checks the traffic load of the active UEs and updates its scheduling every 80 TTIs (Algorithm 3).In the event that traffic demand rises during this window, the obtained throughput will be capped at the maximum capacity of the resources, as determined by the traffic load at the start of the scheduling window.However, the gap between the throughput CDF curves of the PF NR scheduler and customized PF NR scheduler (Fig. 7a) is very low, and it is lower than the gap between the throughput CDF curves of the PF LTE scheduler and customized PF LTE scheduler Fig. 6a.This can be explained by the fact that the customized PF NR scheduler Authorized licensed use limited to the terms of the applicable license agreement with IEEE.Restrictions apply.updates its resource pool in 80 ms, which is shorter than the 160 ms period required to update the resource pool of the customized PF LTE scheduler.
Fig. 7b shows the CDF of the efficiency with which the PF LTE scheduler and customized PF NR scheduler use the corresponding available resource pools in the NR band.The results show that the customized PF NR scheduler has a higher spectrum utilization efficiency as compared to the PF NR scheduler.Similar to the LTE spectrum utilization efficiency, the NR spectrum utilization efficiency is computed using eq.28.In the PF NR scheduler, all PRBs within the NR band are available in the resource pool of the scheduler, which leads to poor spectrum utilization efficiency.To counter this, the proposed NR resource controller determines the size of the resource pool of the customized PF scheduler based on the traffic load, and the remaining resources within the muted MBS subframes are not part of the resource pool of the scheduler.
The shorter resource pool update period used by the customized PF NR scheduler and its higher flexibility to allocate muted MBS subframes lead to higher spectrum utilization efficiency gains as compared to the PF NR scheduler.As an example, the likelihood of getting less than 80% spectrum utilization efficiency is 40% and 9% for the PF scheduler and the customized PF NR scheduler, respectively.

E. LTE and NR Performance Using Cross-Band DSS
In this section, we present the performance of the LTE and NR networks considering cross-band DSS.Fig. 8c shows the spectrogram of LTE and NR signals from the simulator on a randomly selected sample frame on the LTE band.Based on the LTE traffic at the shown sample frame, the LTE resource controller limits the resource pool of the customized PF LTE scheduler to utilize resources in the four subframes only, leaving the six remaining subframes as muted MBSFN subframes (Fig. 8a).The figure shows that the non-MBSFN region of the muted MBSFN subframes is reserved for the always-on LTE control signals.The NR resource controller, on the other hand, senses the LTE band and determines the pattern of the muted MBSFN subframes.The resources in these muted MBSFN subframes are used by the NR scheduler (Fig. 8b).The results are obtained by using the proposed customized PF scheduler for both technologies.Without the use of the DSS, the customized PF LTE scheduler and customized PF NR scheduler have resource pools limited to the resources in the static LTE and NR bands, respectively.For this reason, the results show that there is an LTE and NR throughput gain using cross-band DSS as compared to the corresponding throughput obtained using a static band.
For the LTE network, the maximum obtained throughput is 53.2 Mbps and 81.4 Mbps using the static LTE band and crossband DSS, respectively.As the number of active UEs varies in a Poisson distribution between 1 and 100, with a mean of 50, the probability that the LTE band is at its maximum capacity is high.Based on the results, the highest capacity of the static LTE band-based LTE network ranges between 37.4 Mbps and 53.2 Mbps, depending on the location of the UEs and channel conditions.For this reason, the throughput probability distribution is high in this range for the LTE network designed to utilize the static LTE band only.On the other hand, the LTE network that utilizes cross-band DSS has a throughput distribution that reaches up to 81.4 Mbps.As the NR network mainly utilizes the NR band, it is less probable to get a completely free NR band that is available to offload the LTE traffic.Hence, the probability of getting throughput values close to the maximum value is low.
For the NR networks considered with a static NR band and with cross-band DSS, the maximum throughput obtained is 80.1 Mbps and 105.3 Mbps, respectively.Likewise, in the case of an LTE network with a static LTE band, there is a significant possibility that the NR band is at maximum capacity as the number of active NR UEs varies from 1 to 100 according to a Poisson distribution, with a mean of 50.Depending on the distribution of UEs and the channel conditions, the results show that the maximum capacity of a static band-based NR Fig. 9.
LTE and NR throughput using static bands, using TRTC-based cross-band DSS, and using coordinated cross-band DSS.
network can be anywhere from 60.8 Mbps to 80.1 Mbps.By contrast, the cross-band DSS-based NR network can deliver a throughput of up to 105.3 Mbps.The CDF shows a slowly increasing trend as the throughput approaches its upper bound.The reason for this is that the LTE network predominately utilizes the LTE band, making it less probable that a completely free LTE band will be available to offload the NR traffic.Taking into account the entire simulation time, the introduction of the TRTC-based uncoordinated cross-band DSS improves the average throughput of LTE and NR bands by 13.5% and 8.3%, respectively.
Additionally, the Fig. 9 illustrates that by eliminating the need for a coordination channel and implementing a TRTCbased cross-band DSS scheme, there is a marginal drop in the achieved LTE and NR throughput as compared to the coordinated cross-band DSS approach.For the coordinated cross-band DSS, we consider that the eNB uses Algorithm 1 to schedule muted MBSFN subframes on the LTE band and the gNB uses Algorithm 2 to schedule muted MBS subframes on the NR band.Based on the minimum possible periodicity of the resource pool management scheme, the eNB uses a dedicated coordination channel to send scheduling updates to the gNB every 160 ms, and the NR scheduler uses the available resources on the LTE band based on the received scheduling information.Similarly, the gNB sends scheduling updates to the eNB every 80 ms via a coordination channel, and the LTE scheduler uses the available resources on the NR band based on the received scheduling information.In comparison to the coordinated cross-band DSS approach, the TRTC-based crossband DSS scheme results in a reduction of 3.1% and 2.3% in the attained average LTE and NR throughput, respectively.The marginal drop occurs due to reactive resource management when an overlapping transmission or unused subframes occur in the TRTC-based cross-band DSS, while the coordinated cross-band DSS uses proactive resource management based on the scheduling information exchanged through the dedicated signaling channel.
The CDF of the spectrum utilization efficiency of LTE and NR bands considering LTE and NR networks with and without DSS is shown in Fig. 10.The figure shows the CDF of spectrum utilization efficiency for the LTE and NR bands using static configuration, TRTC-based uncoordinated cross-band DSS, and signaling channel-based coordinated cross-band DSS.In the simulation, the spectrum utilization efficiency on a specific band (ξ b ) is periodically determined using the following formula: where U rb is the number of PRBs allocated to active UEs and T rb is the total number of available PRBs in the band.Without the use of DSS, PRBs are allocated to LTE and NR UEs in the LTE and NR bands, respectively.On the other hand, the PRBs in each band can be allocated to LTE and NR UEs when the proposed cross-band DSS scheme is proposed.Fig. 10 shows that the spectrum utilization efficiency is enhanced with the help of DSS in both LTE and NR bands.With the introduction of the proposed DSS scheme, the spectrum utilization efficiency gain achieved in the NR band is higher than the spectrum utilization efficiency gain obtained in the LTE band.As an example, the likelihood of getting less than 60% spectrum utilization efficiency is 23.2%, 23.6%, 10.6%, and 3.4% for LTE bands without DSS, NR bands without DSS, LTE bands with TRTC-based DSS, and NR bands with TRTC-based DSS, respectively.The reason for getting higher spectrum utilization efficiency on the NR band (with TRTC-based DSS) is the fact that the NR band is primarily used by the NR network, which has a shorter resource pool update period (80 ms) and higher flexibility to allocate muted MBS subframes as compared to the LTE network, which can allocate a maximum of six muted MBSFN subframes with a longer resource pool update period (160 ms).
Considering the overall simulation duration, the introduction of TRTC-based DSS gives 11.8% and 20.7% improvements in the spectrum utilization efficiency of LTE and NR bands, respectively.Fig. 10 also shows that the TRTC-based DSS scheme and the coordinated DSS approach have comparable performance with respect to spectrum utilization efficiency.As compared to the coordinated DSS, the TRTC-based DSS has a drop of less than 2% in spectrum utilization efficiency Authorized licensed use limited to the terms of the applicable license agreement with IEEE.Restrictions apply.
in the LTE and NR bands.This minor drop is attributed to the presence of unused subframes during the monitoring phases of the radio resource management schemes by the LTE and NR schedulers.
We have also evaluated the performance of the proposed DSS scheme in terms of the probability of buffer overflow.A buffer overflow occurs when the data buffer becomes full and new data arrives.When this happens, the buffer cannot accommodate any more data, and packets are dropped.We used a buffer size of 50,000 bits [31] for each UE, and the total number of times the traffic queue exceeds this value is counted to measure the average buffer overflow probability.The results show that the probability of buffer overflow is 14.6%, while it drops to 9.4% and 9.1% when the TRTC-based DSS and coordinated DSS are used, respectively.

VI. CONCLUSION AND FUTURE WORK
In this work, we present a cross-band DSS scheme utilizing the MBSFN feature of an LTE network and the MBS feature of an NR network.To enable the proposed cross-band DSS scheme, novel uncoordinated LTE and NR resource controllers are proposed.Based on traffic requirements in the LTE band, the LTE resource controller dynamically assigns muted MBSFN subframes that the NR scheduler can use.Similar to this, the LTE scheduler can take advantage of muted MBS subframes in the NR band.The proposed DSS technique differs from the state-of-the-art in that it does not require a coordination signaling channel between the LTE and NR networks.A TRTC system is utilized to recognize and characterize the traffic pattern of the co-located cells instead of a resource coordination channel.Based on the results of the TRTC, the LTE scheduler offloads its traffic into the NR band and vice versa.The proposed cross-band DSS improves the average LTE throughput, average NR throughput, average LTE band spectrum utilization efficiency, and average NR band spectrum utilization efficiency by 13.5%, 8.3%, 11.8%, and 20.7%, respectively.
In this work, we assumed subframe-level synchronization between the LTE eNB and NR gNB and developed our solution based on that.Nonetheless, the exploration and assessment of the synchronization mechanisms could be considered a potential future work.Additionally, the proposed DSS scheme uses the TRTC system, which assesses the existing traffic load and makes the subsequent resource allocation decision.This resource allocation decision can be extended so that the decisions are executed proactively by integrating advanced machine learning-based traffic prediction techniques.

Fig. 3 .
Fig. 3. Resource coordination in DSS mode a) through coordination channel for coordinated LTE and NR networks b) using proposed TRTC for uncoordinated LTE and NR networks.
. The technology recognition model is trained and validated to classify three classes: LTE, NR, and overlap of the two signals.The traffic characterization process uses the outcome of the technology recognition model to estimate the traffic pattern.Based on the identified Time Resolution Windows (TRWs), the pattern of the muted LTE MBSFN subframes is determined in the LTE band by the TRTC of the 5G network.Based on the pattern of the muted MBSFN subframes estimated using the TRTC system, the NR resource controller allocates resources (from the LTE band) to the NR scheduler.Similarly, the outcome of the technology recognition process in the LTE network is also used to characterize the pattern of muted MBS NR subframes in the NR band.Based on the estimated pattern of the muted NR MBS subframes, the LTE radio resource controller allocates resources (from the NR band) to the LTE scheduler.
lb and J lb are sets of the possible indices of V N lb lte and V M lb ρ vectors.V N lb lte and V M lb ρ are vectors that store all the possible values of N lb lte and M lb ρ respectively.According to the 3GPP standard [19] V N lb lte = {1, 2, 3, 4, 5, 6} and V M lb ρ = {1, 2, 4, 8, 16, 32}.At each possible combination

Algorithm 4 ) 23 for Every new frame do 24 Update subframe pattern using TRTC 25 if Pattern change detected then 26 Go
F enb c ) to monitor frame indices, and the pattern of muted MBS subframes is determined using ED as shown (Steps 12-15).The value of M nb ρ is established by tallying the frames until a frame with an equivalent count of muted MBS subframes is identified, as illustrated in Steps 16-18.During Step 19, Adaptive Resource Allocation for LTE Scheduler on NR Band 1 Input: eNB frame counter (F enb c determined N nb lte and M nb ρ values are stored as the LTE frame configuration for the NR band (F ltenb con

Fig. 4 .
Fig. 4. Classification performance of the proposed CNN model at different SNR levels.

Fig. 5 .Fig. 6 .
Fig. 5. Impact of batch size on the TRTC system a) processing time b) classification rate c) probability of overlap TRWs.

Fig. 7 .
Fig. 7. Performance of the PF NR scheduler and the proposed customized PF NR scheduler in the NR band: a) CDF of NR throughput; b) CDF of NR spectrum utilization efficiency.

Fig. 8 .
Fig. 8. Spectrogram of LTE, NR, and combined (using proposed cross-band DSS) signals in a sample frame in the LTE band.

Fig. 9
Fig. 9 shows the CDF for LTE and NR throughput obtained with and without cross-band DSS.The figure shows the CDF (over all the simulation duration) of throughput obtained for an LTE network with a static LTE band, an NR network with a static NR band, an LTE network with TRTC-based uncoordinated cross-band DSS, an NR network with TRTC-based uncoordinated cross-band DSS, an LTE network with signaling channel-based coordinated cross-band DSS, and an NR network with signaling channel-based coordinated cross-band DSS.The results are obtained by using the proposed customized PF scheduler for both technologies.Without the use of the DSS, the customized PF LTE scheduler and customized PF NR scheduler have resource pools limited to the resources in the static LTE and NR bands, respectively.For this reason, the results show that there is an LTE and NR throughput gain using cross-band DSS as compared to the corresponding throughput obtained using a static band.For the LTE network, the maximum obtained throughput is 53.2 Mbps and 81.4 Mbps using the static LTE band and crossband DSS, respectively.As the number of active UEs varies in a Poisson distribution between 1 and 100, with a mean of 50, the probability that the LTE band is at its maximum capacity is high.Based on the results, the highest capacity of the static LTE band-based LTE network ranges between 37.4 Mbps and 53.2 Mbps, depending on the location of the UEs and channel conditions.For this reason, the throughput probability distribution is high in this range for the LTE network designed to utilize the static LTE band only.On the other hand, the LTE network that utilizes cross-band DSS has a throughput distribution that reaches up to 81.4 Mbps.As the NR network mainly utilizes the NR band, it is less probable to get a completely free NR band that is available to offload the LTE traffic.Hence, the probability of getting throughput values close to the maximum value is low.For the NR networks considered with a static NR band and with cross-band DSS, the maximum throughput obtained is 80.1 Mbps and 105.3 Mbps, respectively.Likewise, in the case of an LTE network with a static LTE band, there is a significant possibility that the NR band is at maximum capacity as the number of active NR UEs varies from 1 to 100 according to a Poisson distribution, with a mean of 50.Depending on the distribution of UEs and the channel conditions, the results show that the maximum capacity of a static band-based NR

Fig. 10 .
Fig. 10.Spectrum utilization efficiency of LTE and NR bands without DSS, with TRTC-based cross-band DSS, and coordinated cross-band DSS.
Fast Fourier Transform (FFT) of the collected In-phase and Quadrature (IQ) samples is used to train and validate the developed CNN-based technology recognition model.•A traffic pattern characterization scheme is also proposed to estimate the pattern of the muted LTE MBSFN subframes and muted NR MBS subframes that can be exploited by the NR and LTE schedulers, respectively.
• To enable LTE-NR DSS without coordination signaling between the technologies, we developed a Convolutional Neural Network (CNN)-based technology recognition model to identify LTE, NR, and overlap signals.

TABLE I RELATED
WORK: DSS SCHEMES a resource manager that keeps receiving traffic statistics from LTE and NR stations through a coordination channel.

1
Adaptive resource allocation for LTE scheduler on LTE band.
Algorithm 1 shows the procedures of the resource allocation scheme used by the LTE resource controller.Step 4 shows that the resource controller uses the eNB frame index (F enb c ) to periodically check the LTE traffic queue size (Q lb lte ) every P lte sib frames.Where P lte sib is the periodicity of the LTE SIB2 and SIB13.Hence, the minimum possible value of P lte sib is 16 frames, which is 160 ms.Once the size of the traffic queue is determined (in Step 5), we use a binary search method in Step 6 to select the new configuration.For the binary search, a capacity lookup table (C lb table ) is generated for each MCS, putting all possible combinations of standard values of N lb lte and M lb ρ in eq.10.The capacity lookup table is generated by storing and sorting the values obtained using: yield this required capacity are used as a new configuration.As shown in Step 8, these values are stored in the LTE frame configuration on the LTE band (F ltelb con ).Based on the F ltelb con , the resource pool of the LTE scheduler is limited to the resources in the selected non-muted subframes.The newly selected configuration is encoded into the LTE SIBs and sent to the UE.Algorithm 2 Adaptive resource allocation for NR scheduler on LTE band.