Professor Barry Evans
Academic and research departments
Institute for Communication Systems, School of Computer Science and Electronic Engineering.About
Biography
Professor Evans was educated at the University of Leeds, obtaining BSc (1st Class Hons) and PhD degrees in 1965/8 respectively. He then joined the British Telecom sponsored team at the University of Essex (Lecturer-Reader 1968-83) where he was responsible for setting up the Telecommunication Systems post-graduate activities and for leading the radio systems group researching in radio propagation (initiating dual polarisation satellite systems) speech coding and satellite systems initiating SCPC syllabic companded FM which became a satellite standard used by developing countries.
In 1983 he was appointed to the Alex Harley Reeves Chair of Information Systems Engineering at the University of Surrey. He championed the idea of independent research centres and was founder Director of the Centre for Satellite Engineering Research in 1990. He was a founder Director of the successful spin-out company, Surrey Satellite Technology Ltd producing small satellites and which now boosts over 500 employees and is located on the Surrey research park. In 1996 he became founder Director of the new Centre for Communication Systems research ( CCSR ) – a post he held until 2010. CCSR was the largest European academic research group in Mobile and Satellite Communications. CCSR being an extremely successful research centre with over 40 research fellows, 100+ PhD students and an annual research portfolio of £15m. CCSR played a central role in the UK’s Mobile Virtual Centre of Excellence, major participation in EU’s Framework Research Programmes (including most of the satellite projects) and wide industrial collaborations and strategic alliances (including Nokia, Ericsson, Vodafone, Thales, EADS-Astrium and Motorola). In 2013 CCSR became an independent research Institute for Communications Systems(ICS) at the University and was awarded £35m from UK Government and industry to set up a 5G Innovation Centre (5GIC) working between academia and industry.
Taking a wider role in the University, Professor Evans was appointed Dean of Engineering from 1999 to 2001, moving to Pro-Vice-Chancellor for Research and Enterprise in 2001 -2009. In the latter capacity he was responsible for research across the University and for the expanding role of technology transfer and spin outs. In 2010 he returned to a full time academic role in ICS where he heads up the satellite communications research activities.
Outside the University of Surrey, Professor Evans was responsible for setting up the Mobile Virtual Centre of excellence in the UK (collaboration of 6 Universities and 20+ companies) ,. Barry has been a participant in many of the EU funded research projects and has served as an advisor to the EU on future research programmes. He chairs the Steering Board of the satellite communications network of excellence SatNEx supported by EU and now ESA. His current research interests include channel modelling and MIMO as well as advanced modulation and coding, satellite on board processing and the integration of satellites and terrestrial systems in 5G as well as spectrum and cognitive radio studies. His earlier work on secure speech over mobile systems has resulted in a spin off company Mulsys ltd, for which he is a Director.
Professor Evans has been also involved in the UK Foresight programmes in Communications and ITEC, EPSRC Strategic Advisory Committees, MoD-DSAC Committees, adviser to DG of OFTEL, board member of BNSC-TNAB as well as ITU, ETSI and EU Advisory Committees. He was until recently a member of the Ofcom Spectrum Advisory Board, the Steering Council of the European Technology Platform – Integral Satcom Initiative and is now on the steering board of the EU technology platform NETWORLD2010. . He was created a Fellow of the Royal Academy of Engineering, UK’s national academy of engineering in 1991 and in 2013 was awarded the Ambrose Fleming medal by the IET in recognition of his achievements in satellite communications and bringing industry and academia together.
Professor Evans has been also involved in the UK Foresight programmes in Communications and ITEC, EPSRC Strategic Advisory Committees, MoD-DSAC Committees, adviser to DG of OFTEL, board member of BNSC-TNAB as well as ITU, ETSI and EU Advisory Committees. He was recently a member of the Ofcom Spectrum Advisory Board, the Steering Council of the European Technology Platform – Integral Satcom Initiative and is now on the steering board of the EU technology platform NETWORLD2010. .
Satellite achievements:
Our successes in the satellite arena go back to the first publication of a satellite based mobile cellular system in the 1980’s—LEONET, which for the first time proposed the basic ideas on which the later Iridium system was based. Research from Surrey was responsible for the basic concepts of network management and RRM in such constellations. Our work then on constellations came up with optimised LEO orbits on which the SKYBRIDGE system was based and more recently the spectrum used for this has been adopted for the ONEWEB constellation of LEO satellites proposed for 2020 onwards.
Integration of satellite systems with terrestrial has been a theme of our research for many years going back to the 1990’s when we proposed in an EU project SATIN the idea of joint multimedia delivery between satellite and a terrestrial component. This was followed up by further projects MoDiS and MAESTRO to actual demonstration of the system in a 4G network in Monaco and the launch of a joint venture satellite system(SolarisMobile) by Eutelsat and SES. In addition this paved the way for S band spectrum harmonisation across the EU for these systems. Another example of our commitment to integration was in the EU project BATS which concerned the multi linking of traffic between satellite, mobile radio and terrestrial lines to provide improved QoE for users. With an intelligent user gateway in the domestic premises broadband users achieved enhanced performances and this was demonstrated in the lab and in the field via 50 households across Europe. We have produced for OFCOM a study on various broadband IP applications over satellite with QoE results which they have used on their website to convince the use of satellite.
Channel measurement and modelling has been another feature of our research and we were responsible for the early work on L/S band satellite channel modelling and the later work on MIMO channel modelling in this context. We have also produced modelling on low elevation satellite channels at Ku band and more recently on Q/V and W band modelling for High Throughput Satellite systems.
In 2007 we proposed to ESA the ideas behind a Terrabit/s (HTS) satellite and convinced them of its feasibility in a SatNEx study. This was followed by industrial contracts from ESA in which we participated for the design of such satellites and their smart gateway ground segments. Now most major satellite manufacturers have adopted these ideas and are producing commercial satellites based upon them. In a complimentary project CoRaSat we have proposed a mechanism for satellite and terrestrial users to share the Ka band spectrum by incorporating terrestrial data bases in the satellite network gateways so that carriers can be allocated to avoid interference. This was validated using data from several EU countries and has been adopted by CEPT and is being integrated by several terminal manufacturers.
On the networking side we were one of the first to research IP multicast over satellite in EU projects SatLife, SATSIX and EuroNGI and the outcomes of this work were fed into ETSI and IETF standards work. Multicasting via satellite is now being pursued for integrated content delivery to the edge in 5G networks. We have also worked extensively with industry on security in satellite networks eg dynamics of key management in secure satellite multicast. This work again has fed into ETSI and IETF standards and is currently in the ITU roadmap standards. Very recently the security work has used blockchain ideas to enhance these security issues over satellite.
Coming right up to date our research is now concentrating on the integration of satellites into 5G networks. We are currently involved in the EU project SaT5G which has proposed an integrated architecture for satellites into a 5G core network. We are now working towards demonstrating this architecture in backhaul, content delivery to the edge and to moving platforms using our 5GIC, 5G test bed at Surrey. This involves the virtualisation of the satellite component and integration with the Core network as well as the E2E orchestration of the integrated connection. Additional work with ESA is on-going to develop a modified New Radio air interface system that will be efficient for use over the satellite channel.
News
Publications
SatNEx has brought together 24 partners from European research organizations and academia to form a pan-European research network. A major objective of SatNEx is to rectify the fragmentation in satellite communications research by bringing together leading European academic research organizations in a durable way. The paper presents the organization of the Network of Excellence and the "Air Interface" research activities. © 2007 by Prof. Michel Bousquet.
The second generation standard for Digital Video Broadcasting via Satellite (DVB-S2) which was designed for fixed terminals is being considered for the delivery of broadband services to mobile platforms (such as airplanes, ships and trains). In this paper, we investigate the existing frequency synchronization algorithms that were proposed for DVB-S2 fixed terminals, in the specified mobile environment. Based on the simulation results obtained, a new algorithm is proposed to enhance frequency synchronization performance for DVB-S2 in the mobile channel. © 2007 by the American Institute of Aeronautics and Astronautics, Inc. All rights reserved.
With the worldwide evolution of 4G generation and revolution in the information and communications technology(ICT) field to meet the exponential increase of mobile data traffic in the 2020 era, the hybrid satellite and terrestrial network based on the soft defined features is proposed from a perspective of 5G. In this paper, an end-to-end architecture of hybrid satellite and terrestrial network under the control and user Plane (C/U) split concept is studied and the performances are analysed based on stochastic geometry. The relationship between spectral efficiency (SE) and energy efficiency (EE) is investigated, taking consideration of overhead costs, transmission and circuit power, backhaul of gateway (GW), and density of small cells. Numerical results show that, by optimizing the key parameters, the hybrid satellite and terrestrial network can achieve nearly 90% EE gain with only 3% SE loss in relative dense networks, and achieve both higher EE and SE gain (20% and 5% respectively) in sparse networks toward the future 5G green communication networks.
Wireless networks are experiencing a paradigm shift from focusing on the traditional data transfer to accommodating the rapidly increasing multimedia traffic. Hence, their scheduling algorithms have to concern not only network-oriented quality-of-service (QoS) profiles, but also application-oriented QoS targets. This is particularly challenging for satellite multimedia networks that lack fast closed-loop power control and reliable feedbacks. In this paper, we present a cross-layer packet scheduling scheme, namely Hybrid Queuing and Reception Adaptation (HQRA), which performs joint adaptations by considering the traffic information and QoS targets from the applications, the queuing dynamics induced from the network, as well as the end-to-end performance and channel variations from respective users. By jointly optimizing multiple performance criteria at different layers, the scheme enjoys quality-driven, channel-dependant, and network-aware features. HQRA can well accommodate return link diversity and the imperfect feedbacks, whilst ensuring robustness in highly heterogeneous and dynamic satellite environments. We evaluate its performance over diverse network and media configurations in comparison with the state-of-the-art solutions. We observe noticeable performance gains on application-oriented QoS, bandwidth utilization, and objective video quality, together with favorable fairness and scalability measures.
Advance in digital signal processing and telecommunication technologies has lead to the development of ATM and B-ISDN. Satellite communications systems can play an important role in the development of the initial experimental systems and also in the fully developed networks due to their features of flexible wide coverage, independent of ground distances and geographical constrains, multiple access and multipoint broadcast. This paper presents an implementation structure of ATM via satellite and its capabilities of supporting B-ISDN based on a demonstration system developed within the RACE CATALYST project.
Cognitive radio (CR) is a potentially promising solution to the spectrum crunch problem that faces both future terrestrial and satellite systems. This paper discusses the applicability of CR in satellite/terrestrial spectrum sharing scenarios by modelling interference relations between these systems. It analyses the relative impact of several design parameters that can be tuned in order to reach a particular interference target. A realistic path loss model is considered and gain patterns of directional antennas are taken into account which are found to be efficient in minimising the interference. A generic model that is not restricted to particular systems is developed, and typical parameters are used to analyse the co-existence feasibility in a realistic sharing scenario. The results show that both satellite and terrestrial systems could potentially operate in the same band without degrading each other’s performance if appropriate considerations are taken into account and an appropriate design of the interfering system is carried out.
A fundamental challenge in orthogonal-frequency-division-multiple-access (OFDMA)-based cellular networks is intercell interference coordination, and to meet this challenge, various solutions using fractional frequency reuse (FFR) have been proposed in the literature. However, most of these schemes are either static in nature, dynamic on a large time scale, or require frequent reconfiguration for event-driven changes in the environment. The significant operational cost involved can be minimized with the added functionality that self-organizing networks bring. In this paper, we propose a solution based on the center of gravity of users in each sector. This enables us to have a distributed and adaptive solution for interference coordination. We further enhance our adaptive distributed FFR scheme by employing cellular automata as a step toward achieving an emergent self-organized solution. Our proposed scheme achieves a close performance with strict FFR and better performance than SFR in terms of the edge user's sum rate.
In this paper we investigate the next but one generation of fixed satellite systems and the technological challenges that face this generation which we define as operational by 2020. Various technologies and architectures are presented with a view to identifying the most promising to pursue. © 2010 by the American Institute of Aeronautics and Astronautics, Inc. All rights reserved.
The elasticity of transmission control protocol (TCP) traffic complicates attempts to provide performance guarantees to TCP flows. The existence of different types of networks and environments on the connections’ paths only aggravates this problem. In this paper, simulation is the primary means for investigating the specific problem in the context of bandwidth on demand (BoD) geostationary satellite networks. Proposed transport-layer options and mechanisms for TCP performance enhancement, studied in the single connection case or without taking into account the media access control (MAC)-shared nature of the satellite link, are evaluated within a BoD-aware satellite simulation environment. Available capabilities at MAC layer, enabling the provision of differentiated service to TCP flows, are demonstrated and the conditions under which they perform efficiently are investigated. The BoD scheduling algorithm and the policy regarding spare capacity distribution are two MAC-layer mechanisms that appear to be complementary in this context; the former is effective at high levels of traffic load, whereas the latter drives the differentiation at low traffic load. When coupled with transport layer mechanisms they can form distinct bearer services over the satellite network that increase the differentiation robustness against the TCP bias against connections with long round-trip times. We also explore the use of analytical, fixed-point methods to predict the performance at transport level and link level. The applicability of the approach is mainly limited by the lack of analytical models accounting for prioritization mechanisms at the MAC layer and the nonuniform distribution of traffic load among satellite terminals.
Next generation networks will have to provide global connectivity to ensure success. Both satellite and terrestrial networks cannot guarantee this on their own. This incapability is attributed to capacity coverage issues in densely populated areas for satellites and lack of infrastructure in rural areas for terrestrial networks. Therefore, we consider a hybrid terrestrial-satellite mobile system based on frequency reuse. However, this frequency reuse introduces severe co-channel interference (CCI) at the satellite end. To mitigate CCI, we propose an OFDM based adaptive beamformer implemented on-board the satellite with pilot reallocation at the transmitter side. Results show that the proposed scheme outperforms the conventional approach.
In this paper we review some of the work of the' Satellite Working Group' in the European Technology platform NETWORLD2020 towards a strategy for satellites in 5G. We first review the 5G vision and its drivers as defined by the terrestrial mobile community via the 5GPPP Association. We then outline the areas in which satellite can contribute to an integrated system within 5G and detail the research challenges that this provides. Finally we give views on a technology roadmap to meet these challenges such that satellites are ready by 2020 to play their part in the integrated 5G roll out.
The satellite communication data traffic is increasing dramatically over the coming years. High throughput multibeam satellite networks in Ka band are potentially able to accommodate the upcoming high data rate demands. However, there is only 500 MHz of exclusive band for download and the same amount for upload. This spectrum shortage impose a barrier in order to satisfy the increasing demands. Cognitive satellite communication in Ka band is considered in this paper in order to potentially provide an additional 4.4 GHz bandwidth for downlink and uplink fixed-satellite-services. In this way, it is expected that the problem of spectrum scarcity for future generation of satellite networks is alleviated to a great extent. The underlying scenarios and enabling techniques are discussed in detail, and finally we investigate the implementation issues related to the considered techniques.
Single Carrier Frequency Domain Equalization (SC-FDE), has recently attracted significant interest particularly in the wireless communications literature as an alternative air-interface solution to OFDM. Both SC-FDE and OFDM combat the detrimental effects of the frequency selective channel in a very efficient way; through the use of FFT/IFFT processing and parallel single-tap equalizers for each frequency bin. Despite their strong similarities, the two techniques differ in that in SC-FDE both the FFT and IFFT processing are performed at the receiving end. In the context of satellite communications, where power efficiency is a key objective, this difference translates into a much stronger resilience for SC-FDE to the non-linearity of the on-board High Power Amplifier (HPA). On the other hand the very high Peak-to-Average Power Ratio (PAPR) that characterizes OFDM signals requires employment of predistortion techniques and/or back-off. In this paper simulation results are provided, which show that SC-FDE offers some improvements to conventional OFDM in a S-UMTS downlink.
This paper presents the findings of an effort to establish the performance of ACM as defined in the DVB-S2 standard, in the context of vehicular satellite communications. The effects of multipath fading as well as rain attenuation and the high round trip delays associated with geostationary satellite systems are discussed. Performance results have been obtained by means of link level simulations. The work has been curried out under the FP6 project MOWGLY whose objective is the design, testing and implementation of a prototype system for broadband satellite communications to aircraft, maritime vessels and high speed trains. © 2007 by the American Institute of Aeronautics and Astronautics, Inc. All rights reserved.
The authors introduce the types of satellite constellation networks, and examine how overall performance of TCP communications carried across such a network can be affected by the choice of routing strategies used within the network.
Session Initiation Protocol (SIP) is an application layer signalling protocol used in the IP-based UMTS network for establishing multimedia sessions. With a satellite component identified to play an integral role in UNITS, there is a need to support SIP-based session establishment over Satellite-UMTS (SUNITS) as well. Due to the inherent characteristics of SIP, the transport of SIP over an unreliable wireless link with a larger propagation delay is inefficient. To improve the session setup performance, a link layer retransmission based on the Radio Link Control acknowledgement mode (RLC-AM) mechanisms is utilised. However the current UMTS RLC-AM procedure is found to cause undesirable redundant retransmission when applied over the satellite. As such, this paper proposes an enhancement to the RLC protocol through a timer-based retransmission scheme. Simulation results reveal that not only the system capacity can be improved through this redundant retransmission avoidance scheme, but also better system performances in terms of session setup delay and failure are gained.
The synergy between satellite and terrestrial mobile networks is regarded as a promising approach for the delivery of broadcast and multicast services to mobile users. This paper evolves around a hybrid satellite-terrestrial system, featuring a unidirectional satellite component that is responsible for the delivery of point-to-multipoint services. It proposes a systematic approach for the satellite system capacity partitioning between streaming and push-and-store services and the radio bearer configuration within the satellite access layer. The approach takes into account the service requirements, estimates of the traffic demand and popularity of individual services and preliminary link dimensioning exercises. A capacity analysis is carried unt in the end tu check the efficiency of the approach, concluding in the same time on the feasibility of this hybrid system solution.
Satellite communication has recently been included as one of the key enabling technologies for 5G backhauling, especially for the delivery of bandwidth-demanding enhanced mobile broadband (eMBB) applications in 5G. In this paper, we present a 5G-oriented network architecture that is based on satellite communications and multi-access edge computing (MEC) to support eMBB applications, which is investigated in the EU 5GPPP Phase-2 SaT5G project. We specifically focus on using the proposed architecture to assure Quality-of-Experience (QoE) of HTTP-based live streaming users by leveraging satellite links, where the main strategy is to realise transient holding and localization of HTTP-based (e.g., MPEG-DASH or HTTP Live Streaming) video segments at 5G mobile edge while taking into account the characteristics of satellite backhaul link. For the very first time in the literature, we carried out experiments and systematically evaluated the performance of live 4K video streaming over a 5G core network supported by a live geostationary satellite backhaul, which validates its capability of assuring live streaming users’ QoE under challenging satellite network scenarios.
Quantification of distortion effects on UWB system performances in terms of positioning error is analysed in this research. UWB multipath distorted channels are simulated in each frequency subband, over 2-11 GHz. Its characteristics are modelled corresponding to multipath clusters along the propagation paths. The classification of clusters and physics. based distortion mechanisms are generalized to be included into the simulation algorithm. Finally, distortion impacts on system performances regarding to frequency dependent characteristics and positioning errors are investigated.
The high speed downlink packet access (HSDPA) system has been investigated for adaptation in the GEO satellite environment in order to achieve high packet user throughput and system efficiency. This paper discusses the performance of the so called satellite-HSDPA (S-HSDPA) system, where the impacts of the power amplifier non-linearity, space time transmit diversity (STTD) and multicode transmission, are examined. The S-HSDPA performance is obtained from simulations of a modified terrestrial HSDPA link simulator in a rich multipath urban environment with three intermediate module repeaters (IMR). The results indicate an appropriate choice of system parameters. © 2008 IEEE.
In order to improve the manageability and adaptability of future 5G wireless networks, the software orchestration mechanism, named software defined networking (SDN) with control and user plane (C/U-plane) decoupling, has become one of the most promising key techniques. Based on these features, the hybrid satellite terrestrial network is expected to support flexible and customized resource scheduling for both massive machine-type- communication (MTC) and high-quality multimedia requests while achieving broader global coverage, larger capacity and lower power consumption. In this paper, an end-to-end hybrid satellite terrestrial network is proposed and the performance metrics, e. g., coverage probability, spectral and energy efficiency (SE and EE), are analysed in both sparse networks and ultra-dense networks. The fundamental relationship between SE and EE is investigated, considering the overhead costs, fronthaul of the gateway (GW), density of small cells (SCs) and multiple quality-ofservice (QoS) requirements. Numerical results show that compared with current LTE networks, the hybrid system with C/U split can achieve approximately 40% and 80% EE improvement in sparse and ultra-dense networks respectively, and greatly enhance the coverage. Various resource management schemes, bandwidth allocation methods, and on-off approaches are compared, and the applications of the satellite in future 5G networks with software defined features are proposed.
This paper aims at presenting different concepts for a broadband access satellite. A hybrid system using Q/V-bands on the feeder links and Ka-band on the user links has been considered. European coverage with a large number of beams allows a large frequency reuse factor depending on the number of colors of the cluster pattern. The use of Q/V band for the feeder links requires site diversity to counteract the strong tropospheric fading occurring at Q/V band and to obtain acceptable link availability. The paper gives an overview of the various trade-offs for frequency-reuse scenarios, size and number of beams, number of gateways, site diversity options and gateway handover. © 2011 by Prof. Michel Bousquet. Published by the American Institute of Aeronautics and Astronautics, Inc.
Spectrum sensing is one of the key technologies to realize dynamic spectrum access in cognitive radio (CR). In this paper, a novel database-augmented spectrum sensing algorithm is proposed for a secondary access to the TV White Space (TVWS) spectrum. The proposed database-augmented sensing algorithm is based on an existing geo-location database approach for detecting incumbents like Digital Terrestrial Television (DTT) and Programme Making and Special Events (PMSE) users, but is combined with spectrum sensing to further improve the protection to these primary users (PUs). A closed-form expression of secondary users' (SUs) spectral efficiency is also derived for its opportunistic access of TVWS. By implementing previously developed power control based geo-location database and adaptive spectrum sensing algorithm, the proposed database-augmented sensing algorithm demonstrates a better spectrum efficiency for SUs, and better protection for incumbent PUs than the exiting stand-alone geo-location database model. Furthermore, we analyze the effect of the unregistered PMSE on the reliable use of the channel for SUs.
In this paper, we propose a channel-aware scheduling algorithm and feedback implosion suppression (FIS) technique that exploits reported Channel State Information (CSI) through the return link from a subset of users in a multicast group for reliable multicast delivery over a geostationary satellite network. Reliability is guaranteed via a MFTP-like transport protocol that retransmits lost segments to the group. The deployed scheduling mechanism uses CSI collected from group members before making a decision as to whether or not to transmit a data segment in the forward link. As such, the algorithm aims at avoiding unfavourable channel conditions to reduce the forward link resources that would be wasted in retransmission. However, the users' feedback collected from a large pool would result in the feedback implosion problem. Hence, we propose a FIS technique to complement the CSI collection policy to reduce traffic implosion in the return link as well as reducing the probability to transmit unnecessary CSI values. A change detection scheme run at the users' terminal is implemented using rectangular sliding test window to update a smoothed CSI value depending on its discrepancies with a nominal model from a reference window. The scheduling algorithm together with the CSI collection and FIS policies aims to reduce file transfer delay (FTD) by trading off total number of CSI updates in the face of L-band mobile satellite channel conditions. © 2007 by the American Institute of Aeronautics and Astronautics, Inc. All rights reserved.
A simple method for frequency estimation of a PSK-modulated carrier in complex additive white Gaussian noise is proposed. Although this feedforward pilot-aided technique is sub-optimal in terms of the Cramer-Rao bound (CRB) for one pilot field, its performance greatly improves (in comparison to some established methods) when used over consecutive pilot fields. At low signal-to-noise ratios (SNR) where there is an inherent use of averaging over consecutive pilot fields to achieve the required estimation accuracy, this method achieves a similar speed and accuracy to Luise and Reggiannini (L&R)'s method for fine frequency recovery but with a significantly lower complexity. It also provides a competitive speed and accuracy for coarse frequency offset estimation without a threshold region as associated with Kay's method. Simulations have been run based on the DVB-S2 standard (with all examined methods benefiting from the averaging over multiple pilot fields) and the results are presented. © 2007 by the American Institute of Aeronautics and Astronautics, Inc. All rights reserved.
In this paper, we review the work of the gateway diversity schemes for Q/V band feeder links for a future HTS system. As each gateway will carry very high capacity, it is essential to ensure very high availabilities for the feeder links, typically greater than 99.9%. In order to do so, spatial diversity is required, since Adaptive Coding and Modulation (ACM) and Uplink Power Control (UPC) are unable to cope with the fade dynamics encountered at these high frequency bands. In an effort to reduce the total number of gateways in the system and to result in a more cost effective system, the concept of Smart gateways has emerged. We first review different architectures implementations and highlight their similarities and differences. We then outline the technological challenges and the issues that the envisaged solutions should consider during the design process. Finally we give views on a technology roadmap to meet these challenges.
SatNEx is a European Network of Experts for satellite communications, coordinated by the German Aerospace Center DLR. The first two phases of SatNEx were funded by the EU from 2004 to 2009. The third phase, SatNEx-III, comprises 17 partners and is funded by ESA from 2010 to 2013. A core team consisting of DLR, University of Surrey, and University of Bologna is coordinating the SatNEx-III research activities. Specific research tasks are contracted to partners in the frame of annual “Call-off Orders”.
Multibeam satellite networks in Ka band have been designed to accommodate the increasing traffic demands expected in the future. However, these systems are spectrum limited due to the current spectrum allocation policies. This paper investigates the potential of applying cognitive radio techniques in satellite communications (SatCom) in order to increase the spectrum opportunities for future generations of satellite networks without interfering with the operation of incumbent services. These extra spectrum opportunities can potentially amount to 2.4 GHz of bandwidth in the downlink and 2 GHz of bandwidth in the uplink for high density fixed satellite services (HDFSS).
An improved method for estimating the frequency of a single complex sinusoid in complex additive white Gaussian noise is proposed. The method uses a modified version of the weighted linear predictor to achieve optimal accuracy at low/moderate SNR while retaining its speed and wide acquisition range. Consequently, it has an advantage over known methods that use the weighted phase averager since they suffer from an increased threshold effect at frequencies approaching the full estimation range.
This paper presents a novel Connection Admission Control (CAC) strategy for vehicular DVB-RCS satellite networks. Using estimations of arriving and departing traffic in each spot-beam our proposal aims to maximize user satisfaction and minimize at the same time resource reservation. Via GPS measurements, terminals periodically estimate their time to handoff and encapsulate mobility information within signaling bursts. Upon reception of a mobility update, the NCC (Network Control Centre) is able to estimate the amount of traffic roaming around the spot-beams as well as the probabilities that active terminals will eventually handoff. As a consequence, the NCC reserves only the necessary amount of resources for handover purposes in each spot-beam in order to minimize the percentage of connections forced to termination. No overhead is introduced for the implementation of our CAC solution as it makes use of the existing DVB-RCS signaling for providing the NCC with the extra mobility parameters driving the admission control module. Through an accurate ns-2 modeling of existing DVB-RCS signaling mechanisms, we demonstrate that our lightweight CAC scheme outperforms static channel reservation schemes in terms of handover failure rate as well as traffic prediction strategies using mobility information in terms of channel utilization.
Cognitive radio technologies have achieved in the recent years an increasing interest for the possible gain in terms of spectrum usage with respect to unshared approaches. While most of the attention has been devoted to the cognitive coexistence between terrestrial systems, the coexistence between terrestrial and satellite communications is also seen as a viable option. Cognitive radio for satellite communications (CoRaSat) has been a European Commission seventh Framework Program project funded under the ICT Call 8. CoRaSat aimed at investigating, developing, and demonstrating cognitive radio techniques in satellite communication systems for flexible and dynamic spectrum access. In this paper, the CoRaSat cognitive approaches and techniques, investigated, developed, and demonstrated as most relevant to satellite communications, are described. In particular, the focus is on spectrum awareness, that is, database and spectrum sensing approaches, and on spectrum exploitation algorithms, that is, resource allocation and beamforming algorithms, to enable the use of spectrum for satellite communications using shared bands.
The number of broadband users has beengrowing rapidly during the last years. It is not only the number of users that increases but also the average data volume per user; a consequence of the increased number of users connected via broadband techniques is that the demand for audio and video content is also increasing. In this papel we describe an integrated satellite-terrestrial UMTS architecture investigating on the minimization of the delivery cost. The proposed telecommunication system can offer SDMB (Satellite Digital Multimedia Broadcast) services to mobile users through the satellite or terrestrial UMTS downlink segment. On the inside of this scenario we propose a simple and efficient cost model for choosing the more suitable bearer (satellite or terrestrial) in order to save service delivery cost, moreover we design a new signalling strategy based on users location information for supporting this optimal choice. The simulations performed show the goodness of the proposed strategy also with several mobile operator networks varying the number of users asking for the SDMB service.
Global connectivity cannot be guaranteed by terrestrial networks due to the lack of infrastructure in rural areas. Neither can satellite networks assure this due to lack of signal penetration and capacity coverage issues in densely populated areas. To bridge this gap, we propose an orthogonal frequency domain (OFDM) based hybrid architecture where users are provided service by existing mobile networks in urban areas and are served by satellite in the rural areas. In such a system terrestrial and satellite networks can reuse the portion of spectrum dedicated to each of these systems resulting in significant increase in overall capacity, wider coverage and reduced cost. This frequency reuse induces severe cochannel interference (CCI) at the satellite end and our work focuses on its mitigation using OFDM based adaptive beamforming
The next generation of mobile radio communication systems—so called 5G—will provide some major changes to those generations to date. The ability to cope with huge increases in data traffic at reduced latencies and improved quality of user experience together with major reduction in energy usage are big challenges. In addition future systems will need to embody connections to billions of objects—the so called Internet of Things (IoT) which raise new challenges. Visions of 5G are now available from regions across the World and research is ongoing towards new standards. The consensus is a flatter architecture that adds a dense network of small cells operating in the millimetre wave bands and which are adaptable and software controlled. But what place for satellites in such a vision? The paper examines several potential roles for satellite including coverage extension, content distribution, providing resilience, improved spectrum utilisation and integrated signalling systems.
Satellite communication has recently been included as one of the enabling technologies for 5G backhauling, in particular for the delivery of bandwidth-demanding enhanced mobile broadband (eMBB) application data in 5G. In this paper we introduce a 5G-oriented network architecture empowered by satellite communications for supporting emerging mobile video delivery, which is investigated in the EU 5GPPP Phase 2 SAT5G Project. Two complementary use cases are introduced, including (1) the use of satellite links to support offline multicasting and caching of popular video content at 5G mobile edge, and (2) real-time prefetching of DASH (Dynamic Adaptive Streaming over HTTP) video segments by 5G mobile edge through satellite links. In both cases, the objective is to localize content objects close to consumers in order to achieve assured Quality of Experiences (QoE) in 5G content applications. In the latter case, in order to circumvent the large end-to-end propagation delay of satellite links, testbed based experiments have been carried out to identify specific prefetching policies to be enforced by the Multiaccess computing server (MEC) for minimizing user perceived disruption during content consumption sessions.
This paper presents the results from the EU FP 7 project BATS aimed at integrated BB access across the EU for 2020 and beyond. The BB access is integrated between DSL, LTE and the satellite and features a broadband intelligent user terminal. The satellite component is a cluster of two multibeam HTS satellites providing lower cost per bit than today’s satellites. The system architecture embedding the gateways and user terminals is presented as well as the design for the advanced satellites. The detailed design concepts of the intelligent router are also provided. We present the results of controlled lab tests on an emulated test bed as well as initial results from a field trial in which the intelligent routers were placed in households in Spain and Germany and connected to local; DSL and LTE as well as the Hylas satellite.
Broadband access by satellite in Ka-band will become constrained by spectrum availability. In this context, the European Union (EU) FP7 project CoRaSat is examining the possible spectrum extension opportunities that could be exploited by a database approach in Ka-band via the use of cognitive mechanisms. The database approach utilizing spectrum scenarios between Fixed Satellite Services (FSS), Fixed Services (FS) and Broadcast Satellite Service (BSS) feeder links are considered. Database statistics for several EU countries are also provided for database analysis. Interference in the downlink scenarios are evaluated by the database approach using real databases and propagation models. The importance of using correct terrain profiles and accurate propagation models are shown. For the case of BSS interference to the FSS downlink (17.3-17.7GHz) it is demonstrated that in the UK an area of less than 2% is adversely affected. FS interference into the FSS downlink 17.7- 19.7GHz is shown for the UK to only affect a small percentage of the band at any location. Some initial preliminary findings when considering earth stations on moving platforms are also presented. It is concluded that by using a database approach to allocate frequencies it is possible to use most of the band across different locations for satellites services in the shared Ka-band.
During the last years, spectrum scarcity has become one of the major issues for the development of new communication systems. Cognitive Radio (CR) approaches have gained an ever increasing attention from system designers and operators, as they promise a more efficient utilization of the available spectral resources. In this context, while the application of CRs in terrestrial scenarios has been widely considered from both theoretical and practical viewpoints, their exploitation in satellite communications is still a rather unexplored area. In this paper, we address the definition of several satellite communications scenarios, where cognitive radio techniques promise to introduce significant benefits, and we discuss the major enablers and the associated challenges. © 2014 ICST.
Currently there is a clear trend towards vehicular satellite systems, which are designed in high frequency bands (Ku/Ka) and support broadband data services. One such system is designed within the European project MOWGLY for broadband services provision within aeronautical, maritime and railroad environments, and which features suitably adapted DVB-S2 and DVB-RCS standards as the basic underlying communications technology. This paper presents simulation results for assessing various aspects of the performance of the DVB-S2/RCS based satellite system in different mobile environments. In particular the effects of different fading mechanisms on the Block Error Rate performance of the system is analyzed through link-level simulations. Additionally, system-level simulation results are provided that apart from complementing the link level simulations include the modeling of propagation impairments such as path loss and rain attenuation that cannot be captured in link level simulations. Copyright © 2006 by the American Institute of Aeronautics and Astronautics, Inc. All rights reserved.
This paper investigates collaboration among neighboring Base Stations (BSs) in OFDMA based cellular networks in the absence of a centralized control unit, which is a defining characteristic of 4G wireless networks. We propose a novel scheme for collaboration between the base stations. Monte Carlo simulation based performance analysis demonstrates effectiveness of collaborative resource allocation among adjacent base stations for OFDMA systems, particularly for the users in the cell edges. © 2011 IEEE.
This paper presents a feasibility study for a fixed broadband access High Throughput Satellite terabit/second system by considering appropriate state‐of‐the‐art communication technologies. For the investigated system model, DVB‐S2 and DVB‐RCS2 are assumed as the air interfaces for the forward and return link, respectively. The performance of DVB‐S2 and DVB‐RCS2 Adaptive Coding and Modulation is examined along with potential extensions of these standards. For example, the performance of very low rate DVB‐RCS turbo codes are investigated and their performance is evaluated. In addition, a performance comparison for M‐ary (M=16, 32, 64) constellations in the presence of a linear and a high power amplifier non‐linear channel is carried out. Various frequency reuse schemes and different antenna models are also considered, and their performance is analyzed and evaluated. It is demonstrated that, by using the Q/V (40/50 GHz) bands for the gateways and the Ka (20–30 GHz) band for the user terminals, around 20 gateways and 200 beams are required to provide with the proposed satellite system terabit/second capacity. The obtained performance evaluation results have shown that the forward link is limited by noise rather than interference, whereas the return link is interference limited. Additionally, some further aspects of the system design in relation with the total number of gateways and the payload are discussed.
Next generation Internet requires the capability of providing quality of service (QoS) with service differentiation. Broadband satellite networks as an integral part of the global broadband network infrastructure are moving towards the same goal of providing service differentiation. Due to the unique properties of the space environment, providing service differentiation in satellite networks has additional challenges such as long propagation delay. In this paper, we present an innovative profile-based probabilistic dropping scheme for the provision of relative loss differentiation in a geostationary (GEO) bandwidth on demand (BoD) satellite network which follows the DVB-RCS standard, approved and published by ETSI for interactive broadband satellite network. The network is structured to support a finite number of ordered service classes. We adopt the proportional differentiated service (PDS) model which strikes a balance between the strict QoS guarantee of Integrated Services (IntServ) and softer QoS guarantee of Differentiated Services (DiffServ) to provide proportional loss differentiation to different priority classes. Our scheme controls the loss rates by computing the appropriate packet drop probability based on the congestion level within each satellite terminal (ST) independent of operating condition of other terminals and without requiring a central controller or monitor. Unlike previous proposals designed for terrestrial and wireless networks where the correct differentiation is only achieved locally on per-hop basis (i.e. the queues for different classes must be co-located in a node sharing a common buffer and communication channel), our scheme is able to maintain network-wide proportional loss even though the queues are physically distributed. Additionally, in the design of our solution, we ensure simplicity of the algorithm to minimize overhead incurred in the satellite network and also intra- and inter-node differentiation consistency. We extend the ns-2 simulator with the BoD capability analogous to the DVB-RCS system in which STs first request for resources (i.e. time slots) and only start transmitting packets following the reception of burst time plan (BTP). We implement our loss differentiation algorithm and attach it to each ST. Simulation results show that the scheme is able to achieve the PDS model in a heterogeneous ST environment. Our results also suggest that the predictability property of the PDS model may be violated if the service provider configures the performance gaps between service classes to be too close. However, our scheme can fully achieve the controllability property of the model. © 2007 by the American Institute of Aeronautics and Astronautics, Inc. All rights reserved.
Over the last decade, the explosive increase in demand of high-data-rate video services and massive access machine type communication (MTC) requests have become the main challenges for the future 5G wireless network. The hybrid satellite terrestrial network based on the control and user plane (C/U) separation concept is expected to support flexible and customized resource scheduling and management toward global ubiquitous networking and unified service architecture. In this paper, centralized and distributed resource management strategies (CRMS and DRMS) are proposed and compared com- prehensively in terms of throughput, power consumption, spectral and energy efficiency (SE and EE) and coverage probability, utilizing the mature stochastic geometry. Numerical results show that, compared with DRMS strategy, the U-plane cooperation between satellite and terrestrial network under CRMS strategy could improve the throughput and EE by nearly 136% and 60% respectively in ultra-sparse networks and greatly enhance the U-plane coverage probability (approximately 77%). Efficient resource management mechanism is suggested for the hybrid network according to the network deployment for the future 5G wireless network.
The exploitation of fluctuating channel conditions in link adaption techniques for unicast transmission has been shown to provide large system capacity gains. However, the problem of choosing transmission rates for multicast transmission has not been thoroughly investigated. In this paper, we investigate multicast adaptive techniques for reliable data delivery in GEO satellite networks. An optimal multicast link adaptation is proposed with the aim to maximise terminal throughput whilst increasing resource utilization and fairness in the face of diverse channel conditions. Via simulation results and theoretical analysis, the proposed algorithm has shown to outperform other alternative multicast link adaptation techniques especially when the terminals are in vigorous channel conditions.
In the literature, the Gaussian input is assumed in power optimization algorithms. However, this assumption is unrealistic, whereas practical systems use Finite Symbol Alphabet (FSA) input, (e.g., M-QAM). In this paper, we consider the optimal power for joint interweave and underlay CR systems given FSA inputs. We formulated our problem as convex optimization and solved it through general convex optimization tools. We observed that the total SU transmit power is always less than the power budget and remains in interference limited region only over the considered distance range. Therefore, we re-derive optimal power with interference constraint only in order to reduce the complexity of the algorithm by solving it analytically. Numerical results reveal that, for the considered distance range, the transmit power saving and the rate gain with the proposed algorithm is in the range 16-92% and 7-34%, respectively, depending on the modulation scheme (i.e., BPSK, QPSK and 16-QAM) used.
With the unique broadcast nature and ubiquitous coverage of satellite networks, the synergy between satellite and terrestrial networks provides opportunities for delivering wideband services to a wide range of audiences over extensive geographical areas. This article concerns the optimization techniques pertinent to packet scheduling to facilitate multimedia content delivery over the satellite with a return channel via a terrestrial network. We propose a novel hierarchical packet scheduling scheme that allocates the resources at different parts of the network in response to network dynamics and link variations while under the system power/resource constraints. Simulations prove that the HPS scheme can effectively improve the end-to-end performance and resource utilization with desirable scalability and fairness features.
The next generation of mobile radio communication systems—so called 5G—will provide some major changes to those generations to date. The ability to cope with huge increases in data traffic at reduced latencies and improved quality of user experience together with major reduction in energy usage are big challenges. In addition future systems will need to embody connections to billions of objects—the so called Internet of Things (IoT) which raise new challenges. Visions of 5G are now available from regions across the World and research is ongoing towards new standards. The consensus is a flatter architecture that adds a dense network of small cells operating in the millimetre wave bands and which are adaptable and software controlled. But what place for satellites in such a vision? The paper examines several potential roles for satellite including coverage extension, content distribution, providing resilience, improved spectrum utilisation and integrated signalling systems.
In this paper, we propose a low complexity gradient based approach for enabling the Tone Reservation (TR)technique to reduce the Peak-to-Average Power Ratio (PAPR) of Orthogonal Frequency Division Multiplexing (OFDM) signals. The performance of the proposed algorithm is evaluated for different pilot location in the frequency domain, and also in combination with the Discrete Fourier Transform (DFT) spreading technique proposed in [6]; in order to further reduce the PAPR. Simulation results show that the new technique achieves significant PAPR reductions, which are further enhanced when it is combined with DFT spreading. The simulation results also show that the performance of the technique is dependent on the pilot positions. In addition, further investigation was performed where the reduction tones are constrained, equal to the average power mask for the data tones, by a simple projection rule in the frequency domain both for the TR scheme and for the combined scheme. Simulation results show that the contiguous pilot arrangement provides better PAPR reduction performance in both cases, when the peak-cancellation signal is constrained in the frequency domain.
This paper studies the use of W band in a future high throughput satellite (HTS) system alongside the Q/V band. First the available spectrum in W band is reviewed along with the propagation effects that contribute to the signal’s degradation. Then a mathematical framework for the assessment of the global feeder link availability is presented and the design challenges of a combined Q/V+W gateway system are analysed. The advantages and disadvantages are highlighted. A preliminary assessment of the feeder link availability of a combined Q/V+W band systems is given.
Land Mobile Satellite (LMS) networks, forming a key component of future mobile Internet and broadcasting, can benefit from Multiple-Input Multiple-Output (MIMO) techniques to improve spectral efficiency and outage. LMS-MIMO networks can be obtained using multiple satellites with single polarization antennas with spatial multiplex channel coding, or by a single satellite with dual polarization antennas providing polarization multiplex channel coding. In this paper, a guide is presented showing the steps required to implement a simple empirical-stochastic dual circular polarized LMS-MIMO narrowband channel model with validation both with and without a line of sight. The model is based on an S-band tree-lined road measurement campaign using dual circular polarizations at low elevations. Application of the model is aimed at LMS-MIMO physical layer researchers and system designers, who need an easy to implement and reliable model, representative of typical LMS-MIMO channel conditions.
The potentialities of using EHF frequencies on a satellite for aeronautical broadband communication provision have been discussed in this chapter. Currently used Ka band frequencies will soon not be able to cope with the increased Internet demands from aircraft passengers. There do not appear to be any major regulatory barriers to adopting Q/V and W bands, except perhaps around airports. It has been shown that the propagation impairments in the troposphere that are preventing for now the use of those bands for satellite user links are not a major issue for aeronautical applications as the magnitude of those impairments is significantly decreasing with altitude. They are almost negligible at cruise level. The various tools available to size the propagation margins have been detailed. An outcome of the analysis is that the margins required to ensure more than 99.9% of availability could be lower than 10 dB for most of the flight configurations at Q/V and W band. In order to get an idea of the improvement of the performances brought by the use of those higher frequency bands, current aeronautical terminals and satellites characteristics' have been extrapolated to EHF. It has been shown that the capacities provided can be enhanced by use of conformal antennas and provide from 4 to 10 times increases over current Ka band systems. These would appear to accommodate the predicted requirements of around 200Mbps per aircraft made for 2020 and beyond. This demonstrates the feasibility of EHF satellite systems to meet future Aero passenger requirements, letting bandwidth for ground-based applications at lower frequency bands.
We investigate the physical layer performance of HSDPA via GEO satellites for use in S-UMTS and SDMB. The impact of large round trip delay on link adaptation is discussed and link-level results are presented on the performance of HARQ for a variable number of retransmissions and different categories of UE in a rich multipath urban environment with three IMRs. It is shown that the N-channel SAW HARQ protocol can significantly increase the average throughput particularly for 16-QAM but the large round trip delay also requires an increase in the number of parallel HARQ channels resulting in high memory requirements at the UE. Receive antenna diversity with varying degrees of antenna correlation is also investigated as a possible performance enhancing method. The results presented here will help in specifying the physical layer of satellite HSDPA. Copyright © 2006 by the American Institute of Aeronautics and Astronautics, Inc. All rights reserved.
Summary This special issue of the journal on ‘constellations’ comes at a critical time in their development as a second wave of such non‐geostationary satellite orbit (NGSO) systems is being planned and deployed. These mega‐constellations as they have become known are, with a few exceptions, very much larger than those in the first wave and are focused on broadband and 5G applications rather than speech and narrow band data as those deployed in the first wave during the 1990s. However, as we explain in this editorial, there are many similarities in the design and business plans to the first wave and, perhaps, many similar lessons to be learned.
This paper investigates the 28 GHz band sharing between fixed satellite services (FSS) and fifth generation (5G) new radio (NR) cellular system. In particular, it focuses on modelling a sharing scenario between the uplink of the FSS system and the uplink of the 5G NR enhanced mobile broadband (eMBB) cellular system. Such a scenario could generate interference from the FSS terminals towards the 5G base station, known as next generation Node-B (g-NodeB). We provide detailed interference modelling, sharing constraint derivations and performance analysis under realistic path loss models and antenna radiation patterns based on the latest system characteristics of the third generation partnership project (3GPP) 5G NR Release 15. Several scenarios for seamless coexistence of the two systems are considered by evaluating the efficiency and the signal-to-interference-plus-noise ratio (SINR) at the NR g-NodeB, and using the block error rate (BLER) as a sharing constraint. A single FSS terminal is considered and the impact of several parameters, such as the distance to the g-NodeB and FSS elevation angle, on the g-NodeB spectrum efficiency are evaluated. In addition, the impact of the g-NodeB antenna array size on reducing the FSS/g-NodeB protection distance is evaluated and a dynamic beam steering is proposed to minimise the protection distance.
In the literature, optimal power assuming Gaussian input has been evaluated in OFDM based Cognitive Radio (CR) systems to maximize the capacity of the secondary user while keeping the interference introduced to the primary user band within tolerable range. However, the Gaussian input assumption is not practical and Finite Symbol Alphabet (FSA) input distributions, i.e., M-QAM are used in practical systems. In this paper, we consider the power optimization problem under the condition of FSA inputs as used in practical systems, and derive an optimal power allocation strategy by capitalizing on the relationship between mutual information and minimum mean square error. The proposed scheme is shown to save transmit power in a CR system compared to its conventional counterpart, that assumes Gaussian input. In addition to extra allocated power, i.e., power wastage, the conventional power allocation scheme also causes nulling of more subcarriers, leading to reduced transmission rate, compared to the proposed scheme. The proposed optimal power algorithm is evaluated and compared with the conventional algorithm assuming Gaussian input through simulations. Numerical results reveal that for interference threshold values ranging between 1 mW to 3 mW, the transmit power saving with the proposed algorithm is in the range between 55-75%, 42-62% and 12-28% whereas the rate gain is in the range between 16.8-12.4%, 13-11.8% and 3-5.8% for BPSK, QPSK and 16-QAM inputs, respectively.
Their inherent broadcasting capabilities over very large geographical areas make satellite systems one of the most effective vehicles for multicast service delivery. Recent advances in spotbeam antennas and high-power platforms further accentuate the suitability of satellite systems as multicasting tools. The focus of this article is reliable multicast service delivery via geostationary satellite systems. Starburst MFTP is a feedback-based multicast transport protocol that is distinct from other such protocols in that it defers the retransmission of lost data until the end of the transmission of the complete data product. In contrast to other multicast transport protocols, MFTP retransmission strategy does not interrupt the fresh data transmission with the retransmissions of older segments. Thanks to this feature, receivers enjoying favourable channel conditions do not suffer from unnecessarily extended transfer delays due to those receivers that experience bad channel conditions. Existing research studies on MFTP's performance over satellite systems assume fixed-capacity satellite uplink channels dedicated to individual clients on the return link. Such fixed-assignment uplink access mechanisms are considered to be too wasteful uses of uplink resources for the sporadic and thin feedback traffic generated by MFTP clients. Indeed, such mechanisms may prematurely limit the scalability of MFTP as the multicast client population size grows. In contrast, the reference satellite system considered in this article employs demand-assignment multiple access (DAMA) with contention-based request signalling on the uplink. DAMA MAC (Medium Access Control) protocols in satellite systems are well-known in the literature for their improved resource utilisation and scalability features. Moreover, DAMA forms the basis for the uplink access mechanisms in prominent satellite networks such as Inmarsat's BGAN (Broadband Global Area Network), and return link specifications such as ETSI DVB-RCS, However, in comparison with fixed-assignment uplink access mechanisms, DAMA protocols may introduce unpredictable delays for MFTP feedback messages on the return link. Collisions among capacity requests on the contention channel, temporary lack of capacity on the reservation channel, and random transmission errors on the uplink are the potential causes of such delays, This article presents the results of a system-level simulation analysis of MFTP over a DAMA GEO satellite system with contention-based request channels. Inmarsat's BGAN system was selected as the reference architecture for analyses. The simulator implements the full interaction between the MFTP server and MFTP clients overlaid on top of the Inmarsat BGAN uplink access mechanism. The analyses aim to evaluate and optimise MFTP performance in Inmarsat BGAN system in terms of transfer delay and system throughput as a function of available capacity, client population size, data product size, channel error characteristics, and MFTP protocol settings. Copyright @ 2006 by the American Institute of Aeronautics and Astronautics, Inc. All rights reserved.
The paper presents the studies of traffic management in the satellite ATM bridge based on RACE II RACE project - CATALYST R2074. The project aims to develop satellite ATM bridge that can support the future B-ISDN services for satellite communications. An ATM bridge can interconnect the ATM network and the existing networks as well as DQDB, FDDI, Ethernet networks. The current interfaces have different bit rates, thus the need to control traffic and manage the resources on the bridge to prevent overload and preserve the quality of the services on the satellite. The satellite ATM bridge will be used in the initial interconnections of B-ISDN islands and the development of B-ISDN for mobile telecommunications and direct broadcast services.
Satellite communication systems are a promising solution to extend and complement terrestrial networks in unserved or under-served areas, as reflected by recent commercial and standardization endeavors. In particular, 3GPP recently initiated a study item for new radio, i.e., 5G, non-terrestrial networks aimed at deploying satellite systems either as a stand-alone solution or as an integration to terrestrial networks in mobile broadband and machine-type communication scenarios. However, typical satellite channel impairments, as large path losses, delays, and Doppler shifts, pose severe challenges to the realization of a satellite-based NR network. In this paper, based on the architecture options currently being discussed in the standardization fora, we discuss and assess the impact of the satellite channel characteristics on the physical and medium access control layers, both in terms of transmitted waveforms and procedures for enhanced mobile broadband and narrowband-Internet of Things applications. The proposed analysis shows that the main technical challenges are related to the PHY/MAC procedures, in particular random access, timing advance, and hybrid automatic repeat request and depending on the considered service and architecture, different solutions are proposed.
A computationally simple cross-correlation based approach for improving timing estimation and frame detection performance in OFDM systems is proposed. The new technique requires only one pilot symbol and this can be used further for frequency-offset estimation, by employing standard available techniques. Simulation results show that the proposed approach performs more robustly relative to state-of-the-art autocorrelation based approaches. Copyright © 2006 by the American Institute of Aeronautics and Astronautics, Inc. All rights reserved.
As we move from 5G to 6G networks satellites will play an increasingly key part in providing coverage and resilience. Here we outline the timescales and some of the issues facing satellites in the 6G world.
In this paper, interference-aware radio resource management (RRM) algorithms are presented for the forward and return links of geostationary orbit (GEO) high throughput satellite (HTS) communication system. For the feeder link, satellite-switched smart gateway diversity is combined with two scheduling methods to improve the feeder link availability in rain conditions. For the user link, interference-aware scheduling (IAS) for the forward link and scheduling based on multi-partite graph matching for the return link are shown to enable full frequency reuse (FR) multi-beam satellite systems. The performance assessment of scheduling algorithms is carried out in a system-level simulator with realistic channel models and system assumptions. The improvements of the system capacity and user rates are evaluated.
Frequency sharing in satellite-terrestrial cellular networks can help to achieve increased capacity. However, it has the undesirable effect of generating increased interference due to the introduction of satellite-terrestrial interference paths. In this paper we investigate the factors affecting the carrier-to-interference power ratio (C/I) and the possibilities of achieving an optimum C/I in these networks. It is shown that a proper scaling of some relevant parameters such as EIRP and cell size would help to achieve C/I values comparable to those achievable in satellite-only and terrestrial-only cellular networks. © 2011 by the American Institute of Aeronautics and Astronautics, Inc. All rights reserved.
This chapter presents initial results available from the European Commission H2020 5G PPP Phase 2 project SaT5G (Satellite and Terrestrial Network for 5G) [1]. It specifically elaborates on the selected use cases and scenarios for satellite communications (SatCom) positioning in the 5G usage scenario of eMBB (enhanced mobile broadband), which appears the most commercially attractive for SatCom. After a short introduction to the satellite role in the 5G ecosystem and the SaT5G project, the chapter addresses the selected satellite use cases for eMBB by presenting their relevance to the key research pillars (RPs), their relevance to key 5G PPP key performance indicators (KPIs), their relevance to the 3rd Generation Partnership Project (3GPP) SA1 New Services and Markets Technology Enablers (SMARTER) use case families, their relevance to key 5G market verticals, and their market size assessment. The chapter then continues by providing a qualitative high-level description of multiple scenarios associated to each of the four selected satellite use cases for eMBB. Useful conclusions are drawn at the end of the chapter.
An introduction into self organizing cellular networks is presented. This topic has generated a lot of research interest over the past few years as operators have identified it as a necessary feature in future wireless communication systems. We review projects which have studied self organization and with knowledge of system model design in computing, we suggest design rules in developing robust and efficient self organizing algorithms. We finally demonstrate a channel assignment example based on the concept of sectorial neighbours where the system autonomously changes its allocation scheme based on external factors in the environment (e.g. geographical location, interfering sectors and demand for resources). Further research directions are also highlighted.
Multibeam satellite networks in Ka band have been designed to accommodate the increasing traffic demands expected in the future. However, these systems are spectrum limited due to the current spectrum allocation policies. This paper investigates the potential of applying cognitive radio techniques in satellite communications (SatCom) in order to increase the spectrum opportunities for future generations of satellite networks without interfering with the operation of incumbent services. These extra spectrum opportunities can potentially amount to 2.4 GHz of bandwidth in the downlink and 2 GHz of bandwidth in the uplink for high density fixed satellite services (HDFSS).
Future internet demands are being increased dramatically year by year. Terrestrial systems are unable to satisfy these demands in all geographical areas and thus broadband access by satellite is a key service provision platform. Considering the traffic demands, the raw capacity should approach a Terabit/s by 2020 to meet these demands. The satellite communications network will be a star-based topology, where User Terminals (UT) from multiple beams communicate via central Gateway Earth Stations (GES). The return link from UT to satellite will use DVB-RCS2 Multi-Frequency Time Division Multiple Access (MF-TDMA) transmission scheme in Ka band (30GHz), while the return feeder link from satellite to GES in Q band (40 GHz). Due to generation of large number of narrow user beams, the interference starts becoming a limiting factor in the system's dimensioning. Herein, interference coordination schemes, borrowed from terrestrial cellular systems, are examined in terms of applicability and C/I performance. In addition, an algorithm for dynamic interference coordination is proposed to schedule the transmissions of the users in time-frequency domain of the return link, aiming to improve the C/I. The performance of these schemes and the proposed algorithm is assessed over a 302 user beams satellite system with practical antenna radiation patterns.
This paper presents initial results available from the European Commission Horizon 2020 5G Public Private Partnership Phase 2 project “SaT5G” (Satellite and Terrestrial Network for 5G).1 After describing the concept, objectives, challenges, and research pillars addressed by the SaT5G project, this paper elaborates on the selected use cases and scenarios for satellite communications positioning in the 5G usage scenario of enhanced mobile broadband.
The continuous increase of traffic demands for satellite networks motivates the evolution of the telecommunication satellite technology towards wider channels and multiple beam operation with frequency re-use across the coverage. This is made possible by the use of higher frequency bands. Recent research projects 1,2 have investigated multi-beam coverages with more than 200 user beams operated in Ka band, to offer very large data throughputs over Europe. Since 2012, the European Commission project Broadband Access via integrated Terrestrial and Satellite systems (BATS) has explored a similar concept based on a dual satellite solution offering around 302 user beams over EU27 and Turkey, targeting 2020 timeframe, see Figure 1. In all these systems, so as to maximize the user link capacity, the whole civil band allocated to Fixed Satellite Services (FSS) in Ka band (20/30 GHz) is dedicated to the user links. The feeder links thus have to be operated in another band. An attractive option is to rely on Q/V bands (30/40 GHz) to provide the gateway-to-satellite links. Despite the large available bandwidths in Q/V band (5 GHz in each direction), the very large user aggregated bandwidths are required to implement several tens of gateways to provide the necessary capacity.
The integration of Space Information Network (SIN) with terrestrial infrastructures has been attracting significant attentions in the context of 5G, where satellite communications can be leveraged as additional capabilities such as backhauling between the core network and remote mobile edge sites. However, simple addition of SIN capabilities to terrestrial 5G does not automatically lead to enhanced service performance without systematic scheduling of coexisting resources. In this article we focus on the scenario of multi-link video streaming over both parallel Geostationary Earth Orbit (GEO) satellite and terrestrial 5G backhaul links for enhancing user Quality of Experience (QoE) and network efficiency. The distinct challenge is the complex optimization of scheduling video segment delivery via two parallel channels with very different characteristics while striving to enhance the video quality and resource optimality. We carried out systematic experiments based on a real-life 5G testing framework with integrated GEO satellite and terrestrial backhaul links. The experimental results demonstrate the effectiveness of our proposed 5G edge computing based solution for holistically achieving assured user experiences and optimised network resource efficiency in terms of video traffic offloading.
In this article we examine the role of satellite communications in future telecommunication networks and service provision. Lessons from the past indicate that satellites are successful as a result of their wide area coverage or speed to market for new services. Niche areas such as coverage of air and sea will persist, but for land masses convergence of fixed, mobile, and broadcasting will dictate that the only way forward for satellites is in an integrated format with terrestrial systems. We outline future ways forward for satellites, and discuss the research challenges and technology advances needed to facilitate this integrated approach.
A robust and efficient technique for frame/symbol timing and carrier frequency synchronization in OFDM systems is presented. It uses a preamble consisting of only one training symbol with two identical parts to achieve reliable timing and frequency accuracy in the time-domain, over a wide-frequency estimation range which can be up to half of the signal sampling frequency. Also, it has a loti, complexity which is adaptive to the degree of channel distortion. Computer simulations in the Rayleigh fading ISI channel show that the proposed method achieves superior performance to existing techniques in terms of timing and frequency accuracy. Also, its operation in the time-domain helps to achieve faster synchronization convergence(1).
This paper presents a signal-to-noise ratio (SNR) estimation algorithm for advanced Digital Video Broadcasting - Return Channel via Satellite (DVB-RCS) systems using adaptive coding and modulation (ACM) in the reverse link of broadband satellite systems. Due to the absence of a repetitive pilot symbol structure, SNR estimation has to be performed using the fixed symbol preamble data. Moreover, sporadic nature of data traffic on the return link causes variation in interference level from slot to slot and, therefore, the estimation has to be done within one traffic slot duration. Hence, it becomes necessary to use a combination of data-aided (DA) and decision-directed (DD) algorithms so as to make use of traffic data. A non-data-aided (NDA) estimator that was previously proposed by the authors for binary phase shift keying (BPSK) and QPSK schemes is extended to 8-PSK in a decision directed manner. This estimator shows improved performance over existing estimators. The inherent bias of DD approach at low values of SNR is reduced by using a hybrid approach, i.e. using the proposed estimator at moderate/high values of SNR and the moments-based estimator (M2M4) at low values of SNR. Overall improved performance of the proposed hybrid estimator, in terms of accuracy and complexity, makes it an attractive choice for implementing ACM in advanced DVB-RCS systems.
A low complexity time-domain channel estimation scheme for OFDM systems is presented. It uses a training symbol (preamble) to estimate the channel impulse response (CIR) in the presence of timing synchronization errors. Its computational complexity is much lower than that of the popular MMSE technique and this can be further reduced if there is prior knowledge of the channel delay spread. Consequently, channel delay spread estimation is also addressed by using threshold setting on the optimized CIR. Computer simulations show that the proposed scheme achieves near-ideal accuracy in quasi-static channels.
The rapid development of modern communication services results in high data rate requirements from the end user. It is challenging to meet high data rate requirements because of prevailing issues such as spectrum scarcity and spectrum underutilization due to fixed spectrum assignment policy. Cognitive Radio (CR), being the enabler of dynamic spectrum management techniques, has the capability to tackle these issues by proficiently implementing spectrum sharing schemes using Multicarrier Modulation (MCM) techniques. In CR system, where the Primary User (PU) and the Secondary User (SU) co-exist in the same frequency band, mutual interference (i.e., from SU to PU and vice versa) is a limiting factor on the achievable capacity of both the PU and the SU. Power allocation in MCM based CR systems aims to dynamically control the transmit power on each subcarrier of the SU in order to reduce the mutual interference. Furthermore, combining multiple antennas with MCM is regarded as a very attractive solution for the CR communications to effectively enhance data rate without demanding additional bandwidth and transmit power.
The emerging 5G wireless networks will pose extreme requirements such as high throughput and low latency. Caching as a promising technology can effectively decrease latency and provide customized services based on group users behaviour (GUB). In this paper, we carry out the energy efficiency analysis in the cache-enabled hyper cellular networks (HCNs), where the macro cells and small cells (SCs) are deployed heterogeneously with the control and user plane (C/U) split. Benefiting from the assistance of macro cells, a novel access scheme is proposed according to both user interest and fairness of service, where the SCs can turn into semi- sleep mode. Expressions of coverage probability, throughput and energy efficiency (EE) are derived analytically as the functions of key parameters, including the cache ability, search radius and backhaul limitation. Numerical results show that the proposed scheme in HCNs can increase the network coverage probability by more than 200% compared with the single- tier networks. The network EE can be improved by 54% than the nearest access scheme, with larger research radius and higher SC cache capacity under lower traffic load. Our performance study provides insights into the efficient use of cache in the 5G software defined networking (SDN).
The demand for wider bandwidths has motivated the need for wireless systems to migrate to higher frequency bands. In line with this trend is an envisaged deployment of Ka-band (or mmWave) cellular infrastructure. Further, to improve the spectral efficiency, developing full-duplex radio transceivers is gaining momentum. In view of this move, the paper proposes the possibility of reusing the satellite feeder uplink band in the full-duplex small cells. The motivation for such a reuse is two-fold :(a) there is virtually no interference from the small cells to the incumbent in-orbit satellite receiver, and (b) directive feeder antennas, with possibly additional isolation and processing causing negligible interference to the small cells. The presented interference analysis clearly supports the proposed coexistence.
To reach the terabit per second of throughput, telecommunication satellites cannot make use of frequency below Ka band only. Therefore, the use of broad portion of the spectrum available at Q/V (40/50 GHz) band is foreseen for the feeder link. This study presents the evaluation of performances of different macro-diversity schemes that may allow mitigating the deep fades experienced at Q/V bands by introducing cooperation and a limited redundancy between the different gateways of the system. Two different solutions are firstly described. The performances resulting from the use of those assumptions are derived in a second stage.
Satellites will play an indispensable part in 5G roll out and the common use of new radio (NR) air interface will enable this. Satellite-terrestrial integration requires adaptations to the existing NR standards and demands further study on the potential areas of impact. From a physical layer perspective, the candidate waveform has a critical role in addressing design constraints to support non-terrestrial networks (NTN). In this paper, the adaptability of frequency-localized orthogonal frequency division multiplexing (OFDM)-based candidate waveforms and solutions are discussed in the context of physical layer attributes of non-linear satellite channel conditions. The performance of the new air interface waveforms are analysed in terms of spectral confinement, peak-to-average power ratio (PAPR), power amplifier efficiency, robustness against non-linear distortions and carrier frequency offset (CFO).
In this paper, we present a satellite-integrated 5G testbed that was produced for the EU-commissioned Satellite and Terrestrial Networks for 5G (SaT5G) project. We first describe the testbed's 3GPP Rel. 15/6-compliant mobile core and radio access network (RAN) that have been established at the University of Surrey. We then detail how satellite NTN UE and gateway components were integrated into the testbed using virtualization and software-defined orchestration. The satellite element provides 5G backhaul, which in concert with the terrestrial/mobile segment of the testbed forms a fully integrated end-to-end (E2E) 5G network. This hybrid 5G network exercised and validated the four major use cases defined within the SaT5G project: cell backhaul, edge delivery of multimedia content, multicast and caching for media delivery and multilinking using satellite and terrestrial. In this document, we describe the MEC implementations developed to address each of the aforementioned use cases and explore how each MEC system integrates into the 5G network. We also provide measurements from trials of the use cases over a live GEO satellite system and indicate in each case the improvements that result from the use of satellite in the 5G network.
In Orthogonal Frequency Division Multiplexing (OFDM) based cognitive radio systems, power optimization algorithms have been evaluated to maximize the achievable data rates of the Secondary User (SU). However, unrealistic assumptions are made in the existing work, i.e. a Gaussian input distribution and traditional interference model that assumes frequency division multiplexing modulated Primary User (PU) with perfect synchronization between the PU and the SU. In this paper, we first derive a practical interference model by assuming OFDM modulated PU with imperfect synchronization. Based on the new interference model, the power optimization problem is proposed for the Finite Symbol Alphabet (FSA) input distribution (i.e., M-QAM), as used in practical systems. The proposed scheme is shown to save transmit power and to achieve higher data rates compared to the Gaussian optimized power allocation and the uniform power loading schemes. Furthermore, a theoretical framework is established in this paper to estimate the power saving by evaluating optimal power allocation for the Gaussian and the FSA input. Our theoretical analysis is verified by simulations and proved to be accurate. It provides guidance for the system design and gives deeper insights into the choice of parameters affecting power saving and rate improvement.
One of the key performance targets on the European Commission's Digital Agenda is to provide at least 30‐Mbit/s broadband coverage to all European households by 2020. The deployment of existing terrestrial technologies will not be able to satisfy the requirements in the most difficult‐to‐serve locations, either due to a lack of coverage in areas where the revenue potential for terrestrial service providers is too low or due to technological limitations that diminish the available throughput in rural environments. In this paper, we investigate a hybrid broadband system combining satellite and terrestrial access networks. The system design and the key building blocks of the intelligent routing entities (referred to as intelligent gateways) are presented. To justify the hybrid broadband system's performance subjectively, lab trials have been performed with an integrated multiple access network emulator and a variety of typical multimedia applications that have varying requirements. The results of the lab trials suggest that the quality of experience is consistently improved thanks to the utilisation of intelligent gateway devices, when compared with using a single access network at a time.
In Cognitive Radio (CR) systems, the data rate of the Secondary User (SU) can be maximized by optimizing the transmit power, given a threshold for the interference caused to the Primary User (PU). In conventional power optimization algorithms, the Gaussian input distribution is assumed, which is unrealistic, whereas the Finite Symbol Alphabet (FSA) input distribution, (i.e., M-QAM) is more applicable to practical systems. In this paper, we consider the power optimization problem in multiple input multiple output orthogonal frequency division multiplexing based CR systems given FSA inputs, and derive an optimal power allocation scheme by capitalizing on the relationship between mutual information and minimum mean square error. The proposed scheme is shown to save transmit power compared to its conventional counterpart. Furthermore, our proposed scheme achieves higher data rate compared to the Gaussian optimized power due to fewer number of subcarriers being nulled. The proposed optimal power algorithm is evaluated and compared with the conventional power allocation algorithms using Monte Carlo simulations. Numerical results reveal that, for distances between the SU transmitter and the PU receiver ranging between 50m to 85m, the transmit power saving with the proposed algorithm is in the range 13-90%, whereas the rate gain is in the range 5-31% depending on the modulation scheme (i.e., BPSK, QPSK and 16-QAM) used.
The design of efficient radio resource management (RRM) strategy has become a key technique for the next-generation wireless systems to provision multimedia data services at appropriate QoS. In this paper, the authors propose a novel cross-layer joint priority (CLJP) packet scheduling scheme for QoS provisioning of multimedia traffic in satellite broadcast/multicast systems. The scheme considers both the application and transport layers' QoS requirements and RLC layer's queuing status into MAC layer packet scheduling decisions, aiming at guaranteeing diverse QoS demands and achieving efficient resource allocation subject to power/resource constraints. In comparison with existing schemes, the simulation results prove that the proposed scheme achieves better QoS and fairness, and enhances the throughput/channel utilisation. © 2007 by the American Institute of Aeronautics and Astronautics, Inc. All rights reserved.
This paper addresses the Smart Gateway diversity scheme to be employed in future High Throughput Satellite Systems (HTS) at Ka, Q/V bands or above. The availability of a network of N gateways is presented with P redundant and with N active multiplexed gateways. A new time switched architecture is compared with the more conventional frequency division system for the N-active scheme but retaining transparent satellite transponders. In addition, resource allocation algorithms are compared, end to end, which maximize the throughput and balance the traffic amongst the user beams for gateways links degraded by rain. The time switched gateway approach allows simplifies satellite payloads and reduced ground segment size and hence cost.
Adaptation of TCP for satellite links has been the subject of many research studies in the last decade. In many such studies, Performance Enhancing Proxies (PEPs) and modified TCP congestion control algorithms have been reported to considerably improve TCP performance over satellite systems. In parallel to Transport layer developments, Link layer mechanisms as well as satellite hardware design are also evolving, providing more reliable transmission and higher capacity utilisation. Under the new light of these evolutions, this paper presents a simulation campaign, comparing the performance of different TCP variants over a satellite network using PEPs. The second contribution of this paper is the performance assessment of a novel TCP congestion control mechanism based on cross-layer design. Cross-layer adaptation is a relatively new idea in network design, based on the vertical integration of protocol layers. Simulation results for this mechanism demonstrate the strong relationship between the Transport and Link layers and illustrate the benefits of joint analysis and design. © 2007 by University of Surrey.
In this paper, we investigate and compare the performance of the IEEE mobile WiMAX and the 3GPP LTE terrestrial air interfaces over GEO satellite links with amplifier nonlinearity and mobile wideband fading, in order to have an insight into which would be a better candidate for ubiquitous mobile broadband delivery. Both air-interfaces are compared in terms of achievable single-link capacities for a 5MHz channel. Link-level computer simulations are also carried out to establish their block-error-rate performance. Although both standards share a lot of similarity, simulation results show that they can differ in performance as a result of their subcarrier multiplexing schemes and the variants of OFDMA implemented. © 2010 by the American Institute of Aeronautics and Astronautics, Inc. All rights reserved.
In Orthogonal Frequency Division Multiplexing (OFDM) based cognitive radio systems, power optimization algorithms have been evaluated to maximize the achievable data rates of the Secondary User (SU). However, unrealistic assumptions are made in the existing work, i.e. a Gaussian input distribution and traditional interference model that assumes frequency division multiplexing modulated Primary User (PU) with perfect synchronization between the PU and the SU. In this paper, we first derive a practical interference model by assuming OFDM modulated PU with imperfect synchronization. Based on the new interference model, the power optimization problem is proposed for the Finite Symbol Alphabet (FSA) input distribution (i.e., M-QAM), as used in practical systems. The proposed scheme is shown to save transmit power and to achieve higher data rates compared to the Gaussian optimized power allocation and the uniform power loading schemes. Furthermore, a theoretical framework is established in this paper to estimate the power saving by evaluating optimal power allocation for the Gaussian and the FSA input. Our theoretical analysis is verified by simulations and proved to be accurate. It provides guidance for the system design and gives deeper insights into the choice of parameters affecting power saving and rate improvement.
The explosive growth of various services boosts the innovation and development in terrestrial communication systems for the implementation of the next generation mobile communication networks. However, simply utilizing limited resources in terrestrial communication networks is difficult to support the massive quality of service (QoS) aware requirements and it is hard to guarantee seamless coverage in far remote regions. Leveraging the intrinsic merits of high altitude and the ability of multicasting or broadcasting, satellite communication systems provide an opportunity for novel mobile communication networks with its tight interaction and complementary characteristics to traditional terrestrial networks. It is believed that the convergence of satellite and terrestrial networks can solve the problems existing in current mobile communication systems and make a profound effect on global information dissemination. In this paper, we make a comprehensive survey on the convergence of satellite and terrestrial networks. First, motivations and requirements of satellite-terrestrial network convergence are identified. Then, we summarize related architectures of existing literature, classify the taxonomy of researches on satellite-terrestrial networks, and present the performance evaluation works in different satellite-terrestrial networks. After that, the state-of-the-art of standardization, projects and the key application areas of satellite-terrestrial networks are also reviewed. Finally, we conclude the survey by highlighting the open issues and future directions.
This article surveys the literature over the period of the last decade on the emerging field of self organisation as applied to wireless cellular communication networks. Self organisation has been extensively studied and applied in adhoc networks, wireless sensor networks and autonomic computer networks; however in the context of wireless cellular networks, this is the first attempt to put in perspective the various efforts in form of a tutorial/survey. We provide a comprehensive survey of the existing literature, projects and standards in self organising cellular networks. Additionally, we also aim to present a clear understanding of this active research area, identifying a clear taxonomy and guidelines for design of self organising mechanisms. We compare strength and weakness of existing solutions and highlight the key research areas for further development. This paper serves as a guide and a starting point for anyone willing to delve into research on self organisation in wireless cellular communication networks.
We consider, in this paper, the maximization of throughput in a dense network of collaborative cognitive radio (CR) sensors with limited energy supply. In our case, the sensors are mixed varieties (heterogeneous) and are battery powered. We propose an ant colony-based energy-efficient sensor scheduling algorithm (ACO-ESSP) to optimally schedule the activities of the sensors to provide the required sensing performance and increase the overall secondary system throughput. The proposed algorithm is an improved version of the conventional ant colony optimization (ACO) algorithm, specifically tailored to the formulated sensor scheduling problem. We also use a more realistic sensor energy consumption model and consider CR networks employing heterogeneous sensors (CRNHSs). Simulations demonstrate that our approach improves the system throughput efficiently and effectively compared with other algorithms.
This paper investigates adaptive implementation of the linear minimum mean square error (MMSE) detector in code division multiple access (CDMA). From linear algebra, Cimmino's reflection method is proposed as a possible way of achieving the MMSE solution blindly. Simulation results indicate that the proposed method converges four times faster than the blind least mean squares (LMS) algorithm and has roughly the same convergence performance as the blind recursive least squares (RLS) algorithm. Moreover the proposed algorithm is numerically more stable than the RLS algorithm and also exhibits parallelism for pipelined implementation. © 2009 IEEE.
An improved time diversity technique is proposed for deploying the 3GPP LTE air-interface over satellite mobile links. It overcomes the FEC performance loss resulting from the use of a short transmit-time-interval (TTI) in LTE. By using transport block expansion, code block segmentation, advanced rate matching and automatic retransmissions, the FEC coded block is split over as many TTIs as is necessary to break the channel correlation without any reduction in the user data rate.
Satellite communication has recently been included as one of the key enabling technologies for 5G backhauling, especially for the delivery of bandwidth-demanding enhanced mobile broadband (eMBB) applications in 5G. In this paper, we present a 5G-oriented network architecture that is based on satellite communications and multi-access edge computing to support eMBB applications, which is investigated in the EU 5GPPP phase-2 satellite and terrestrial network for 5G project. We specifically focus on using the proposed architecture to assure quality-of-experience (QoE) of HTTP-based live streaming users by leveraging satellite links, where the main strategy is to realize transient holding and localization of HTTP-based (e.g., MPEG-DASH or HTTP live streaming) video segments at 5G mobile edge while taking into account the characteristics of satellite backhaul link. For the very first time in the literature, we carried out experiments and systematically evaluated the performance of live 4K video streaming over a 5G core network supported by a live geostationary satellite backhaul, which validates its capability of assuring live streaming users' QoE under challenging satellite network scenarios.
An improved method for estimating the frame/symbol timing offset in preamble-aided OFDM systems is presented. It uses a conventional preamble structure and combines autocorrelation techniques with restricted cross-correlation to achieve a near-ideal timing performance without significant increase in complexity. Computer simulations show that the method is robust in both AWGN and fading multipath channels, achieving better performance than the existing methods.
This paper proposes a meshed Very Small Aperture Terminal (VSAT) satellite communications network which uses an On Board Processing (OBP) satellite with spot beams and cell switching capabilities. A novel approach is used for maximizing the bandwidth utilization of the satellite by performing statistical multiplexing on-board the satellite. MF-TDMA is used as satellite multiple access protocol since it takes advantage of the flexibility and statistical multiplexing capabilities of ATM. Finally, the cell loss resulting from the limited bandwidth of the satellite link can be prevented by effective traffic control functions. A preventive control scheme has been used for this purpose. The Leaky Bucket (or Generic Cell Rate Algorithm) used as Usage Parameter Control (UPC) controls the source traffic parameters for conformance with the traffic contract. Furthermore a rate-based flow control is used to control ABR services. The results of the performance analysis of the proposed system indicate that a high increase in satellite bandwidth utilization can be achieved, compared to circuit-switched satellite systems.
In the context of orthogonal frequency division multiplexing (OFDM)-based systems, pilot-based beamforming (BF) exhibits a high degree of sensitivity to the pilot sub-carriers. Increasing the number of reference pilots significantly improves BF performance as well as system performance. However, this increase comes at the cost of data throughput, which inevitably shrinks due to transmission of additional pilots. Hence an approach where reference signals available to the BF process can be increased without transmitting additional pilots can exhibit superior system performance without compromising throughput. Thus, the authors present a novel three-stage iterative turbo beamforming (ITBF) algorithm for an OFDM-based hybrid terrestrial-satellite mobile system, which utilises both pilots and data to perform interference mitigation. Data sub-carriers are utilised as virtual reference signals in the BF process. Results show that when compared to non-iterative conventional BF, the proposed ITBF exhibits bit error rate gain of up to 2.5 dB with only one iteration.
The close synergy between a satellite system and terrestrial mobile cellular networks for cost-efficient content delivery to mobile users lies at the core of the satellite digital multimedia broadcasting system concept. Having already rich research and design work behind it, much of which has been carried out in the framework of European R&D projects, the system has recently seen the end of its first field trials, which relied on an experimental platform representing the future operational system. The article reports the outcomes of these trials that confirmed the technical feasibility of the system concept and provided valuable hints for the subsequent system design stages.
It has been recognised that satellites can play very important role in supporting B-ISDN services based on ATM technology. There have been several projects to exploit ATM over satellite for broadband services since 1992. These include the European RACE II CATALYST project which developed a satellite ATM demonstrator and the EPSRC project which studied the interconnection of Broadband ATM Islands via satellite. In a broadband network environment, ATM over satellite can be used for inter-network connections as transit link and for terminal access as access link. For transit link a small number of earth stations require a high bit rate link. Static bandwidth reservation based on estimated fixed rates provide a simple solution. However for terminal access a large number of terminals require low bit rate links. Since the traffic is expected to have large fluctuations, a dynamic reservation system is a more efficient but complex solution. Dynamic reservation Time Division Multiple Access (TDMA) appears to be the best solutions, as it takes advantage of the flexibility and statistical multiplexing capabilities of ATM and supports all traffic classes. The paper shows that ATM over satellite can implement a flexible and efficient bandwidth resource management mechanisms which allows the satellite link to be configured to meet the requirements of broadband services from low bit rate to high bit rate. © 1996 The Institution of Electrical Engineers. Printed and published by the IEE.
The ambitious fifth generation (5G) cellular system requirements and performance targets motivated standardisation bodies to consider wide bandwidth allocations for 5G in the mm-wave band. Nevertheless, parts of the considered band are already allocated to satellite services in several regions. We tackle this challenge by proposing a co-existence framework for 5G and fixed satellite services (FSS). We focus on the uplink of both systems and consider realistic 5G deployment scenarios with multiple users and multiple radio access network (RAN) cells. We propose a generic and controllable co-existence constraint applicable to different 5G numerologies and configurations. In addition, we derive a protection distance to guarantee the co-existence constraint and utilise several 5G system features to define soft constraints. The 5G/FSS coexistence is investigated based on performance of the 5G user plane. Simulation results show that the 5G deployment scenario is a key factor in setting the protection distance. In addition, the FSS elevation has a significant effect on the identified distance. The results suggest that both systems can operate in the same band without very large protection distance at a controllable expense of a small, e.g., 1% - 5%, performance loss.
Network densification is envisioned as the key enabler for 2020 vision that requires cellular systems to grow in capacity by hundreds of times to cope with unprecedented traffic growth trends being witnessed since advent of broadband on the move. However, increased energy consumption and complex mobility management associated with network densifications remain as the two main challenges to be addressed before further network densification can be exploited on a wide scale. In the wake of these challenges, this paper proposes and evaluates a novel dense network deployment strategy for increasing the capacity of future cellular systems without sacrificing energy efficiency and compromising mobility performance. Our deployment architecture consists of smart small cells, called cloud nodes, which provide data coverage to individual users on a demand bases while taking into account the spatial and temporal dynamics of user mobility and traffic. The decision to activate the cloud nodes, such that certain performance objectives at system level are targeted, is carried out by the overlaying macrocell based on a fuzzy-logic framework. We also compare the proposed architecture with conventional macrocell only deployment and pure microcell-based dense deployment in terms of blocking probability, handover probability and energy efficiency and discuss and quantify the trade-offs therein
In this paper, we discuss and analyse an architecture of a future multibeam High Throughput Satellite (HTS) system capable of offering greater flexibility to both feeder links and user links, allowing the resources to be allocated more efficiently and resulting in a system optimised holistically. This architecture considers a time switched payload. Beam hopping is used in the user links and a similar form in the feeder links. A user beam is served by a number of gateways and not just one. This scheme allows the progressive deployment of gateways in the system, without compromising the service or the coverage of the system.
Simultaneous frequency sharing in integrated satellite and terrestrial networks can help to achieve increased network capacity based on an efficient reuse of frequency resources in a way that minimizes interference. In this paper, we investigate the use of the 2GHz MSS frequency allocation for satellite and terrestrial mobile services based on LTE. Results show that uplink frequency sharing is not feasible due to the predominant terrestrial interference into the satellite service. On the other hand, downlink frequency sharing is feasible for pedestrian microcells and vehicular macrocells but not for the wider rural cells. In this regard, a satellite multimedia broadcast service can be deployed simultaneously with twoway terrestrial mobile service in urban and suburban areas. Furthermore, dynamic radio resource allocation combined with interference coordination can be used to optimize performance throughout the integrated network.
This paper analyses the New Radio (NR) air interface waveforms and numerologies in the context of current activities and studies of 3GPP related to the feasibility and standardisation of necessary adaptations for the 5G NR to support integrated-satellite-terrestrial networks with low earth orbit (LEO) satellites. Frequency-localized orthogonal frequency division multiplexing (OFDM)-based candidate waveforms are recommended by 3GPP as the waveforms for the NR in order to preserve the advantages of OFDM as well as maintain backward compatibility. 5G New Radio enables diverse service support, efficient synchronization and channel adaptability using a multinumerology concept, which defines a family of parameters of the parent waveform, that are related to each other by scaling. The major design challenges in the LEO satellite scenario are power limited link budget and high Doppler effects which can be addressed by choosing waveforms with small peak to average power ratio (PAPR) and sub-carrier bandwidth adaptation respectively. Hence, the selection of the right waveform and numerology is of prime relevance for the proper adaptation of 5G NR for LEO satellite terrestrial integration. The performance evaluation of the new air interface waveforms, with different numerologies, are carried out under the effect of carrier frequency offset (CFO), multipath effects, non-linearity, phase noise and additive white Gaussian noise (AWGN).
Broadband access by satellite in Ka band is currently constrained by spectrum availability. In this context, the EU FP7 project CoRaSat is examining the possible ways in which improved frequency utilization could be possible in Ka band via the use of cognitive mechanisms. A database approach utilizing spectrum scenarios between Fixed Satellite Services (FSS), Fixed Services (FS) and Broadcast Satellite Service (BSS) feeder links are considered. Interference in the downlink from BSS and FS are evaluated using real data bases and propagation models. Data base statistics for several EU countries are also evaluated. The importance of using correct terrain profiles and accurate propagation models is shown. For the case of the BSS interference to the FSS downlink it is demonstrated that for the UK an area of less than 2% is affected and thus the additional 400 MHz spectrum band (17.3 to 17.7 GHz) can be used by FSS over the majority of the country. The operational challenges of the database approach across the EU are also discussed.
The global telecommunication market aims to fulfil future ubiquitous coverage and rate requirements by integrating terrestrial communications with multiple spot beam high throughput satellites (HTS). In this paper a new scheme is proposed to connect multiple Low Earth Orbit (LEO) satellites in a constellation to a single gateway to support integrated-satellite-terrestrial networks. A single gateway with multiple steerable antenna arrays is proposed for reduction in gateway numbers and cost. Using a power allocation strategy, the target is to maximize the gateway link capacity of the HTS-LEO satellites for operation including feasible used cases studies of 3GPP for necessary adaptations in a 5G system. Firstly, an objective function is established to find the optimal power levels required. Secondly the interference from neighbouring satellite beams is considered to achieve maximum capacity. Mathematical formulations are developed for this non-convex problem. Simulation results show that the proposed system architecture improves capacity and meets the dynamic demand better than traditional methods.
In broadcast wireless networks, the options for reliable delivery are limited when there is no return link or a return link is not deemed cost-efficient due to the system resource requirements it introduces. In this paper we focus our attention on two reliable transport mechanisms that become relevant for the non real time delivery of files: packet-level Forward Error Correction (FEC) and data carousels. Both techniques perform error recovery at the expense of redundant data transmission and content repetition respectively. We demonstrate that their joint design may lead to significant resource savings.
This article focuses on link and network-layer mobility issues in vehicular DVB-RCS satellite networks. The article presents a discussion on these topics before proposing a position-based spotbeam handover mechanism in some detail. The mechanism relies on regular position updates being available at mobile terminals. The NCC is responsible for making spotbeam handover decisions according to handover need information from mobile terminals with the objective to balance the load and maximise the overall utilisation. Particular emphasis has been placed on signalling issues. Backwardcompatible, light-weight, add-on mechanisms are crucial for commercial viability of a mobile DVB-RCS. Similarly, accurate modelling of existing DVB-RCS signalling mechanisms is necessary to attain useful conclusions regarding the performance of new handover mechanisms. The article briefly presents our ns-based DVB-RCS simulator and presents the performance analyses of our spotbeam handover mechanism in terms of failure rate and delay for. © 2007 by the American Institute of Aeronautics and Astronautics, Inc. All rights reserved.
The design of efficient radio resource management (RRM) is crucial for the multimedia content delivery in the satellite digital multimedia broadcasting (SDMB) system. In this paper, a novel packet scheduling scheme with a new power allocation algorithm has been proposed, and its performance has been evaluated via simulations. In comparison with existing schemes, the proposed strategy achieves significant performance improvement, especially on delay and jitter. Copyright © 2006 by the American Institute of Aeronautics and Astronautics, Inc. All rights reserved.
This paper reports on satellite beacon measurements at 12.7 GHz from late October 2010 until the end of June 2012 at two geographically separated Arqiva Teleport sites in Southern England in order to obtain propagation data to be used in operational decision making and improvement of the scientific understanding of such effects. These sites operate at very low elevation angles (4.1 and 5.3 degrees) and therefore experience significant propagation effects. Measurements are on-going to achieve meaningful long term statistics, however there is value in comparing the results obtained at both sites over the common period of measurement data to date (approximately 20 months). The measurements are performed at 1/2 second intervals and statistical analysis applied. A primary objective of this work is to assess the influence of tropospheric scintillation and consequently the paper will focus on this with the addition of some lesser material on rain fade effects. The measurements and analysis have been performed as a collaborative effort between Arqiva and staff who work at the University of Surrey. It is emphasised that the primary objective of this work is to assess the influence of tropospheric scintillation and rain attenuation from an operational perspective and to a lesser extent from an academic viewpoint. © 2009-2013 - IOS Press and the authors. All rights reserved.
The paper addresses the TCP performance enhancing proxy techniques broadly deployed in wireless networks. Drawing on available models for TCP latency, we describe an analytical model for the latency and the buffer requirements related to the split-TCP mechanism. Although the model applicability is broad, we present and evaluate the model in the context of geostationary satellite networks, where buffering requirements may become more dramatic. Simulation results are compared with the analytical model estimates and show that the model captures the impact of various parameters affecting the dynamics of the component connections traversing the terrestrial and the satellite network.
This paper investigates the radio resource management (RRM) issues in a heterogeneous macro-femto network. The objective of femto deployment is to improve coverage, capacity, and experienced quality of service of indoor users. The location and density of user-deployed femtos is not known a-priori. This makes interference management crucial. In particular, with co-channel allocation (to improve resource utilization efficiency), RRM becomes involved because of both cross-layer and co-layer interference. In this paper, we review the resource allocation strategies available in the literature for heterogeneous macro-femto network. Then, we propose a self-organized resource allocation (SO-RA) scheme for an orthogonal frequency division multiple access based macro-femto network to mitigate co-layer interference in the downlink transmission. We compare its performance with the existing schemes like Reuse-1, adaptive frequency reuse (AFR), and AFR with power control (one of our proposed modification to AFR approach) in terms of 10 percentile user throughput and fairness to femto users. The performance of AFR with power control scheme matches closely with Reuse-1, while the SO-RA scheme achieves improved throughput and fairness performance. SO-RA scheme ensures minimum throughput guarantee to all femto users and exhibits better performance than the existing state-of-the-art resource allocation schemes.
Addition of devices of different form factors to the network has resulted to high demand for broadband access. To improve the network capacity, Frequency spectrum regulators have recommended fifth generation (5G) network for deployment on one of the high frequency bands due to their huge contiguous bandwidth. Since such bands have already been allocated to satellite networks by the regulatory bodies, 5G must coexistence with the satellite systems. As a result, feasibility study for coexistence of 5G with the incumbent satellite systems is needed. This paper studied coexistence feasibility of a 5G terrestrial Base Station (BS) with Fixed Satellite Service (FSS) terminal at 28 GHz considering only interference from the satellite terminal into the 5G terrestrial BS. The study used signal to interference plus noise ratio (SINR) available at sectors of a 3-sector cell 5G terrestrial BS as a protection parameter. The available SINR on each sector was used in estimating the impact of the FSS terminal transmit power on the deployment parameters of the 5G system. Moreover, the study was conducted using a more realistic path loss model and 5G antenna pattern recently rd released by 3 generation partnership project (3 GPP). The results show that the transmission power and elevation angle of the FSS terminal affect deployment parameters of the 5G terrestrial BS. Finally, the results suggested that coexistence of the two systems is feasible in residential areas with only one FSS terminal if the deployment parameters of the 5G BS are carefully selected.
e iterative soft output Viterbi algorithm (SOVA) is a sub-optimum algorithm when it is used to decode turbo codes. By normalizing its extrinsic information we get a performance improvement compared to the standard SOVA. In particular, if the extrinsic information is increased in the last decoding iteration, an additional coding gain improvement is noticed. For example, this is 0.25 dB for a frame length of 1000 bits in the additive white Gaussian noise (AWGN) channel as well as in an uncorrelated Rician fading channel at bit error rate (BER) of IO". Also, this normalized SOVA is only about 0.25 dB worse than a turbo decoder using the Log-MAP algorithm, both in the AWGN channel and in an uncorrelated Rayleigh fading channel at BER of around Furthermore with an 8-state component code, a frame length of 500 hits performs 0.125 dB better than a 16-state Bidirectional (Si)-SOVA turbo decoder at BER of IO-' in the AWGN channel.
The rosette satellite constellation network with intersatellite links (ISLs) presents unique properties, in providing locally separate ascending and descending network surfaces of interconnected satellites with which the ground terminal can communicate. We present a novel approach exploiting this rosette geometry, by use of control of handover and management of satellite diversity, to determine which surface a ground terminal will select for communication.
The future smart homes will be equipped with a variety of devices that are connected to the internet to provide a multitude of services to consumers. An adequate and reliable broadband connection is a must to support smart home services such as E-Health, home automation, smart energy management and seamless usage of multimedia. To realize such advanced networked applications, especially for households in geographical areas that are unserved or underserved with broadband connectivity, the European Union's 7th Framework Programme funded the BATS integrated project (Broadband Access via integrated Terrestrial and Satellite systems). The main objective of the BATS project is to integrate satellite and terrestrial networks to provide broadband connectivity to households through a home gateway. The envisaged home gateway is intelligent such that it is able to determine in real time the QoS requirements of applications and accordingly make routing decisions to optimize Quality of Experience (QoE) of the end user. In this paper we introduce future smart home applications and devices that would benefit from the BATS concept and discuss the different aspects of designing an intelligent home gateway that is both QoE aware and standard compliant. © 2013 Authors.
We investigate the use of fixed-point methods for predicting the performance of multiple TCP flows sharing geostationary satellite links. The problem formulation is general in that it can address both error-free and error-prone links, proxy mechanisms such as split-TCP connections and account for asymmetry and different satellite network configurations. We apply the method in the specific context of bandwidth on demand (BoD) satellite links. The analytical approximations show good agreement with simulation results, although they tend to be optimistic when the link is not saturated. The main constraint upon the method applicability is the limited availability of analytical models for the MAC-induced packet delay under nonhomogeneous load and prioritization mechanisms.
In the coming era of telecommunication, the integration of satellite capabilities with emerging 5G technologies has been considered as a promising solution to achieve assured user experiences in bandwidth-hungry content applications. In this paper, we present our design for emerging Multi-access Edge Computing (MEC) based Video on Demand (VoD) services, which efficiently utilizes satellite and terrestrial integrated 5G network. Based on this framework, we propose and analyse the Video-segment Scheduling Network Function (VSNF), which is able to deliver enhanced quality of video consumption experiences to end-users. We specifically consider the layer video scenario, where it is possible to intelligently schedule layers of video segments via parallel satellite and terrestrial backhaul links in 5G. The key technical challenge is to optimally schedule the layered video segment over the two network link which are having distinct characteristics while attempting to enhance the Quality of Experience (QoE) for all the end-users in fair manner. We have conducted extensive set of experiments using real 5G testing framework in which gNB is integrated with core network using Geostationary Earth Orbit (GEO) satellite and terrestrial backhaul links. The results highlights the capability of our proposed content delivery framework for holistically delivering assured QoE, fairness among multiple video sessions, as well as optimised network resource efficiency. INDEX TERMS HTTP Adaptive Streaming, Satellite and Terrestrial network integration, 5G networks, Quality of Experiences
An analytical evaluation of the performance of amplitude phase shift keying constellations in additive white Gaussian noise is performed and an expression to approximate the error probability is presented. The expression accounts for a number of symbols, equal to the number of concentric rings in the constellation. It is shown to significantly reduce the computational complexity associated with previously known error performance expressions, while closely predicting the error rates in the relevant signal-to-noise ratio range.
This paper gives an introduction to a key tool that can help provide various self organised solutions in wireless cellular networks. We begin by describing the need for self organised networks and the challenges associated with it. An introduction to cellular automata theory is then given with an easy to follow example. We provide an analysis of such solutions and discuss its stability and convergence criterion. Finally, we formulate a specific problem for Inter-Cell Interference Coordination (ICIC), describing how the cellular automata approach can be applied to achieve ICIC. In addition, we discuss the prospective use case of applying cellular automata in achieving energy efficiency in heterogeneous networks. This paper will serve as a guide for anyone willing to develop self organising solutions in wireless cellular networks. © VDE VERLAG GMBH.
The present invention relates to digital receivers in a frequency-selective (wideband) channel and in particular to frame/symbol timing and carrier frequency synchronization in the reception of orthogonal frequency-division multiplexing (OFDM) signals. The present invention provides a method of synchronizing a receiver to a received signal, wherein the signal includes a training symbol having two identical parts, the method including determining a degree of autocorrelation between parts of the signal at the spacing of the two identical parts, and a degree of cross-correlation between parts of the received signal and a reference copy of the training symbol, and multiplying together the autocorrelation and crosscorrelation to determine at least one of the timing and the frequency of the received signal. This may be achieved by multiplying the cross-correlation and autocorrelation directly, or by multiplying a function, which may be referred to as a timing metric, of one of them with the other, or my multiplying functions of both of them together. A method and apparatus is provided for rapid frame/symbol timing and carrier frequency synchronization of a digital receiver to an orthogonal frequency division multiplexing (OFDM) signal and other transmissions (either continuous or burst) over a frequency-selective (wideband) channel. The method makes use of the autocorrelation and cross- correlation properties of one training symbol, having two or more identical parts. Its timing estimation performance approaches the ideal with a very low mean square error while its frequency accuracy is competitive when compared with conventional methods. It also has a wide frequency estimation range of half the signal bandwidth, yet it has a low, scalable and adaptive complexity that can be optimized to any specific application.
This paper presents a signal-to-noise ratio (SNR) estimation algorithm for advanced digital video broadcastingreturn channel via satellite (DVB-RCS) systems using adaptive coding and modulation (ACM). Due to the absence of a repetitive pilot symbol structure, SNR estimation has to be performed using the fixed symbol preamble data. Moreover, sporadic nature of data traffic on the return link causes variation in interference level from slot to slot and, therefore, the estimation has to be done within one traffic slot duration. Hence, it becomes necessary to use a combination of data-aided and decision-directed (DD) algorithms so as to make use of traffic data. A non-data-aided estimator that was previously proposed by the authors for binary and quadrature phase shift keying schemes is extended to 8-PSK in a decision directed manner. The inherent bias of DD approach at low values of SNR is reduced by using a hybrid approach, that is, using the proposed estimator at moderate/high values of SNR and the moments-based estimator (M2M4) at low values of SNR. Overall improved performance of the proposed hybrid estimator, in terms of accuracy and complexity, makes it an attractive choice for implementing ACM in advanced DVB-RCS systems.
The authors introduce the types of satellite constellation networks, and examine how overall performance of TCP communications carried across such a network can be affected by the choice of routing strategies used within the network.
This paper presents initial results of the recently kicked-off FP7 ICT STREP project "CoRaSat" (Cognitive Radio for Satellite Communications) [1]. Focus is put on the preliminary identification of the scenarios which are suitable for the applicability of Cognitive Radio technology over Satellite Communications (SatCom). The considered frequency bands include Ka-band, Ku-band, C-band and S-band, where regulatory and coordination constraints exist. An initial mapping of broadband and narrowband SatCom use cases on each identified scenario is also provided. Moreover, several challenges associated to the applicability of Cognitive Radio over SatCom in the identified scenarios are presented, which form the basis of the market/business, regulatory, standardization and technological framework of CoRaSat. Furthermore, ongoing and future work of the CoRaSat project is outlined. © 2013 Authors.
An improved method for estimating the frame/symbol timing offset in preamble-aided OFDM systems is presented. It uses a conventional preamble structure and combines autocorrelation techniques with restricted crosscorrelation to achieve a near-ideal timing performance without significant increase in complexity. Computer simulations show that the method is robust in both AWGN and fading multipath channels, achieving better performance than the existing methods.
In this paper, we study the applicability of terrestrial mobile waveforms in the return link of a high throughput satellite (HTS) communication system. These include orthogonal frequency division multiple access (OFDMA), single-carrier frequency division multiple access (SC-FDMA) and filter bank multi-carrier (FBMC). Key solutions to the challenges in a geostationary orbit (GEO) satellite channel, such as synchronization and non-linear distortion, are presented. A global-positioning-system-(GPS)-based approach for synchronization acquisition is proposed, while suitable algorithms are studied for timing/frequency offset estimation and synchronization tracking. The spectral and power efficiencies of the schemes are optimized by means of an intermodulation interference (IMI) cancelling receiver, and these are compared to state-of-the-art time division multiple access (TDMA). Finally, end-to-end simulations validate the system performance.