Smart Servers From Space
This month’s topic considers the impact of optical storage and optical computing on hyper scale data centres, server architectures and server connectivity and considers the longer term economic and environmental benefits of moving servers into space.
The three topics together provide background to a Cambridge Wireless Webinar which is being presented on the afternoon of Wednesday October 20th
Information on the webinar can be found here.
https://www.cambridgewireless.co.uk/events/rf-and-optical-integration/
It is a chargeable event for non-Cambridge Wireless members but you can book a free ticket by going to register here and quoting 'CWGV21’ when prompted.
We look forward to you joining us at this event.
Read on
Google and Microsoft Azure (Azure Orbital) recently announced an agreement to deliver cloud services from the Starlink constellation. Similar arrangements are in place between Microsoft and SES (O3B MEO constellation) and Telesat and Cloud Ops.
For SES this includes joint investments in Azure Orbital ground stations to be used to support their MEO and earth observation customers. The ground stations are co-located with Azure data centres.
The approach is similar to the ‘ground stations as a service’ approach taken by AWS with Blue Origin and Project Kuiper.
Defence vendors are promoting the aggregation of multiple data sets (visible and infra-red imaging and synthetic aperture radar from space) with cloud based terrestrial voice and video battlefield command and control systems. Telesat have stated this as a target application for their Light Speed constellation.
New market entrants are planning constellations that are claimed to be optimised for cloud computing, using suppliers geared to maximising space data added value.
All of the above are based on the concept of bringing data down from space into terrestrial servers.
It would of course be possible to move this server bandwidth into space. Present modelling suggests that data centres consume just over 200 Terawatt hours of power per year equivalent to 1% of global energy consumption with probably a largely similar contribution to carbon emissions.
A hyper scale data centre is defined as a piece of real estate taking up a minimum of 10,000 square feet hosting a minimum of 500 cabinets supporting a minimum of 5000 servers.
Some of the largest data centres now house over a million servers so the definition is now rather out of date. A typical Facebook data centre today takes up at least 100,000 square feet on a thousand acres of land, costs about a billion dollars and uses about 220 megawatts of power.
There are more than 500 hyper scale data centres worldwide with over 40% owned and operated by companies such as AWS, Microsoft, Google, IBM, Facebook, Twitter, eBay, Ali Baba, Baidu and Apple.
That amount of real estate would be equivalent to having 7000 International Space Station satellites in low earth orbit weighing something of the order of 3 million metric tons. Even Mr Musk’s biggest rocket would require thirty thousand trips into low earth orbit to make that happen.
However several things could happen which together would make space based computing and memory assets more valuable than their terrestrial equivalents.
The nice thing about space is that electricity is free (it is sunnier in space), there are no landlord costs (there is plenty of space in space) and once in space, computing and storage hardware has a zero carbon footprint. Computing power in space could reduce the cost and carbon impact of mining bit coins and other crypto currencies and other computationally expensive processes.
But it is still expensive.
A Falcon Heavy rocket can deliver 30 ton payloads to GSO for about $150 million dollars ($5 million dollars per ton). This cost has been halving every 18 months but that is because it has been historically expensive to launch a rocket. Once Mr Musk has extracted maximum value out of a reusable first stage the rate of cost reduction per ton could be expected to slow due to the more or less constant cost of the fuel needed for the rocket. The Space Shuttle weighed 200,000 pounds but the fuel weighed twenty times as much. Modern rocket engines use fuel more efficiently but essentially operate in the same way as regular jet engines in a process of rapid but sub sonic combustion known as deflagellation. Rockets also have to provide their own oxygen which makes them less weight efficient.
Small incremental performance improvement jet engines and rocket engines are still possible but thermal efficiency remains stubbornly below 50%
This could be improved close to 100% by adding a detonation process as a precursor to the combustion process. A series of detonations create overlapping waves of supersonic energy which tumble into a combustion chamber which then burns fuel at close to 100 per cent efficiency. These are known as pulse detonation engines but the problem has been that it is hard to control and sustain the detonation process for more than a few milliseconds.
The most promising solution so far is a continuous rotating detonation engine where fuel is burned by transverse shock waves traveling at four to five times the speed of sound that spin within an annulus with the shock acting as a bladeless compressor feeding into a secondary combustion chamber.
These engines could make hydrogen weight economic for aircraft and could mean that a single stage rocket could fly into space with significantly higher payloads per ton of fuel.
However even if it became cost economic to deliver servers into space there would be other problems to overcome including radiation damage and collision risk.
Collision risk could be mitigated by placing the servers in geostationary orbit though the added latency might be an issue. There is an ever increasing need to manage traffic in LEO and MEO orbits. It would help if all the satellites sharing an orbital shell were going in the same direction.
Radiation damage could be mitigated by using optical computers and optical storage.
In our May and June technology topic/postings we talked about Smart Quantum from space, inter satellite and inter constellation optical cross connect and optical downlinks and uplinks as inter coupled enabling technologies
Optical computing and optical storage are another part of that same story.
Photonic computers would not be as fast as quantum computers but would be more tolerant to temperature change and noise, faster and more size and weight efficient than existing computer hardware and immune to radiation damage.
The starting point is an optical transistor probably using optical crystals with a non-linear refractive index where the intensity of incoming light influences the intensity of the light transmitted influences the intensity of the light transmitted through the material in a similar manner to the current response of a bipolar transistor. This opens up a world of optical logic gates.
Colours are also provide a potentially interesting way to represent vectors and can be applied to complex numbers.
Storage could be based on utilising colour (frequency/wavelength) together with brightness and saturation using nano crystals as a storage medium, yielding a compact way of representing and searching large volumes of data.
It could be argued that optical computers and optical storage remain too far in the future to be of interest to strategy teams and investors. Ubiquitous optical computing and optical storage is probably still thirty years away but in the greater scheme of telecom technology economics, thirty years is not a long time.
More specifically, optical computing and optical storage in space would make servers in space economic particularly if implemented across large numbers of optically inter connected satellites in LEO, MEO and GSO orbit, optically interconnected to earth through widely distributed ground stations integrated with their own optical buffer and storage bandwidth and optical computing power.
In previous technology topics/postings we have suggested that optical asset value will outpace RF asset value over the next twenty to thirty years and space asset value will outpace terrestrial asset value over a similar time scale.
If this is true then the consequences for terrestrial operators and their vendors are profound.
Servers in space are part of that emerging narrative of challenge and change.
Ends
For more background on these topics, buy a copy of our latest book
5G and Satellite Spectrum, Standards and Scale
Available from Artech House, you can order a copy on line using the code VAR25 to give you a 25% discount.
http://uk.artechhouse.com/5G-and-Satellite-Spectrum-Standards-and-Scale-P1935.aspx
For information on our South East Asia consultancy services, bespoke research and in house virtual or on site/off site facilitation workshops e-mail [email protected]
About RTT Technology Topics
RTT Technology Topics reflect areas of research that we are presently working on. We aim to introduce new terminology and new ideas to help inform present and future technology, engineering, market and business decisions.
The first technology topic (on GPRS design) was produced in August 1998. 22 years on there are over 240 technology topics archived on the RTT web site.
Do pass these Technology Topics and related links on to your colleagues, encourage them to join our Subscriber List and respond with comments.
Contact RTT
RTT, and Niche Markets Asia are presently working on research and forecasting projects in the mobile broadband, public safety radio, satellite and broadcasting industry and related copper, cable and fibre delivery options.
If you would like more information on this work then please contact [email protected]
00 44 7710 020 040