/Four CIO ‘hot tech’ topics to contemplate, from quantum considerations to 5G’s dirty little secret (via Qpute.com)
Four CIO 'hot tech' topics to contemplate, from quantum considerations to 5G's dirty little secret

Four CIO ‘hot tech’ topics to contemplate, from quantum considerations to 5G’s dirty little secret (via Qpute.com)


How are developments in four tech areas going to make their presence felt this year in such a way that ought to change how CIOs and their IT tech teams think about and plan their future operations?

That was the agenda for discussion set out when I caught up with John Roese, global Chief Technology Officer for Dell Technologies, with the four referred to being 5G, edge computing, quantum computing and the semi-conductor silicon ecosystem.

None of these should be new to diginomica readers, of course, but as technology lead of one of the most wide-ranging tech brand in the world, Roese’s pitch on these was clearly of interest, if for no other reason than to provide another perspective for CIOs to clarify their own thinking on these subjects.

Get ready for quantum

That said, the first item on his list, quantum computing, is the one technology no one is likely to implement this year. Certainly, there will be no desktop laptop quantum hardware systems around, though quantum-related software will be in evidence. The point to take onboard is that quantum computing is a very different kettle of IT fish, so learning what is and isn’t possible when it arrive is important. Roese explained:

We shouldn’t view quantum as a replacement for conventional compute. You’re not going to run a web browser on a quantum computer, it makes no sense to do that.

Instead quantum provides a new mechanism to process a different class of mathematics, one that is going to open up the door to the development of new algorithms that conventional computer technologies can, at best, only tackle badly. One example is factoring numbers. Quantum systems are expected to be able to factor prime numbers at extremely high performance levels, which paves the way for asymmetric key management protocols, he said:

If that happens, and you can crack them, you break a lot of the fundamental cryptography. So there’s already work underway in various governmental entities, at the industry level, to essentially replace our key management infrastructure with something a bit more quantum safe. However, the way that we’re going to do it is not to just replace one algorithm with another, because we’re not actually sure what algorithms will truly be quantum safe.

Instead, we’re moving into an era of what we call crypto-agility, which means that we’re going to build architectures in which the key management protocols and encryption architectures are not hard coded. They’re actually modular, you can change them out. So if we have to move to lattice or to some other algorithm to overcome a quantum threat, we can do that, or have multiples running.

There may not be any quantum hardware yet, but Roese sees 2021 as a year where IT teams need to start developing and practising their quantum skills and prototyping quantum applications using simulation software. He foresees applications in areas such as ultra-high performance trading and material sciences. Dell now has a simulator running on off-the-shelf technology, albeit one that is, he concedes, very slow. The firm has also started to partner with the likes of IBM, Microsoft, and Honeywell to make experimental environments available to the user community.

Quantum hardware may still be four or five years away, but it may well take developers that long to understand the mathematics and algorithms that are likely to be needed.

The semi-conductor ecosystem

The consideration here for IT teams is the coming demise of what Roese calls the “era of homogeneous compute” – standardised x86 processing that has made hardware simple and universal and enabled the cloud era with its ubiquitous access to pools of capacity, regardless of who provides it. But the factors that drove its growth, such as Moore’s Law, have now stalled, so the opportunities of dramatically improving performance are now fading fast.

Emerging now is the “era of heterogeneous compute”, when general purpose compute chips are packaged together with devices that offer much greater opportunity for differentiation in terms of the capabilities offered to users. This means far more user sweetspots can be hit far more directly and purposefully than the many ‘approximately close’ implementations available with just general purpose  devices:

The entire semi-conductor ecosystem is reforming to adapt to look like the heterogeneous compute ecosystem, and that will change everything. From semi-conductor to software modernization to the definition of what a server is…there are a lot of big deals here that folks aren’t really recognising.

For users, this will certainly mean getting involved with software modernization. Programming these new, composite devices and sub-systems is likely to be very different to programming or virtualizing a general purpose processor. Roese suggested that this will then need better software architectures and new abstraction models, because the general unit of compute is now going to be a diverse set of technologies. The integration platforms are going to matter even more as servers evolve and new ways to package the chips and the software into systems emerge.

5G’s dirty little secret

As Roese observed, the big sub-plot with the arrival of 5G communications services is that the stronghold of all mobile comms to date, the consumer marketplace, is actually the least important objective for 5G.:

It wasn’t any faster than 4G, it was just called 5G, but basically the same experience. The reason for that is the first wave of 5G was about extending the 4G ecosystem with a slightly new radio access network for broadband service for consumers. That was a good thing to do to get started, but 2021 will be the first year where we start seeing upscale standalone 5G environments where it’s not just about enhanced mobile broadband, it’s all of the other capabilities and what is known as release 16 and release 17 of the 5G standards.

He expects to see massive machine-type communication available as-as-service, based on very dense connectivity to low-power sensors, with densities of a million sensors in a square kilometre, all managed intelligently. At the same time, ultra-reliable low-latency communication will merge, offering one millisecond round trip latency across the radio access network for high performance data transmission. This will be able to run services such as drone telemetry feeds and similar services. And this is where 5G is really aimed.

What he calls its “dirty little secret” is that the architecture is designed for enterprise use cases. That means users will now have the technology available to start building out on enterprise use cases and 5G will start to dominate the technical landscape. As an example, he pointed to auto manufacturers around the world which are already investing heavily in early private 5G to automate factories.

Roese also sees this year as the one where OpenRAN, the open Radio Access Network designed to allow the products and services of multiple vendors to integrate and interoperate with the minimum of engineering, becoming the dominant underpinning of communications architectures. This will allow the widespread and rapid uptake of dis-aggregated software-defined, standardized services and components.

This, in turn, is already leading to a merging of IT, communication services and engineering resources, allowing a wider, richer scope for exploiting data across what had been difficult-to-manage boundaries. The vendor is already picking up on the possibilities, with the communications ecosystem vendors, such as Nokia, Ericsson, Huawei, and ZTE, now being joined by the likes of Dell, Corning, Microsoft, Google and, to a lesser extent, Amazon, with others that have traditionally been in the telecoms sector coming along shortly.

Correcting mistakes at the edge

Roese sees this year as the start of a major re-think around the edge, an area of computing where he feels some huge architectural mistakes have been made so far:

Most people thought that edge was an extension of a cloud operating model. But because we’re in the multi-cloud world, there are many cloud operating models. These are all very different architectures, it’s not one common architecture. What we’ve realised is that as people started to extend their public and private cloud services into their factories and hospitals and physical environments, each of those architectures required a different infrastructure. This idea that the edge is now going to be populated with many different distinct infrastructures is a bad thing, because, if it continues, it would be possible in a typical enterprise to have dozens of edge footprints in your factory to accommodate your multi-cloud architecture. So we think this year is a year where we will re-think this.

He sees edge platforms turning horizontal, able to run multiple edge workloads – a software-defined service, consuming capacity – on the same edge platform. This will allow users to avoid the proliferation of edge infrastructure, but still have multiple edges and a dynamic software-defined edge come and go out in the real world. To this end, Dell is already selling an edge platform for four different edge architectures, built on the same hardware layer:

2021 will be a year where we start to realise that edge is incredibly important. There are going to be many edge workloads and edge architectures and extensions of cloud models out to the edge. But those should be treated as software functions. And they should live on a more stable horizontal edge platform that can actually provide the compute storage, the networking, the resource management, out in that environment as the edges come and go.

My take

An interesting round-up on coming developments that will impact CIOs decision making over the coming years, and without the sound of too many Dell trumpets being blasted in self-aggrandisement. There was a good level of common sense set out as well, especially in the area of quantum computing. It may be five or more years before even the equivalent of the old IBM 360 mainframe starts appearing on company inventories, but getting a few team members involved in learning about how to get the best out of them would be a good investment. I can see, however, that it will also be the start of a buyers market in people with such skills, so many CIOs will also need to start forward-booking slots in the gig economy schedules for such people – while making sure they remember which company invested in that early training.


This is a syndicated post. Read the original post at Source link .