K

Kathleen Martin

Guest
Web 2.0 brought us user-generated content and interactivity – think Twitter, Facebook, Slack and Zoom. It enabled online startup Dollar Shave Club to build global recognition in 2012 with a YouTube video that cost a mere $4,500 (£3,300) to produce. The company rattled the previously bomb-proof incumbents in its market to such an extent that Unilever reportedly paid $1bn to acquire it four years later.
Web 3.0 has brought us even more disruption, in the shape of technologies such as big data, machine learning and blockchain. Companies are understandably keen, therefore, to get ahead of Web 4.0. Definitions of it vary, but this iteration promises immersive and highly personalised online services, blurring the physical and the digital. 
What will that look like? The metaverse is both the biggest Web 4.0 promise and the biggest threat. It’s a threat to incumbents such as Facebook (now Meta), largely because of its decentralised nature. It’s a promise to startups because they may be able to harness the tech in ways that could give them a competitive edge. 
All sectors are keen to get on the front foot. Microsoft recently spent $68.7bn on gaming company Activision Blizzard. JPMorgan, meanwhile, has created its own metaverse lounge – in which visitors are greeted by, of all things, a virtual tiger.
A less headline-grabbing matter is what CIOs need to do to make such technology work well. 
Firms that offer augmented reality (AR) and virtual reality (VR) in the metaverse will almost certainly have to reconfigure their computing power for the experiences to feel properly immersive. Centralised cloud computing provision, however punchy, won’t be enough. The reason for that is latency – the time lag on the network. 
Swedish telco Ericsson has pointed out that time-critical video games, such as first-person shooters, need no worse than 30 millisecond end-to-end network latency to ensure a high-quality experience. The further away the data centre is from the end device, the greater the latency. Even on the fastest fibre links, there is a latency of 5 microseconds (0.005 milliseconds) for every 1km of cable travelled by the data, according to research by Infinera.
That is why serious gamers use expensive hardware that can do the processing then and there. The problem for firms is that consumers are unlikely to want to spend much on special hardware to access metaverse services. What to do?
Gaining a competitive edge
Edge computing, where the processing muscle is placed closer to the data being crunched, is the next step on from the cloud. 
“People have been talking about the edge for two decades, but it has been limited to niche use cases,” says Ishu Verma, emerging technology evangelist at Red Hat, a provider of open-source enterprise software. “Now the idea of placing computing and storage closer to the data sources is being adopted more widely across industry and consumer applications.” 
One important reason for this is that data systems have become much more capable, cost-effective and energy-efficient, so deploying them at the edge on a large scale is far more feasible than it was. 
“In the cloud, you scale up capacity. At the edge, you scale it out to millions of sites,” says Verma, who adds that there is demand across all sectors that need low-latency services or simply want to avoid batch processing. 
Continue reading: https://www.raconteur.net/technology/edge-computing-customer-experience/
 

Attachments

  • p0006999.m06651.web_images_feb_2022_13.jpg
    p0006999.m06651.web_images_feb_2022_13.jpg
    75.7 KB · Views: 15
  • Like
Reactions: Kathleen Martin