Edge computing comes in many forms and brings many challenges, including bandwidth limitations, network reliability issues, and limited space. This episode of Utilizing Edge brings Brian Chambers, Alastair Cooke, and Stephen Foskett together to discuss the state of the edge in 2023. Industries like retail, multi-tenant environments, and industrial IoT find practical applications, but defining the edge remains an ongoing exploration. Implementation varies, from repurposing existing technologies to adopting modern approaches like containers and function as a service. The debate between virtual machines and containers continues, driven by organizational comfort. Despite constraints, edge environments offer greater control and accountability. The future promises more innovation and adoption, cementing edge computing’s significance in the tech landscape.
Tag: @SFoskett
Stephen Foskett, Publisher of Gestalt IT and Organizer of Tech Field Day. Find Stephen’s writing at GestaltIT.com, on Twitter at @SFoskett, or on Mastodon at @[email protected].
In this episode of the Utilizing Tech podcast, Stephen Foskett, Alison Klein, and Gina Rosenthal discuss dark data in edge computing. Dark data is unutilized or unknown data collected by organizations. The distributed nature and use of third-party apps can make it challenging to handle dark data, limiting insights and posing security risks. Establishing a stronger IT-business connection is crucial. Observability solutions and data analytics can aid in discovering and centralizing dark data. AI has potential for data hygiene improvement, but human-driven cleaning is still necessary. Despite challenges, edge computing offers better data management due to controlled deployments.
Although we use Intel’s NUC as a shorthand for the type of hardware deployed at the far edge, this recently-cancelled platform isn’t all there is. This episode of Utilizing Edge looks beyond the NUC, to platforms from Lenovo, Nvidia, and more, with Julian Chesterfield of Sunlight, Andrew Green, and Stephen Foskett. ARM-based solutions, many using the Nvidia Jetson platform, are particularly interesting given their low cost and power consumption and strong GPUs for edge AI. A hyperconverged stack runs all of the components required for high availability, including storage and networking, in software spanning all of the nodes in a cluster, and this is commonly deployed on low-cost devices at the far edge. The trend to deploying applications at the edge is driven both by new hardware and software capabilities and the changing expectation of consumers and businesses.
Edge infrastructure is susceptible to many of the same security risks as datacenter and cloud, but is often run in less protected environments. This episode of Utilizing Edge features Craig Nunes, Co-Founder and COO of Nebulon, talking to Brian Chambers and Stephen Foskett about the provision of reliable infrastructure services at the edge. Nebulon’s product presents storage to servers in a managed way, monitoring and protecting storage in real time. Edge servers must have a known-good system image to ensure that they are secure, yet this is difficult to achieve in remote devices.
Perhaps no company is more important to the datacenter than VMware, but how are the company’s technologies applied at the edge? This episode of Utilizing Edge features Saadat Malik, VP and GM of Edge Computing at VMware, discussing the evolution of VMware at the edge with Brian Chambers and Stephen Foskett. The discussion delves into the evolution of VMware’s presence at the edge, highlighting the differences between datacenter and edge environments in terms of people and technology. Malik emphasizes the importance of outcomes and product-focused mindsets in edge environments, as well as the constraints posed by limited physical resources. VMware’s technologies in connectivity, storage, security, and management are showcased as key enablers of successful edge computing. The episode also touches upon the growing significance of AI and machine learning at the edge and the need for standardized solutions to drive edge growth and transformation.
Datacenter IT is used to having tight control over infrastructure and applications, but this is challenging to maintain at the edge. This episode of Utilizing Edge features Pierluca Chiodelli of Dell Technology discussing the modern edge application platform with Allyson Klein and Stephen Foskett. A typical edge environment features many different platforms, devices, and connections that must be deployed, managed, and controlled remotely. When looking at the modern edge, Chiodelli recognizes the different personas and needs and constructs a plan to achieve the required outcome at this location. Modern applications need specialized hardware and connectivity that must be supported, deployed, and managed.
Between the so-called last mile and first mile lies the middle mile, the realm of colocation and network service providers. This episode of Utilizing Tech features Roy Chua and Allyson Klein, discussing the middle mile with Stephen Foskett. This middle area includes content delivery services like Varnish and Akamai, as well as companies like Cloudflare that are delivering content and compute there. The middle network includes providers like Equinix, Digital Realty, and Megaport, which provide connectivity to the cloud and service providers, the hyperscalers themselves, and some interesting networking startups like Packet Fabric and Graphiant. We must also consider observability, with companies like cPacket and Kentik as well as companies like Cisco and Juniper Networks.
When it comes to edge computing, money is not limitless. Joining us for this episode of Utilizing Edge is Carlo Daffara of NodeWeaver, who discusses the unique economic challenges of edge with Alastair Cooke and Stephen Foskett. Cost is always a factor for technology decisions, but every decision is multiplied when designing edge infrastructure with hundreds or thousands of nodes. Total Cost of Ownership is a critical consideration, especially operations and deployment on-site at remote locations, and the duration of deployment must also be taken into consideration. Part of the solution is designing a very compact and flexible system, but the system must also work with nearly any configuration, from virtual machines to Kubernetes. Another issue is the fact that technology will change over time and the system must be adaptable to different hardware platforms. It is critical to consider not just the cost of hardware but also the cost of maintenance and long-term operation.
Although everyone wants high availability from IT systems, the cost to achieve it must be weighed against the benefits. This episode of Utilizing Edge focuses on HA solutions at the edge with Bruce Kornfeld of StorMagic, Alastair Cooke, and Stephen Foskett. Although it might be tempting to build the same infrastructure at the edge as in the data center, but this can get very expensive. Thinking about multi-node server clusters and RAID storage, the risk of a so-called split brain means not just two nodes but three must be deployed in most cases. StorMagic addresses this issue in a novel way, with a remote node providing a quorum witness and reducing the need for on-site hardware. Edge infrastructure also relies on so-called hyperconverged systems, which use software to create advanced services on simple and inexpensive hardware.
The edge isn’t the same thing to everyone: Some talk about equipment for use outside the datacenter, while others talk about equipment that lives in someone else’s location. The difference between this far edge and near edge is the topic of Utilizing Edge, with Andrew Green and Alastair Cooke, Research Analysts at Gigaom, and Stephen Foskett. Andrew is drawing a line at 20 ms roundtrip, the point at which a user feels that a resource is remote rather than local. From the perspective of an application or service, this limit requires a different approach to delivery. One approach is to distribute points of presence around the world closer to users, including compute and storage, not just caching. This would entail deploying hundreds of points of presence around the world, and perhaps even more. Technologies like Kubernetes, serverless, and function-as-a-service are being used today, and these are being deployed even beyond service provider locations.