Categories
Podcast Season 5

Hardware Beyond the NUC with Julian Chesterfield of Sunlight

Although we use Intel’s NUC as a shorthand for the type of hardware deployed at the far edge, this recently-cancelled platform isn’t all there is. This episode of Utilizing Edge looks beyond the NUC, to platforms from Lenovo, Nvidia, and more, with Julian Chesterfield of Sunlight, Andrew Green, and Stephen Foskett. ARM-based solutions, many using the Nvidia Jetson platform, are particularly interesting given their low cost and power consumption and strong GPUs for edge AI. A hyperconverged stack runs all of the components required for high availability, including storage and networking, in software spanning all of the nodes in a cluster, and this is commonly deployed on low-cost devices at the far edge. The trend to deploying applications at the edge is driven both by new hardware and software capabilities and the changing expectation of consumers and businesses.

Categories
Podcast Season 5

Building Resilient Infrastructure at the Edge with Craig Nunes of Nebulon

Edge infrastructure is susceptible to many of the same security risks as datacenter and cloud, but is often run in less protected environments. This episode of Utilizing Edge features Craig Nunes, Co-Founder and COO of Nebulon, talking to Brian Chambers and Stephen Foskett about the provision of reliable infrastructure services at the edge. Nebulon’s product presents storage to servers in a managed way, monitoring and protecting storage in real time. Edge servers must have a known-good system image to ensure that they are secure, yet this is difficult to achieve in remote devices.

Categories
Podcast Season 5

VMware’s Journey into Edge Computing with Saadat Malik

Perhaps no company is more important to the datacenter than VMware, but how are the company’s technologies applied at the edge? This episode of Utilizing Edge features Saadat Malik, VP and GM of Edge Computing at VMware, discussing the evolution of VMware at the edge with Brian Chambers and Stephen Foskett. The discussion delves into the evolution of VMware’s presence at the edge, highlighting the differences between datacenter and edge environments in terms of people and technology. Malik emphasizes the importance of outcomes and product-focused mindsets in edge environments, as well as the constraints posed by limited physical resources. VMware’s technologies in connectivity, storage, security, and management are showcased as key enablers of successful edge computing. The episode also touches upon the growing significance of AI and machine learning at the edge and the need for standardized solutions to drive edge growth and transformation.

Categories
Podcast Season 5

Delivering Mature IT Platforms at the Edge with Pierluca Chiodelli

Datacenter IT is used to having tight control over infrastructure and applications, but this is challenging to maintain at the edge. This episode of Utilizing Edge features Pierluca Chiodelli of Dell Technology discussing the modern edge application platform with Allyson Klein and Stephen Foskett. A typical edge environment features many different platforms, devices, and connections that must be deployed, managed, and controlled remotely. When looking at the modern edge, Chiodelli recognizes the different personas and needs and constructs a plan to achieve the required outcome at this location. Modern applications need specialized hardware and connectivity that must be supported, deployed, and managed.

Categories
Podcast Season 5

The Middle Mile is the Heart of the Edge

Between the so-called last mile and first mile lies the middle mile, the realm of colocation and network service providers. This episode of Utilizing Tech features Roy Chua and Allyson Klein, discussing the middle mile with Stephen Foskett. This middle area includes content delivery services like Varnish and Akamai, as well as companies like Cloudflare that are delivering content and compute there. The middle network includes providers like Equinix, Digital Realty, and Megaport, which provide connectivity to the cloud and service providers, the hyperscalers themselves, and some interesting networking startups like Packet Fabric and Graphiant. We must also consider observability, with companies like cPacket and Kentik as well as companies like Cisco and Juniper Networks.

Categories
Podcast Season 5

Economics of Edge Computing with Carlo Daffara of NodeWeaver

When it comes to edge computing, money is not limitless. Joining us for this episode of Utilizing Edge is Carlo Daffara of NodeWeaver, who discusses the unique economic challenges of edge with Alastair Cooke and Stephen Foskett. Cost is always a factor for technology decisions, but every decision is multiplied when designing edge infrastructure with hundreds or thousands of nodes. Total Cost of Ownership is a critical consideration, especially operations and deployment on-site at remote locations, and the duration of deployment must also be taken into consideration. Part of the solution is designing a very compact and flexible system, but the system must also work with nearly any configuration, from virtual machines to Kubernetes. Another issue is the fact that technology will change over time and the system must be adaptable to different hardware platforms. It is critical to consider not just the cost of hardware but also the cost of maintenance and long-term operation.

Categories
Podcast Season 5

Achieving High Availability at the Edge with StorMagic

Although everyone wants high availability from IT systems, the cost to achieve it must be weighed against the benefits. This episode of Utilizing Edge focuses on HA solutions at the edge with Bruce Kornfeld of StorMagic, Alastair Cooke, and Stephen Foskett. Although it might be tempting to build the same infrastructure at the edge as in the data center, but this can get very expensive. Thinking about multi-node server clusters and RAID storage, the risk of a so-called split brain means not just two nodes but three must be deployed in most cases. StorMagic addresses this issue in a novel way, with a remote node providing a quorum witness and reducing the need for on-site hardware. Edge infrastructure also relies on so-called hyperconverged systems, which use software to create advanced services on simple and inexpensive hardware.

Categories
Podcast Season 5

The Near Edge and the Far Edge with Andrew Green

The edge isn’t the same thing to everyone: Some talk about equipment for use outside the datacenter, while others talk about equipment that lives in someone else’s location. The difference between this far edge and near edge is the topic of Utilizing Edge, with Andrew Green and Alastair Cooke, Research Analysts at Gigaom, and Stephen Foskett. Andrew is drawing a line at 20 ms roundtrip, the point at which a user feels that a resource is remote rather than local. From the perspective of an application or service, this limit requires a different approach to delivery. One approach is to distribute points of presence around the world closer to users, including compute and storage, not just caching. This would entail deploying hundreds of points of presence around the world, and perhaps even more. Technologies like Kubernetes, serverless, and function-as-a-service are being used today, and these are being deployed even beyond service provider locations.

Categories
Podcast Season 5

Designing a Scalable Edge Infrastructure with Scale Computing

One of the main differentiators for edge computing is developing a scalable architecture that works everywhere, from deployment to support to updates. This episode of Utilizing Edge welcomes Dave Demlow of Scale Computing discussing the need for scalable architecture at the edge. Scale Computing discussed Zero-Touch Provisioning and Disposable Units of Compute at their Edge Field Day presentation, and we kick off the discussion with these concepts. We also consider the undifferentiated heavy lifting of cloud infrastructure and the tools for infrastructure as code and patch management in this different environment. Ultimately the differentiator is scale, and the key challenge for designing infrastructure for the edge is making sure it can be deployed and supported at hundreds or thousands of sites.

Categories
Podcast Season 5

Developer-Friendly Edge Application Management with Avassa

There is a long-standing gulf between developers and operations, let alone infrastructure, and this is made worse by the scale and limitations of edge computing. This episode of Utilizing Edge features Carl Moberg of Avassa discussing the application-first mindset of developers with Brian Chambers and Stephen Foskett. As we’ve been discussing, it’s critical to standardize infrastructure to make them supportable at the edge, yet we also must make platforms that are attractive to application owners.