Categories
Podcast Season 5

Considering the Diversity of Edge Solutions with Brian Chambers and Alastair Cooke

Edge computing comes in many forms and brings many challenges, including bandwidth limitations, network reliability issues, and limited space. This episode of Utilizing Edge brings Brian Chambers, Alastair Cooke, and Stephen Foskett together to discuss the state of the edge in 2023. Industries like retail, multi-tenant environments, and industrial IoT find practical applications, but defining the edge remains an ongoing exploration. Implementation varies, from repurposing existing technologies to adopting modern approaches like containers and function as a service. The debate between virtual machines and containers continues, driven by organizational comfort. Despite constraints, edge environments offer greater control and accountability. The future promises more innovation and adoption, cementing edge computing’s significance in the tech landscape.

Edge computing comes in many forms and brings many challenges, including bandwidth limitations, network reliability issues, and limited space. This episode of Utilizing Edge brings Brian Chambers, Alastair Cooke, and Stephen Foskett together to discuss the state of the edge in 2023. Industries like retail, multi-tenant environments, and industrial IoT find practical applications, but defining the edge remains an ongoing exploration. Implementation varies, from repurposing existing technologies to adopting modern approaches like containers and function as a service. The debate between virtual machines and containers continues, driven by organizational comfort. Despite constraints, edge environments offer greater control and accountability. The future promises more innovation and adoption, cementing edge computing’s significance in the tech landscape.

Stephen Foskett, Organizer of the Tech Field Day Event Series, part of The Futurum Group. Find Stephen’s writing at GestaltIT.com, on Twitter at @SFoskett, or on Mastodon at @[email protected].

Alastair Cooke, independent analyst and consultant working with virtualization and data center technologies. Connect with Alastair on LinkedIn and TwitterRead his articles on his website.

Brian Chambers, Technologist and Chief Architect at Chick-fil-A. Connect with Brian on LinkedIn and Twitter. Read his blog on Substack.

Follow the podcast on Twitter at @UtilizingTech, on Mastodon at @[email protected], or watch the video version on the Gestalt IT YouTube channel

Transcript:

Welcome to Utilizing Tech, the podcast about emerging technology from Gestalt IT. This season of Utilizing Tech focuses on edge computing, which demands a new approach to compute, storage, networking, and more. I’m your host, Stephen Foskett, organizer of Tech Field Day and publisher of Gestalt IT. Joining me today for this special mid-season check-in is Alastair Cooke and Brian Chambers. welcome to the show Alastair.

Alastair Cooke: Thanks Stephen, it’s a joy to be here. It’s been a while since I’ve been on one of the episodes, which is why I’m just primarily not the only co-host on the show. Nice to be here with Brian as well.

Stephen: Yeah Brian, it’s nice to see you again.

Brian Chambers: Yeah, good to see you guys. Alastair, it’s been a while, great to be back and catching up mid-season to talk about some of the things that we’ve learned so far, so I’m looking forward to it.

Stephen: Yeah and I think that that’s um kind of where we’re going to go today. you know we’ve recorded a few episodes we’ve talked to many of the companies that we saw at Edge Field Day earlier this year. We’ve talked to, if you’re paying close attention, we’ve talked to a bunch of the companies we probably will see at Edge Field Day later this year and we’ve talked to a bunch of other folks from the community who are deeply interested in this including a few episodes recently featuring our friends Allyson Klein, Gina Rosenthal, Andrew Green, Roy Chua, and all of these episodes have really sort of well we’ve hit on a lot of different topics in there. I guess let me just throw this to you guys first. I don’t know, Brian what’s the big takeaway that you’ve had so far from this season of Utilizing?

Brian: Yeah that’s a great question. Probably the thing that has come up the most times and that resonates the most with me is we spent a lot of time talking about the constraints that exist at the edge. There are many, right? They’re bandwidth related they’re network reliability related, they’re footprint related, environmental related, etc. In spite of those constraints, I guess the fact that there’s still a lot of people who are charging forward into this kind of frontier and who see a lot of value in building edge solutions and in the process of doing so it’s, been really interesting to see where the paradigms that we’re used to in the cloud or even data center worlds in the past have been similar or sometimes identical even, the same technologies existing in both places and in some cases where people are taking ground-up approaches and doing something completely new and different using, you know, new and emerging things. So it’s been interesting to see I guess that kind of juxtaposition of the uh the oldand the new uh coming together at the edge.

Alastair: I always think it’s interesting to hear how the things we’ve done in the past fit together to lead to what we’re doing now and in the future so this the sort of anthropology or the the archeology of the technology seeing that reuse of the technologies we had on premises and then in the cloud seeing how you do something similar but not quite exactly the same at the edge, and I think one of the things I’ve taken from this is this the series of episodes and the diversity of conversations and vendors and and expertise that we’ve had is it that the edge is are very diverse thing that isn’t a single characteristic there isn’t a single defining characteristic for it for all of the use cases. So the edge that Brian has is very much focused on retail and very constrained environments other times the age really means getting close to your customers in a multi-tenant environment and we see lots of variation in that. I’m always intrigued by seeing the solutions that people come up with the ways people use technologies that weren’t the way most people are using the technologies or potentially the way the vendor thought that the technology was going to be used and I hope we get to see some more of that as we carry on through the series.

Stephen: Yeah it’s been interesting because as we predicted in episode zero of this season, you know, what is the edge is probably one of the big questions that is going to come up. Well guess what? It came up, it came up a lot, you know we talked about I think the retail and restaurant edge a lot, thanks Brian. We’ve also talked a lot about other things, you know, I mean we’ve gone into networking, you know we had an episode about the near edge and the network edge as well as the middle Mile and connectivity. We talked about industrial IoT, we talked about as you say Al, kind of the the bringing you know basically the the tear down and built from scratch versus trying to have continuity. It’s a big, it’s a big picture, it’s a big place right?

Alastair: And there’s a lot of diversity in the approaches customers take. I mean I’ve said before the dirty secret of the edge is that often the thing that you’re running at the edge is an old Windows machine and that that drives the technologies and the things that you’re actually going to point out to those physical edge locations. But that’s not always the case. Sometimes you have the joy of building from scratch a whole new application that’s going to fit to how you’re doing business and sometimes you have to do that because of what the edge means. Some of the the edge use cases are a truck driving around with some logistics components attached inside that truck or unlikely to be running a whole bunch of Windows servers or as I saw at one retail environment a long time ago, I saw two warp servers 10 years after that product got discontinued, there is just this long tail of tech that affects a lot of people but there are also people who are bravely striking out and abandoning all of the sunk investment and saying I need to build it from New I need to build using containers or I haven’t yet seen people running functions as a service at the edge, at least not on the platform they built themselves. That is something that the AWS snow family will do for you. So there’s there is still that huge diversity of how people are approaching building applications for the edge.

Brian: I definitely agree with that I think one of the things that would be a key takeaway as we’ve talked to a lot of people and reflected on a lot of these solutions is I mean we have our edge environment in restaurants but you don’t really want to do that unless you absolutely have to. You want to be putting all of your workloads, you know, in the places where you have the fewest concerns and the edge is really kind of one of those places that where you probably have the most concerns, the most things you have to be responsible for to manage to deal with. So it’s kind of like a sub-optimal solution from an overhead and a management perspective, but sometimes justified by you know the potential business value it brings but I think there’s a lot of cases that are emerging where people are doing things at the edge that are like you know your Cloudflare workers type edge or your CDNpops you know, your different solutions like that and actually maybe moving some of the compute, some of the business logic, some of the data storage out of just like a data center region or a cloud region into some of these like closer to the user locations and actually doing things there. But I do agree like I don’t hear people doing that on platforms they’ve rolled themselves. They’re doing it on top of either the hyperscale cloud providers or the other major solutions like Cloudflare and such that I mentioned, so that’s been interesting to observe as well.

Stephen: Yeah and that’s especially true in networking and edge network applications I think that there’s definitely a lot of that stuff happening, sort of like we talked about in the near edge. In other words at the edge sure, but not at the far edge, not on-premises, but you know at the edge of the network and at network points of presence and so on. And as you say I think that it’s possible that we could see more and more applications inhabiting those sort of remote data centers, you know the Equinix kind of model and as well as the Cloudflare model which I mean where does Cloudflare live? I don’t know, everywhere? You know and that’s the power of it right, it’s that it does live everywhere and Al you know to the point about function as a service, sure I mean you know you think about Cloudflare workers and things like that um very very powerful stuff that doesn’t exist and certainly doesn’t exist in the data center, I think you could classify that as edge right would you?

Alastair: I would definitely consider the functions are running very close to users as edge and Cloudflare. Like to say that they’re within 10 milliseconds of 90 something percent of the population. That number changes periodically with them but they are close to your users which is I think one of the most fundamental definitions of edge so yeah Cloudflare is an interesting edge. And it gets away from some of the sort of constraints that concern Brian, that the the compute resources sitting in a trusted controlled location rather than sitting out in a near public location like a quick service restaurant and so you can see that there’s some huge benefits to being in those more secure, more regulated, more easy to automate locations, places that are much more cookie cutter deployment than even you can get into quick service restaurant because Brian I don’t imagine every single one of the restaurants has exactly the same services and that you can deploy exactly the same applications to every location that you have but this is this sort of idea of these these near edge platforms uh such as Cloudflare or running things Lambda at the edge and I may be AWS bias because I teach their training courses but also things like the AWS wavelength zones where they’re putting the AWS Services inside self cell points of presence for 5G networks. These are all elements of edge where the constraints of being on-premises are not causing you so much grief, yet you’re getting nice and close to your users.

Stephen: You know Al, you bring up a really good point that I love what you said that the far edge is probably the worst place, I don’t I’m paraphrasing you, but the the worst place to be running this stuff because so many of the conversations we’ve had at Field Day and here on the show have been trying to overcome those worst place problems like security, like reliability, like cost, like bandwidth and interrupted communications and power and zero provisioning and all these things that we’re, if you kind of look at it from that perspective, the reason we’re doing these things is because it just is such a bad place to be running this hardware. And yet that’s where we kind of have to run it if we have to solve for some of those things and it’s so, it’s almost like a self-fulfilling prophecy right? I mean if we have to stay up even if connectivity goes down and if conductivity might go down, ergo we have to run it at the far edge right? Is that the kind of thought process that you think that drives this stuff to the far edge, even if that’s not really the best place to be running it.

Brian: Yeah I think that’s exactly right, Stephen. So in our case for example which I can speak best about, it’s been about business continuity and making sure that we can run the things that are critical at any time and so that’s what drove us to do that. Again I’ll say it over and over again, we didn’t really want to have to do that, it was just something that we needed to do based on the types of solutions that we wanted to deploy. The more and more we can take advantage of you know, The CDN points of presence or the AWS wavelengths or those types of services or the cloud itself, the better. So that’s our default and that’s where we want to run things if at all possible. But there are cases for us and we’ve heard a lot of stories from others, whether it was here or Edge Field Day you know I always think of the remote oil field use case where things need to run but there may just be no connectivity. So these great solutions that are you know really close to most users, these are some of the users that they’re not close to and they have no options, so they have to have some sort of on-prem solution to sustain their business and so they’re investing in these things to make that happen. So I think that’s what drives people there not because it’s the best place to run a workload or it’s the most fun thing to do or all the problems are solved and it’s just turnkey. I think it’s you’re driven there by business need and then you have to solve a bunch of problems to make that work and you know that’s why we see a lot of these solutions emerging. So I would have maybe thought it was a niche problem a little bit but it seems like we’ve seen a lot of cases where you know, commercial vendors are building solutions for these reasons. You know and then companies are adopting them because they feel this sense of need to not be entirely connectivity dependent for their solutions. So that’s kind of what I’m seeing overall.

Alastair: I think it is absolutely a niche product that there’s a series of specific things that drive you to requiring an edge solution and they’re not all the same for all customers but that there are so many of these different niches you know, ships at sea or public access kiosks, photo printing, there’s a whole collection of them that require a paradigm that’s not well served by the the cloud plus on-premises and that’s what’s driving vendors into seeing an opportunity to build a platform that will work across multiple of these different use cases these niches and make it easier to actually deliver to the relatively hostile location of the far edge or to deliver services in more near edge locations, that it can lower the latency if your users don’t all come to one place right? Far edge tends to be predicated on your users come to a relatively small number of locations that buy the food at the restaurant, they don’t buy the food as they’re walking downtown a country lane whereas the the near edge solutions tend to be. Your users are all over the place and you need to deliver a service close to them and so that seems to be one of the sort of separations of use cases and we see platforms that are really well suited to the far edge and quite different platforms much more data center like in cloud-like platforms that are being delivered into near edge.

Stephen: And spoiler alert for a future episode here another use case that we’re going to be talking about is media and entertainment, specifically you know production and the fact that there it’s not about business continuity as much as it is just about throughput and how do you get data from point A to point B and there’s a bunch of different solutions for that you know, bulk data. I’ve got a couple of companies that we’ve been talking to that are doing some very very cool things that are applicable for media, for oil and gas exploration, for you know, you mentioned ships at sea, autonomous driving, things like that, all sorts of really cool use cases where you know, yeah it’s got to be at the edge and connectivity is important, but it really is kind of a throughput question more than anything. So I think we’ll see about that. Another thing I wanted to to kind of get into here though that has come up and just came up when we started this conversation is the adaptive versus starting from scratch question. Let’s turn the page and talk about that because that’s come up again and again in our conversations to the point that when we had you know when we talked to a company like VMware, I think that the conclusion was you got to have, you got to be able to run VMs at the edge because you just can’t start from scratch is that true? You know I guess first are VMs inherently linked to this adaptive mode versus containers and cloud functions that might be more from starting from scratch is that logical? Is that true? Is that necessary? I don’t know Al do you want to, you brought it up so you want to dive in?

Alastair: So I’ve on recent, well last year’s briefings and writings, I upset VMware by describing building things in virtual machines as being a legacy approach and there’s a negative connotation to legacy but there’s also a very positive connotation to legacy. That’s what we normally refer to as production. For the majority of organizations, production applications running virtual machines with their on-premises or in the cloud and so there’s a huge sunk investment of quantifying how your business operates written into these Windows applications and one of the hardest things when you have to move to a new platform is if you have to rewrite those applications and whenever you’re moving to something like functions as a service or using fully rich services from a cloud, you can only do this by rewriting your code containers. Seems to have been the last time we were just able to repackage our code and ship it as it was so I think VMware is right. There are a huge number of locations where the business value is delivered by sitting on top of the existing business value being delivered by virtual machine based applications but if there isn’t a virtual machine based application, that will deliver the business value. I don’t think many organizations are starting from scratch and writing that business application as virtual machines instead of writing containers and functions, they’re writing smaller units of code rather than having big monolithic code bases. So it’s the innovator’s dilemma if you sunk a lot of effort into getting a lot of value out of one piece of technology, switching to a different piece of technology is going to remove all of that value and that differentiation in the market. So we definitely see this continuing to use virtual machines where you’ve already got virtual machines, but starting from the beginning. You possibly aren’t going to be using virtual machines.

Brian: Yeah I tend to agree. I think it will depend a little bit on the customer and the solution. So if people are doing a completely Green Field approach that they have the luxury of doing that, I think they’re probably more inclined to go the route of something like containers or you know, something even smaller, you know, WASM maintaining like functions or things like that, but I think the reality is that the majority of organizations are probably not finding themselves in those situations, but probably starting with something and what we’re calling edge might even be a modernization effort in a lot of places, you know, to try and take something like really legacy and bring it to something you know, that gets some of the benefits of the cloud but maybe not all of the ones that could potentially be achieved, maybe not going all the way to containers. So I do think companies that already have a virtual machine you know, like strength internally and then maybe already have virtual machines running in some sort of edge location are probably going to keep using virtual machines in a lot of cases and I think that’s probably perfectly fine, you know, assuming the ability to manage them in those places is there. But I think Green Field people are probably not going to go that route quite as much because there are some benefits to containers. It’s the way people are used to developing in the cloud. So I think that’s going to be what the Green Field ones do but I think there’s going to be a hybrid of both and probably some new emerging things over the next several years. And again, all of this is contingent on what do we mean when we say edge. Do we mean like the far edge and like a restaurant location or do we mean like you know a Cloudflare worker which is already super modern and can kind of take you know a function and just run it. You’re not going to see VMs there I don’t think but you probably will see a lot of them in like the far distant edge locations for some time to come.

Alastair: I think there’s really an important aspect that you touched on in there is the organizational comfort, that adopting new technologies is not always natural to an organization and so often continuing to use the tools that the teams and the developers and the management structures and all the things we’re used to working with is much easier for an organization and sometimes that even drives down to a financial effect of virtual machines tend to run on platforms where we buy them and run them for three, five, seven years, more transient workloads may run on platforms that you pay per execution, per run time rather than for multiple years and so that, again, that difference between RPX and Capex oriented is often significant right. Lots of non-technical requirements, lots more that are more about how the organization operates and its comfort.

Brian: yeah and then I mean let’s not forget that the these are not mutually exclusive paradigms. You can run you know, k3s with containers inside of a virtual machine so, but they can co-exist as well so I would expect you’d see a lot of people who take that hybrid approach of maybe they keep running VMs at the foundation so that they can do VM based apps but then they can also start to introduce you know, new paradigms if they make business sense and if they need them to bring containers and other types of things along, probably I mean I think the tipping point is going to be when you see vendors bring solutions that people want to buy off the shelf and put in edge locations that actually assume a container is the unit of entry as opposed to a virtual machine and I haven’t seen that yet. I don’t know if you guys have, but I haven’t seen anything like that tips scales.

Alastair: I have I’ve seen that with things like higher self where their standard deployment is a Kubernetes cluster and so they’re expecting you to run just containers and what’s the standard time right the future is here, it’s just not evenly distributed. Some people are running all containers everywhere and so those platforms that don’t support virtual machines, they don’t care about. But I think the majority of organizations have a much longer legacy and so they are still getting value out of those legacy platforms that they build using Windows tools, sometimes Linux tools in virtual machines in the past.

Stephen: Yeah one of the things that came up actually on the last episode when we were talking about dark data is well this interesting paradox for me when IT comes to edge that in a way edge environments are naturally somewhat better controlled than data center or cloud environments simply because they are the domain of the business, because they have to be like kind of connecting the dots here. So let’s see if this makes any sense to you. So you know if edge is really not the best place to deploy things, I would personally say that you know, probably cloud would be the best place, datacenter, you know, edge is really not where you wanted to play things. The only reason you’re going to do it is because you have to you because of the needs of the business. Now it’s not going to be like it is going to be like you know, oh man you know where we should run this you know no they’re going to absolutely not but the business is going to say look either the volume of data is too great or our needs, for business continuity, and high availability are too great, or are the you know, our users are too distributed or whatever. There’s a reason, there’s a compelling reason to deploy things here so we’ve got to do it. What that means and this is kind of like I said this came up during the dark data discussion but it’s also come up all season long, is this idea that Apps, it’s not a free-for-all, it’s not like a wild west it’s not like your desktop where people could be running literally anything at any time and it’s completely unconstrained, it’s you know, the only things that are going to be running there are things that need to run there, which means that there is greater control or at least greater accountability in terms of you know this line of business said that we need to run this application here, this third party provider said they need to run this application here, you know. We have decided we need to collect this here and for because of that, it actually is a much more controllable and predictable environment government, which means that we actually could have more Green Field in the edge than we do in other areas simply because it’s really only going to be three or four or five business units, three, four, five applications. It’s what they need to run, I don’t know. Does this make any sense to you guys at all?

Alastair: I think so and I think it comes through from what we saw at Edge Field Day 1 with mega networks, that they were very strong on the whole. Everything is about governance, everything is about control, everything is about security, and whilst they didn’t yet extend out into the compute layer, that was simply making sure the network layer was very tight down that approach, that this is a scary difficult place, so we must be very careful about it. We can’t be complacent, we must look at all of the details, we must be sure that we’re delivering business value because this is a relatively a high risk activity. I think there is something in that, that this is because it’s such a hostile place and constrained place, we give it more attention, whereas if I’m just deploying new virtual machines on my data center virtualization platform, I’m not too concerned about it. I’m not going to think so deeply about it. I’m just going to casually deploy a new virtual machine which is what I’m going to do this morning once I finish here.

Yeah I think it probably forces a greater degree of management than what you’re forced to do like in the cloud maybe as we talk about far edge at a remote location and then cloud like you don’t even have to care about how big your container image is in the cloud, like it’s 17 gigabytes who cares. In a far edge scenario, I mean 17 gigs should really scare you because if you’re talking about bad connections, maybe that are slow like, might take you weeks to actually successfully download that thing, if ever, to be able to run it and meet the business needs. So I think you govern things that maybe you can take for granted or even be sloppy on in the cloud, you got to be very precise about when you do them in an edge environment. And then of course, the things you guys mentioned about, you know, security and you know it should probably scare people a little bit that they potentially have their business data sitting in a very remote location or you know in an office somewhere that you know it could be taken I mean there’s no physical security to stop that those kinds of things that hopefully drives a lot of thinking about, you know, better encryption solutions and you know, things that maybe you get parity with in a data center or a cloud environment you know through some other means. But it forces you to maybe think about that stuff differently which maybe can actually make it a you know a better managed environment even though I don’t think that necessarily gives you a different outcome, it just I think we’re saying it actually makes you do more work and pay more attention to it to get the same benefits that you get most of the time from the cloud but you have to do it.

Alastair: So Brian, while you’re getting scared about things are you scared about Intel deciding they’re not going to build any more NUCs and that they’re going to hand NUCs over to other vendors?

Brian: That was a day of great sorrow for the Intel NUC, very sad. Um yeah, not super concerned It looks like that’s become a little bit of a an open spec for lack of a better term. I know there’s been a lot of news about ASUS taking over and delivering those, so I think the NUC is going to live on, you know ,which is great because I think a lot of people have liked them for you know solutions like ours that are super lightweight or just prototyping and things like that. So I’m glad they’re going to live on. I’m sad it won’t be the Intel NUC anymore but not really scared about it. Glad to see that there’s going to be some continuity and those are going to be available elsewhere, just you know, from a business continuity perspective, for sure. But yeah I think it’ll be interesting to see how that evolves and how it goes forward. What about you?

Alastair: Well I’ve had collections of NUCs in the past and have moved towards things that are a little more powerful in my shed quarters at home, but you know the NUCs are, have always been interesting. I do still have one sitting in in the bottom of a rack and the shed quarters, but what I’ve seen is as a proliferation of similar designs, so it’s not that Intel needed to necessarily be the leader in producing these things. We’ve seen lots of designs coming out of the minor vendors that are more Compact and providing a lot of compute resource in a relatively small space. It’s definitely seen a resurgence in the idea of using this small volume PCs for hobbyist kind of applications where Raspberry Pi’s got short. So yeah I’m not concerned that the that Intel is stepping out of this market is a problem. They’ve defined a market and now the rest of the market is going to carry on producing in it.

Stephen: yeah I would I would second that in Alistair I think that the Intel’s well, Intel’s first contribution was to show that a small form factor device made sense and it certainly does you know, I mean the greatest thing about the NUC is you know it’s small, it’s low power, it’s cheap, and it’s a real PC that runs everything you need to run, I mean fantastic. The second contribution though and I think that this should not be overlooked, and they deserve a big pat on the back for it, is the continuity of support and sales. The best thing about the NUC in my mind is that it’s not some you know, weird company nobody’s ever heard of that might go out of business or might drop the product, wait. Anyway, no it’s that it you know the thing has been supported. They’ve got bios updates, they’ve got patches, it doesn’t run a bunch of bloatware and weirdness, basically it is a really solid supported useful platform and Intel has never been shy about sharing technical details, specifications, drivers updates, etc. I mean it’s all widely available, it’s all easy to access, and for me as a user that’s the thing that I loved about this platform. But I think that the coolest thing though is that Intel has showed the world that there’s a market for these things and so they kind of don’t need to do it anymore because essentially that the pull point of the NUC business unit in my mind was to show the world that you could make a product like this and then the world made a product like that and so I think they’ve handled it very professionally in terms of saying okay we’re not going to be making these anymore. You know, here’s the design you know ASUS you can you know, kind of take on the support for this, you know, other companies you’re free to make your own designs like this. You know, we’re going to continue to support what we support and hopefully those other companies have gotten the message that it’s not just about making a mini PC, it’s about making a mini PC that is industry standard and well supported and that I think is the killer thing I mean Brian I think is that what you were attracted to by the platform.

Brian: Yeah I think you nailed it and I mean like generation to generation there’s like a high degree of consistency too. It’s not some radically whimsically different thing or form factor, there’s not like you said not a lot of bloat, not a lot of extra stuff, you just have to deal with it’s just kind of, I mean I actually think it’s simplicity is the key. It’s about as simple as you can get and I think that’s what people love because I mean we have our use case for it but I hear tons of NUCs behind digital menu boards and kiosks and different things like this. They just need to run an application and do it reliably and I mean I think they’ve been phenomenal at doing that. So I think you hit exactly the reason that it’s been attractive to enterprises. Even though it’s sort of a you know more or less feels like a consumer grade device, it’s been attractive to organizations because it gives them something they can build on top of easily and they get the results they expect in terms of the tech and the organization behind it and the way that they they’ve showed up. So I think that’s been awesome and I hope that continues.

Stephen: Yeah I really hope it continues. I hope I’m not off base in saying that companies have learned that lesson. A little worry that I am because some of the big vendors of products out there, there’s a lot of abandonware out there where you know, we’re not going to produce a bios update or there hasn’t been an update for this thing in 10 years or whatever, I don’t know, hasn’t been out that long but you know what I mean it, I hope, that the vendors are hearing that. That people need consistency and they need supportability and they need to not hide things behind pay walls and so on but we’ll see. I guess the cat’s out of the bag on that one. Yeah. The other thing of course that I loved about it is that they used industry standard components. You know they were very very strong proponents of that. So you know in terms of expandability but also in terms of you know, what network chip is this and you know things like that. It just it was the standard stuff that works. So looking forward to the second half of the season, looking forward to another Edge Field Day event, and of course, looking forward to a future without the NUC. What are you looking looking at for the second half of 2023, Brian?

Brian: Yeah, I’m really interested to see, we all hear about AI every day especially LLMs and such. I’m really interested to see what we learn about AI use cases at the edge, like are they real, are they really coming, are they really just cloud use cases that a user uses somewhere or are there real ones that maybe move out from cloud data centers into you know less resourced environments whether that’s you know sort of midterm edges or like far edge solutions like what we have at Chick-fil-A. So I’m curious to see that. I hear a lot about it drivers like computer vision, you know, voice is a new interface for people to interact with applications and again, we go right back to the same questions of you could do it in the cloud but the continuity question comes up or these things that are customer facing and in the critical flow, do you have a good network and you know and other things you can depend on or do you end up running that locally to try and make sure you deliver the best possible experience to customers and such. So I’m really curious about that another completely different realm for the same thing would be you know the whole self-driving car world there’s compute there and and it’s living at a disconnected or semi-connected edge in a lot of cases, so curious to see a lot of those things and how those continue to develop. So I think that whole AI thing is IT hype or is it real that’ll be fun to dig into more and talk more about it.

Alastair: Yeah seeing some real business value being delivered with AI at the edge would be something that I’m not sure that we’re confident as there is and it would be nice to hear from some customers and people who are actually receiving that business value. I’m also interested in what technologies do you need in order to get that business value because we know things like the the driverless cars use dedicated uh Hardware in order to deliver the AI, using things like the Jetson from NVIDIA and I’m interested to see just whether that means that all of the AI components are going to be on their own special separate platform or whether there is going to be a more general purpose platform that can deliver that AI at the edge. I’m always interested in strange ways that customers have used products to solve their business problems because it all has to tie back to that business problem as we were talking about before you only run your applications out at the edge because it solves business problems, it brings business value. I’m interested in seeing how that business value can be delivered in unusual and innovative ways. What I hope we’ll see over the progression of this series is also some looking at the interface between IoT and at edge, the autonomous devices that are generating data and generating insights or at least that we can generate insights for further from things like cameras that are doing AI analytics face recognition actually inside the camera and then feeding back into a more central system because of course you don’t just have one camera, you might have 30 cameras at the site and so there needs to be a coordinating intelligence as well as the individual intelligence on the cameras, that kind of integrated system of a collection of IoT devices plus some edge devices possibly even, we can throw them into the mix, near edge to make it easier to get out to your far edge that integrated solution, that platform for building your future. I think is an interesting part of wear edge is going and where we’re seeing vendors building.

Stephen: Yeah I would agree with you all on those. I definitely am interested in seeing where these things go. Another thing I want to point out that I’ll be watching, especially short term, is the level of interest from hyperscalers and service providers, network service providers in this market. I’m talking more and more to the cloud and network companies that you know, the sort of familiar names in the industry who are actively aggressively rolling out products targeting edge environments targeting exactly what you all have talked about here. We in the second half of the year are going to see conferences from Google, from Amazon Web Services, I guess I don’t need to show of hands how many people think that AWS is going to introduce more edgy stuff at Re:Invent this year. I think we can all agree they probably are aware of this market. I also think that they’re, I don’t want to say scared, but they realize that they need to have a big play in this market too because it is sort of exploding around them. There’s a lot of you know, the competition from the the network service providers is huge along with the traditional service providers and Telco companies is huge and the competition from new ideas at the far edge is huge and I think that Amazon especially realizes that they need to play there as does Microsoft and Google, so I definitely think that that’s something that we’re going to see a lot more of in the second half of this year and going forward as well and of course I’ll be keeping an eye on what happens next with the post NUC world you know, I mean certainly people are talking about Srm and and Risk Five along with x86 platforms from companies you know, Dell, HPE, Lenovo, Supermicro, so many companies are doing so many cool things with the hardware as well, So I mean on the one hand you’ve got hyperscalers, on the other hand you got hardware. I think that’s what makes this thing interesting and of course you know, self-driving cars and NVIDIA Jetsons and all sorts of crazy things, so really an interesting space. I really appreciate y’all joining me on the podcast this this season, at Edge Field Day earlier this year and hopefully later this year and just generally publishing and speaking about this really interesting world of IT. So thank you both for joining us. As we wrap this up um tell us where can we connect with you, also what are you interested in and you know, what are your key topics for the second half of the year? Alistair why don’t you go first.

Alastair: Sure, you can find me online as DemitasseNzZ as well demitasse.co.nz and I’ll be at VMware Explore next month, anyways it is coming up awfully fast and looking forward to catching up with people in the community there. My interest continues to be around how people solve problems at the edge and I have some thoughts about that that I really need to get out and get published, it’s been a little while since I’ve published anything on the blog so I’ll take an action out of this even to get some more things written.

Stephen: How about you Brian? What’s new in your world?

Brian: Yeah well where folks can find me first of all, Brian Chambers on LinkedIn. I think you can find me there. On Twitter as well b-r-i-c-h-a-m-b so if you want to find me there or still writing my Chamber of Tech Secrets on Substack which is actually really been a lot of fun. Been great to challenge myself to come up with something to talk about you know once a week and then try and write something thoughtful so that’s been good. That’s at brianchambers.substack.com, so yeah I’m interested in a lot of things as it relates to the edge. I think it’s it’s very similar to what we talked about. I’m really interested in the problem of observing the area around you, so think about for us in a restaurant, how do you really know operationally what all is happening and there’s a bunch of interesting things there with computer vision or lidar or other attacks like that and really interested in exploring that problem more and seeing how does edge support that or is it less needed, can we do things with without as big of a footprint even as we currently have. So I think there’s some interesting things to think about there as it relates to the edge through the second half of the year. So that’s what I’m interested in right now.

Stephen: Yeah thanks and Alastair, I’ll be at VMware Explorer as well. I’m also going to some other things, some storage conferences and Flash Memory Summit, Storage Developer Conference, I’ll probably be at Re:Invent and I’m going to be using this all to continue to learn and and grow. You can find me online at SFoskett, I’m pretty active on Mastodon right now so look that up if you want that’s been my chosen post, Twitter / X platform, along with of course LinkedIn and I would love to catch people there. So thank you very much for listening to Utilizing Edge, which is part of the Utilizing Tech Podcast Series. If you enjoyed this discussion, we would love to hear from you please reach out to us on your favorite social media site, oops sorry about that. Yeah I’m still on the Twitters, LinkedIn, whatever email, carrier pigeon, whatever you got, we’d love to hear from you. Also please do give us a rating give us a review, subscribe in your favorite channel. You can also find us as well onYouTube where we would love to hear from you there too. This podcast is brought to you by GestaltIT.com your home for IT coverage from across the enterprise. For show notes and more episodes though head over to our dedicated site which is utilizingtech.com or you can find us on Twitter and yes, Mastodon at Utilizing Tech. Thanks for listening and we will see you next week.

Leave a Reply

Your email address will not be published. Required fields are marked *