Industrial cameras and sensors are generating more data than ever, and companies are increasingly moving machine learning to the edge to meet it
Tag: @SFoskett
Stephen Foskett, Publisher of Gestalt IT and Organizer of Tech Field Day. Find Stephen’s writing at GestaltIT.com, on Twitter at @SFoskett, or on Mastodon at @[email protected].
AI applications typically require massive volumes of data and multiple devices within the datacenter
Training and optimizing a machine learning model takes a lot of compute resources, but what if we used ML to optimize ML? Luis Ceze created Apache Tensor Virtual Machine (TVM) to optimize ML models and has now founded a company, OctoML, to leverage this technology
AI applications have large data volumes with lots of clients and conventional storage systems aren’t a good fit
AI and analytics needs access to massive volumes of data, but we are constantly reminded of the importance of securing data
Most organizations have a vast amount of so-called unstructured data, and this poses a major risk for operations
Big data really wasn’t all that big until modern analytics and machine learning applications appeared, but now storage solutions have to scale capacity and performance like never before
When it comes to AI, it’s garbage in, garbage out: A model is only as good as the data used
Biases can creep into any data set, and these can cause trouble when this data is used to train an AI model
Productive use of AI requires the application of existing models to new applications through a process called transfer learning