The edge is an end point where data is generated through some type of interface, device or sensor. Keep in mind that the technology is nothing new. But in light of the rapid innovations in a myriad of categories, the edge has become a major growth business.
“The edge brings the intelligence as close as possible to the data source and the point of action,” said Teresa Tung, who is the Managing Director at Accenture Labs. “This is important because while centralized cloud computing makes it easier and cheaper to process data at scale, there are times when it doesn’t make sense to send data off to the cloud for processing.”
This is definitely critical for AI. The fact is that consumers and businesses want super-fast performance with their applications.
“Currently AI training produces vast volumes of data that are almost exclusively implemented and stored in the cloud,” said Flavio Bonomi, who is the board advisor to Lynx Software. “But by placing compute at the edge, this allows for looking at patterns locally. We believe this can evolve the training models to become simpler and more effective.”
The edge may even allow for improved privacy with AI models. “Having federated learning means that no end-user data is centralized or communicated between nodes,” said Sean Leach, who is the Chief Product Architect at Fastly.
What Can Be Done At The Edge
The most notable use case for the edge and AI is the self-driving car. The complexities are mind boggling, which is why the development of this technology has taken so long.
MORE FOR YOU
But of course, there are many other use cases that span a myriad of industries. Just look at manufacturing. “In monitoring manufacturing processes where seconds or minutes could mean millions of dollars in losses, for example, machine learning models embedded in sensors and devices where the data is being collected enables operators to preemptively mitigate serious production issues and optimize performance,” said Santiago Giraldo, who is the Senior Product Marketing Manager of Machine Learning at Cloudera.
Here are some other examples:
- Chris Bergey, the Senior Vice President and General Manager of Infrastructure Line of Business at Arm: “AI and the edge can explore the impacts of urbanization and climate change with software-defined sensor networks, pinpoint the origins of power outages in smart grids with data provenance, or enhance public safety initiatives through data streaming.”
- Adam Burns, the Vice President of IoT and the Director of Edge Inference Products at Intel: “CORaiL, which was a project with Accenture and the Sulubaaï Environmental Foundation, can analyze coral reef resiliency using smart cameras and video analytics powered by Intel Movidius VPUs, Intel FPGAs and CPUs, and the OpenVINO toolkit.”
- Jason Shepherd, the Vice President of Ecosystems at ZEDEDA: “TinyML will enable AI in more appliances, connected products, healthcare wearables, etc., for fixed functions triggered locally by simple voice and gesture commands, common sounds (a baby crying, water running, a gunshot), location and orientation, environmental conditions, vital signs, and so on.”
- Michael Berthold, the CEO and cofounder at KNIME: “In the future, we will also see models that update themselves and potentially recruit new data points on purpose for retraining.”
- Ari Weil, who is the Global Vice President of Product and Industry Marketing at Akamai: “Consider medical devices like pacemakers or heart rate monitors in hospitals. If they signal distress or some condition that requires immediate attention, AI processing on or near the device will mean the difference between life and death.”
But successfully bringing AI to the edge will face challenges and likely take years to get to critical mass. “The edge has relatively lower resource capabilities in comparison to data centers, and edge deployments will require lightweight solutions focused on security and supporting low latency applications,” said Brons Larson, who is a PhD and the AI Strategy Lead at Dell Technologies.
There will also need to be heavy investments in infrastructure and the retooling of existing technologies. “For NetApp, this is a large opportunity but one that we have to re-invent our storage to support,” said Ross Ackerman, who is the Head Of Customer Experience and Active IQ Data Science at NetApp. “A lot of the typical ONTAP value prop is lost at the edge because clones and snapshots have less value. The data at the edge is mostly ephemeral, needing only a short time to be used in making a recommendation.”
Then there are the cybersecurity risks. In fact, they could become more dangerous then typical threats because of the impact on the physical world.
“As the edge is being used with applications and workflows, there is not always consistent security in place to provide centralized visibility,” said Derek Manky, who is the Chief of Security Insights and Global Threat Alliances at Fortinet’s FortiGuard Labs. “Centralized visibility and unified controls are sometimes being sacrificed in favor of performance and agility.”
Given the issues with the edge and AI, there needs to be a focus on building quality systems but also rethinking conventional approaches. Here are some recommendations:
- Prasad Alluri, the Vice President of Corporate Strategy at Micron: “The increase in AI also means that its increasingly important that edge computing is near 5G base stations. So soon, in every base station, every tower might have compute and storage nodes in it.”
- Debu Chatterjee, the Senior Director of AI Platform Engineering at ServiceNow: “There will need to be newer chips with tensor capabilities seen in GPUs or their alternative, or specialized with specific inference models burnt into FPGAs. A hardware/software combo will be required to provide a zero-trust security model at the edge.”
- Abhinav Joshi, the Global Product Marketing Leader at OpenShift Kubernetes Platform at Red Hat: “Many of these challenges can be successfully addressed at the start by approaching the project with a focus on an end-to-end solution architecture built on the foundation of containers, Kubernetes, and DevOps best practices.”
Although, when it comes to AI and the edge, the best strategy is probably to start with the low-hanging fruit. This should help avoid failed projects.
“Enterprises should begin by applying AI to smaller, non-mission critical applications,” said Bob Friday, who is the Chief Technology Officer at Mist Systems, which is a Juniper Networks company. “By paying close attention to details such as finding the right edge location and operational cloud stack, it can make operations easier to manage.”
But regardless of the approach, the future does look promising for the edge. And AI efforts really need to consider the potential use cases to get its full value.
Tom (@ttaulli) is an advisor/board member to startups and the author of Artificial Intelligence Basics: A Non-Technical Introduction, The Robotic Process Automation Handbook: A Guide to Implementing RPA Systems and Implementing AI Systems: Transform Your Business in 6 Steps. He also has developed various online courses, such as for the COBOL and Python programming languages.