Imagine a farmer in rural California finding challenges in increasing productivity. The challenges he’s facing have led to dips in revenue, making it difficult to keep his business afloat. Now, imagine how different his life may be if his tractor, old or new, was connected to camera technology, an overhead drone, and artificial intelligence (AI) — giving him the ability to scan his 300-acre farm, understand each crop’s health, and have his platform decide where to focus his limited resources. The technology, supported by edge technology, could give him the ability to mitigate connectivity challenges, simplify grueling tasks, and maximize yield in a blink of an eye.
The scenario is one Kevin Hannah, vice president of strategy and product marketing at Epik Systems, a leading company in complex technology solutions, is very familiar with.
“Perhaps the augmented reality (AR) application is more future thinking, albeit something we believe will be critical in driving the adoption of precision farming by farmers who do not have the latest and greatest equipment,” Hannah said.
How Did We Get Here
Tim O’Neal, CEO of Epik Systems
For those old enough to remember the company “computer system”, the first compute platform was a hulking piece of mainframe equipment connected to its users through dumb green-screen terminals.
In the 90s computers were getting faster. Not just centralized server computers, but also the user end-point PC. Then the second compute platform was born: client–server. Processing could now occur locally.
However, “it could take us weeks to purchase and spin-up server hardware to support products. Business wasn’t always happy with IT’s ability to keep up with demand or the expense,” said Tim O’Neal, CEO of Epik Systems. “We were now building and managing our own data centers.”
Innovation arrived on the scene in the form of distributed processing solutions that could take advantage of often underutilized PCs and turn them into a grid that could, for example, help pharmaceutical companies with complex drug discovery research and development requiring brute force processing power. However, the type of application that could take advantage of this distributed compute capability was limited.
Cloud arrived as the third compute platform. “Demand for on-premise or near-premise enterprise computing is obliterated by the emergence of Software-as-a-Service (SaaS) and Public Cloud that re-directs the attention of the buyers of compute away from desktop or corporate server-based offerings,” O’Neal said. “And what do you know… we cycled back to what are effectively centralized servers, albeit in someone else’s data center, connected to unintelligent browser-based access.” Applications became containerized.
In the face of practical realities that included cost, security and performance concerns, the emphasis on public cloud changed to one of hybrid cloud — where organizations were encouraged to take advantage of both public and private deployments.
Today, Internet of Things (IoT) devices are proliferating, along with an associated data generation explosion and the ability to use Artificial Intelligence (AI) to turn that data into tangible business value. But moving this data from where it is created to use with AI, or other disruptive technologies such as Augmented Reality (AR), is too slow and perhaps expensive.
“Latency is now a primary driving factor,” O’Neal said. “Processing at the Edge is here, and we cycle back to a distributed paradigm — the fourth compute platform, or a next-generation grid running containerized applications, that spans the entire ecosystem from Cloud to IoT.”
The Case for the Edge
Elizabeth Rose, vice president of marketing and public relations for Epik Systems.
Aside from the rural farmer in California struggling to keep his business afloat, there are several use cases where processing at the Edge would be beneficial, said Elizabeth Rose, vice president of marketing and public relations for Epik Systems.
For one, local governments that are working to become Smart Communities would benefit from the Edge because they could reduce the amount of data they send across networks to the cloud, which reduces network congestion, transport costs, and latency.
In addition, “The latency issue pops up again in public safety scenarios, where AI is processing data regarding predictive maintenance for public transportation or computer vision data at large public gatherings,” Rose said. “In either of these cases, running AI algorithms at the Edge delivers information faster, allowing for faster decision-making and response times from maintenance crews or first responders.”
Just like local governments, utility companies could benefit from the Edge by being able to continue service in case of an outage, or simply the ability to act faster in cases of emergencies.
Manufacturing is another sector where low latency would improve processes and quality of products.
“The top drivers include the need for low-latency processing to improve quality assurance and predictive maintenance, less use of network bandwidth, security and data sovereignty, the ability to consolidate and connect on-premise equipment that may only have wired or short range wireless connections, and the ability to continue operations in the event of a broader network outage,” Rose said.
It may be difficult for the average person to understand the impact the Edge could have in our everyday lives, but the results are clear.
“It is not always apparent to human users what are the benefits of microseconds saved in a manufacturing scenario, for example,” Rose said. “But microseconds can save thousands or even millions of dollars or more in wasted raw materials or damaged manufacturing equipment.”
When to Tap the Edge
Kevin Hannah, vice president of strategy and product marketing at Epik Systems.
Hannah, vice president of strategy and product marketing at Epik Systems, said when it comes to using cloud tools, all factors play an important role.
“The evolution from on-premise to cloud to hybrid to Edge and IoT has resulted in a compute, storage and network topography best thought of as what we term a ‘Next-Generation Grid’ built using distributed compute architecture,” he said. “In this environment, there is no singular important vector. It is a balance between latency, cost, performance, resiliency and — not to forget — security and compliance.”
The system can become even more complicated when considering other factors such as data sets, compute resource characteristics and hardware environments.
“One thing is clear,” Hannah said. “We will need AI to manage the complexity. If we look at just the network, AI will be critical to network automation and optimization. Real-time, intelligent decisioning is required to support traffic characterization, meeting end-to-end quality of service.”
Looking forward, Hannah said he expects the Edge to be more common across all industries.
“This massive increase in connected devices will result in Edge deployments with hundreds of thousands of, or perhaps more, nodes that need to be managed unsupervised by IT staff,” Hannah said. “Again, AI, we believe, will be required to support the control plane for automation, management, Kubernetes orchestration of primarily containerized applications and security. The Edge has to be Smart.”