A Single board computer (SBC) changed significantly in the last two decades. What used to be simple 8-bit CPUs with limited RAM have now been transformed into quad-core data-crunching machines that can be as small as a pack of gum. Now that edge computing is becoming more popular, how can SBCs benefit? This article will look at why AI is being driven away from data centers and how SBCs are being made artificial intelligence (AI)-friendly.
The Pros and Cons of AI
The use of AI in products is constantly increasing due to the many advantages that it brings to the table. This modern solution helps create customizable products for customers and simultaneously improves the product for all customers.
For example, the popular platform Google Assistant can learn and customize its responses to single individuals, which can then be developed to customize responses to an entire group of individuals. This, in turn, helps improve the experience for all customers.
Getting AI into a product can be quite difficult. The most common method for AI implementation is cloud-based AI. This approach is challenging because cloud-based AI systems have their main AI algorithm running at a data center. The need to have customer devices send and receive information from a data center works but inherently has several issues for a robust AI system integration.
Privacy concerns have become a sensitive and primary issue with AI. Sensitive information is being sent to an unknown location that could potentially be accessible by unauthorized individuals. Let’s consider the consumer-popular Alexa-based products from Amazon. Amazon’s Alexa has AI capabilities wherein users can ask it questions and get a response. When you think about it, Alexa is like a telephone because the user’s questions are sent to a data center for AI processing instead of the processing being done locally at the device. The privacy concern arises from the fear that Alexa could potentially record conversations and store them without customer knowledge or consent, thus making them accessible to a wide range of employees at Amazon with access to the AI data or systems.
Latency is the next issue. Products that use a remote data center need to send the data, wait for it to be processed, and then get the result. There will be a small delay because no internet connection is instant, and this latency can vary depending on traffic. Furthermore, as the number of internet user’s increases, so will system latency. This could potentially make products unresponsive.
Another issue associated with latency is the internet access itself. An always-on device that relies on a remote data center needs a continuous internet connection. It is not unheard of for website providers and DNS servers to have hiccups that can lead to inaccessible websites. If this does happen, then any product that relies on a data center will not be fully reliable. Locations with unreliable or limited data connections will not be suitable for internet-reliant devices.
While not all SBCs come with an AI co-processor, incorporating one (especially in embedded systems) into your design can be hugely beneficial. Even those that don’t include an AI co-processor can still rely on an external processor such as the Coral USB Accelerator. Either way, AI in embedded devices will become commonplace in the next decade, and at that point, even the simplest devices will have a minimum level of intelligence.