Big AI, Little AI, cardboard box: The Coming Shift in AI

The recent release of open-source AI models for local devices underscores a significant shift from bit to little AI. But what can they do differently?

Last week, Apple announced the release of three new open-source AI models, signalling a shift from predominantly cloud-based AI solutions to powerful, locally operated AI right in users’ pockets. Although these are just developer models this move underscores a broader trend in the AI landscape, where there is an increasing emphasis on embedding sophisticated AI capabilities directly into personal devices.

Big AI: Cloud-Based Powerhouses

In the realm of “big AI,” AI solutions are primarily cloud-based, leveraging massive server farms to process and analyse extensive datasets. A prime example of this can be seen in the healthcare industry, particularly in diagnostic labs where AI models analyse large volumes of medical imaging data. These models, which can consist of hundreds of millions to over a billion parameters, require significant computational resources that are typically available only in cloud environments. By utilising high-performance GPUs and TPUs, these systems can detect patterns and anomalies that might be missed by human eyes, providing crucial insights that assist in early disease detection and personalised treatment plans.

Little AI: Local Efficiency and Privacy

Conversely, “little AI” refers to AI systems that operate on local devices, such as smartphones or portable diagnostics. For instance, in a lab setting, a portable diagnostic tool might use a compact AI model to analyse blood samples for pathogens without needing to connect to the internet. These local models are significantly smaller in size, often containing only a few million parameters to ensure they run efficiently on less powerful processors and less power hungry.

The advantage of little AI is twofold: enhanced privacy and accessibility. Data processed locally means sensitive information, like health records, does not have to be sent over the network, reducing privacy risks. Additionally, local processing enables functionality in remote or poorly connected areas, ensuring that diagnostic tools remain operational regardless of internet availability.

Cardboard box: What’s inside count

The difference in parameter size between cloud-based and local AI solutions is stark. Big AI models, with their extensive parameter counts, are capable of more comprehensive and nuanced understanding and predictions but at the cost of requiring more power and generating more heat, which necessitates sophisticated cooling systems in data centres. Little AI models, however, are engineered to maintain a balance between performance and efficiency. Techniques like model pruning, where redundant parameters are eliminated, and quantisation, which reduces the precision of the computations, are commonly used to fit these models into the limited computational space of portable devices.

The distinction between big and little AI is not a perfect line but one needed to help show the shift in how AI is deployed across various sectors. While big AI continues to excel in environments where complexity and data volume are high, little AI brings AI capabilities directly to the end-user, ensuring privacy and functionality even in the most remote locations. Apple’s new AI models are a sign to the growing capability of little AI, promising a future where high-performance computing is increasingly ubiquitous, portable, and tailored to user privacy needs.

Staff Writer

Our in-house science writing team has prepared this content specifically for Lab Horizons

Leave a Reply

Your email address will not be published. Required fields are marked *