Apple Chips Get Unleashed for AI With io.net’s Groundbreaking Cloud Support

Apple Chips Get Unleashed for AI With io.net's Groundbreaking Cloud Support

io.net has made known to the public its latest platform upgrade, which is aimed at providing access to powerful computing resources for AI and machine learning.

io.net becomes the first cloud service provider to support the clustering of Apple’s silicon chips like M1, M2, and the upcoming M3 for these workloads.

The Silicon Valley startup’s latest platform-level upgrade means that for the first time, machine learning engineers and data scientists can now deploy thousands of APple’s powerful system-on-chip (SoC) designs in a single, distributed cluster within seconds through io.net’s cloud offering.

According to the company founder Ahmad Shadid: “We are thrilled to be the first cloud service provider to support Apple chips for machine learning,” he added, “This is a massive step forward in democratizing access to powerful computing resources, and paves the way for millions of Apple users to earn rewards for contributing to the AI revolution.”

What are Apple’s silicon chips?

Apple’s silicon chips, just like the M1 and M2 households, are observed in the latest Mac, iPad, and iPhone models, which have additionally gained popularity for awesome performance consistent with-watt, way to their tightly incorporated CPU, GPU, and committed Neural Engine additives on a single chip. This unified memory architecture consequences in lower latency and better facts processing efficiency, thereby making them nicely desirable for AI and gadget-mastering inference workloads. 

The upcoming M3 chip takes this prowess even further. With up to a 40-core GPU, an enhanced Neural Engine capable of 18 trillion operations per second, and a unified memory architecture supporting up to 128GB of RAM, the M3 is expected to outperform even high-end GPUs like the NVIDIA A100 on certain AI tasks involving large models. 

Unlocking access for millions

By supporting Apple’s silicon chips on its platform, io.net is opening up new opportunities for both machine learning practitioners and Apple’s vast installed user base of hundreds of millions of devices worldwide.

On one hand, ML engineers gain access to building affordable, decentralized Apple chip clusters to their specific workload. On the other, everyday Apple users can opt-in to contribute their devices’ spare compute system for unutilized Apple chips and earn rewards through io.net’s orchestration layer.

World’s largest decentralized GPU network

In 2023, io.net launched the world’s largest decentralized GPU network in a bid to solve the increasing demand for Graphics Processing Units (GPUs). By aggregating one million GPUs from various independent computing power providers, io.net aimed to reshape the landscape of AI processing.

Wrapping up

io.net has announced an upgrade that makes it easier for people to use powerful computers for AI and machine learning. They’re the first to let users cluster Apple’s M1, M2, and soon M3 chips in the cloud. This means machine learning experts can use thousands of Apple’s chips together online, speeding up their work. The founder, Ahmad Shadid, says this will help a lot of Apple users join in on the AI boom by using their devices. With io.net’s new feature, machine learning folks can set up affordable clusters for their projects, and regular Apple users can lend their device’s power when they’re not using it, earning rewards. In 2023, io.net also started the biggest network of shared GPUs to meet the growing need for AI computing power, gathering a million GPUs from different sources.

Source: https://coincodex.com/article/39076/apple-chips-get-unleashed-for-ai-with-ionets-groundbreaking-cloud-support/