This Week in AI: US, China clash; Amazon eyes in-house chips

This week, the United States Department of Commerce (USDOC) sent a letter to Taiwan Semiconductor Manufacturing Co. (TSMC) (NASDAQ: TSM) requesting that the company suspend its shipment of advanced artificial intelligence (AI) chips to Chinese companies.

The USDOC issued this order after a TSMC chip was found in a Huawei AI processor, which raised concerns regarding international entities’ compliance with U.S. trade restrictions. For several years now, Huawei has been on the U.S. restricted trade list, and any company doing business with Huawei must secure a U.S. license, especially if the technology involved could enhance Huawei’s capabilities. If the U.S. sees the trade taking place or the business being done as something that threatens the U.S. or gives China a leg up when it comes to innovation or development, the U.S. is likely to deny that license.

TSMC responded to the USDOC’s order and notified its Chinese clients that it would cease shipment of semiconductors to them. This event highlights more than just this single trade dispute; it highlights the escalating tension between the U.S. and China over technological advancements in AI.

As the global leader in AI, the U.S. is determined to protect its position and aims to restrict access to technology that could catalyze or support innovation outside its borders. For the U.S., the fear is that advanced AI capabilities in countries like China could pose a security threat, which prompts the U.S. to have increasingly tight controls on tech exports. As we see in the TSMC case, the U.S. controls domestic exports and uses its influence to regulate international companies working with entities it deems a potential threat.

Amazon and the AI chip race

Amazon (NASDAQ: AMZN) continues advancing its position in the AI chip market with Trainium 2, a new addition to its growing line of in-house AI chips. According to Amazon Web Services (AWS), the company does not plan to move away from using NVIDIA’s (NASDAQ: NVDA) chips. However, it wants to offer its clients a cost-effective alternative that Amazon hopes will appeal to businesses looking to optimize their AI infrastructure—Amazon’s own chips.

Amazon’s focus on in-house chip production isn’t new. In recent years, Amazon has introduced chips like Inferentia, which reportedly reduces costs by up to 40% when powering AI model responses. These potential cost savings are appealing to companies running large-scale operations where AI costs can reach millions, if not billions, annually.

Building chips internally is a strategy many tech giants are adopting. Apple (NASDAQ: AAPL) was an early adopter of this approach, creating proprietary chips for better and cheaper integration across its devices. Recently, OpenAI announced it would follow suit when the company shared its plans to produce in-house chips.

There are numerous benefits to having a vertically integrated operation. Companies can reduce long-term costs and improve efficiency by designing chips tailored to specific operations. Customized chips can often execute tasks faster and more efficiently than general-purpose ones made by third parties, providing an advantage for companies looking to maintain a competitive edge in their specific niche.

Elon Musk’s X to launch free AI chatbot Grok

Elon Musk’s X (formerly Twitter) is rolling out a free-to-use version of its AI chatbot, Grok, making it accessible to a broader audience. Grok was initially launched exclusively for Premium users, who pay a monthly fee for various perks on the social media platform, but Grok’s free version is currently being tested in select countries.

However, the free version of Grok comes with limitations. Users can only make up to 10 queries every two hours with the Grok-2 model, 20 queries every two hours with the Grok-2 mini model, and ask three image analysis questions daily.

For tech giants, having a proprietary chatbot is almost an expectation—if competitors have one, they must, too. This proliferation of chatbots raises an important question: How many do we really need? While more chatbot options seem like a positive development, some users—including myself—begin to wonder why the world needs so many AI chatbots that effectively do the same thing.

Each new chatbot claims unique features, but the functionality remains largely the same for most users. Some do have specialized capabilities, but most users seem to engage with chatbots for similar tasks, which makes these slight advantages from chatbot to chatbot inconsequential.

At the moment, OpenAI’s ChatGPT stands as the frontrunner in this field, and it doesn’t look like it will change anytime soon, especially with the company recently raising billions of dollars at a historic valuation. However, it is only a matter of time before the other chatbots on the market either carve out their section of the market, get acquired, or cease to exist.

In order for artificial intelligence (AI) to work right within the law and thrive in the face of growing challenges, it needs to integrate an enterprise blockchain system that ensures data input quality and ownership—allowing it to keep data safe while also guaranteeing the immutability of data. Check out CoinGeek’s coverage on this emerging tech to learn more why Enterprise blockchain will be the backbone of AI.

Watch: Alex Ball on the future of tech—AI development and entrepreneurship

title=”YouTube video player” frameborder=”0″ allow=”accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share” referrerpolicy=”strict-origin-when-cross-origin” allowfullscreen=””>

Source: https://coingeek.com/this-week-in-ai-us-china-clash-amazon-eyes-in-house-chips/