Taiwan Semiconductor (TSMC) - The Chipmaker That Runs The World
Key Indicators
- Domain: Semiconductors
- Comp. (Chip Manf) – Samsung, SMIC, GFS, UMC
- Growth Segment – HPC AI Chips (Up)
- Economic Moat – Wide
- Cyclical - Yes
Key Customers (Total 530+)
- TSMC's revenue is made up of 26% from Apple and 7% from Nvidia. Apple has a 10-year partnership with the chip maker.
- Apple designs chips for iPhones and Mac computers, while Google designs Tensor Chips for Pixel smartphones. Qualcomm and MediaTek design processors for Android phones. Nvidia designs Gaming and Artificial Intelligence (AI) processors and AMD and Nvidia design advanced processors for Tesla.
- TSMC chips are also used by major cloud providers like AWS, MSFT, Google, Oracle, and IBM for data centres, networking, and software. Broadcom designs chips for broadband and wireless markets.
TSMC is able to offer its customers its manufacturing capabilities in the areas of Smartphones, High-Performance Computing (HPC), Internet of Things (IoT), Automotive and Digital Consumer Electronics. TSMC calls its Technology Leadership, Manufacturing Excellence and Customer Trust as TSMC Trinity of Strengths.
TSMC is a major player in three of the top four semiconductor growth sectors which include Silicon Carbide (SiC), Gallium Nitride (GaN), AI Compute Processors, and Generative AI.
AI servers are specialised computers designed for AI Training and Inference. Training involves adjusting the layers of the neural network based on results and can require a month of computational power. Inference uses trained neural network models to infer results. AI chips are used for applying trained AI algorithms to real-world data inputs, which is often referred to as "inference".
Specialised chips called Accelerators play a crucial role in the field of deep learning. There are two types of accelerators, Training Accelerators and Inference Accelerators. Training accelerators are optimised to facilitate the training of deep learning models by performing intricate calculations and processing extensive datasets. Inference Accelerators, on the other hand, execute trained models on fresh data with great speed, making them perfect for real-time applications such as image recognition in cameras or voice assistants in smartphones.
Generative Artificial Intelligence (Gen - AI)
Traditional AI relies on structured, labelled data for training and is confined to specific tasks such as image recognition, sentiment analysis, and recommendation systems. Generative AI, on the other hand, aims to simulate human-like creativity and generate content autonomously. It is versatile and capable of producing diverse outputs across various domains, including text, images, music, and even entire applications. The key aspect of Gen AI models is their ability to generate content that goes beyond the scope of their
training data.
training data.
Various types of Gen AI chips are
- GPU (Graphics Processing Unit)
- TPU (Tensor Processing Unit)
- FPGA (Field-Programmable Gate Array)
- ASIC (Application-Specific Integrated Circuit)
- Neuromorphic Chips
My other posts on Generative AI and Strategic Analysis of Key Players
- AI Value Chain
- Amazon the King of Retail -WhyAWS is the Crown Jewel
- Microsoft the King of AI in Software, Salesforce under the AI Cloud
- Tesla - It's not a Car, Its an AI Device on Wheels
- Google the King of Search - What the Future Beholds in the AI World
- Nvidia Godfather of AI - Why the Market is Bullish
- Generative AI can transform Telecoms, Energy and Utilities
Source: SeekingAlpha, Bloomberg, Martinfowler.com, Databricks.com, Nvidia, Google, AWS, Fourweek MBA Blog, Amazon, Ashwath Damodaran, TSMC
Image generated by Open AI's Dall.e - 3
Comments