博通公司 (AVGO.US) 2026财年第一季度业绩电话会
文章语言:
简
繁
EN
Share
Minutes
原文
会议摘要
Broadcom emphasizes strategic customer engagements in custom silicon and AI infrastructure, leveraging long-term relationships and advanced technology choices. Announcing record-breaking Q1 FY2026 revenue of $19.3 billion, with AI semiconductors driving 106% growth to $8.4 billion, the company forecasts Q2 revenue of $22 billion. Key partnerships with tech giants and a focus on AI networking and infrastructure software underpin Broadcom's growth strategy, aiming to surpass $100 billion in AI revenue by 2027.
会议速览
Broadcom's leadership team discusses Q1 FY2026 financial performance, provides Q2 guidance, and offers insights into the business environment, emphasizing non-GAAP financial results.
Achieved record $19.3 billion revenue in Q1 2026, up 29% year-on-year, driven by AI semiconductor growth of 106%. Projected $22 billion revenue for Q2 2026, with 47% year-on-year growth. Custom AI accelerators, especially TPUs, are ramping up, with significant demand expected. Infrastructure software bookings strong, supporting foundational cloud needs. Secured supply chain for components through 2028, ensuring AI revenue surpasses $100 billion in 2027.
Q1 financials reveal a record 19.3 billion in consolidated revenue, a 29% increase year-over-year, with operating income at 12.8 billion, up 31%. The semiconductor segment led growth, accounting for 65% of revenue, while infrastructure software maintained steady performance. Free cash flow hit 8 billion, or 41% of revenue, showcasing strong financial health and operational leverage.
The company spent significantly on capital expenditures, managed inventory to support AI demand, and allocated capital through dividends and share repurchases. Q2 guidance includes a 140% year-on-year increase in AI semiconductor revenue to $10.7 billion, with infrastructure software revenue also projected to rise. The non-GAAP tax rate is expected to be 16.5% for Q2, influenced by global minimum tax and income geographic mix.
A clarification is sought on the distinction between AI chips, Asics, and networking within the $100 billion revenue context. The dialogue explores the perspective on achieving a return on investment in AI growth, particularly in the face of cloud CapEx expansion, with an emphasis on market share gains and investment timelines.
A surge in demand for compute capacity, driven by customers creating platforms for generative AI and agentic AI, is fueling growth in the silicon industry. This trend is expected to continue, with revenue projections exceeding $100 billion by 2027, largely due to the increasing need for custom accelerators and advanced networking clusters.
A discussion on Broadcom's superior performance and complexity in AI chip design compared to hyperscaler-led customer-owned tooling initiatives, highlighting potential market share implications and strategies to widen the gap.
Discusses the necessity of top-tier silicon design, advanced packaging, and efficient networking for AI companies competing against Nvidia and other platform players, emphasizing the importance of high-volume production and time-to-market.
Broadcom's networking differentiation, driven by high-bandwidth solutions and leadership in AI networking components, is fueling significant demand. The company anticipates maintaining a 33% to 40% share of AI revenue from networking components, with future product launches ensuring sustained growth momentum.
The dialogue explores the evolution of AI accelerator architectures, particularly the shift from general-purpose GPUs to more customized silicon solutions tailored for specific workloads, highlighting the benefits of custom silicon in enhancing performance for tasks like prefill and decode processes.
A discussion reassures that shipping AI racks won't substantially affect overall gross margin, with emphasis on maintaining profitability and consistent models across semiconductor business.
An individual is acknowledged and given the floor to ask a question, showcasing a formal process of managing inquiries from the audience line.
A discussion unfolds around the projected gigawatts of AI computing power to be deployed and the corresponding revenue per gigawatt, with insights on varying costs among customers and an estimate of approaching 10 GW.
A company shares its strategy for securing key components for AI growth through 2028, emphasizing early action and strong partnerships. It highlights its readiness for accelerated growth, particularly in 2027 and 2028, by locking up T class and substrates early and maintaining good relations with suppliers.
A company shares its strategy of building custom silicon for key clients, involving deep, multi-year engagements. This approach allows anticipation of future needs, leading to securing essential elements and capacity from partners. The focus is on long-term investments in technology and capacity, ensuring supply stability for products like those requiring 28nm technology.
Discussed the strategic importance of custom silicon in AI development, clarifying the distinction between chips and rack-scale projects. Emphasized long-term strategic partnerships with a few key customers, ensuring clear visibility and market share despite market fragmentation. Highlighted the non-optional nature of these engagements and the focus on capacity expansion and project roadmaps.
The conversation revolves around confirming financial stability and expressing a preference for system racks over chips, concluding with gratitude and readiness for the next topic.
The dialogue highlights the strategic use of direct attached copper for scalable Ethernet, emphasizing its efficiency in connecting XPU and GPU directly for lower latency and cost. It also discusses the timing of CPO adoption, suggesting it's not yet the optimal choice despite being a leading technology.
The discussion highlights Ethernet's pivotal role as the preferred protocol for achieving scalability in cloud environments, both for scale-out and scale-up needs. Industry leaders, including semiconductor peers and hyperscalers, have aligned on Ethernet's suitability, influencing current and future network design strategies. This consensus is driven by deep industry partnerships and proven performance capabilities, marking a significant shift towards Ethernet-centric solutions in the tech sector.
Discussion highlights custom XPU's efficiency and cost-effectiveness over GPUs in inference and training for LLMs, with mature customers developing specialized chips simultaneously for both processes, indicating a trend towards optimized XPU solutions in the industry.
Broadcom shares insights on increased visibility and confidence in AI chip deployments, emphasizing long-term strategic partnerships over short-term transactions. The dialogue highlights a shift towards more detailed planning and a focus on sustainable, strategic engagements in the generative AI sector, exemplified by specific deals extending into 2029. The company underscores its role in guiding customer roadmaps for advanced AI models and monetization strategies, marking a significant move in the competitive landscape of AI technology.
要点回答
Q:What is the forecast for Q2 2026 consolidated revenue?
A:The forecast for Q2 2026 consolidated revenue is approximately $22 billion, representing 47% year-on-year growth.
Q:What is the projected revenue growth for the AI semiconductor business in Q2?
A:The projected revenue growth for the AI semiconductor business in Q2 is an acceleration to 140% year on year, with revenue reaching $10.7 billion.
Q:What is the status of the custom AI accelerator roadmap?
A:The custom AI accelerator roadmap is alive and well, with shipping activities taking place now, and plans to scale to multiple E1s in the future.
Q:What is the revenue forecast for non-AI semiconductors in Q2?
A:The revenue forecast for non-AI semiconductors in Q2 is approximately $4.1 billion, which is up 14% from a year ago.
Q:How is the growth in infrastructure software for Q1 and what is the forecast for Q2?
A:The growth in infrastructure software for Q1 was 13% year on year, with bookings continuing to be strong. For Q2, the focus is on infrastructure software revenue to be approximately $7.2 billion, which is 9% year on year.
Q:What was the consolidated revenue for Q1 and how much was the increase from the previous year?
A:Consolidated revenue for Q1 was $19.3 billion, marking a 29% increase from the previous year.
Q:What were the semiconductor solutions segment's revenue and gross margin for Q1?
A:Revenue for the semiconductor solutions segment in Q1 was $12.5 billion, with a gross margin up 100 basis points year on year to approximately 68%.
Q:What was the quarterly free cash flow and what does it represent of revenue?
A:Free cash flow in the quarter was $8 billion, representing 41% of revenue.
Q:What is the authorization for additional share repurchases?
A:The Board of Directors has authorized an additional $5 billion for the share repurchase program, effective through the end of calendar year 2026.
Q:What is the expected impact of the global minimum tax and fiscal year 2026 non GAAP tax rate?
A:The non GAAP tax rate for Q2 in fiscal year 2026 is expected to be approximately 16.5%, primarily due to the impact of the global minimum tax and the geographic mix of income.
Q:What is the perspective on AI chips and their relevance to networking and data centers?
A:AI chips are critical to creating and productizing platforms across various sectors including enterprise, cloud services, and consumer subscriptions. This demand is driven by the need for constant compute capacity for training and inference to productize these platforms, with many customers, including hyperscalers, creating their own custom accelerators.
Q:How does the company view the performance and complexity of its current generation solutions compared to competitor's offerings?
A:The company's current generation solutions outperform competitor offerings in terms of performance and complexity. They are 12 to 18 months ahead in terms of chip design complexity, packaging complexity, and IP.
Q:What are the critical technology aspects and partner requirements for competitive AI chip development?
A:Competitive AI chip development requires a best-in-class silicon design team, advanced packaging, and expertise in networking clusters. The best technology IP and execution are essential, with Nvidia being a formidable competitor. To compete effectively, companies need to produce chips that are not only good enough but better than their competitors, including Nvidia. The ideal partner would possess this best technology and execute at the highest level.
Q:What is the role of high volume production in the development of AI chips?
A:High volume production is critical in the development of AI chips to ensure a quick time to market. The ability to produce 100,000 chips or more at yields that are cost-effective is crucial, and very few players in the industry can achieve this. The company in question claims to have significant experience in doing just that, which is a key competitive advantage.
Q:What factors drive the demand for networking components in AI revenue?
A:The demand for networking components in AI revenue is driven by the new generation of GPUs and their capabilities, such as running 200 Gb and 400 Gb bandwidth. The Tomahawk 6, introduced a few months prior to the speech, is the only solution that offers this level of bandwidth for hyper-scalers. Additionally, the demand is bolstered by the need for high-bandwidth scaling-out optical transceivers. This combination is driving the growth of networking components within the company, and they expect continued momentum with the launch of the next generation in 2027.
Q:How is the architecture of AI accelerators in GPUs evolving with workload specialization?
A:The architecture of AI accelerators in GPUs is evolving to become more specialized for particular workloads. General-purpose GPUs are being supplemented with more customized designs, such as XPs, which are tailored for specific tasks like training and inference. These specialized designs are increasingly effective for workloads that do not fit the traditional matrix multiplication focus of GPUs, like mixture of experts. This trend is expected to continue, making XPs a more preferred choice for workload-specific performance optimization.
Q:What is the anticipated impact of the company's product mix on its overall business?
A:The anticipated impact of the company's product mix on its overall business is not considered substantial by the speaker, indicating that there is no need for worry regarding this matter.
Q:How does the speaker's company plan to substantially exceed a $100 billion revenue goal next year?
A:The speaker's company plans to substantially exceed a $100 billion revenue goal next year by shipping approximately 10 GW of products, which would equate to content per gigawatt in the $20 billion range, as indicated by the speaker's calculation and expectations.
Q:What is the significance of looking at gigawatts instead of dollars when assessing the company's chip sales?
A:Looking at gigawatts is significant when assessing the company's chip sales because that's how the chips are sold. According to the speaker, the dollars per gigawatt can vary dramatically depending on the customer, but it's important to note that this metric is not far from the dollar amounts being discussed.
Q:How does the company ensure supply visibility for the next few years?
A:The company ensures supply visibility for the next few years by anticipating significant growth, locking up key components like T class and tea class, and having deep strategic partnerships with customers and suppliers for the necessary elements. They invest with their partners to develop not just more capacity but also the appropriate technology for the future demand.
Q:What is the distinction between the revenue from chips versus the revenue from the racks in the company's projects?
A:The distinction between the revenue from chips versus the revenue from the racks in the company's projects is not directly answered. However, it is implied that the company may not differentiate between the two revenue sources, suggesting that the entirety of the revenue for the projects, including Anthropic's racks and chips, is not explicitly delineated in the provided information.
Q:How does the company manage visibility and market share with multiple fragmented customers?
A:The company manages visibility and market share with multiple fragmented customers by focusing on a small number of strategic customers with significant revenue generation. They have a clear understanding of each customer's strategic plan and their needs for custom silicon, which allows for strong visibility into market share across a fragmented set of customers using multiple cloud service providers.
Q:What is the preferred method for connecting GPUs in scale up scenarios as per the speaker?
A:The preferred method for connecting GPUs in scale up scenarios is direct attached copper, as it offers the lowest latency, power, and cost.
Q:How does the speaker view the role of Ethernet in cloud infrastructure?
A:The speaker believes Ethernet is the de facto standard for scale out in the cloud and the right choice for scale up as well. This is supported by engagements with partners and the industry, indicating that Ethernet is the preferred protocol for achieving necessary latency and scale requirements.
Q:What is the anticipated trend for customers using the speaker's custom xpu technology?
A:The anticipated trend for customers using the speaker's custom xpu technology is that they will develop two chips each year, one for training and one for inference, as they progress towards complete xpu. This is because there is a need to invest simultaneously in training to achieve higher levels of intelligence and in inference for product testing and deployment.
Q:What does the speaker say about the visibility and strategic partnership with the customers?
A:The speaker indicates that visibility and strategic partnership with the customers have improved over time. They have gained better insight into the customer's plans and long-term strategic deployment of the XPS they develop together. The speaker emphasizes that the investment these customers are making is long-term and that the partnership goes beyond transactional to be part of the strategic roadmap.

Broadcom Inc.
Follow





