## Prospectus Summary

  

---

  

  PROSPECTUS SUMMARY
 This summary highlights selected information contained elsewhere in this prospectus. This summary does not contain all of the information that you should consider before deciding to invest in our Class A common stock. You should read this entire prospectus carefully, including the sections titled “Risk Factors,” “Special Note Regarding Forward-Looking Statements,” “Management’s Discussion and Analysis of Financial Condition and Results of Operations,” and “Business,” and our consolidated financial statements and related notes included elsewhere in this prospectus before making an investment decision.
 Our Mission
 We believe AI is the most transformative technology of our generation. Our mission is to accelerate AI by making it faster, easier to use, and more energy efficient, making AI accessible around the world.
 Company Overview
 Cerebras is an AI company. We design processors for AI training and inference. We build AI systems to power, cool, and feed the processors data. We develop software to link these systems together into industry-leading supercomputers that are simple to use, even for the most complicated AI work, using familiar ML frameworks like PyTorch. Customers use our supercomputers to train industry-leading models. We use these supercomputers to run inference at speeds unobtainable on alternative commercial technologies. We deliver these AI capabilities to our customers on premise and via the cloud.
 AI compute is comprised of training and inference. For training, many of our customers have achieved over 10 times faster training time-to-solution compared to leading 8-way GPU systems of the same generation and have produced their own state-of-the-art models. For inference, we deliver over 10 times faster output generation speeds than GPU-based solutions from top CSPs, as benchmarked on leading open-source models. This enables real-time interactivity for AI applications and the development of smarter, more capable AI agents. The Cerebras solution requires less infrastructure, is simpler to use, and consumes less power than leading GPU architectures. It enables faster development and eliminates the complex distributed compute work required when using thousands of GPUs. Cerebras democratizes AI, enabling organizations that have less in-house AI or distributed computing expertise to leverage the full potential of AI.
 The rise of AI presents a unique set of compute challenges. Unlike other computational workloads, both training and inference require a huge number of relatively simple calculations, the results of which necessitate constant movement to and from memory, and to and from millions or tens of millions of compute cores. This traditionally demands hundreds or thousands of chips, and puts tremendous pressure on memory, memory bandwidth, and the communication fabric linking them all together.
 Cerebras started with a simple question: How can we design a processor, purpose-built to meet these exact challenges? If we were to start with a clean sheet, how would we avoid carrying forward the tradeoffs made for graphics and other workloads, and ensure that every transistor is optimized for the specific challenges presented by AI?
 Our answer is wafer-scale integration. Cerebras solved a problem that was open for the entire 75-year history of the computer industry: building chips the size of full silicon wafers. The third-generation Cerebras Wafer-Scale Engine (the “WSE-3”) is the largest chip ever sold. It is 57 times larger than the leading commercially available GPU. It has 52 times more compute cores, 880 times more on-chip memory (44 gigabytes), and 7,000 times more memory bandwidth (21 petabytes per second). The sheer size of the wafer-scale chip allows us to keep more work on-silicon and minimize the time-consuming, power-hungry movement of data. This enables Cerebras customers to solve problems in less time and using less power. Our AI compute platform combines processors, systems, software, and AI expert services, to deliver massive acceleration on even the largest, most capable AI models. It substantially reduces training times and inference latencies, while reducing programming complexity.

   1

---

  

  Our business model is designed to meet the needs of our customers. Organizations seeking control over their data and AI compute infrastructure can purchase Cerebras AI supercomputers for on-premise deployment. Those that want the flexibility of a cloud-based platform can purchase Cerebras high-performance AI compute via a consumption-based model through the Cerebras Cloud, or via our partner’s cloud. We offer customers the flexibility to choose the solution that best aligns with their budgetary, security, and scalability requirements, and some customers choose to use both options simultaneously.
 We have established a growing set of customer engagements spanning CSPs, leading enterprises, Sovereign AI programs, national laboratories, research institutions, and other innovators at the forefront of AI. While a substantial portion of our current business is supported by one primary customer, we are actively seeking to expand our reach and diversify our customer base. We collaborate with our customers to harness the power of AI to tackle their most significant challenges and drive breakthroughs across industries.
 Bloomberg Intelligence estimates that the AI market will grow to $1.3 trillion by 2032. Consumer and enterprise models like Google’s Gemini, Meta’s Llama, and OpenAI’s ChatGPT have driven demand for AI infrastructure training and inference solutions, powering AI applications such as specialized assistants, agents, and services. We believe that our AI compute platform addresses a large and growing AI hardware and software opportunity across training and inference, as well as software and expert services. We believe that further adoption of AI, accelerated by the advent of GenAI, and the widespread integration of AI into business processes, will rapidly expand our total addressable market (“TAM”) from an estimated $131 billion in 2024 to $453 billion by 2027, a compounded annual growth rate (“CAGR”) of 51%.
 We have experienced rapid growth, with revenue of $78.7 million and $24.6 million for the years ended December 31, 2023 and 2022, respectively, representing year-over-year growth of 220%. During the six months ended June 30, 2024 and 2023, we generated $136.4 million and $8.7 million in revenue, respectively. Since our inception, we have incurred operating losses and negative cash flows to develop, market, and expand our product portfolio and to continue our research and development activities. Our net loss for the years ended December 31, 2023 and 2022 was $127.2 million and $177.7 million, respectively, representing a year-over-year reduction of 28%. Our net loss for the six months ended June 30, 2024 and 2023 was $66.6 million and $77.8 million, respectively, representing a year-over-year reduction of 14%.
 Industry Background
 Over the past 40 years, the computer industry has followed a clear pattern: as major new computational workloads with distinct characteristics emerged, so too have new compute architectures. With each new compute paradigm, technologists first attempted to adapt existing compute architectures to these workloads. But in each case, a new purpose-built architecture was ultimately needed to unlock the potential of the new paradigm.
 We believe this pattern is continuing with the rise of AI – the next major compute workload. This growth has been accelerated by the emergence of GenAI, a new class of powerful AI models that can create new content and reason across broad domains and multiple data types. These breakthrough capabilities translate to tremendous potential value creation and have driven rapid GenAI adoption.
 The Computational Demands of Training and Inference
 Both training and inference demand immense compute, each with unique compute, memory, and memory bandwidth requirements. They represent two stages in the continuous lifecycle of AI models. Once trained, a model is “served” in production and used for inference. As part of this cycle, models in production are continuously being optimized to use fewer compute resources, and while that is happening, new and more powerful models are being trained, leading to the eventual obsolescence of the previous model—starting the cycle over.
 For training, the compute required is a function of a model’s size (number of parameters) and the amount of data used (number of tokens). The most capable GenAI models today have trillions of parameters and are trained on

   2

---

  

  trillions of tokens. As the industry has pushed to achieve greater AI capabilities, the size of GenAI models has grown, and we expect this trend to continue.
 The increase in model and data sizes has led to a dramatic surge in computational demand. Today, training for GenAI requires enormous GPU clusters, sophisticated engineering teams, and months of time for a single run. A training run can cost over a hundred million dollars, and improving the model and keeping it fresh with new data requires additional fine-tuning and regular re-training.
 For inference, the required compute is a function of model size, user demand, and time spent on inference. Larger models use more computational resources, and each user request also increases the compute need, contributing to significant and ongoing operational costs. We expect the demand for inference to grow, especially as larger and more capable models become more widely adopted. There is currently a direct tradeoff between model capability and responsiveness of user experience because the largest and most capable models demand more inference compute and therefore run more slowly during inference. We believe that as both inference speed and reasoning capabilities advance, GenAI models will support more demanding applications, thereby expanding the market opportunity for inference.
 Existing Compute Architectures Are Fundamentally Limited for GenAI Training and Inference
 GPUs, though better than CPUs for AI workloads, face fundamental limitations when processing the unique characteristics of large GenAI models. GenAI models are complex, interconnected compute graphs that require the constant communication of intermediate calculations to train. This requires a high amount of data movement to and from memory and across cores. Similarly, during inference, these models generate outputs sequentially – each dependent on the previous output – requiring the full model to be constantly moved in and out of memory to produce successive outputs, and again requiring massive data movement between cores and memory. GPUs face inherent scalability and complexity challenges when faced with the distinct, communication-heavy requirements of GenAI workloads.
 For Training – Individual GPUs Are Too Small, and Scaling to Many GPUs is Highly Inefficient
 Large GenAI models far exceed the memory and processing limits of a single GPU. For example, to train GPT-3, it would take a single NVIDIA H100 more than eight years of running at peak theoretical performance to train the model. Recent models like GPT-4 and Gemini are over 10 times larger in parameter size than GPT-3. Consequently, training a large GenAI model on GPUs in a tractable amount of time requires breaking up the model and calculations, and distributing the pieces across hundreds or thousands of GPUs, creating extreme communication bottlenecks and power inefficiencies.
 This distributed compute problem also creates a high level of complexity for developers, who are responsible for partitioning and coordinating the compute, memory, and communication across GPUs, so that they can work together in a complex choreography. This is an ongoing cost and slows down time-to-solution, as the delicate balance of bottlenecks needs to be reconfigured every time the ML developer wants to change the model architecture, model size, or run on a different number of GPUs.
 For Inference – GPU Efficiency is Low and Limited by Memory Bandwidth
 During generative inference, the full model must be run for each word that is generated. Since large models exceed on-chip GPU memory, this requires frequent data movement to and from off-chip memory. GPUs have relatively low memory bandwidth, meaning that the rate at which they can move information from off-chip HBM to on-chip SRAM, where the computation is done, is severely limited. This leads to low performance as GPUs cores are idle while waiting for data – they can run at less than 5% utilization on interactive generative inference tasks. Low utilization and limited memory bandwidth impact the responsiveness and throughput of GPU-based systems and hinders real-time applications for larger models. This inefficiency also necessitates larger GPU deployments and dramatically drives up the cost of inference.

   3

---

  

  GPU companies have tried to address these challenges, but the issues of small core count, memory size, and memory bandwidth are fundamental hardware limitations that persist. Interconnect technologies like InfiniBand, PCIe, and NVLink are limited by their physical interfaces, and moving data across them is thousands of times slower and more power-hungry than keeping the data on silicon. Software libraries intended to simplify distributed computing still require developers to manage complex parallelism strategies and extensive codebases. Realizing the physical challenges of repurposing small chips for a large compute problem, the GPU industry has recently announced new multi-chip packaging techniques, but these also yield only minimal expansions of GPU chip size.
 Accelerating GenAI requires a dedicated compute solution, designed for the unique requirements of GenAI, that can deliver faster training times, real-time inference speeds, and simple developer workflows, at reasonable cost.
 Our Solution
 We believe Cerebras has built the world’s fastest commercially available AI training and inference solution. Our dedicated AI hardware and software platform is powered by the Cerebras Wafer-Scale Engine – a processor the size of an entire silicon wafer that brings more on-chip compute, memory, and bandwidth resources together than any other commercially available processor in the semiconductor industry. A single WSE replaces a cluster of GPUs, reducing the time-consuming, power-hungry movement of data, removing the need for complex distributed programming, and providing exceptional compute speed. Our solution consists of the following elements:
 •Cerebras Wafer-Scale Engine (WSE). Our third generation WSE, the WSE-3, is 57 times larger than the leading commercially available GPU and has 52 times more compute cores, totaling 900,000. It features 880 times more on-chip memory (44 gigabytes SRAM) and 7,000 times more memory bandwidth (21 petabytes per second) than the leading commercially available GPU. The immense scale of the WSE delivers significant acceleration, efficiency, and simplicity for AI training and inference.
 For Training. Each Cerebras WSE has enough compute and on-chip memory to run even the largest, multi-trillion parameter GenAI models on a single chip, thus avoiding the complexities of chip-to-chip data movement. To further speed up training time-to-solution, Cerebras users can simply add more WSEs to the problem, scaling to near-linear performance.
 For Inference. Wafer-scale integration keeps all critical data on-chip and close to compute cores, resulting in 7,000 times more memory bandwidth than the leading GPU solution. This allows the WSE-3 to deliver over 10 times lower latency for real-time GenAI inference at vastly lower power consumption compared to top CSPs.
 •Cerebras System (CS). The CS AI computer system houses the WSE and delivers innovative power and cooling to the chip. Our third generation CS (“CS-3”) delivers three times more compute per unit power than the leading 8-way GPU system. This compact AI powerhouse is designed to easily integrate into standard data centers, and connects into the network via standards-based 100G Ethernet.
 •Cerebras AI Supercomputer. The Cerebras AI Supercomputer is designed to streamline scaling up to 2,048 CS-3 systems for maximum AI acceleration, with more efficiency and simplicity than scaling up to large GPU clusters. Our AI Supercomputer delivers near-linear performance increase as CS systems are added to a problem, takes only seconds to configure, and does not incur the overhead or complexity of heavy inter-chip, inter-system communication.
 Our scalable execution model is designed to simplify the development workflow for large GenAI training. We designed our AI Supercomputer to enable users to elastically scale workload computing resources up to 256 exaFLOPS (2,048 CS-3 systems) just by changing a single number in their code, allowing users to program as if for a single powerful device. Training a GPT-3 sized model on Cerebras, for example, uses 97% fewer lines of code compared to on clusters of GPUs, greatly accelerating the speed of AI model developments for larger-scale models.

   4

---

  

  •Cerebras Software Platform (CSoft). Our proprietary software platform, CSoft, is core to our solution and provides intuitive usability and improved developer productivity.
 CSoft seamlessly integrates with industry-standard ML frameworks like PyTorch and with popular software development tools, allowing developers to easily migrate to the Cerebras platform.
 CSoft eliminates the need for low-level programming in CUDA, or other hardware-specific languages. Starting from a user’s PyTorch model, the CSoft graph compiler automatically maps model operations to the WSE, creating an optimized executable without user-level intervention.
 CSoft allows ML users to accelerate training and inference on models of any size, scaled across any configuration of the Cerebras AI Supercomputer, just by changing one number in a configuration file, simulating a single-device programming experience without the complexities of distributed programming. This drastically reduces operational overhead and speeds up developer iteration time and business impact.
 •Cerebras Inference Serving Stack. The Cerebras Inference Stack is built on top of CSoft and is designed to allow customers to easily deploy even the largest GenAI models at industry-leading inference speeds. The Cerebras Inference API is designed to facilitate rapid developer adoption and ease of use. Our serving software automatically handles system-level optimizations for our inference solution and is designed to enable low latency and high cost effectiveness.
 •AI Model Services. Our AI model services further amplify speed to solution. Our team of AI practitioners helps customers design research experiments, train models, and optimize processes designed to achieve the fastest time-to-solution. These services complement our advanced hardware and software platform, providing an end-to-end solution for rapid and efficient AI development and deployment that helps our customers translate AI potential to business impact.
 Summary of key customer benefits include:
 •Enables over 10 times faster training time-to-solution compared to leading 8-way GPU systems of the same generation, as reported by many of our customers. This dramatically accelerates AI model time-to-solution, enabling businesses to test ideas faster, iterate more quickly, and bring next-generation GenAI-powered products and services to market, faster and more cost effectively.
 •Delivers over 10 times faster GenAI inference compared to GPU-based solutions from top CSPs, as benchmarked on leading open-source models. The WSE’s massive on-chip SRAM capacity keeps the vast majority of memory-to-compute communication on-silicon, thereby avoiding the memory bandwidth bottleneck faced by GPU-based solutions. Our resulting ultra-low latency delivers industry-leading inference speeds and real-time responses back to users, even on large, cutting-edge GenAI models. Ten times more speed compared to GPU-based solutions from top CSPs also allows developers to make ten times more inference calls in the same amount of time. This supports a new level of model capability delivered by techniques like multi-step inference and agentic AI flows, which leverage more inference calls to produce higher reasoning capability for more complex tasks in domains such as coding, math, and science applications.
 •End-to-end solution. Cerebras offers a unified platform purpose-built to accelerate fundamental compute characteristics of both AI training and inference. Cerebras excels across the axes of compute, memory, memory bandwidth, and simplicity of use, made possible by wafer-scale integration. This allows customers to swiftly transition from training to fine-tuning to deploying high-quality GenAI models on the same platform, eliminating the need for investing in and maintaining separate hardware infrastructure.
 •Zero distribution complexity. Users can effortlessly run a GenAI model of any size, and it takes no additional code to achieve automatic near-linear performance scaling across the nodes of a Cerebras Supercomputer.

   5

---

  

  •Low migration cost. Our proprietary CSoft platform integrates seamlessly with familiar ML frameworks and tools, like PyTorch, eliminating the need for AI teams to learn new languages or adapt to new development environments and easing the transition from other hardware platforms.
 •Power and operational efficiency. Cerebras outperforms GPU systems in power efficiency due to both hardware and architectural advantages. Wafer-scale integration allows the majority of AI workload communication to remain on-chip, significantly reducing data movement distance and power consumption; moving a bit of data on the WSE-3 takes less than 1% of the energy needed to do the same over off-chip GPU interconnects. This drives significant operational cost savings and streamlined management for organizations deploying AI at scale.
 •Expert-led model training and AI integration services. We offer expert-led foundation model training, fine-tuning, and retrieval-augmented generation services to customers. Our team provides guidance on cutting-edge AI methods that work on top of our hardware, assisting customers to derive the maximum value from their AI investment.
 Our Customers
 We have an expanding customer base that includes leading enterprises, Sovereign AI initiatives, cloud service providers, government agencies, and research institutions at the forefront of AI and at the intersection of AI and HPC. These customers leverage Cerebras to tackle complex AI challenges and achieve previously unattainable business outcomes, even with the most advanced GPU systems. Our customers find fundamental value in the simplicity of the Cerebras solution. It unlocks breakthrough business and scientific use cases by removing limitations in development time, programming complexity, and runtime speed.
 Our Business Model
 We use a combination of direct sales and partnerships to address the rapidly expanding AI market. We offer both on-premise solutions and cloud-based solutions to provide maximum flexibility to our customers. We offer a collection of services from data center deployment to AI expert services and AI Supercomputer operation and management, to provide our customers with the support they need to train, deploy, and accelerate GenAI time-to-value.
 •On Premise. We sell our AI Supercomputers to leading organizations who seek maximum control over their data and their AI infrastructure, fulfilling their needs for high-performance AI compute on premise. Our on-premise AI Supercomputers support both training and inference. They can be configured either for both workloads, or to be further optimized for only one, depending on our customers’ needs.
 •Cloud-Based Computing Services. We sell Cerebras solutions via our cloud offering as well as via the Condor Galaxy Cloud owned by Group 42 Holding Ltd (together with its affiliates, “G42”), our partner CSP. Our cloud solutions provide customers fast and flexible access to our powerful AI acceleration hardware. This offering gives our customers the ability to train LLMs with extraordinary speed and deploy them for inference at ultra-low latencies, all without the complexity or time needed to build and manage on-premise infrastructure.
 •Cerebras Inference Cloud. Our real-time inference solution is also available via a dedicated inference cloud service. Leveraging our Cerebras Inference Serving Stack, this cloud API offering allows developers to directly point their applications towards efficient and reliable model serving endpoints. On Cerebras Inference Cloud, we host both popular open-source models and proprietary customer models. For customers who do not need direct compute access and are not interested in managing their own inference serving software stack, our inference cloud offering is the quickest and simplest way to leverage our fast model inference services.

   6

---

  

  We provide a combination of these offerings to customers who may benefit from leveraging both on-premise and cloud-based options. This flexibility allows customers to choose the solution that best aligns with their budgetary, security, and scalability requirements.
 Additionally, customers can train models on-premise and then leverage our inference cloud for production, benefiting from flexible serving resources that can adapt to fluctuating demand. This end-to-end solution allows seamless integration from training to production, serving the entire lifecycle of a customer’s AI needs.
 We also provide professional services to assist customers throughout the AI workflow.
 Our Market Opportunity
 We participate in a large and growing AI market. Our full suite of AI computing solutions addresses use cases for training, inference, software, and expert services. We estimate the TAM for our AI computing platform to be approximately $131 billion in 2024, growing to $453 billion in 2027, a 51% CAGR. This TAM is comprised of the following core markets:
 •AI Compute for Training. As businesses continue to evaluate and deploy solutions, we believe the market for training new models will continue to grow. Based on market estimates in Bloomberg Intelligence research, our estimate of the TAM for AI Training Infrastructure is $72 billion in 2024, growing to $192 billion in 2027, a 39% CAGR.
 •AI Compute for Inference. While GenAI training is essential to developing models that are powerful and accurate, we believe inference is the next phase of the ongoing wave of AI disruption. As more enterprises develop models and start to deploy their models in applications at scale, the need for high performance and efficient inference is becoming critical to fully realize the commercial potential of ML. We believe that the inference opportunity is enormous, as the market is in the early phases in its adoption cycle. Our estimate of the TAM for AI Inference is $43 billion in 2024, growing at an estimated 63% CAGR to $186 billion in 2027.
 •Software and Expert Services. Based on market estimates in Bloomberg Intelligence research, our estimate of the TAM for GenAI Software and Services is $16 billion in 2024, growing to $75 billion in 2027, a 67% CAGR.
 We believe we are at the very early stages of a large and fast-growing market opportunity. As adoption of AI continues to accelerate, we expect numerous new applications will be identified, and we believe our solutions are well-positioned to capitalize on the wave of disruption that will come in the coming years.
 Competitive Strengths
 We believe our design is capable of meeting today’s needs and is scalable to address tomorrow’s challenges. Our competitive strengths include:
 •The world’s first and only wafer-scale chip in the market. Our wafer-scale chip architecture eliminates the need for distributed computing. This enables AI developers to use up to 97% less code when working with large models on our platform compared to on clusters of GPUs and greatly accelerates the speed of AI model development for larger-scale models.
 •Full system solution that is easy to deploy and efficient to operate. Our system is co-designed with the wafer-scale processor, leveraging proprietary technology to address thermal and power delivery challenges. This allows us to keep the system operating efficiently, optimizing the energy consumption and underlying operating costs for our customers.

   7

---

  

  •Comprehensive software suite that leads to ease of adoption and shortens time to deployment. The Cerebras Software platform allows for seamless programmability of AI models through our integration with PyTorch. Our tools allow users to bring models that have been trained on other hardware onto our platform for training or for inference. Likewise, users can train models on our platform and deploy those models for inference elsewhere.
 •Our AI platform addresses both training and inferencing markets. For training, our AI platform enables customers to effortlessly and swiftly use the most advanced GenAI models available on the market, without the need for specialized software frameworks or help from distributed computing experts. For inferencing, our AI platform delivers ultra-low latencies and high generation throughput. This helps our customers to unlock cutting edge performance for ultra-low-latency use cases leveraging GenAI.
 •Scalable architecture. We have developed our solutions to support models and data sets of large and varying sizes. Our current generation CS-3 is designed to support models with up to 24 trillion parameters, much larger than even today’s state-of-the-art GenAI models. Our platform is designed to seamlessly scale from 1 to 2,048 CS-3 systems, forming an AI supercomputer that is even further differentiated by our proprietary interconnect and memory technology. This enables customers to seamlessly increase compute resources from small-scale experiments to large-scale deployments.
 •AI model services help customers translate AI potential to business impact. We provide customers with AI model services to help them develop solutions that are customized to meet their needs and help them realize the full value of their AI investments. These services include model selection, data preparation, training, and solutions integration.
 •World-class AI talent with a proven track record of innovation and execution. We have assembled a world-class team of industry leaders in integrated circuit design, processor architecture, power delivery, cooling, system engineering and software. Over the last five years, we have introduced three generations of our WSE, each time achieving two times the performance of its predecessor, and bringing new IC, power, and cooling technologies to bear.
 Growth Strategies
 We believe we are positioned for sustained growth in the rapidly expanding market for AI acceleration solutions. We have designed our focused strategies to drive continued success and establish ourselves as a long-term leader:
 •Increase sales to our existing customers. We have established a strong land-and-expand track record with our existing customer base and we intend to deepen these relationships by expanding our product and service offerings tailored to their evolving needs. Our strategy focuses on demonstrating the value proposition of our solutions through initial engagements, cross-selling complementary products, and identifying new use cases within existing customers.
 •Expand our customer base. We plan to aggressively pursue opportunities in relevant sectors such as healthcare, pharmaceutical, biotechnology, government, financial services, sovereign, and energy, to name a few, where our AI acceleration capabilities can address critical computational bottlenecks. We will seek to drive this expansion by focused sales and marketing initiatives, highlighting the transformative potential of our technology with targeted use cases. We intend to leverage our existing success stories and strategic partnerships to both bolster our credibility within new markets and establish key channels for customer acquisition.
 •Further penetrate into the rapidly growing inference market. We recently launched our inference solution. The immense amount of memory bandwidth and capacity on our chip allows us to deliver significantly lower generation latency and higher generation throughput over GPUs. By making API-based

   8

---

  

  inferencing available through our cloud offering, we could significantly accelerate our adoption into inferencing use cases.
 •Benefit from opportunities in large adjacent AI and compute-intensive markets. We are actively enabling applications in fields like life sciences, materials science, and financial modeling, where our cutting-edge AI computing solutions can unlock new discoveries and solve complex problems. Geographically, our strategy includes deepening our investment in partnerships with large Sovereign AI initiatives and new markets, to accelerate AI adoption.
 •Accelerate our existing product roadmap as well as develop new products for emerging use cases. As AI infrastructure requirements scale, we expect emerging use cases to require new products with added functionalities to solve data, networking, and memory bottlenecks. With our continued focus on innovation, we intend to develop and introduce new products and form factors that will enable us to service a larger portion of our market opportunity.
 •Advance product adoption by proliferating cloud deployment of our AI solutions. We intend to accelerate our growth by expanding access to our revolutionary AI systems through cloud deployment, which we believe will significantly broaden our customer base.
 Risk Factors Summary
 Our business is subject to a number of risks and uncertainties of which you should be aware before making a decision to invest in our Class A common stock. These risks are more fully described in the section titled “Risk Factors.” These risks include, among others, the following:
 •We may not sustain our growth rate, and we may not be able to manage future growth effectively.
 •We have a history of generating net losses, and if we are unable to achieve adequate revenue growth while our expenses increase, we may not achieve and maintain profitability in the future.
 •We have a limited operating history, and we may have difficulty accurately predicting our future revenue for the purpose of appropriately budgeting and adjusting our expenses.
 •We currently generate a significant majority of our revenue from one customer, G42, and a significant portion of our revenue from a limited number of customers. A reduction in demand from, or a material adverse development in our relationship with, G42 or any of our other significant customers may harm our business, financial condition, results of operations, and prospects.
 •Our business and our products and services are subject to various governmental regulations, and compliance with these regulations may cause us to incur significant expense. Similarly, we are required to obtain export licensing to sell our products to various jurisdictions where we have customers, and we cannot guarantee that we will be successful in obtaining all required licenses in the future. If we fail to comply with applicable regulations, we could be subject to civil or criminal penalties, and if we are unable to obtain licenses to export our products, our business, financial condition, results of operations, and prospects may be harmed.
 •If the market does not adopt our products, we will be unable to grow our business.
 •If we are unable to expand the application of our products, or if the new products we develop and introduce into the market are not successful, our business, financial condition, results of operations, and prospects may be harmed.
 •The market for AI computing solutions is competitive and evolving, and if we do not compete effectively, our business, financial condition, results of operations, and prospects may be harmed.

   9

---

  

  •We depend on third-party suppliers, and substantially all of our manufacturing services and components are procured on a purchase order basis without capacity commitments, which may harm our ability to bring products to market and our reputation, business, financial condition, results of operations, and prospects.
 •Our supply chain is long, complex, and global, with many interdependencies. Any significant fluctuations of supply and demand or disruption to our supply chain may harm our ability to manufacture and deliver our products to our customers.
 •No public market for our Class A common stock currently exists and an active liquid market may not develop or be sustained following this offering.
 •We identified material weaknesses in our internal control over financial reporting. If we are unable to remediate these material weaknesses, or if we identify additional material weaknesses in the future or otherwise fail to maintain an effective system of internal controls, we may not be able to accurately or timely report our financial condition or results of operations, which may adversely affect investor confidence in us and, as a result, the value of our Class A common stock.
 Corporate Information
 We were incorporated in April 2016 as a Delaware corporation. Our principal executive offices are located at 1237 E. Arques Avenue, Sunnyvale, California 94085, and our telephone number is (650) 933-4980. Our website address is www.cerebras.ai. Information contained on, or that can be accessed through, our website does not constitute part of this prospectus, and the inclusion of our website address in this prospectus is an inactive textual reference only.
 Implications of Being an Emerging Growth Company
 We are an emerging growth company as defined in the Jumpstart Our Business Startups Act of 2012 (the “JOBS Act”). We will remain an emerging growth company until the earliest of: (i) the last day of the fiscal year following the fifth anniversary of the completion of this offering; (ii) the last day of the fiscal year in which we have total annual gross revenue of at least $1.235 billion; (iii) the last day of the fiscal year in which we are deemed to be a “large accelerated filer” as defined in Rule 12b-2 under the Securities Exchange Act of 1934, as amended (the “Exchange Act”), which would occur if the market value of our Class A common stock held by non-affiliates exceeded $700.0 million as of the last business day of the second fiscal quarter of such year; or (iv) the date on which we have issued more than $1.0 billion in non-convertible debt securities during the prior three-year period. An emerging growth company may take advantage of specified reduced reporting requirements and is relieved of certain other significant requirements that are otherwise generally applicable to public companies. As an emerging growth company:
 •we will present in this prospectus only two years of audited annual financial statements, plus any required unaudited condensed consolidated financial statements, and related management’s discussion and analysis of financial condition and results of operations;
 •we will avail ourselves of the exemption from the requirement to obtain an attestation and report from our independent registered public accounting firm on the assessment of our internal control over financial reporting pursuant to the Sarbanes-Oxley Act of 2002;
 •we will provide less extensive disclosure about our executive compensation arrangements; and
 •we will not require stockholder non-binding advisory votes on executive compensation or golden parachute arrangements.
 In addition, the JOBS Act provides that an emerging growth company can take advantage of an extended transition period for complying with new or revised accounting standards. This provision allows an emerging growth

   10

---

  

  company to delay the adoption of some accounting standards until those standards would otherwise apply to private companies. We have elected to use the extended transition period for any other new or revised accounting standards until the date that we are no longer an emerging growth company or affirmatively and irrevocably opt out of the extended transition period. As a result, our financial statements may not be comparable to companies that comply with new or revised accounting pronouncements as of public company effective dates.

   11

## The Offering

  

---

  

  The Offering
  

> **Class A common stock offered by us**
>
> Class A common stock offered by us / Class A common stock offered by us / shares. / shares. / shares.
>
> Class A common stock offered by the selling stockholders ... Class A common stock offered by the selling stockholders / Class A common stock offered by the selling stockholders / shares. / shares. / shares.
> Option to purchase additional shares of Class A common stock from us ... Option to purchase additional shares of Class A common stock from us / Option to purchase additional shares of Class A common stock from us / shares. / shares. / shares.
> Class A common stock to be outstanding immediately after this offering ... Class A common stock to be outstanding immediately after this offering / Class A common stock to be outstanding immediately after this offering / shares (or              shares if the underwriters exercise their option to purchase additional shares of Class A common stock from us in full). / shares (or              shares if the underwriters exercise their option to purchase additional shares of Class A common stock from us in full). / shares (or              shares if the underwriters exercise their option to purchase additional shares of Class A common stock from us in full).
> Class N common stock to be outstanding immediately after this offering ... Class N common stock to be outstanding immediately after this offering / Class N common stock to be outstanding immediately after this offering / None. / None. / None.
> Total Class A common stock and Class N common stock to be outstanding after this offering ... Total Class A common stock and Class N common stock to be outstanding after this offering / Total Class A common stock and Class N common stock to be outstanding after this offering / shares (or              shares if the underwriters exercise their option to purchase additional shares of Class A common stock from us in full). / shares (or              shares if the underwriters exercise their option to purchase additional shares of Class A common stock from us in full). / shares (or              shares if the underwriters exercise their option to purchase additional shares of Class A common stock from us in full).
> Use of proceeds ......................... Use of proceeds / Use of proceeds / We estimate that we will receive net proceeds from this offering of approximately $                (or $                if the underwriters exercise their option to purchase additional shares of Class A common stock in full), based upon the assumed initial public offering price of $           per share of Class A common stock, which is the midpoint of the estimated price range set forth on the cover page of this prospectus, and after deducting estimated underwriting discounts and commissions and estimated offering expenses payable by us. The principal purposes of this offering are to obtain additional capital to fund our operations, create a public market for our Class A common stock, facilitate our future access to the public equity markets, and increase awareness of our company among potential partners. We currently intend to use the net proceeds from this offering, together with our existing cash, cash equivalents, and investments, for general corporate purposes, including working capital, operating expenses, and capital expenditures. We may also use a portion of the net proceeds to in-license, acquire, or invest in complementary technologies, assets, businesses, or intellectual property. We periodically evaluate strategic opportunities; however, we have no current commitments to enter into any such acquisitions or make any such investments.  We intend to use approximately $                of the net proceeds to satisfy tax withholding and remittance obligations related to the RSU Net Settlement (as defined below) for restricted stock units (“RSUs”) that will vest in connection with this offering. We will have broad discretion in the way that we use the net proceeds of this offering. See the section titled “Use of Proceeds” for additional information. We will not receive any proceeds from the sale of Class A common stock in this offering by the selling stockholders. / We estimate that we will receive net proceeds from this offering of approximately $                (or $                if the underwriters exercise their option to purchase additional shares of Class A common stock in full), based upon the assumed initial public offering price of $           per share of Class A common stock, which is the midpoint of the estimated price range set forth on the cover page of this prospectus, and after deducting estimated underwriting discounts and commissions and estimated offering expenses payable by us. The principal purposes of this offering are to obtain additional capital to fund our operations, create a public market for our Class A common stock, facilitate our future access to the public equity markets, and increase awareness of our company among potential partners. We currently intend to use the net proceeds from this offering, together with our existing cash, cash equivalents, and investments, for general corporate purposes, including working capital, operating expenses, and capital expenditures. We may also use a portion of the net proceeds to in-license, acquire, or invest in complementary technologies, assets, businesses, or intellectual property. We periodically evaluate strategic opportunities; however, we have no current commitments to enter into any such acquisitions or make any such investments.  We intend to use approximately $                of the net proceeds to satisfy tax withholding and remittance obligations related to the RSU Net Settlement (as defined below) for restricted stock units (“RSUs”) that will vest in connection with this offering. We will have broad discretion in the way that we use the net proceeds of this offering. See the section titled “Use of Proceeds” for additional information. We will not receive any proceeds from the sale of Class A common stock in this offering by the selling stockholders. / We estimate that we will receive net proceeds from this offering of approximately $                (or $                if the underwriters exercise their option to purchase additional shares of Class A common stock in full), based upon the assumed initial public offering price of $           per share of Class A common stock, which is the midpoint of the estimated price range set forth on the cover page of this prospectus, and after deducting estimated underwriting discounts and commissions and estimated offering expenses payable by us. The principal purposes of this offering are to obtain additional capital to fund our operations, create a public market for our Class A common stock, facilitate our future access to the public equity markets, and increase awareness of our company among potential partners. We currently intend to use the net proceeds from this offering, together with our existing cash, cash equivalents, and investments, for general corporate purposes, including working capital, operating expenses, and capital expenditures. We may also use a portion of the net proceeds to in-license, acquire, or invest in complementary technologies, assets, businesses, or intellectual property. We periodically evaluate strategic opportunities; however, we have no current commitments to enter into any such acquisitions or make any such investments.  We intend to use approximately $                of the net proceeds to satisfy tax withholding and remittance obligations related to the RSU Net Settlement (as defined below) for restricted stock units (“RSUs”) that will vest in connection with this offering. We will have broad discretion in the way that we use the net proceeds of this offering. See the section titled “Use of Proceeds” for additional information. We will not receive any proceeds from the sale of Class A common stock in this offering by the selling stockholders.

   12

---

  

  

> **Voting rights**
>
> Voting rights / Voting rights / We have two classes of common stock: Class A common stock and Class N common stock. Class A common stock is entitled to one vote per share and Class N common stock is non-voting and is convertible into one share of Class A common stock. See the section titled “Description of Capital Stock” for additional information. / We have two classes of common stock: Class A common stock and Class N common stock. Class A common stock is entitled to one vote per share and Class N common stock is non-voting and is convertible into one share of Class A common stock. See the section titled “Description of Capital Stock” for additional information. / We have two classes of common stock: Class A common stock and Class N common stock. Class A common stock is entitled to one vote per share and Class N common stock is non-voting and is convertible into one share of Class A common stock. See the section titled “Description of Capital Stock” for additional information.
>
> Risk factors ............................ Risk factors / Risk factors / See the section titled “Risk Factors” and other information included in this prospectus for a discussion of factors you should carefully consider before deciding whether to invest in our Class A common stock. / See the section titled “Risk Factors” and other information included in this prospectus for a discussion of factors you should carefully consider before deciding whether to invest in our Class A common stock. / See the section titled “Risk Factors” and other information included in this prospectus for a discussion of factors you should carefully consider before deciding whether to invest in our Class A common stock.
> Proposed Nasdaq Global Market trading symbol ... Proposed Nasdaq Global Market trading symbol / Proposed Nasdaq Global Market trading symbol / “CBRS” / “CBRS” / “CBRS”

 In this prospectus, the number of shares of our common stock to be outstanding after this offering is based on                 shares of our Class A common stock and no shares of our Class N common stock outstanding as of June 30, 2024, after giving effect to the Preferred Stock Conversion, the Option Exercise, and the RSU Net Settlement (each as defined below), and excludes:
 •                shares of our Class A common stock issuable upon the exercise of outstanding stock options as of June 30, 2024, with a weighted-average exercise price of $           per share, after giving effect to the Option Exercise;
 •                shares of our Class A common stock issuable upon the exercise of stock options granted after June 30, 2024, with a weighted-average exercise price of $           per share;
 •                shares of our Class A common stock issuable upon the vesting and settlement of RSUs subject to service-based and liquidity-based vesting conditions outstanding as of June 30, 2024, for which the service-based vesting condition was not yet satisfied as of June 30, 2024 and for which the liquidity-based vesting condition will be satisfied in connection with this offering, after giving effect to the RSU Net Settlement;
 •                shares of Class A common stock issuable upon the vesting and settlement of RSUs subject to service-based and liquidity-based vesting conditions granted after June 30, 2024, for which the service-based vesting condition was not yet satisfied as of June 30, 2024 and for which the liquidity-based vesting condition will be satisfied in connection with this offering, after giving effect to the RSU Net Settlement;
 •22,851,296 shares of our Class N common stock reserved for future purchase pursuant to the G42 Primary Purchase (see the section titled “Certain Relationships and Related Party Transactions” for additional information);
 •a variable number of shares of our Class N common stock that may be issued pursuant to the G42 Option (see the sections titled “Dilution—G42 Option” and “Certain Relationships and Related Party Transactions” for additional information);
 •                shares of our Class A common stock reserved for future issuance under our 2024 Incentive Award Plan (the “2024 Plan”), which will become effective on the business day immediately prior to the date of effectiveness of the registration statement of which this prospectus forms a part, including                 new shares and the number of shares (i) that remain available for grant of future awards under our 2016 Equity Incentive Plan (as amended, the “2016 Plan”) at the time the 2024 Plan becomes effective, which shares will cease to be available for issuance under the 2016 Plan at such time and (ii) underlying outstanding stock-based compensation awards granted under the 2016 Plan (such awards outstanding under such plans, the “Prior Plan Awards”) that expire, or are cancelled, forfeited, reacquired, or withheld; and

   13

---

  

  •                shares of our Class A common stock reserved for future issuance under our 2024 Employee Stock Purchase Plan (the “ESPP”), which will become effective on the business day immediately prior to the date of effectiveness of the registration statement of which this prospectus forms a part.
 The 2024 Plan and the ESPP also provide for automatic annual increases in the number of shares reserved thereunder. See the section titled “Executive and Director Compensation—Equity Compensation Plans” for additional information.
 Except as otherwise indicated, all information in this prospectus assumes or gives effect to:
 •the adoption, filing, and effectiveness of our amended and restated certificate of incorporation and the adoption of our amended and restated bylaws, each of which will occur immediately prior to the completion of this offering;
 •the automatic conversion of all outstanding shares of our redeemable convertible preferred stock into an aggregate of 82,899,159 shares of our Class A common stock, which will occur prior to the completion of this offering, including 68,213 shares issued upon the exercise of a warrant (the “Preferred Stock Conversion”);
 •the cash exercise of stock options to purchase                 shares of our Class A common stock in connection with this offering by certain selling stockholders (the “Option Exercise”), with a weighted-average exercise price of $           per share, for total gross proceeds to us of approximately $               , by certain selling stockholders in connection with the sale of all or a portion of such shares by such selling stockholders in this offering, as described in the section titled “Principal and Selling Stockholders”;
 •the net issuance of                shares of our Class A common stock issuable upon the vesting and settlement of RSUs subject to service-based and liquidity-based vesting conditions outstanding as of                , 2024, for which the service-based vesting condition was satisfied as of                , 2024 and for which the liquidity-based vesting condition will be satisfied in connection with this offering, after giving effect to the withholding of an estimated                 shares to satisfy estimated tax withholding and remittance obligations (based on an assumed           % tax withholding rate) (the “RSU Net Settlement”);
 •no repurchase of outstanding shares of our capital stock after June 30, 2024;
 •no exercise of outstanding stock options or settlement of outstanding RSUs after June 30, 2024, except for the Option Exercise and the RSU Net Settlement;
 •no issuance of shares of our capital stock pursuant to the G42 Primary Purchase or the G42 Option (see the sections titled “Dilution—G42 Option” and “Certain Relationships and Related Party Transactions” for additional information); and
 •no exercise by the underwriters of their option to purchase                      additional shares of our Class A common stock from us in this offering.

   14

## Summary Consolidated Financial Data

  

---

  

  Summary Consolidated Financial Data
 The following tables set forth our summary consolidated financial data. The summary consolidated statements of operations data for the years ended December 31, 2023 and 2022 have been derived from our audited consolidated financial statements included elsewhere in this prospectus. The summary consolidated statements of operations data for the six months ended June 30, 2024 and 2023 and the summary consolidated balance sheet data as of June 30, 2024 have been derived from our unaudited interim consolidated financial statements included elsewhere in this prospectus. In our opinion, the unaudited interim consolidated financial statements have been prepared on a basis consistent with our audited consolidated financial statements and, in our opinion, contain all adjustments, consisting only of normal and recurring adjustments, necessary for a fair presentation of such interim financial statements. Our historical results are not necessarily indicative of results that may be expected in the future, and our results for the six months ended June 30, 2024 are not necessarily indicative of results that may be expected for the year ending December 31, 2024 or any future period.
 You should read the following summary consolidated financial data in conjunction with the section titled “Management’s Discussion and Analysis of Financial Condition and Results of Operations” and our consolidated financial statements and related notes included elsewhere in this prospectus. The summary consolidated financial data in this section are not intended to replace, and are qualified in their entirety by, the consolidated financial statements and related notes.
  

> **Year Ended December 31,**
>
> Year Ended December 31, / Year Ended December 31, / Year Ended December 31, / Year Ended December 31, / Year Ended December 31, / Year Ended December 31, / Year Ended December 31, / Year Ended December 31, / Six Months Ended June 30, / Six Months Ended June 30, / Six Months Ended June 30, / Six Months Ended June 30, / Six Months Ended June 30, / Six Months Ended June 30, / Six Months Ended June 30, / Six Months Ended June 30, / Six Months Ended June 30,
>
> 2023 / 2023 / 2023 / 2022 / 2022 / 2022 / 2024 / 2024 / 2024 / 2023 / 2023 / 2023
> (in thousands, except per share amounts) / (in thousands, except per share amounts) / (in thousands, except per share amounts) / (in thousands, except per share amounts) / (in thousands, except per share amounts) / (in thousands, except per share amounts) / (in thousands, except per share amounts) / (in thousands, except per share amounts) / (in thousands, except per share amounts) / (in thousands, except per share amounts) / (in thousands, except per share amounts) / (in thousands, except per share amounts) / (in thousands, except per share amounts) / (in thousands, except per share amounts) / (in thousands, except per share amounts) / (in thousands, except per share amounts) / (in thousands, except per share amounts) / (in thousands, except per share amounts) / (in thousands, except per share amounts) / (in thousands, except per share amounts) / (in thousands, except per share amounts)
> Consolidated Statement of Operations: ... Consolidated Statement of Operations: / Consolidated Statement of Operations:
> Revenue ................................. Revenue / Revenue
> Hardware ................................ Hardware / Hardware / $ / 57,114 / $ / 15,599 / $ / 104,269 / $ / 1,559
> Services and other ...................... Services and other / Services and other / 21,630 / 21,630 / 9,020 / 9,020 / 32,133 / 32,133 / 7,105 / 7,105
> Total revenue ........................... Total revenue / Total revenue / 78,744 / 78,744 / 24,619 / 24,619 / 136,402 / 136,402 / 8,664 / 8,664
> Cost of sales(1) ........................ Cost of sales(1) / Cost of sales(1)
> Hardware ................................ Hardware / Hardware / 45,559 / 45,559 / 19,195 / 19,195 / 66,442 / 66,442 / 1,980 / 1,980
> Services and other ...................... Services and other / Services and other / 6,827 / 6,827 / 2,534 / 2,534 / 13,941 / 13,941 / 2,306 / 2,306
> Total cost of sales ..................... Total cost of sales / Total cost of sales / 52,386 / 52,386 / 21,729 / 21,729 / 80,383 / 80,383 / 4,286 / 4,286
> Gross profit ............................ Gross profit / Gross profit / 26,358 / 26,358 / 2,890 / 2,890 / 56,019 / 56,019 / 4,378 / 4,378
> Operating expenses ...................... Operating expenses / Operating expenses
> Research and development(1) ............. Research and development(1) / Research and development(1) / 140,057 / 140,057 / 155,408 / 155,408 / 77,742 / 77,742 / 76,295 / 76,295
> Sales and marketing(1) .................. Sales and marketing(1) / Sales and marketing(1) / 9,642 / 9,642 / 9,401 / 9,401 / 7,237 / 7,237 / 4,176 / 4,176
> General and administrative(1) ........... General and administrative(1) / General and administrative(1) / 10,593 / 10,593 / 16,902 / 16,902 / 12,851 / 12,851 / 4,922 / 4,922
> Total operating expenses ................ Total operating expenses / Total operating expenses / 160,292 / 160,292 / 181,711 / 181,711 / 97,830 / 97,830 / 85,393 / 85,393
> Loss from operations .................... Loss from operations / Loss from operations / (133,934) / (133,934) / (178,821) / (178,821) / (41,811) / (41,811) / (81,015) / (81,015)
> Interest income ......................... Interest income / Interest income / 5,683 / 5,683 / 1,076 / 1,076 / 3,809 / 3,809 / 2,349 / 2,349
> Other income (expense), net ............. Other income (expense), net / Other income (expense), net / 1,228 / 1,228 / 230 / 230 / (28,284) / (28,284) / 918 / 918
> Loss before income taxes ................ Loss before income taxes / Loss before income taxes / (127,023) / (127,023) / (177,515) / (177,515) / (66,286) / (66,286) / (77,748) / (77,748)
> Income tax expense ...................... Income tax expense / Income tax expense / 132 / 132 / 204 / 204 / 319 / 319 / 72 / 72
> Net loss ................................ Net loss / Net loss / $ / (127,155) / $ / (177,719) / $ / (66,605) / $ / (77,820)
> Net loss per share – basic and diluted(2) ... Net loss per share – basic and diluted(2) / Net loss per share – basic and diluted(2) / $ / (2.92) / $ / (4.28) / $ / (1.42) / $ / (1.82)
> Weighted average number of common shares outstanding, basic and diluted ... Weighted average number of common shares outstanding, basic and diluted / Weighted average number of common shares outstanding, basic and diluted / 43,552 / 43,552 / 41,485 / 41,485 / 46,945 / 46,945 / 42,857 / 42,857

   15

---

  

  

> **Year Ended December 31,**
>
> Year Ended December 31, / Year Ended December 31, / Year Ended December 31, / Year Ended December 31, / Year Ended December 31, / Year Ended December 31, / Year Ended December 31, / Year Ended December 31, / Six Months Ended June 30, / Six Months Ended June 30, / Six Months Ended June 30, / Six Months Ended June 30, / Six Months Ended June 30, / Six Months Ended June 30, / Six Months Ended June 30, / Six Months Ended June 30, / Six Months Ended June 30,
>
> 2023 / 2023 / 2023 / 2022 / 2022 / 2022 / 2024 / 2024 / 2024 / 2023 / 2023 / 2023
> (in thousands, except per share amounts) / (in thousands, except per share amounts) / (in thousands, except per share amounts) / (in thousands, except per share amounts) / (in thousands, except per share amounts) / (in thousands, except per share amounts) / (in thousands, except per share amounts) / (in thousands, except per share amounts) / (in thousands, except per share amounts) / (in thousands, except per share amounts) / (in thousands, except per share amounts) / (in thousands, except per share amounts) / (in thousands, except per share amounts) / (in thousands, except per share amounts) / (in thousands, except per share amounts) / (in thousands, except per share amounts) / (in thousands, except per share amounts) / (in thousands, except per share amounts) / (in thousands, except per share amounts) / (in thousands, except per share amounts) / (in thousands, except per share amounts)
> Pro forma net loss per share attributable to common stockholders, basic and diluted(3) ... Pro forma net loss per share attributable to common stockholders, basic and diluted(3) / Pro forma net loss per share attributable to common stockholders, basic and diluted(3)
> Pro forma weighted-average shares used in calculating pro forma net loss per share attributable to common stockholders, basic and diluted(3) ... Pro forma weighted-average shares used in calculating pro forma net loss per share attributable to common stockholders, basic and diluted(3) / Pro forma weighted-average shares used in calculating pro forma net loss per share attributable to common stockholders, basic and diluted(3)
> Other Financial Information: ............ Other Financial Information: / Other Financial Information:
> Non-GAAP operating loss(4) .............. Non-GAAP operating loss(4) / Non-GAAP operating loss(4) / $ / (107,303) / $ / (155,777) / $ / (9,482) / $ / (71,755)
> Non-GAAP net loss(5) .................... Non-GAAP net loss(5) / Non-GAAP net loss(5) / $ / (100,524) / $ / (154,675) / $ / (3,949) / $ / (68,560)
> Net cash (used in) provided by operating activities ... Net cash (used in) provided by operating activities / Net cash (used in) provided by operating activities / $ / (78,977) / $ / (164,402) / $ / 311,813 / $ / (70,185)

 _______________
 (1)Includes stock-based compensation expense as follows:
  

> **Year Ended December 31,**
>
> Year Ended December 31, / Year Ended December 31, / Year Ended December 31, / Year Ended December 31, / Year Ended December 31, / Year Ended December 31, / Year Ended December 31, / Year Ended December 31, / Six Months Ended, June 30, / Six Months Ended, June 30, / Six Months Ended, June 30, / Six Months Ended, June 30, / Six Months Ended, June 30, / Six Months Ended, June 30, / Six Months Ended, June 30, / Six Months Ended, June 30, / Six Months Ended, June 30,
>
> 2023 / 2023 / 2023 / 2022 / 2022 / 2022 / 2024 / 2024 / 2024 / 2023 / 2023 / 2023
> (in thousands) / (in thousands) / (in thousands) / (in thousands) / (in thousands) / (in thousands) / (in thousands) / (in thousands) / (in thousands) / (in thousands) / (in thousands) / (in thousands) / (in thousands) / (in thousands) / (in thousands) / (in thousands) / (in thousands) / (in thousands) / (in thousands) / (in thousands) / (in thousands)
> Cost of sales ........................... Cost of sales / Cost of sales / $ / 309 / $ / 223 / $ / 420 / $ / 51
> Research and development ................ Research and development / Research and development / 21,187 / 21,187 / 17,732 / 17,732 / 23,905 / 23,905 / 7,367 / 7,367
> Sales and marketing ..................... Sales and marketing / Sales and marketing / 3,563 / 3,563 / 832 / 832 / 3,195 / 3,195 / 1,075 / 1,075
> General and administrative .............. General and administrative / General and administrative / 1,572 / 1,572 / 4,257 / 4,257 / 4,809 / 4,809 / 767 / 767
> Total stock-based compensation expense ... Total stock-based compensation expense / Total stock-based compensation expense / $ / 26,631 / $ / 23,044 / $ / 32,329 / $ / 9,260

 Stock-based compensation expense included $9.0 million, $8.6 million, $18.5 million, and $0.9 million for the years ended December 31, 2023, and 2022 and for the six months ended June 30, 2024 and 2023, respectively, related to secondary transactions in each period and a common stock repurchase from employees during the year ended December 31, 2022. See Note 12 to our audited consolidated financial statements and our unaudited interim condensed consolidated financial statements included elsewhere in this prospectus for additional details on the secondary transactions.
 (2)See Note 7 to our audited consolidated financial statements included elsewhere in this prospectus for an explanation of the method used to calculate our basic and diluted net loss per share and the weighted-average number of shares used in the computation of per share amounts.
 (3)The pro forma weighted-average shares used in computing pro forma net loss per share gives effect to (i) the Preferred Stock Conversion, (ii) the Option Exercise, and (iii) the RSU Net Settlement. The pro forma net loss used to calculate pro forma net loss per share reflects stock-based compensation expense of approximately $               that we will recognize upon the completion of this offering related to RSUs subject to service-based and liquidity-based vesting conditions for which the service-based vesting condition was satisfied as of June 30, 2024 and for which the liquidity-based vesting condition will be satisfied in connection with this offering.
 (4)See “Non-GAAP Operating Loss” below for more information and for a reconciliation of Non-GAAP operating loss to loss from operations, the most directly comparable financial measure calculated and presented in accordance with U.S. generally accepted accounting principles (“U.S. GAAP”).
 (5)See “Non-GAAP Net Loss” below for more information and for a reconciliation of Non-GAAP operating loss to net loss, the most directly comparable financial measure calculated and presented in accordance with U.S. GAAP.

   16

---

  

  

> **As of June 30, 2024**
>
> As of June 30, 2024 / As of June 30, 2024 / As of June 30, 2024 / As of June 30, 2024 / As of June 30, 2024 / As of June 30, 2024 / As of June 30, 2024 / As of June 30, 2024 / As of June 30, 2024 / As of June 30, 2024 / As of June 30, 2024 / As of June 30, 2024 / As of June 30, 2024 / As of June 30, 2024
>
> Actual / Actual / Actual / Pro Forma(1) / Pro Forma(1) / Pro Forma(1) / Pro Forma As Adjusted(2)(3) / Pro Forma As Adjusted(2)(3) / Pro Forma As Adjusted(2)(3)
> (in thousands) / (in thousands) / (in thousands) / (in thousands) / (in thousands) / (in thousands) / (in thousands) / (in thousands) / (in thousands) / (in thousands) / (in thousands) / (in thousands) / (in thousands) / (in thousands) / (in thousands)
> Consolidated Balance Sheet Data: ........ Consolidated Balance Sheet Data: / Consolidated Balance Sheet Data:
> Cash and cash equivalents ............... Cash and cash equivalents / Cash and cash equivalents / $ / 90,931 / $ / $ / $ / $ / $ / $
> Working capital(4) ...................... Working capital(4) / Working capital(4) / 129,089 / 129,089
> Total assets ............................ Total assets / Total assets / 622,862 / 622,862
> Total liabilities ....................... Total liabilities / Total liabilities / 479,394 / 479,394
> Redeemable convertible preferred stock ... Redeemable convertible preferred stock / Redeemable convertible preferred stock / 722,780 / 722,780
> Stockholders’ deficit ................... Stockholders’ deficit / Stockholders’ deficit / (579,312) / (579,312)

 _______________
 (1)The pro forma column above gives effect to (i) the filing and effectiveness of our amended and restated certificate of incorporation, which will occur immediately prior to the completion of this offering; (ii) the Preferred Stock Conversion; (iii) the Option Exercise; (iv) the RSU Net Settlement; (v) the increase in accrued expenses and other current liabilities and an equivalent decrease in additional paid-in capital of $                in connection with the estimated tax withholding and remittance obligations related to the RSU Net Settlement; and (vi) stock-based compensation expense of approximately $               that we will recognize upon the completion of this offering related to RSUs subject to service-based and liquidity-based vesting conditions for which the service-based vesting condition was satisfied as of June 30, 2024 and for which the liquidity-based vesting condition will be satisfied in connection with this offering.
 (2)The pro forma as adjusted column above gives further effect to (i) the pro forma adjustments set forth above; (ii) the issuance and sale of                 shares of Class A common stock by us in this offering at an assumed initial public offering price of $           per share, which is the midpoint of the estimated price range set forth on the cover page of this prospectus, after deducting estimated underwriting discounts and commissions and estimated offering expenses payable by us; (iii) the receipt by us of gross proceeds of approximately $                in connection with the Option Exercise; and (iv) the use of a portion of the net proceeds from this offering to satisfy the estimated tax withholding and remittance obligations related to the RSU Net Settlement.
 (3)Each $1.00 increase or decrease in the assumed initial public offering price of $             per share, which is the midpoint of the estimated price range set forth on the cover page of this prospectus, would increase or decrease, as applicable, each of cash and cash equivalents, working capital, total assets, and stockholders’ deficit by $            , assuming that the number of shares of Class A common stock offered by us, as set forth on the cover page of this prospectus, remains the same, and after deducting estimated underwriting discounts and commissions and estimated offering expenses payable by us. Similarly, each increase or decrease of 1.0 million shares in the number of shares of Class A common stock offered by us would increase or decrease, as applicable, each of cash and cash equivalents, working capital, total assets, and stockholders’ deficit by $            , assuming the assumed initial public offering price of $             per share remains the same, and after deducting estimated underwriting discounts and commissions and estimated offering expenses payable by us. In addition, each 1.0% increase or decrease in the assumed tax withholding rate would increase or decrease, as applicable, the amount of estimated tax withholding and remittance obligations related to the RSU Net Settlement by $            . Pro forma adjustments in the footnotes above and the related information in the consolidated balance sheet data are illustrative only and will be adjusted based on the actual initial public offering price and other terms of this offering determined at pricing, the actual tax withholding rate, as well as the actual amount of RSUs settled in connection with this offering (including after accounting for forfeitures prior to the settlement date).
 (4)Working capital is defined as total current assets less total current liabilities. See our unaudited interim consolidated financial statements and the related notes thereto included elsewhere in this prospectus for further details regarding our current assets and current liabilities.

   17

---

  

  Non-GAAP Financial Measures
 We use certain non-GAAP financial measures to supplement the performance measures in our consolidated financial statements, which are presented in accordance with GAAP. These non-GAAP financial measures include non-GAAP operating loss and non-GAAP net loss. We use these non-GAAP financial measures for financial and operational decision-making and as a means to assist us in evaluating period-to-period comparisons. By excluding certain items that may not be indicative of our recurring core operating results, we believe that non-GAAP operating loss and non-GAAP net loss provide meaningful supplemental information regarding our performance. Accordingly, we believe these non-GAAP financial measures are useful to investors and others because they allow for additional information with respect to financial measures used by management in its financial and operational decision-making and they may be used by our institutional investors and the analyst community to help them analyze the health of our business. However, there are a number of limitations related to the use of non-GAAP financial measures, and these non-GAAP measures should be considered in addition to, not as a substitute for or in isolation from, our financial results prepared in accordance with GAAP. Other companies, including companies in our industry, may calculate these non-GAAP financial measures differently or not at all, which reduces their usefulness as comparative measures.
 Non-GAAP Operating Loss
 We define non-GAAP operating loss as operating loss presented in accordance with GAAP, adjusted to exclude stock-based compensation expenses. We have presented non-GAAP operating loss because we consider non-GAAP operating loss to be a useful metric for investors and other users of our financial information in evaluating our operating performance because it excludes the impact of stock-based compensation, a non-cash charge that can vary from period to period for reasons that are unrelated to our core operating performance. This metric also provides investors and other users of our financial information with an additional tool to compare business performance across companies and periods, while eliminating the effects of items that may vary for different companies for reasons unrelated to core operating performance.
 A reconciliation of our GAAP operating loss, the most directly comparable GAAP financial measure, to non-GAAP operating loss is presented below:
  

> **Year Ended December 31,**
>
> Year Ended December 31, / Year Ended December 31, / Year Ended December 31, / Year Ended December 31, / Year Ended December 31, / Year Ended December 31, / Year Ended December 31, / Year Ended December 31, / Six Months Ended June 30, / Six Months Ended June 30, / Six Months Ended June 30, / Six Months Ended June 30, / Six Months Ended June 30, / Six Months Ended June 30, / Six Months Ended June 30, / Six Months Ended June 30, / Six Months Ended June 30,
>
> 2023 / 2023 / 2023 / 2022 / 2022 / 2022 / 2024 / 2024 / 2024 / 2023 / 2023 / 2023
> (in thousands) / (in thousands) / (in thousands) / (in thousands) / (in thousands) / (in thousands) / (in thousands) / (in thousands) / (in thousands) / (in thousands) / (in thousands) / (in thousands) / (in thousands) / (in thousands) / (in thousands) / (in thousands) / (in thousands) / (in thousands) / (in thousands) / (in thousands) / (in thousands)
> GAAP operating loss ..................... GAAP operating loss / GAAP operating loss / $ / (133,934) / $ / (178,821) / $ / (41,811) / $ / (81,015)
> Add: Stock-based compensation expense ... Add: Stock-based compensation expense / Add: Stock-based compensation expense / 26,631 / 26,631 / 23,044 / 23,044 / 32,329 / 32,329 / 9,260 / 9,260
> Non-GAAP operating loss ................. Non-GAAP operating loss / Non-GAAP operating loss / $ / (107,303) / $ / (155,777) / $ / (9,482) / $ / (71,755)

 Non-GAAP Net Loss
 We monitor non-GAAP net loss for planning and performance measurement purposes. We define non-GAAP net loss as net loss reported on our consolidated statements of operations, excluding the impact of stock-based compensation expenses and change in fair value of forward contract liability. We have presented non-GAAP net loss because we believe that the exclusion of these charges allows for a more relevant comparison of our results of operations to other companies in our industry and facilitates period-to-period comparisons as it eliminates the effect of certain factors unrelated to our overall operating performance. Our calculation of non-GAAP net loss does not currently include the tax effects of the stock-based compensation expense adjustment because such tax effects have not been material to date.

   18

---

  

  A reconciliation of our GAAP net loss, the most directly comparable GAAP financial measure, to our non-GAAP net loss is presented below:
  

> **Year Ended December 31,**
>
> Year Ended December 31, / Year Ended December 31, / Year Ended December 31, / Year Ended December 31, / Year Ended December 31, / Year Ended December 31, / Year Ended December 31, / Year Ended December 31, / Six Months Ended June 30, / Six Months Ended June 30, / Six Months Ended June 30, / Six Months Ended June 30, / Six Months Ended June 30, / Six Months Ended June 30, / Six Months Ended June 30, / Six Months Ended June 30, / Six Months Ended June 30,
>
> 2023 / 2023 / 2023 / 2022 / 2022 / 2022 / 2024 / 2024 / 2024 / 2023 / 2023 / 2023
> (in thousands) / (in thousands) / (in thousands) / (in thousands) / (in thousands) / (in thousands) / (in thousands) / (in thousands) / (in thousands) / (in thousands) / (in thousands) / (in thousands) / (in thousands) / (in thousands) / (in thousands) / (in thousands) / (in thousands) / (in thousands) / (in thousands) / (in thousands) / (in thousands)
> GAAP net loss ........................... GAAP net loss / GAAP net loss / $ / (127,155) / $ / (177,719) / $ / (66,605) / $ / (77,820)
> Add: Stock-based compensation expense(1) ... Add: Stock-based compensation expense(1) / Add: Stock-based compensation expense(1) / 26,631 / 26,631 / 23,044 / 23,044 / 32,329 / 32,329 / 9,260 / 9,260
> Add: Change in fair value of forward contract liability ... Add: Change in fair value of forward contract liability / Add: Change in fair value of forward contract liability / — / — / — / — / 30,327 / 30,327 / — / —
> Non-GAAP net loss ....................... Non-GAAP net loss / Non-GAAP net loss / $ / (100,524) / $ / (154,675) / $ / (3,949) / $ / (68,560)

 _______________
 (1)Non-GAAP net loss does not include the tax effects of the stock-based compensation expense adjustment because such tax effects were not material during the periods presented.

   19