With this new product, we’re expecting something truly monstrous in the field of AI. In fact, a leaked slide shows the performance index pointing ” to the moon “. The H100 and H200, the best-performing cards in the industry at the moment, are going to be a thing of the past.
Another scary point is the price. Currently, an H100 costs at least $20,000. With this B100, we can reasonably expect an even higher price tag.
The board will benefit from two dies using TSMC’s CoWoS-L(Chip-on-Wafer-on-Substrate) assembly. The same applies to engraving, which has also been entrusted to the Taiwanese company, which has also seen its electricity bills rise.
Data centers are likely to see theirs increase too. Indeed, this model can be expected to consume an inordinate 1000W per GPU. At least, that’s what a Dell executive told us.
This model of card would benefit from a very large quantity of VRAM, since we’re talking about 192 GB of HBM3e.