The GPUs and other chips used to train AI communicate with each other inside datacenters through “interconnects.” But those interconnects have limited bandwidth, which limits AI training performance. A 2022 survey found that AI developers typically struggle to use more than 25% of a GPU’s capacity. One solution could be new interconnects with much higher […] © 2024 TechCrunch. All rights reserved. For personal use only.

No comments yet…

Login to comment.