CloudNewsHub
Deep Learning Inference on PowerEdge R7425


Dell EMC PowerEdge R7425 is based on AMD’s EPYC™ architecture and since EPYC™ architecture supports higher number of PCIe Gen3 x16 lanes, it allows the server to be used as a scale-up inference server. It becomes a perfect solution when running large production-based AI workloads where both throughput and latency are important

This whitepaper looks at the performance and efficiency of Deep learning inference when using the Dell EMC PowerEdge R7425 server with NVIDIA T4-16GB GPU.

 

Yes, I would like to be contacted by a Dell Technologies expert to receive further information.

By clicking submit you agree to Dell Technologies and its group of companies stay in touch and to keep you updated on products, services, solutions, exclusive offers and special events. For information on how we protect your personal data, see our Privacy Statement. You can unsubscribe at any time.

Dell Technologies, Dell, EMC, Dell EMC and other trademarks are trademarks of Dell Inc. or its subsidiaries. Other trademarks may be trademarks of their respective owners.

Copyright © 2021  CloudNewsHub - Privacy Policy