Covering Scientific & Technical AI | Monday, December 2, 2024

memory bandwidth

WEKA Keeps GPUs Fed with Speedy New Appliances

GPUs have an insatiable desire for data, and keeping those processors fed can be a challenge. That’s one of the big reasons that WEKA launched a new line of ...Full Article

Micron Deep Learning Accelerator Gets Memory Boost

Deep learning accelerators based on chip architectures coupled with high-bandwidth memory are emerging to enable near real-time processing of machine learning algorithms. Memory chip specialist Micron Technology argues that ...Full Article

Baidu Embraces Intel Optane for In-Memory Databases

Chinese e-commerce giant Baidu is building a new platform based on Intel Corp.’s Optane DC persistent memory as a means of upgrading search engine results delivered by its in-memory ...Full Article

Inference Engine Uses SRAM to Edge AI Apps

Flex Logix, the embedded FPGA specialist, has shifted gears by applying its proprietary interconnect technology to launch an inference engine that boosts neural inferencing capacity at the network edge ...Full Article
AIwire