Abstract: In applications such as social, energy, transportation, sensor, and neuronal networks, high-dimensional data naturally reside on the vertices of weighted graphs. The emerging field of signal ...
Abstract: Performing training and inference for Graph Neural Networks (GNNs) under tight latency constraints has become increasingly difficult as real-world input graphs continue to grow. Compared to ...
Why We Consider ViTCoD Given NLP Transformer Accelerators? This is because there is a large difference between ViTs and Transformers for natural language processing (NLP) tasks: ViTs have a relatively ...
Some results have been hidden because they may be inaccessible to you
Show inaccessible results