Transformers4Rec: Bridging the Gap between NLP and Sequential / Session-Based Recommendation

This demo covers Transformers4Rec, an open source library that works as a bridge between NLP and recommender systems. It takes inspiration from HuggingFace Transformers and makes state-of-the-art Transformer architectures and training methods utilized within NLP available for RecSys researchers and industry practitioners. The Transformers4Rec library is flexible, customizable, supports multiple input features, and provides APIs for PyTorch and Tensorflow. Transformers4Rec together with other NVIDIA libraries, NVTabular and Triton Inference Server, enables an end-to-end pipeline on GPUs, from data preprocessing to model training to deployment and inference. This demo is particularly prepared based on our accepted paper at ACM RecSys’21 conference. Learn more about session-based recommenders: Download Transformers4Rec:
Back to Top