KDD 2026 Paper Acceptance
Our paper "TREASURE: A Transformer-Based Foundation Model for High-Volume Transaction Understanding" was accepted at KDD 2026 Applied Data Science Track. #KDD #FoundationModels #Transformers #TransactionAnalytics
Senior Machine Learning Engineer, Adobe
Vineeth Rakesh Mohan is a Senior Machine Learning Engineer at Adobe, with deep expertise in building large-scale, production-grade AI systems for decisioning, personalization, and intelligent user experiences.
He currently works on Adobe Experience Platform, where he develops agent orchestration frameworks that leverage generative AI and large language models to power enterprise-scale personalization and experience orchestration. His work focuses on designing robust, scalable AI systems that enable coordinated reasoning across data, knowledge, and actions to deliver intelligent, context-aware customer experiences.
Prior to joining Adobe, Vineeth served as a Senior Staff Research Scientist at Visa, where he led multiple initiatives in risk-based authentication and next-generation AI systems for secure digital commerce. His work supported systems operating at billion-scale transaction volumes, optimized for low-latency, high-throughput decisioning, and informed the development of an emerging agentic commerce framework for intelligent post-purchase experiences.
Before Visa, Vineeth worked as a Research Scientist at InterDigital, where he built patent-focused information retrieval systems and applied language models to intellectual property documents, enabling document-to-document comparison and localized similarity matching. He also conducted research at Technicolor, focusing on user-behavior modeling, personalization, and recommendation systems.
Vineeth earned his Ph.D. in Computer Engineering from Wayne State University and later completed postdoctoral research at Arizona State University.
His technical interests span large language models, probabilistic machine learning, generative models, natural language processing, recommendation systems, and graph neural networks, with a strong focus on scalable engineering, model optimization, and deploying ML systems in high-stakes, production environments. He has published more than 30 research papers in top-tier venues including WWW, WSDM, KDD, ICML, and CVPR.
Our paper "TREASURE: A Transformer-Based Foundation Model for High-Volume Transaction Understanding" was accepted at KDD 2026 Applied Data Science Track. #KDD #FoundationModels #Transformers #TransactionAnalytics
Our paper "MAIN-RAG: Multi-Agent Filtering Retrieval-Augmented Generation" is accepted in Association of the Computational Linguistics (ACL) 2025. #LargeLanguageModels #RAG #MultiAgentSystems #LLM
Our paper "Personalized Layer Selection for Graph Neural Networks" was accepted at Transactions on Machine Learning Research (TMLR). #GNN #GraphNeuralNetworks
Our paper SARA: Selective and Adaptive Retrieval-augmented Generation with Context Compression was accepted at Efficient Systems for Foundation Models ICML 2025. #RAG #RetrievalAugmentedGeneration #LargeLanguageModels #LLMs #ContextCompression.