Close Menu
    DevStackTipsDevStackTips
    • Home
    • News & Updates
      1. Tech & Work
      2. View All

      Sunshine And March Vibes (2025 Wallpapers Edition)

      May 16, 2025

      The Case For Minimal WordPress Setups: A Contrarian View On Theme Frameworks

      May 16, 2025

      How To Fix Largest Contentful Paint Issues With Subpart Analysis

      May 16, 2025

      How To Prevent WordPress SQL Injection Attacks

      May 16, 2025

      Microsoft has closed its “Experience Center” store in Sydney, Australia — as it ramps up a continued digital growth campaign

      May 16, 2025

      Bing Search APIs to be “decommissioned completely” as Microsoft urges developers to use its Azure agentic AI alternative

      May 16, 2025

      Microsoft might kill the Surface Laptop Studio as production is quietly halted

      May 16, 2025

      Minecraft licensing robbed us of this controversial NFL schedule release video

      May 16, 2025
    • Development
      1. Algorithms & Data Structures
      2. Artificial Intelligence
      3. Back-End Development
      4. Databases
      5. Front-End Development
      6. Libraries & Frameworks
      7. Machine Learning
      8. Security
      9. Software Engineering
      10. Tools & IDEs
      11. Web Design
      12. Web Development
      13. Web Security
      14. Programming Languages
        • PHP
        • JavaScript
      Featured

      The power of generators

      May 16, 2025
      Recent

      The power of generators

      May 16, 2025

      Simplify Factory Associations with Laravel’s UseFactory Attribute

      May 16, 2025

      This Week in Laravel: React Native, PhpStorm Junie, and more

      May 16, 2025
    • Operating Systems
      1. Windows
      2. Linux
      3. macOS
      Featured

      Microsoft has closed its “Experience Center” store in Sydney, Australia — as it ramps up a continued digital growth campaign

      May 16, 2025
      Recent

      Microsoft has closed its “Experience Center” store in Sydney, Australia — as it ramps up a continued digital growth campaign

      May 16, 2025

      Bing Search APIs to be “decommissioned completely” as Microsoft urges developers to use its Azure agentic AI alternative

      May 16, 2025

      Microsoft might kill the Surface Laptop Studio as production is quietly halted

      May 16, 2025
    • Learning Resources
      • Books
      • Cheatsheets
      • Tutorials & Guides
    Home»Development»Sparse-Matrix Factorization-based Method: Efficient Computation of Latent Query and Item Representations to Approximate CE Scores

    Sparse-Matrix Factorization-based Method: Efficient Computation of Latent Query and Item Representations to Approximate CE Scores

    May 10, 2024

    Cross-encoder (CE) models evaluate similarity by simultaneously encoding a query-item pair, outperforming the dot-product with embedding-based models at estimating query-item relevance. Current methods perform k-NN search with CE by approximating the CE similarity with a vector embedding space fit with dual-encoders (DE) or CUR matrix factorization. However, DE-based methods face challenges from poor recall because its new domains are not generalized well, and the test-time retrieval with DE is decoupled from the CE. So, DE-based and CUR-based methods are insufficient for a certain application setting in k-NN search.

    Matrix factorization is widely used for evaluating low-rank approximation of dense distance and matrices, non-PSD matrices, and missing entries in sparse matrices. In this paper, researchers explored the methods for factorizing sparse matrices instead of dense matrices. An important assumption for matrix completion methods is that the underlying matrix M is low-rank, so it helps to recover missing entries by analysis of a small fraction of entries in M. Also, when features describing the rows and columns of the matrix are available, the complexity of samples used in this method to recover an m × n matrix of rank r with m ≤ n can be enhanced.

    Researchers from the University of Massachusetts Amherst and Google DeepMind have introduced a novel sparse-matrix factorization-based method. This method optimally computes latent query and item representations to approximate CE scores and performs a kNN search using the approximate CE similarity. In comparison to CUR-based methods, the proposed method generates a high-quality approximation using a fraction of CE similarity calls. The factorization of a sparse matrix that contains query-item CE scores is done to evaluate item embeddings, and this embedding space is initiated by utilizing DE models.                 

    The methods and baselines are rigorously evaluated on tasks like finding k-nearest neighbors for CE models and downstream tasks. Notably, CE models are utilized for tasks like zero-shot entity linking and zero-shot information retrieval, demonstrating how different design decisions impact the time it takes to index data and the retrieval accuracy during testing. The experimentation is conducted on two datasets, ZESHEL and BEIR, where separate CE models trained on ground-truth labeled data are used for both datasets. Two test domains from ZESHEL with 10K and 34K items(entities) and two test domains from BEIR with 25K and 5M items(documents) are used.  

    Researchers have also proposed an impressive k-NN search method that can be used with dense item embeddings generated by methods such as baseline dual-encoder models. This method yields up to 5% and 54% improvement in k-NN recall for k =1 and 100, respectively, over retrieve and rerank style inference with the same DE. Moreover, this approach to align item embeddings with the cross-encoder achieves up to 100 times and 5 times speedup over CUR-based methods and training DE through distillation-based, respectively, parallel matching or enhancing test-time k-NN search recall over baseline methods.

    In conclusion, researchers from the University of Massachusetts Amherst and Google DeepMind introduced a sparse-matrix factorization-based method that efficiently computes latent query and item representations. This method optimally performs k-NN search with cross-encoders by efficiently approximating the cross-encoder scores using the dot product of learned test query and item embeddings. Also, two datasets, ZESHEL and BEIR, were used during the experiment, and both used separate CE models trained on ground-truth labeled data. 

    Check out the Paper. All credit for this research goes to the researchers of this project. Also, don’t forget to follow us on Twitter. Join our Telegram Channel, Discord Channel, and LinkedIn Group.

    If you like our work, you will love our newsletter..

    Don’t Forget to join our 42k+ ML SubReddit

    The post Sparse-Matrix Factorization-based Method: Efficient Computation of Latent Query and Item Representations to Approximate CE Scores appeared first on MarkTechPost.

    Source: Read More 

    Facebook Twitter Reddit Email Copy Link
    Previous ArticlexLSTM: Enhancing Long Short-Term Memory LSTM Capabilities for Advanced Language Modeling and Beyond
    Next Article AnchorGT: A Novel Attention Architecture for Graph Transformers as a Flexible Building Block to Improve the Scalability of a Wide Range of Graph Transformer Models

    Related Posts

    Security

    Nmap 7.96 Launches with Lightning-Fast DNS and 612 Scripts

    May 17, 2025
    Common Vulnerabilities and Exposures (CVEs)

    CVE-2025-40906 – MongoDB BSON Serialization BSON::XS Multiple Vulnerabilities

    May 17, 2025
    Leave A Reply Cancel Reply

    Continue Reading

    HuiOne Guarantee: The $11 Billion Cybercrime Hub of Southeast Asia

    Development

    Whisper WebGPU: Real-Time in-Browser Speech Recognition with OpenAI Whisper

    Development

    The Birth of Ariadne – The Chilling AI

    Artificial Intelligence

    Il Kernel Linux 6.13 è stato pubblicato da Linus Torvals grazie ad Intel e AMD che hanno corretto una patch Microsoft

    Linux

    Highlights

    News & Updates

    Minecraft licensing robbed us of this controversial NFL schedule release video

    May 16, 2025

    An NFL team had to remove its Minecraft-themed social media video due to licensing issues…

    How Yelp’s latest AI updates better connect restaurant owners and diners

    April 29, 2025

    Why CISOs Need Full Board Support to Tackle Today’s Cyber Threats

    January 9, 2025

    Avowed confirmed to have 60 FPS on Xbox Series X

    February 6, 2025
    © DevStackTips 2025. All rights reserved.
    • Contact
    • Privacy Policy

    Type above and press Enter to search. Press Esc to cancel.