Close Menu
    DevStackTipsDevStackTips
    • Home
    • News & Updates
      1. Tech & Work
      2. View All

      Sunshine And March Vibes (2025 Wallpapers Edition)

      May 16, 2025

      The Case For Minimal WordPress Setups: A Contrarian View On Theme Frameworks

      May 16, 2025

      How To Fix Largest Contentful Paint Issues With Subpart Analysis

      May 16, 2025

      How To Prevent WordPress SQL Injection Attacks

      May 16, 2025

      Microsoft has closed its “Experience Center” store in Sydney, Australia — as it ramps up a continued digital growth campaign

      May 16, 2025

      Bing Search APIs to be “decommissioned completely” as Microsoft urges developers to use its Azure agentic AI alternative

      May 16, 2025

      Microsoft might kill the Surface Laptop Studio as production is quietly halted

      May 16, 2025

      Minecraft licensing robbed us of this controversial NFL schedule release video

      May 16, 2025
    • Development
      1. Algorithms & Data Structures
      2. Artificial Intelligence
      3. Back-End Development
      4. Databases
      5. Front-End Development
      6. Libraries & Frameworks
      7. Machine Learning
      8. Security
      9. Software Engineering
      10. Tools & IDEs
      11. Web Design
      12. Web Development
      13. Web Security
      14. Programming Languages
        • PHP
        • JavaScript
      Featured

      The power of generators

      May 16, 2025
      Recent

      The power of generators

      May 16, 2025

      Simplify Factory Associations with Laravel’s UseFactory Attribute

      May 16, 2025

      This Week in Laravel: React Native, PhpStorm Junie, and more

      May 16, 2025
    • Operating Systems
      1. Windows
      2. Linux
      3. macOS
      Featured

      Microsoft has closed its “Experience Center” store in Sydney, Australia — as it ramps up a continued digital growth campaign

      May 16, 2025
      Recent

      Microsoft has closed its “Experience Center” store in Sydney, Australia — as it ramps up a continued digital growth campaign

      May 16, 2025

      Bing Search APIs to be “decommissioned completely” as Microsoft urges developers to use its Azure agentic AI alternative

      May 16, 2025

      Microsoft might kill the Surface Laptop Studio as production is quietly halted

      May 16, 2025
    • Learning Resources
      • Books
      • Cheatsheets
      • Tutorials & Guides
    Home»Development»Gradformer: A Machine Learning Method that Integrates Graph Transformers (GTs) with the Intrinsic Inductive Bias by Applying an Exponential Decay Mask to the Attention Matrix

    Gradformer: A Machine Learning Method that Integrates Graph Transformers (GTs) with the Intrinsic Inductive Bias by Applying an Exponential Decay Mask to the Attention Matrix

    April 30, 2024

    Graph Transformers (GTs) have successfully achieved state-of-the-art performance on various platforms. GTs can capture long-range information from nodes that are at large distances, unlike the local message-passing in graph neural networks (GNNs). In addition, the self-attention mechanism in GTs permits each node to look at other nodes in a graph directly, helping collect information from arbitrary nodes. The same self-attention in GTs also provides much flexibility and capacity to collect information globally and adaptively. 

    Despite being advantageous over a large variety of tasks, the self-attention mechanism in GTs doesn’t pay more attention to the special features of graphs, such as biases related to structure. Although some methods that account for these features leverage positional encoding and attention bias model inductive biases, they are ineffective in overcoming this problem. Also, the self-attention mechanism doesn’t utilize the full advantage of intrinsic feature biases in graphs, which creates critical challenges in capturing the essential graph structural information. Neglecting structural correlation can lead to an equal focus on each node by the mechanism, creating an inadequate focus on key information and the aggregation of redundant information. 

    Researchers from Wuhan University China, JD Explore Academy China, The  University of Melbourne, and Griffith University, Brisbane, proposed Gradformer, a novel method that innovatively integrates GTs with inductive bias. Gradformer includes a special feature called exponential decay mask into the GT self-attention architecture. This approach helps to control each node’s attention weights relative to other nodes by multiplying the mask with the attention score. The gradual reduction in attention weights due to exponential decay helps the decay mask effectively guide the learning process within the self-attention framework. 

    Gradformer achieves state-of-the-art results on five datasets, highlighting the efficiency of this proposed method. When tested on small datasets like NC11 and PROTEINS, it outperforms all 14 methods with improvements of 2.13% and 2.28%, respectively. This shows that Gradformer effectively incorporates inductive biases into the GT model, which becomes important if available data is limited. Moreover, it performs well on big datasets such as ZINC, which shows that it applies to datasets of different sizes. 

    Researchers performed an efficiency analysis on Gradformer and compared its training cost with other important methods like SAN, Graphormer, and GraphGPS, mostly focussing on parameters such as GPU memory usage and time. The results obtained from the comparison demonstrated that Gradformer can balance efficiency and accuracy optimally, outperforming SAN and GraphGPS in computational efficiency and accuracy. Further, despite having a longer runtime than Graphormer, it outperforms Graphormer in accuracy, showing its superiority in resource usage with good performance results. 

    In conclusion, researchers proposed Gradformer, a novel integration of GT with intrinsic inductive biases, achieved by applying an exponential decay mask with learnable parameters to the attention matrix. Gradformer outperforms 14 methods of GTs and GNNs with improvements of 2.13% and 2.28%, respectively. Gradformer excels in its capacity to maintain or even exceed the accuracy of shallow models while incorporating deeper network architectures. Future work on Gradformer includes (a) exploring the feasibility of achieving a state-of-the-art structure without using MPNN and (b) investigating the capability of the decay mask operation to improve GT efficiency.

    Check out the Paper. All credit for this research goes to the researchers of this project. Also, don’t forget to follow us on Twitter. Join our Telegram Channel, Discord Channel, and LinkedIn Group.

    If you like our work, you will love our newsletter..

    Don’t Forget to join our 40k+ ML SubReddit

    The post Gradformer: A Machine Learning Method that Integrates Graph Transformers (GTs) with the Intrinsic Inductive Bias by Applying an Exponential Decay Mask to the Attention Matrix appeared first on MarkTechPost.

    Source: Read More 

    Facebook Twitter Reddit Email Copy Link
    Previous ArticleSEED-Bench-2-Plus: An Extensive Benchmark Specifically Designed for Evaluating Multimodal Large Language Models (MLLMs) in Text-Rich Scenarios
    Next Article GPT-4.5 or GPT-5? Unveiling the Mystery Behind the ‘gpt2-chatbot’: The New X Trend for AI

    Related Posts

    Security

    Nmap 7.96 Launches with Lightning-Fast DNS and 612 Scripts

    May 16, 2025
    Common Vulnerabilities and Exposures (CVEs)

    CVE-2025-47916 – Invision Community Themeeditor Remote Code Execution

    May 16, 2025
    Leave A Reply Cancel Reply

    Continue Reading

    EXAMPLEARTICLE

    Development

    CVE-2025-3636 – Moodle RSS Feed Access Vulnerability

    Common Vulnerabilities and Exposures (CVEs)

    WhatsApp Adds Advanced Chat Privacy to Blocks Chat Exports and Auto-Downloads

    Development

    How to Watch YouTube on the PS Portal?

    Artificial Intelligence
    Hostinger

    Highlights

    Preorder the OnePlus Watch 3 now and enjoy several discounts before it becomes available

    March 16, 2025

    Take advantage of the current offers for the OnePlus Watch 3, and you can shave…

    Less than 24 hours after launch, Kingdom Come: Deliverance 2 already soared past this huge sales milestone

    February 5, 2025

    Best One UI 7 features coming to Samsung Galaxy S25 (and older models, too)

    January 22, 2025

    Navigating Trust and Data in Healthcare Marketing

    May 14, 2024
    © DevStackTips 2025. All rights reserved.
    • Contact
    • Privacy Policy

    Type above and press Enter to search. Press Esc to cancel.