Close Menu
    DevStackTipsDevStackTips
    • Home
    • News & Updates
      1. Tech & Work
      2. View All

      Sunshine And March Vibes (2025 Wallpapers Edition)

      May 16, 2025

      The Case For Minimal WordPress Setups: A Contrarian View On Theme Frameworks

      May 16, 2025

      How To Fix Largest Contentful Paint Issues With Subpart Analysis

      May 16, 2025

      How To Prevent WordPress SQL Injection Attacks

      May 16, 2025

      Microsoft has closed its “Experience Center” store in Sydney, Australia — as it ramps up a continued digital growth campaign

      May 16, 2025

      Bing Search APIs to be “decommissioned completely” as Microsoft urges developers to use its Azure agentic AI alternative

      May 16, 2025

      Microsoft might kill the Surface Laptop Studio as production is quietly halted

      May 16, 2025

      Minecraft licensing robbed us of this controversial NFL schedule release video

      May 16, 2025
    • Development
      1. Algorithms & Data Structures
      2. Artificial Intelligence
      3. Back-End Development
      4. Databases
      5. Front-End Development
      6. Libraries & Frameworks
      7. Machine Learning
      8. Security
      9. Software Engineering
      10. Tools & IDEs
      11. Web Design
      12. Web Development
      13. Web Security
      14. Programming Languages
        • PHP
        • JavaScript
      Featured

      The power of generators

      May 16, 2025
      Recent

      The power of generators

      May 16, 2025

      Simplify Factory Associations with Laravel’s UseFactory Attribute

      May 16, 2025

      This Week in Laravel: React Native, PhpStorm Junie, and more

      May 16, 2025
    • Operating Systems
      1. Windows
      2. Linux
      3. macOS
      Featured

      Microsoft has closed its “Experience Center” store in Sydney, Australia — as it ramps up a continued digital growth campaign

      May 16, 2025
      Recent

      Microsoft has closed its “Experience Center” store in Sydney, Australia — as it ramps up a continued digital growth campaign

      May 16, 2025

      Bing Search APIs to be “decommissioned completely” as Microsoft urges developers to use its Azure agentic AI alternative

      May 16, 2025

      Microsoft might kill the Surface Laptop Studio as production is quietly halted

      May 16, 2025
    • Learning Resources
      • Books
      • Cheatsheets
      • Tutorials & Guides
    Home»Development»Hierarchical Graph Masked AutoEncoders (Hi-GMAE): A Novel Multi-Scale GMAE Framework Designed to Handle the Hierarchical Structures within Graph

    Hierarchical Graph Masked AutoEncoders (Hi-GMAE): A Novel Multi-Scale GMAE Framework Designed to Handle the Hierarchical Structures within Graph

    May 29, 2024

    In graph analysis, the need for labeled data presents a significant hurdle for traditional supervised learning methods, particularly within academic, social, and biological networks. To overcome this limitation, Graph Self-supervised Pre-training (GSP) techniques have emerged, leveraging the intrinsic structures and properties of graph data to extract meaningful representations without the need for labeled examples. GSP methods are broadly classified into two categories: contrastive and generative. 

    Contrastive methods, like GraphCL and SimGRACE, create multiple graph views through augmentation and learn representations by contrasting positive and negative samples. Generative methods like GraphMAE and MaskGAE focus on learning node representations via a reconstruction objective. Notably, generative GSP approaches are often simpler and more effective than their contrastive counterparts, which rely on meticulously designed augmentation and sampling strategies.

    Current Generative graph-masked AutoEncoder (GMAE) models primarily concentrate on reconstructing node features, thereby capturing predominantly node-level information. This single-scale approach, however, needs to address the multi-scale nature inherent in many graphs, such as social networks, recommendation systems, and molecular structures. These graphs contain node-level details and subgraph-level information, exemplified by functional groups in molecular graphs. The inability of current GMAE models to effectively learn this complex, higher-level structural information results in diminished performance.

    To address these limitations, a team of researchers from various institutions, including Wuhan University, introduced the Hierarchical Graph Masked AutoEncoders (Hi-GMAE) framework. Hi-GMAE comprises three main components designed to capture hierarchical information in graphs. The first component, multi-scale coarsening, constructs coarse graphs at multiple scales using graph pooling methods that cluster nodes into super-nodes progressively. 

    The second component, Coarse-to-Fine (CoFi) masking with recovery, introduces a novel masking strategy that ensures the consistency of masked subgraphs across all scales. This strategy starts with random masking of the coarsest graph, followed by back-projecting the mask to finer scales using an unspooling operation. A gradual recovery process selectively unmasks certain nodes to aid learning from initially fully masked subgraphs.

    The third key component of Hi-GMAE is the Fine- and Coarse-Grained (Fi-Co) encoder and decoder. The hierarchical encoder integrates fine-grained graph convolution modules to capture local information at lower graph scales and coarse-grained graph transformer (GT) modules to focus on global information at higher graph scales. The corresponding lightweight decoder gradually reconstructs and projects the learned representations to the original graph scale, ensuring comprehensive capture and representation of multi-level structural information.

    To validate the effectiveness of Hi-GMAE, extensive experiments were conducted on various widely-used datasets, encompassing unsupervised and transfer learning tasks. The experimental results demonstrated that Hi-GMAE outperforms existing state-of-the-art models in contrastive and generative pre-training domains. These findings underscore the advantages of the multi-scale GMAE approach over traditional single-scale models, highlighting its superior capability in capturing and leveraging hierarchical graph information.

    In conclusion, Hi-GMAE represents a significant advancement in self-supervised graph pre-training. By integrating multi-scale coarsening, an innovative masking strategy, and a hierarchical encoder-decoder architecture, Hi-GMAE effectively captures the complexities of graph structures at various levels. The framework’s superior performance in experimental evaluations solidifies its potential as a powerful tool for graph learning tasks, setting a new benchmark in graph analysis.

    Check out the Paper. All credit for this research goes to the researchers of this project. Also, don’t forget to follow us on Twitter. Join our Telegram Channel, Discord Channel, and LinkedIn Group.

    If you like our work, you will love our newsletter..

    Don’t Forget to join our 43k+ ML SubReddit | Also, check out our AI Events Platform

    The post Hierarchical Graph Masked AutoEncoders (Hi-GMAE): A Novel Multi-Scale GMAE Framework Designed to Handle the Hierarchical Structures within Graph appeared first on MarkTechPost.

    Source: Read More 

    Facebook Twitter Reddit Email Copy Link
    Previous ArticleTop AI Tools for Graphic Designers
    Next Article Top AI Courses from NVIDIA

    Related Posts

    Security

    Nmap 7.96 Launches with Lightning-Fast DNS and 612 Scripts

    May 16, 2025
    Common Vulnerabilities and Exposures (CVEs)

    CVE-2025-47916 – Invision Community Themeeditor Remote Code Execution

    May 16, 2025
    Leave A Reply Cancel Reply

    Continue Reading

    Crab Framework Released: An AI Framework for Building LLM Agent Benchmark Environments in a Python-Centric Way

    Development

    CVE-2025-46520 – Alphasis Related Posts CSRF Stored XSS

    Common Vulnerabilities and Exposures (CVEs)

    I pushed MSI’s latest powerful gaming laptop to its limits. It didn’t break a sweat

    News & Updates

    CISA Adds Critical Zero-Day Vulnerabilities from July 2024 Patch Tuesday to Exploited List

    Development

    Highlights

    CVE-2025-3959 – “Withstars Books-Management-System Cross-Site Request Forgery Vulnerability”

    April 27, 2025

    CVE ID : CVE-2025-3959

    Published : April 27, 2025, 5:15 a.m. | 3 hours ago

    Description : A vulnerability was found in withstars Books-Management-System 1.0. It has been declared as problematic. Affected by this vulnerability is an unknown functionality of the file /reader_delete.html. The manipulation leads to cross-site request forgery. The attack can be launched remotely. The exploit has been disclosed to the public and may be used. This vulnerability only affects products that are no longer supported by the maintainer.

    Severity: 4.3 | MEDIUM

    Visit the link for more details, such as CVSS details, affected products, timeline, and more…

    CVE-2025-4012 – Playeduxyz PlayEdu SSRF Vulnerability

    April 28, 2025

    ‘Google AI’ is coming to the Pixel 9. Here’s what it looks like and everything we know

    July 2, 2024

    FBI, DHS Warn of Insider Threats to 2024 US Elections, Issue New Guidance for Officials

    July 3, 2024
    © DevStackTips 2025. All rights reserved.
    • Contact
    • Privacy Policy

    Type above and press Enter to search. Press Esc to cancel.