Close Menu
    DevStackTipsDevStackTips
    • Home
    • News & Updates
      1. Tech & Work
      2. View All

      Sunshine And March Vibes (2025 Wallpapers Edition)

      May 17, 2025

      The Case For Minimal WordPress Setups: A Contrarian View On Theme Frameworks

      May 17, 2025

      How To Fix Largest Contentful Paint Issues With Subpart Analysis

      May 17, 2025

      How To Prevent WordPress SQL Injection Attacks

      May 17, 2025

      Microsoft’s allegiance isn’t to OpenAI’s pricey models — Satya Nadella’s focus is selling any AI customers want for maximum profits

      May 17, 2025

      If you think you can do better than Xbox or PlayStation in the Console Wars, you may just want to try out this card game

      May 17, 2025

      Surviving a 10 year stint in dev hell, this retro-styled hack n’ slash has finally arrived on Xbox

      May 17, 2025

      Save $400 on the best Samsung TVs, laptops, tablets, and more when you sign up for Verizon 5G Home or Home Internet

      May 17, 2025
    • Development
      1. Algorithms & Data Structures
      2. Artificial Intelligence
      3. Back-End Development
      4. Databases
      5. Front-End Development
      6. Libraries & Frameworks
      7. Machine Learning
      8. Security
      9. Software Engineering
      10. Tools & IDEs
      11. Web Design
      12. Web Development
      13. Web Security
      14. Programming Languages
        • PHP
        • JavaScript
      Featured

      NodeSource N|Solid Runtime Release – May 2025: Performance, Stability & the Final Update for v18

      May 17, 2025
      Recent

      NodeSource N|Solid Runtime Release – May 2025: Performance, Stability & the Final Update for v18

      May 17, 2025

      Big Changes at Meteor Software: Our Next Chapter

      May 17, 2025

      Apps in Generative AI – Transforming the Digital Experience

      May 17, 2025
    • Operating Systems
      1. Windows
      2. Linux
      3. macOS
      Featured

      Microsoft’s allegiance isn’t to OpenAI’s pricey models — Satya Nadella’s focus is selling any AI customers want for maximum profits

      May 17, 2025
      Recent

      Microsoft’s allegiance isn’t to OpenAI’s pricey models — Satya Nadella’s focus is selling any AI customers want for maximum profits

      May 17, 2025

      If you think you can do better than Xbox or PlayStation in the Console Wars, you may just want to try out this card game

      May 17, 2025

      Surviving a 10 year stint in dev hell, this retro-styled hack n’ slash has finally arrived on Xbox

      May 17, 2025
    • Learning Resources
      • Books
      • Cheatsheets
      • Tutorials & Guides
    Home»Development»Large Generative Graph Models (LGGMs): A New Class of Graph Generative Model Trained on a Large Corpus of Graphs

    Large Generative Graph Models (LGGMs): A New Class of Graph Generative Model Trained on a Large Corpus of Graphs

    June 13, 2024

    Large Generative Models (LGMs) like GPT, Stable Diffusion, Sora, and Suno have recently made remarkable strides in creating creative and meaningful content, greatly boosting the efficiency of real-world applications. Unlike earlier models like Bert/Bart in Natural Language Processing (NLP) and Unet in Image Segmentation, which were trained on small datasets from specific areas and for narrow tasks, the success of these LGMs comes from their extensive training on well-curated data from a wide range of fields. Given the tremendous success of LGMs in other domains and the potential practical uses of graph generative models, it is natural to ask: Can we develop large generative models for graph-structured data?

    This paper discusses two existing methods for generating content. First, Large Generative Models (LGMs) have recently achieved great success in generating meaningful content for various tasks across multiple fields. For example, in Natural Language Processing (NLP), large language models trained to predict the next word can generate human-like text for tasks such as question answering and language translation. Second, Graph Generative Models focus on creating realistic graphs to model relationships in real-world data. These models are used in applications like generating molecular structures with desirable properties and creating subtle adversarial attacks. 

    Researchers from Vanderbilt University, the University of Michigan, Adobe Research, and Intel Labs have introduced LARGE GRAPH GENERATIVE MODEL (LGGM), a new class of graph generative model that is trained on a large corpus of graphs from 13 distinct domains. Pre-trained LGGM outperforms other graph generative models in zero-shot generative capability and can be easily fine-tuned with graphs from specific fields, showing better performance than those directly trained from scratch. LGGM can generate graphs given text prompts, such as the description of the network name and domain, and network statistics.

    The ability to generate Text-to-Graph helps users to have detailed control over created graphs. Further, training LGGM needs a large, well-organized corpus of graphs from various fields. Graphs are selected from the Network Repository across 13 different fields covering a wide variety of real-world situations, including Facebook (FB), Animal Social (ASN), Email, Web, Road, Power, Chemical (CHEM), etc. Many real-world graphs contain thousands or even millions of nodes and edges. However, advanced diffusion models like DiGress and GDSS can only handle networks with a few hundred nodes. To address this, subgraphs are sampled from certain domains to manage scalability challenges.

    The fine-tuned LGGM is compared with DiGress trained directly on each domain to show the practical usage of LGGM in generating graphs for real-world deployment. In most of the domains, LGGM shows better generative performance on the same graphs for training due to more knowledge used during the pre-training phase. This advantage is even more noticeable when fewer graphs are available. It is useful, especially in graph-generative applications that involve semi-supervised settings, like generating anomaly detection software and designing drugs. In these cases, the relevant graphs make up only 0.05%-0.5% and 0.01% of all potential candidates, respectively.

    In conclusion, researchers have proposed LGGM, a new class of graph generative model that is trained on over 5,000 graphs sourced from 13 distinct domains from the well-known Network Repository. LGGM outperforms other graph-generative models in zero-shot generative capability and can be easily fine-tuned with graphs from specific fields. It can also generate Text-to-Graph. Similar to LGMs in other fields, LGGMs do not specialize in generating graphs for specific domains. Therefore, a future direction is to evaluate their practical usefulness in application-oriented ways, such as producing higher-quality generated graphs for better data augmentation.

    Check out the Paper. All credit for this research goes to the researchers of this project. Also, don’t forget to follow us on Twitter. Join our Telegram Channel, Discord Channel, and LinkedIn Group.

    If you like our work, you will love our newsletter..

    Don’t Forget to join our 44k+ ML SubReddit

    The post Large Generative Graph Models (LGGMs): A New Class of Graph Generative Model Trained on a Large Corpus of Graphs appeared first on MarkTechPost.

    Source: Read More 

    Facebook Twitter Reddit Email Copy Link
    Previous ArticleAre You a Startup Struggling with UI/UX For Production? Meet CodeParrot: An AI-Powered Tool that Transforms Figma Files to Production Ready Code
    Next Article GenAI-Arena: An Open Platform for Community-Based Evaluation of Generative AI Models

    Related Posts

    Development

    February 2025 Baseline monthly digest

    May 17, 2025
    Development

    Learn A1 Level Spanish

    May 17, 2025
    Leave A Reply Cancel Reply

    Continue Reading

    CVE-2025-42603 – Meon KYC Plain Text Data Exposure

    Common Vulnerabilities and Exposures (CVEs)

    CVE-2025-2944 – Elementor Jeg Stored Cross-Site Scripting (XSS)

    Common Vulnerabilities and Exposures (CVEs)

    CVE-2025-43963 – LibRaw Out-of-Bounds Access Vulnerability

    Common Vulnerabilities and Exposures (CVEs)

    Skype is dying, but this niche messaging app once bought by Skype just integrated with Microsoft Copilot

    News & Updates
    Hostinger

    Highlights

    CVE-2025-3632 – IBM 4769 Developers Toolkit Buffer Overflow Denial of Service

    May 12, 2025

    CVE ID : CVE-2025-3632

    Published : May 12, 2025, 5:15 p.m. | 2 hours, 27 minutes ago

    Description : IBM 4769 Developers Toolkit 7.0.0 through 7.5.52 could allow a remote attacker to cause a denial of service in the Hardware Security Module (HSM) due to improper memory allocation of an excessive size.

    Severity: 7.5 | HIGH

    Visit the link for more details, such as CVSS details, affected products, timeline, and more…

    10+ Tools & Resources for Designers and Agencies in 2025

    February 11, 2025

    Learn the Basics of API Security

    January 30, 2025

    OpenAI and Color Health partner to accelerate cancer treatment

    June 18, 2024
    © DevStackTips 2025. All rights reserved.
    • Contact
    • Privacy Policy

    Type above and press Enter to search. Press Esc to cancel.