Close Menu
    DevStackTipsDevStackTips
    • Home
    • News & Updates
      1. Tech & Work
      2. View All

      Sunshine And March Vibes (2025 Wallpapers Edition)

      May 16, 2025

      The Case For Minimal WordPress Setups: A Contrarian View On Theme Frameworks

      May 16, 2025

      How To Fix Largest Contentful Paint Issues With Subpart Analysis

      May 16, 2025

      How To Prevent WordPress SQL Injection Attacks

      May 16, 2025

      Microsoft has closed its “Experience Center” store in Sydney, Australia — as it ramps up a continued digital growth campaign

      May 16, 2025

      Bing Search APIs to be “decommissioned completely” as Microsoft urges developers to use its Azure agentic AI alternative

      May 16, 2025

      Microsoft might kill the Surface Laptop Studio as production is quietly halted

      May 16, 2025

      Minecraft licensing robbed us of this controversial NFL schedule release video

      May 16, 2025
    • Development
      1. Algorithms & Data Structures
      2. Artificial Intelligence
      3. Back-End Development
      4. Databases
      5. Front-End Development
      6. Libraries & Frameworks
      7. Machine Learning
      8. Security
      9. Software Engineering
      10. Tools & IDEs
      11. Web Design
      12. Web Development
      13. Web Security
      14. Programming Languages
        • PHP
        • JavaScript
      Featured

      The power of generators

      May 16, 2025
      Recent

      The power of generators

      May 16, 2025

      Simplify Factory Associations with Laravel’s UseFactory Attribute

      May 16, 2025

      This Week in Laravel: React Native, PhpStorm Junie, and more

      May 16, 2025
    • Operating Systems
      1. Windows
      2. Linux
      3. macOS
      Featured

      Microsoft has closed its “Experience Center” store in Sydney, Australia — as it ramps up a continued digital growth campaign

      May 16, 2025
      Recent

      Microsoft has closed its “Experience Center” store in Sydney, Australia — as it ramps up a continued digital growth campaign

      May 16, 2025

      Bing Search APIs to be “decommissioned completely” as Microsoft urges developers to use its Azure agentic AI alternative

      May 16, 2025

      Microsoft might kill the Surface Laptop Studio as production is quietly halted

      May 16, 2025
    • Learning Resources
      • Books
      • Cheatsheets
      • Tutorials & Guides
    Home»Development»Transforming Partial Differential Equations PDE Solutions with ‘TENG’: Harnessing Machine Learning for Enhanced Accuracy and Efficiency

    Transforming Partial Differential Equations PDE Solutions with ‘TENG’: Harnessing Machine Learning for Enhanced Accuracy and Efficiency

    April 21, 2024

    Partial differential equations (PDEs) are required for modeling dynamic systems in science and engineering, but solving them accurately, especially for initial value problems, remains challenging. Integrating machine learning into PDE research has revolutionized both fields, offering new avenues to tackle PDE complexities. ML’s ability to approximate complex functions has led to algorithms that can solve, simulate, and even discover PDEs from data. However, maintaining high accuracy, especially with intricate initial conditions, remains a significant hurdle due to error propagation in solvers over time. Various training strategies have been proposed, but achieving precise solutions at each time step remains a critical challenge.

    MIT, NSF AI Institute, and Harvard University researchers have developed the Time-Evolving Natural Gradient (TENG) method, combining time-dependent variational principles and optimization-based time integration with natural gradient optimization. TENG, including variants like TENG-Euler and TENG-Heun, achieves remarkable accuracy and efficiency in neural-network-based PDE solutions. By surpassing current methods, TENG attains machine precision in step-by-step optimizations for various PDEs like the heat, Allen-Cahn, and Burgers’ equations. Key contributions include proposing TENG, developing efficient algorithms with sparse updates, demonstrating superior performance compared to state-of-the-art methods, and showcasing its potential for advancing PDE solutions.

    Machine learning in PDEs employs neural networks to approximate solutions, with two main strategies: global-in-time optimization, like PINN and deep Ritz method, and sequential-in-time optimization, also known as neural Galerkin method. The latter updates the network representation step-by-step, using techniques like TDVP and OBTI. ML also models PDEs from data, utilizing approaches such as neural ODE, graph neural networks, neural Fourier operator, and DeepONet. Natural gradient optimization, rooted in Amari’s work, enhances gradient-based optimization by considering data geometry, leading to faster convergence. They are widely used in various fields, including neural network optimization, reinforcement learning, and PINN training.

    The TENG method extends from the Time-Dependent Variational Principle (TDVP) and Optimization-Based Time Integration (OBTI). TENG optimizes the loss function using repeated tangent space approximations, enhancing accuracy in solving PDEs. Unlike TDVP, TENG minimizes inaccuracies caused by tangent space approximations over time steps. Moreover, TENG overcomes the optimization challenges of OBTI, achieving high accuracy with fewer iterations. TENG’s computational complexity is lower than that of TDVP and OBTI due to its sparse update scheme and efficient convergence, making it a promising approach for PDE solutions. Higher-order integration methods can also be seamlessly incorporated into TENG, improving accuracy.

    The benchmarking of the TENG method against various approaches showcases its superiority in relative L2 error both over time and globally integrated. TENG-Heun outperforms other methods by orders of magnitude, with TENG-Euler already comparable to or better than TDVP with RK4 integration. TENG-Euler surpasses OBTI with Adam and L-BFGS optimizers, achieving higher accuracy with fewer iterations. The convergence speed of TENG-Euler to machine precision is demonstrated, contrasting starkly with OBTI’s slower convergence. Higher-order integration schemes like TENG-Heun significantly reduce errors, especially for larger time step sizes, demonstrating the efficacy of TENG in achieving high accuracy.

    In conclusion, the TENG is an approach for highly accurate and efficient solving of PDEs using natural gradient optimization. TENG, including variants like TENG-Euler and TENG-Heun, outperforms existing methods, achieving machine precision in solving various PDEs. Future work involves exploring TENG’s applicability in diverse real-world scenarios and extending it to broader classes of PDEs. The wider impact of TENG spans multiple fields, including climate modeling and biomedical engineering, with potential societal benefits in environmental forecasting, engineering designs, and medical advancements.

    Check out the Paper. All credit for this research goes to the researchers of this project. Also, don’t forget to follow us on Twitter. Join our Telegram Channel, Discord Channel, and LinkedIn Group.

    If you like our work, you will love our newsletter..

    Don’t Forget to join our 40k+ ML SubReddit

    For Content Partnership, Please Fill Out This Form Here..

    The post Transforming Partial Differential Equations PDE Solutions with ‘TENG’: Harnessing Machine Learning for Enhanced Accuracy and Efficiency appeared first on MarkTechPost.

    Source: Read More 

    Facebook Twitter Reddit Email Copy Link
    Previous ArticleTransforming Teaching: How Generative AI is Enhancing Educator Tools and Methods
    Next Article MuPT: A Series of Pre-Trained AI Models for Symbolic Music Generation that Sets the Standard for Training Open-Source Symbolic Music Foundation Models

    Related Posts

    Machine Learning

    LLMs Struggle with Real Conversations: Microsoft and Salesforce Researchers Reveal a 39% Performance Drop in Multi-Turn Underspecified Tasks

    May 17, 2025
    Machine Learning

    This AI paper from DeepSeek-AI Explores How DeepSeek-V3 Delivers High-Performance Language Modeling by Minimizing Hardware Overhead and Maximizing Computational Efficiency

    May 17, 2025
    Leave A Reply Cancel Reply

    Hostinger

    Continue Reading

    Using Sitecore Connect and OpenAI: A Practical Example for Page Metadata Enhancement

    Development

    CSSWG Minutes Telecon (2024-08-14)

    Development

    Distribution Release: Grml 2025.05

    News & Updates

    MVP vs. MVE: Choosing the Right Path for Digital Product Success

    Web Development
    GetResponse

    Highlights

    Local Pan-Privacy for Federated Analytics

    May 1, 2025

    Pan-privacy was proposed by Dwork et al. (2010) as an approach to designing a private…

    Cybersecurity org ESET says get Linux if Windows 10 can’t be upgraded to Windows 11

    January 3, 2025

    OpenAI’s ChatGPT is back up after major outage

    June 5, 2024

    How an ‘internet of agents’ could help AIs connect and work together

    April 29, 2025
    © DevStackTips 2025. All rights reserved.
    • Contact
    • Privacy Policy

    Type above and press Enter to search. Press Esc to cancel.