Close Menu
    DevStackTipsDevStackTips
    • Home
    • News & Updates
      1. Tech & Work
      2. View All

      How To Prevent WordPress SQL Injection Attacks

      June 13, 2025

      Java never goes out of style: Celebrating 30 years of the language

      June 12, 2025

      OpenAI o3-pro available in the API, BrowserStack adds Playwright support for real iOS devices, and more – Daily News Digest

      June 12, 2025

      Creating The “Moving Highlight” Navigation Bar With JavaScript And CSS

      June 11, 2025

      Microsoft Copilot’s own default configuration exposed users to the first-ever “zero-click” AI attack, but there was no data breach

      June 13, 2025

      Sam Altman says “OpenAI was forced to do a lot of unnatural things” to meet the Ghibli memes demand surge

      June 13, 2025

      5 things we didn’t get from the Xbox Games Showcase, because Xbox obviously hates me personally

      June 13, 2025

      Minecraft Vibrant Visuals finally has a release date and it’s dropping with the Happy Ghasts

      June 13, 2025
    • Development
      1. Algorithms & Data Structures
      2. Artificial Intelligence
      3. Back-End Development
      4. Databases
      5. Front-End Development
      6. Libraries & Frameworks
      7. Machine Learning
      8. Security
      9. Software Engineering
      10. Tools & IDEs
      11. Web Design
      12. Web Development
      13. Web Security
      14. Programming Languages
        • PHP
        • JavaScript
      Featured

      QAQ-QQ-AI-QUEST

      June 13, 2025
      Recent

      QAQ-QQ-AI-QUEST

      June 13, 2025

      JS Dark Arts: Abusing prototypes and the Result type

      June 13, 2025

      Helpful Git Aliases To Maximize Developer Productivity

      June 13, 2025
    • Operating Systems
      1. Windows
      2. Linux
      3. macOS
      Featured

      Microsoft Copilot’s own default configuration exposed users to the first-ever “zero-click” AI attack, but there was no data breach

      June 13, 2025
      Recent

      Microsoft Copilot’s own default configuration exposed users to the first-ever “zero-click” AI attack, but there was no data breach

      June 13, 2025

      Sam Altman says “OpenAI was forced to do a lot of unnatural things” to meet the Ghibli memes demand surge

      June 13, 2025

      5 things we didn’t get from the Xbox Games Showcase, because Xbox obviously hates me personally

      June 13, 2025
    • Learning Resources
      • Books
      • Cheatsheets
      • Tutorials & Guides
    Home»Development»Machine Learning»Transformers Can Now Predict Spreadsheet Cells without Fine-Tuning: Researchers Introduce TabPFN Trained on 100 Million Synthetic Datasets

    Transformers Can Now Predict Spreadsheet Cells without Fine-Tuning: Researchers Introduce TabPFN Trained on 100 Million Synthetic Datasets

    April 15, 2025

    Tabular data is widely utilized in various fields, including scientific research, finance, and healthcare. Traditionally, machine learning models such as gradient-boosted decision trees have been preferred for analyzing tabular data due to their effectiveness in handling heterogeneous and structured datasets. Despite their popularity, these methods have notable limitations, particularly in terms of performance on unseen data distributions, transferring learned knowledge between datasets, and integration challenges with neural network-based models because of their non-differentiable nature.

    Researchers from the University of Freiburg, Berlin Institute of Health, Prior Labs, and ELLIS Institute have introduced a novel approach named Tabular Prior-data Fitted Network (TabPFN). TabPFN leverages transformer architectures to address common limitations associated with traditional tabular data methods. The model significantly surpasses gradient-boosted decision trees in both classification and regression tasks, especially on datasets with fewer than 10,000 samples. Notably, TabPFN demonstrates remarkable efficiency, achieving better results in just a few seconds compared to several hours of extensive hyperparameter tuning required by ensemble-based tree models.

    TabPFN utilizes in-context learning (ICL), a technique initially introduced by large language models, where the model learns to solve tasks based on contextual examples provided during inference. The researchers adapted this concept specifically for tabular data by pre-training TabPFN on millions of synthetically generated datasets. This training method allows the model to implicitly learn a broad spectrum of predictive algorithms, reducing the need for extensive dataset-specific training. Unlike traditional deep learning models, TabPFN processes entire datasets simultaneously during a single forward pass through the network, which enhances computational efficiency substantially.

    The architecture of TabPFN is specifically designed for tabular data, employing a two-dimensional attention mechanism tailored to effectively utilize the inherent structure of tables. This mechanism allows each data cell to interact with others across rows and columns, effectively managing different data types and conditions such as categorical variables, missing data, and outliers. Furthermore, TabPFN optimizes computational efficiency by caching intermediate representations from the training set, significantly accelerating inference on subsequent test samples.

    Empirical evaluations highlight TabPFN’s substantial improvements over established models. Across various benchmark datasets, including the AutoML Benchmark and OpenML-CTR23, TabPFN consistently achieves higher performance than widely used models like XGBoost, CatBoost, and LightGBM. For classification problems, TabPFN showed notable gains in normalized ROC AUC scores relative to extensively tuned baseline methods. Similarly, in regression contexts, it outperformed these established approaches, showcasing improved normalized RMSE scores.

    TabPFN’s robustness was also extensively evaluated across datasets characterized by challenging conditions, such as numerous irrelevant features, outliers, and substantial missing data. In contrast to typical neural network models, TabPFN maintained consistent and stable performance under these challenging scenarios, demonstrating its suitability for practical, real-world applications.

    Beyond its predictive strengths, TabPFN also exhibits fundamental capabilities typical of foundation models. It effectively generates realistic synthetic tabular datasets and accurately estimates probability distributions of individual data points, making it suitable for tasks such as anomaly detection and data augmentation. Additionally, the embeddings produced by TabPFN are meaningful and reusable, providing practical value for downstream tasks including clustering and imputation.

    In summary, the development of TabPFN signifies an important advancement in modeling tabular data. By integrating the strengths of transformer-based models with the practical requirements of structured data analysis, TabPFN offers enhanced accuracy, computational efficiency, and robustness, potentially facilitating substantial improvements across various scientific and business domains.


    Here is the Paper. Also, don’t forget to follow us on Twitter and join our Telegram Channel and LinkedIn Group. Don’t Forget to join our 90k+ ML SubReddit.

    🔥 [Register Now] miniCON Virtual Conference on AGENTIC AI: FREE REGISTRATION + Certificate of Attendance + 4 Hour Short Event (May 21, 9 am- 1 pm PST) + Hands on Workshop

    The post Transformers Can Now Predict Spreadsheet Cells without Fine-Tuning: Researchers Introduce TabPFN Trained on 100 Million Synthetic Datasets appeared first on MarkTechPost.

    Source: Read More 

    Facebook Twitter Reddit Email Copy Link
    Previous Articlememethesis-cli creates memes from the terminal
    Next Article SQL-R1: A Reinforcement Learning-based NL2SQL Model that Outperforms Larger Systems in Complex Queries with Transparent and Accurate SQL Generation

    Related Posts

    Machine Learning

    How to Evaluate Jailbreak Methods: A Case Study with the StrongREJECT Benchmark

    June 13, 2025
    Machine Learning

    Google AI Unveils a Hybrid AI-Physics Model for Accurate Regional Climate Risk Forecasts with Better Uncertainty Assessment

    June 13, 2025
    Leave A Reply Cancel Reply

    For security, use of Google's reCAPTCHA service is required which is subject to the Google Privacy Policy and Terms of Use.

    Continue Reading

    Critical Vulnerabilities in Browser Wallets Let Attackers Drain your Funds

    Security

    CVE-2025-32885 – “GoTenna v1 App Message Injection Vulnerability”

    Common Vulnerabilities and Exposures (CVEs)

    CVE-2025-5643 – “Radare2 Local Memory Corruption Vulnerability”

    Common Vulnerabilities and Exposures (CVEs)

    CVE-2025-5908 – TOTOLINK EX1200T HTTP POST Request Handler Buffer Overflow

    Common Vulnerabilities and Exposures (CVEs)

    Highlights

    CVE-2025-43547 – Bridge File Integer Overflow Arbitrary Code Execution

    May 13, 2025

    CVE ID : CVE-2025-43547

    Published : May 13, 2025, 6:15 p.m. | 49 minutes ago

    Description : Bridge versions 15.0.3, 14.1.6 and earlier are affected by an Integer Overflow or Wraparound vulnerability that could result in arbitrary code execution in the context of the current user. Exploitation of this issue requires user interaction in that a victim must open a malicious file.

    Severity: 7.8 | HIGH

    Visit the link for more details, such as CVSS details, affected products, timeline, and more…

    Enhance Validation Testing Precision with Laravel’s assertOnlyJsonValidationErrors

    April 28, 2025

    SynthID Detector — a new portal to help identify AI-generated content

    May 27, 2025

    CVE-2025-4631 – WordPress Profitori Plugin Privilege Escalation Vulnerability

    May 31, 2025
    © DevStackTips 2025. All rights reserved.
    • Contact
    • Privacy Policy

    Type above and press Enter to search. Press Esc to cancel.