Foundation models are central to AI’s influence on the economy and society. Transparency is crucial for accountability, competition, and understanding, particularly regarding the data used in these models. Governments are enacting regulations like the EU AI Act and the US AI Foundation Model Transparency Act to enhance transparency. The Foundation Model Transparency Index (FMTI) introduced in 2023 evaluates transparency across 10 major developers (e.g. OpenAI, Google, Meta) using 100 indicators. The initial FMTI v1.0 revealed significant opacity, with an average score of 37 out of 100, but also noted variability in disclosures.Â
The Foundation Model Transparency Index (FMTI), introduced in October 2023, conceptualizes transparency through a hierarchical taxonomy aligned with the foundation model supply chain. This taxonomy includes three top-level domains: upstream resources, the model itself, and its downstream use, encompassing 23 subdomains and 100 binary transparency indicators. FMTI v1.0 revealed widespread opacity among 10 evaluated companies, with top scores reaching only 54 out of 100. Open model developers performed better than closed ones. The index aims to track changes over time, encouraging transparency through public and stakeholder pressure, as demonstrated by historical indices like the HDI and the 2018 Ranking Digital Rights Index.
Researchers from Stanford University, MIT, and Princeton University presented the follow-up study (of FMTI v1.0) of FMTI v1.1 to evaluate the evolution of transparency in foundation models over six months, maintaining the 100 original transparency indicators. Developers were asked to self-report information, enhancing completeness, clarity, and scalability. Fourteen developers participated, revealing new information for 16.6 indicators on average.
FMTI v1.1 involves four steps: indicator selection, developer selection, information gathering, and scoring. The 100 indicators from FMTI v1.0 span three domains: upstream resources, the model itself, and downstream use. Fourteen developers, including eight from v1.0, submitted transparency reports for their flagship models. Information gathering shifted from public searches to direct developer submissions, ensuring completeness and clarity. Scoring involved two researchers independently assessing each developer’s disclosures, followed by an iterative rebuttal process. This approach improved transparency by allowing developers to provide additional information and reducing the researcher’s effort.
The execution of FMTI v1.1 is summarised by the researchers as follows:
Developer solicitation: Leadership at 19 companies developing foundation models were contacted, requesting the submission of transparency reports.
Developer reporting: Fourteen developers designated their flagship foundation models and submitted transparency reports addressing each of the 100 transparency indicators for their models.
Initial scoring: Developers’ reports were reviewed to ensure consistent scoring standards across all developers for each indicator.
Developer response: Scored reports were returned to developers, who then contested specific scores and potentially provided additional information. The finalized transparency reports, validated by the developers, were released publicly.
For evaluation, 14 developers submitted transparency reports on 100 indicators for their flagship models. Initial scores varied significantly, with 11 of 14 developers scoring below 65, indicating room for improvement. The mean and median scores were 57.93 and 57, respectively, with a standard deviation of 13.98. The highest-scoring developer scores points for 85 of the 100 indicators, while the lowest-scoring developer scores 33. Developers disclosed significant new information, improving transparency scores by an average of 14.2 points. Transparency was highest in downstream domains and lowest in upstream domains, with open developers generally outperforming closed developers. Transparency improved across all domains compared to the previous iteration.
The societal impact of foundation models is growing, drawing attention from various stakeholders. The Foundation Model Transparency Index shows that transparency in this ecosystem needs improvement, though there have been positive changes since October 2023. By analyzing developer disclosures, the Index helps stakeholders make informed decisions. Establishing transparency reporting for foundation models, the Index provides a valuable resource for downstream developers, researchers, and journalists to enhance collective understanding.
Check out the Paper, GitHub, and Project. All credit for this research goes to the researchers of this project. Also, don’t forget to follow us on Twitter. Join our Telegram Channel, Discord Channel, and LinkedIn Group.
If you like our work, you will love our newsletter..
Don’t Forget to join our 42k+ ML SubReddit
The post Transparency in Foundation Models: The Next Step in Foundation Model Transparency Index FMTI appeared first on MarkTechPost.
Source: Read MoreÂ