Cargando…
Focused Forecasts: Attention maps for large-scale model understanding in the AtmoRep project
<!--HTML-->The interpretability of machine learning models remains a critical yet elusive aspect of contemporary computational science. In this presentation, I specifically explore the interpretability of machine learning algorithms by applying self-attention maps to the AtmoRep large-scale we...
Autor principal: | |
---|---|
Lenguaje: | eng |
Publicado: |
2023
|
Materias: | |
Acceso en línea: | http://cds.cern.ch/record/2867767 |
Sumario: | <!--HTML-->The interpretability of machine learning models remains a critical yet elusive aspect of contemporary computational science. In this presentation, I specifically explore the interpretability of machine learning algorithms by applying self-attention maps to the AtmoRep large-scale weather prediction model. By leveraging self-attention mechanisms, I present a method to analyze the internal structure and dependencies within the model's layers. This technique enables me to interpret the intricate relationships between meteorological variables and the resultant predictions. The application of self-attention maps presents an essential step towards a more transparent and scientifically rigorous approach to interpreting large-scale weather modeling, offering potential implications for advancements in climate science and meteorological forecasting.
Relevant buzzwords: AI, ML, HPC, Cloud Computing, Transformer, All You Need, Digital Twin, Computer Vision, Big Data
Irrelevant buzzwords: Blockchain, IOT, VR, AR, Quantum Computing, QML, FCC, Exascale, NLP, Beyond the Standard Model |
---|