Episode

LHC scientists prototype data analysis solution on Azure Machine Learning

The Large Hadron Collider at CERN is the largest physics machine ever built, and experiments using the collider generate close to an exabyte (one billion gigabytes) of data in the quest to understand the mysteries of the universe. Machine learning on the global-sized network and speed of FPGAs have the scope to improve data analysis for particle physics. LHC scientists from Fermilab, CERN, MIT, the University of Washington and other institutions worked with Microsoft to prototype a solution to their zettabyte LHC data challenge.