GPU-based regression analysis on sparse grids
Prediction and forecasting has become very important in modern society. Regression analysis enables to predict easily based on given data. This paper focuses on regression analysis on sparse grids using the existing toolbox Sparse Grid ++ (SG++). The core workload of the regression analysis will be implemented on graphics cards using NVIDIA's Compute Unified Device Architecture (CUDA). Therefore, we give guidance how to get high performance when dealing with this particular problem using CUDA enabled graphics cards. We also focus on problems where the datasets are larger than the available device memory. Finally, we present test results for real-world and artificial datasets.
Full Text: PDF