SHAP Importance represents how a feature influences the prediction of a single row relative to the other features in that row and to the average outcome in the dataset. The SHAP Importance card in Kraken aggregates the values to provide a general indication of relative influence among the features in the dataset.
The goal of SHAP is to explain the prediction of an instance x by computing the contribution of each feature to the prediction. The SHAP explanation method computes Shapley values from coalitional game theory. The feature values of a data instance act as players in a coalition. Shapley values tell us how to fairly distribute the "payout" (the prediction) among the features. A player can be an individual feature value or a group of feature values.
SHAP Importance is an alternative to Permutation Importance. There is a big difference between both importance measures: Permutation Importance is based on the decrease in model performance, while SHAP Importance is based on magnitude of feature attributions.
A more detailed dive into SHAP Importance is available here.