peggy30 commited on
Commit
40976c7
·
1 Parent(s): 3303745

add permutation

Browse files
pages/ALE.py CHANGED
@@ -66,6 +66,7 @@ def main():
66
 
67
  st.title("ALE (Accumulated Local Effects)")
68
  st.write(prompt_params.ALE_INTRODUCTION)
 
69
  # Explain the selected sample
70
  if st.button("Explain Sample"):
71
  explain_example()
 
66
 
67
  st.title("ALE (Accumulated Local Effects)")
68
  st.write(prompt_params.ALE_INTRODUCTION)
69
+ st.write("now has bug, waiting for fix")
70
  # Explain the selected sample
71
  if st.button("Explain Sample"):
72
  explain_example()
pages/PermutationFeatureImportance.py CHANGED
@@ -50,7 +50,6 @@ def explain_example():
50
  st.pyplot(fig)
51
 
52
  fig, ax = plt.subplots(figsize=(10, 5))
53
- st.write("2D Second-Order ALE Plot")
54
  ax.boxplot(perm_imp.importances[sorted_idx].T,
55
  vert=False, labels=X_test.columns[sorted_idx])
56
  ax.set_title("Permutation Importances")
@@ -65,7 +64,7 @@ def main():
65
  train_model()
66
 
67
  st.title("ALE (Accumulated Local Effects)")
68
- st.write(prompt_params.ALE_INTRODUCTION)
69
  # Explain the selected sample
70
  if st.button("Explain Sample"):
71
  explain_example()
 
50
  st.pyplot(fig)
51
 
52
  fig, ax = plt.subplots(figsize=(10, 5))
 
53
  ax.boxplot(perm_imp.importances[sorted_idx].T,
54
  vert=False, labels=X_test.columns[sorted_idx])
55
  ax.set_title("Permutation Importances")
 
64
  train_model()
65
 
66
  st.title("ALE (Accumulated Local Effects)")
67
+ st.write(prompt_params.PERMUTTATION_INTRODUCTION)
68
  # Explain the selected sample
69
  if st.button("Explain Sample"):
70
  explain_example()
src/prompt_config.py CHANGED
@@ -118,3 +118,15 @@ The process of ALE includes the following steps:
118
 
119
  By using ALE, **interpretability** improves by capturing localized effects while mitigating bias from correlated features, making model explanations more reliable.
120
  """
 
 
 
 
 
 
 
 
 
 
 
 
 
118
 
119
  By using ALE, **interpretability** improves by capturing localized effects while mitigating bias from correlated features, making model explanations more reliable.
120
  """
121
+ PERMUTATION_INTRODUCTION = """
122
+ Permutation Feature Importance is an interpretable machine learning technique that evaluates feature importance by measuring the impact of shuffling a feature’s values on model error.
123
+
124
+ The process of Permutation Feature Importance includes the following steps:
125
+ 1. **Compute Initial Model Error**: Measure the model’s baseline performance using a metric like Mean Squared Error (MSE).
126
+ 2. **Permute a Feature**: Randomly shuffle the values of a selected feature while keeping all other features unchanged.
127
+ 3. **Compute New Model Error**: Re-evaluate the model on the dataset with the shuffled feature to obtain a new error score.
128
+ 4. **Calculate Feature Importance**: Compute the difference between the new error and the original error. A larger difference indicates higher importance.
129
+ 5. **Repeat for Stability**: Perform multiple repetitions of permutation and calculate the mean and standard deviation of feature importance scores for reliability.
130
+
131
+ By using Permutation Feature Importance, **interpretability** improves by providing a direct measure of a feature’s contribution to predictive performance, making model explanations more intuitive and robust.
132
+ """