@@ -23,17 +23,18 @@ class PCA() :
2323    transform(X) : 
2424        ->> give your data in X, which must be a 2D numpy array or python list. 2D meaning [[]] not [] (1D.) 
2525        ->> we will return each data point to X in new_ndims (passed in __init__) space 
26-         ->> all code (5 lines) for this method comes from Hands On Machine Learning (Edition 2) by Aurélien Géron  
26+         ->> all code (5 lines) for this method comes from Hands On Machine Learning (Edition 2) by Aurelien Geron  
2727    inverse_transform(X) : 
2828        ->> reverts each point in X to its original size 
2929        ->> keep in mind that it is very hard to get the exact same X as you originally had as PCA naturally loses 
3030        some of the variance. However, the structure and the shape will be preserved. 
3131    visualize_variance(X, representation_dims) : 
3232        ->> X is the data you want to be transformed in new_ndims space 
3333        ->> representation_dims is a list of all the dimensions you want to try your data in. For example if you give 
34-         [3, 4, 5, 6] your data will be tried in 3, 4, 5, animen xsions. The data dimension will be plotted on the 
35-         x-axis and the variance will be on the y-axis. This is to help you find which dimension you should turn your 
36-         data into (probably the one which has a good variance and is the lowest dimension.) 
34+         [3, 4, 5, 6] your data will be tried in 3, 4, 5, and 6 dimensions. The data dimension (number) will be plotted on the 
35+         x-axis and the variance (of that projection to 3, 4, 5, and 6 dimensions here) will be on the y-axis.  
36+         This is to help you find which dimension you should turn your data into (probably the one which has a good  
37+         variance and is the lowest dimension.) 
3738
3839    """ 
3940    def  __init__ (self , new_ndims ):
0 commit comments