Skip to content

Warishayat/MachineLearning-DataScience100-Days

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

MachineLearning-DataScience100-Days

#100 days with Machine-Learning/Data-Science
Day1:
Today i read and practise all about csv files like how to import csv,and their functions like skiprows,indexcols,usecol, colums,na_pamater,convertor etc.
Day2:
Today i read and practise all about json and Sql files like how to import JSON,and How to work with Sql.
Day3:
Today i read and practise all about how to understand the data.
Day4:
Today i read about univariate Data Analysis and solve some practise level question.
Day5:
Today i read about bivariate Data Analysis and Multivariate Data analysis and solve some practise level datset.
Day6:
Today i read about Data_Profiling how to analyse the data in the form of html etc.
Day7:
Today i read about standard Scalar from Feature_Engineering.
Day8:
Today i read about normalization from Feature_Engineering.
Day9:
Today i read about Ordinanl Encoding from Feature_Engineering.
Day11:
Today i read about the ColumTransformer which is quite easy to do similar task that we do with labelencoder and ordinanl Encoder.
Day12:
Today i read about the sklearn pipelines and practise titanic datset.
Day13:
Today i read about the sklearn function transformer and practise titanic datset.
Day14:
Today i read about the sklearn power transformer.
Day15:
Today i make a project using all existing concept.
Day16:
Today i cover the mix data like how to handle mix data.
Day17:
Today i cover the how to handles date and time in the data.
Day18:
Today i cover the CCA -->Case complete analysis or removinf or droping the values.
Day19:
Today i cover the handling missing values or univariate handle mising values.
Day20:
Today i cover the handling missing categorical values.
Day21:
Today i cover the handling missing values with tecnique Fill with Random and Missing Indicator.
Day22:
Today i cover the Knn imputer and practise side by side comparision with simpleimputer,Knn imputer,missingindicator,random etc.
Day23:
Today i understand the concept of Multivariate Missing Imputer Concept.
Day24:
Today i wrote about the outlier at this day i take the concept of handling outlier with z score that mean data is normally distributed then we remove the outliers with tecniique zcsore
like if i talk about the formoula that is: z_score=x-mean/std or 3+x(standard_diviation) for positive side and 3-x(standard_diviation) then we read the conceptt of capping and trrimming the data.
Day25:
Today i read about the another tecnique of handling outlier is IQR we do solve some question the formoula that we used in the IQR method is
Q1-1.5iqr and Q3+1.5iqr .where IQR is the diffence between the value of 75% and 25%. and q1 is 25% of the total value where the q3 is the 75% of
pf the total value.
Day25:
Today i read about the percentile tcehnique by using that how we can detact and remove outliers.
Day26:
Today i read about the feature contsruction like how to construct new columns from existing data and i learn how to do feature split..
Day27:
Today i read about the curse of dimensionality where the dimensionality mean feature and feature mean columns that if we have feature more then algoritham that can make the
algoritham capacity decrease or it is not benifical for the algorithan and it would increase the sparsity of the columns becuase of that the data point separate/fare from the mean
in Curse of dimensionality use both for FEATURE SELECTION and FEATURE EXTRACTION.
Day28:
Today i read about the principle component analysis pca feature extraction technique what is that? how that is work.?
Day29:
Today i practise the old topic that i had done in past like future construction and handling mix data.
Day:30
Today i read about the simple linear regression how it is work but without mathematical intution.
Day:31
Today i read about the simple linear regression how it is work with mathamtics intutions.
Day:32
Today i read about the multiple linear regression how it is work without mathamtics intutions.
Day:33
Today i read about the multiple linear regression how it is work with mathamtics intutions.
Day:34
Today i read about the Mean Absoulute Error ,Mean Squared Error,Root mean Squared Error,R2_score and adjusted r2_score.
Day:35
Today i read about the Gradient descent
Day:36
Today i read about the Gradient descent type which is Batch Gradient Descent and their mathematical intuition and code from scracth.
Day:37
Today i read about the Gradient descent type which is stochastic Gradient Descent and their mathematical intuition. and code from scractch. And start practise and revise of all topic from start.
Day:38
Today i read about the Gradient descent type which is Minni Batch Gradient Descent and their mathematical intuition. and code from scractch.from start.
Day:39
First i revised the multiple linear regression and doin some paractise.
Than i read about the polynomial regression when the data is non linear.
Day:40
First i revised theGradient Descent and doin some paractise.
Then i read about Bias variance trade off mean the concept of underfitting and overfitting technique.
Then i read about about Regularization technique or ridge technique which i used for overfitting.
Day:41
i read about the ridge regression and solve problem on that.and make my own class from scratch.
i read about the ridge regression and solve problem with gradient decent by making my own class although there was some dimension error but i will check letar.
Day:42
i practise about the gradient descent type which is stochastic gradient descent.which take update with every row.
Day:43
i practise about the gradient descent type which is Minni Batch gradient descent.which take update with batch size.
i practise about the polynomial regression why that use what is the working behind that than practise a problem of polynomial.
Day:44
i practise about the Overfitting technique which is ridge regression i do that with sklearn library and with my own class and compare the accuracy between both of them.
Day:45
Today i read about the ridge regression key feature the five feature that are:
1:How to coefficent get affected by lambda?
2:Higher values are impace more?
3:Regularization effect on bais variance?
4:Lambda impact on loss function?
5:why it is called ridge?
Day:46
Today i read and practise about the lasso regression key features their background intutions:
Day:47
Today i read and practise about the lasso regression behind the mathematics like why lasso creat sparsity mean why coef_ gone zero if we increase lambda/alpha.
Than i read and practise about the Elasitic net regression which is another technique to reduce overfitting. it is the combination of both Ridge and Lasso.
Day:48
Today i read logistic regressiond and the basic concept of perceptron that using in logistic regression.
Day:49
Today i read about logistic regressiond with sigmoid function.
Day:50
Today i read about the loss function of the logistic regression or function that sklearn used inside the logistic regression working.
Than i read about the classification metrics and i read about the accuracy score classifiaction and Confusion metrics.
Day:51
Today i read about the Precision metrics,Recall metrics and F1_score and doin some practise about the metrics.
i do some pracrise of the precession,recall,f1_score and classification report.
Day:52
Today i read about the softmax regression when we have more than two classes.
Than about polynomial logistic regression than i read about the hyperparameter of the logissti regression.
Day 53: Today i read about the Disicion tree model what is disicion tree how it's work working of entropy etc.
Today i read about the Disicion Tree hyper parameter.
Today i read about the Disicion regression tree and explore the librarry Dtreeviz.
Day 54:
Today i read about the ensemble technique, and about the type pf the technique.Like bagging voting etc.
Than i read about the voting ensamble assumption of voting ensamble.
Than i read about the classfication voting ensamble hard voting and soft voting and practise voting ensemble on the iris dataset.
I do the coding of voting ensemble using diffrrent base model like logistic regression svm and disicion tree.
Than i read about the voting regressor.
Day 55:
Today i read about the bagging ensemble technique core idea of bagging intution of bagging etc. than i apply all the knowlede to a code.
Today i read about Bagging classifier and practise the code example of bagging ensemble with bootstraping than i read about the type of the bagging like pasting(without replacement) ,random subspaces (colum sampling) and random patches(both colums and row smapling).
Today i read about Bagging Regressor and doin a code example by using linear regression Disicion Tree regression and KnearestNeighbour with GridSearchCv to find the best aprameter.
Day 56: Today i read about the random forest algoritham basic about the algoritham.
Leave till 11 june because i have my final exam.
Leave till 11 june because mine 5th semester is ongoing.
Leave till 11 june because mine 5th semester is ongoing.
Leave till 11 june because mine 5th semester is ongoing.
Leave till 11 june because mine 5th semester is ongoing.
Leave till 11 june because mine 5th semester is ongoing