R square is the statistical measure that is used to measure the goodness of fit of our regression model. It measures the proportion of the variation in your dependent variable explained by all of your independent variables in the model. It assumes every independent variable in the model helps to explain the variations in the dependent variable. In reality, some independent variables do not help to explain dependent variables.
Mathematically, R square is calculated by dividing sum of square of residuals (SSres) by total sum of squares (SStot) and then subtracted it from 1.
As SSres + SSreg = SStot, R^2 = Explained variation / Total variation
R squared is called coefficient of determination. It lies between 0% to 100%. Value of 100% means r squared explains all the variations of the target variable. And a value of 0% measures zero predictive power of the model. So higher the R-square, better the model is.
Adjusted R-Squared :
Adjusted R-squared is a better measure to find the goodness of fit of a model compared to r-squared. Adjusted R-squared will improve only if added independent variable to model is significant. It measures the proportion of variation explained by only those independent variables that really help in explaining the dependent variable. It penalizes you for adding independent variable that do not help in predicting the dependent variable.
Adjusted R-squared can be calculated mathematically in terms of sum of squares. Degree of freedom is the only difference between R-squared and adjusted r-squared.