The OCF and MV variables describe most of the value of your independent variable. R square is known as the coefficient of determination. Your second output with an R squared of 0.0139... shows that the remaining variables have effectively the same as zero influence on your independent variable.
The t stat tells you how many of its own standard errors the coefficient value is to the left (positive) or the right (negative) of zero. Gerenally the small the number of the t stat the closer to zero the coefficient's significance to the regression solution.
The p-value is the probability of F ratio appearing to the right of the F distribution (providing significance for the difference between the sum of the squared errors and the sum of the squares of the dependent variable being observed) Thus the larger the F test of the regression the more significant the model and smaller the p-value for the variable the more signficant that variable is likely to be to the regression.
However, the guidelines for the p-value, F ration and t statistic don't always provide the best regression coefficients and potential predictive value of the model. Get rid of the weakest and retest in combination to produce the highest R squared. Compare like models of similar R squared by F ratio and, however unlikely, you have similar R squared and similar F ratio models, look at the p-value and t statistic of each variable for potential exclusion for further testing.
By itself, the second regression output appears to show the other variables as insignificant except IPR. However, looking at your output, a fresh regression using IPR, OCF, Change in EMP, MV and LEV should be executed and evaluated.
Hank