M

Main effect
This is the effect that a given explanatory variable has an on an outcome variable. In a main effects model there are no terms for interactions between explanatory variables, so the main effects represent the unique effect of each explanatory variable on the outcome. While interpretation of the model is much simpler when only main effects are specified, ths can overly simplify the situation where there are strong interactions between explanatory variables. See for example MLR module 3.113.13.

Maximum Likelihood Estimation
Maximum likelihood estimation is a statistical method for a model to the data. The process itself is very technical but luckily SPSS will do it for you! Basically, maximum likelihood estimation selects values for the parameters of the explanatory variables that most closely predict the actual outcome, doing this through an iterative process of successive approximation. The process calculates the probability that the data could have been caused by various explanatory variables and continues until it settles on the combination of parameters that give the highest probability  the most likely!

Multilevel regression models
Multilevel regression models can take account for clustering in data sets to more accurately model complex multilevel social datasets.
They are rather complex and we don't discuss them on this site. However, if you are ready to take the challenge and learn about them then we can highly recommend an excellent site called LEMMA.

Multicollinearity
This occurs when two or more explanatory variables are very strongly correlated (usually above 0.80). It is can be problematic in regression analysis as it implies the two explanatory may actually be measuring the same phenomena. In such cases it may be best to use only one of the two variables, or to create a new variable that is a weighted combination of the two. For example a measures of SocioEconic Status (SES) could be derived by a weighted combination of variables such as parental education, occupation and income.

Multiple Linear Regression
Multiple linear regression is similar to simple linear regression but it can produce more expansive models by allowing researchers to include two or more explanatory variables. The formula for multiple linear regression is shown below. Multiple linear regression is the topic of Module 3.
Y_{i} = (b_{0}+b_{1}X_{1}+b_{2}X_{2}+...+b_{n}X_{n}) + ε_{i}
 Y = outcome variable, X_{1} = first explanatory variable, X_{2} = second explanatory variable, X_{n} = nth explanatory variable, b_{0 }= value of outcome when all explanatory variables are zero, b_{1 }= regression coefficient for the first explanatory variable, b_{2 }= regression coefficient for the second explanatory variable, b_{n }= regression coefficient for the nth explanatory variable, ε_{i} = error.

Multiple R
Multiple R is the correlation between the actual values of an outcome variable and the values predicted by a multiple regression model. Multiple R is similar to Pearson's r for the purposes of interpretation and is useful for making decisions regarding how well a model fits the data.
