1 Answer | Add Yours
In the social sciences, positivism has come to be something of a dirty word and so most social scientists would not identify themselves as positivists.
Positivism, in the social sciences, refers to the idea that human institutions and societies can be studied in much the same way that natural phenomena can be. Positivists try to take the methods of the natural sciences and apply them to the social sciences.
Positivists (though they might not call themselves by that name) base their studies on the idea that the world of human society is a real world that has real properties that can be studied scientifically. It will be possible to study this world and derive laws about it that will be as solid as laws in the physical sciences are. Positivists, therefore, are interested in using empirical studies rather than more theoretical or philosophical studies. By gathering data, they feel they will be able to discern laws that govern human societies.
It is this sort of perspective that allowed Durkheim (since you asked about him a bit earlier) to believe that sociologists could diagnose what was wrong with human societies and come up with remedies for those problems.
We’ve answered 315,697 questions. We can answer yours, too.Ask a question