Meaning of positivism

Definition of positivism

(noun) the form of empiricism that bases all knowledge on perceptual experience (not on intuition or revelation)

Other information on positivism

WIKIPEDIA results for positivism
Amazon results for positivism