>people don't always believe evidence even if it's there.
I frankly admit some of the "facts" [0] known to me are wrong [1]. I know that for certain, as some of the "facts" are conflicting with each other, however that alone doesn't help with telling which is the wrong one, and which ones to base decisions & judgements off of. This causes various headaches; I end up resorting to fallible heuristics to try to sort out the good ones in time to make the necessary judgements and decisions. I actively try to gather more facts, hoping to improve decisive power in time.
However there's also a meta aspect: the trustworthiness of any given "fact" we learn. It's common to see people acting vigorously on information that's high impact but low trustworthiness. Another common sight is, as you say, people refusing to learn a new "fact "because it is in conflict with the other "facts" they already know, with little regard whether the new one is more trustworthy.
Somewhere along the road we fail, or maybe even refuse, to associate the "facts" we know with how much trust we can put in them. This is matter of handling and processing meta information, and frankly our current education and upbringing curricula don't seem to help much with it.
I hold it to be generally immoral to perform high impact acts based off of "facts" that are known with only low trustworthiness. And as you say science helps us with obtaining ever better set of facts.
--
[0] scare quotes to differentiate between information as it is known vs. idealized truthful facts
[1] either running counter to the idealized truthful facts, or imprecise enough to be misleading
Someone once told me that facts are political, that maybe we shouldn't evaluate all facts based on what's true but whether believing in them or not will create a better society and world to live in.
In many cases this will overlap, but in some it won't, and that means those facts that create a worse world if everyone believes them to be true should be considered as false no matter the actual truth.
My first feeling is that this might create severe trouble down the line at some point, but it might be less trouble than the alternative? An idea to ponder.
edit: The ideas in question touched worth of people, for example. We tie worth to things like earning power, intelligence and beauty. Changing how society views these things changes society. This is on the surface, but some aspects can go much deeper into who we are as a people, since we're storytellers.
I frankly admit some of the "facts" [0] known to me are wrong [1]. I know that for certain, as some of the "facts" are conflicting with each other, however that alone doesn't help with telling which is the wrong one, and which ones to base decisions & judgements off of. This causes various headaches; I end up resorting to fallible heuristics to try to sort out the good ones in time to make the necessary judgements and decisions. I actively try to gather more facts, hoping to improve decisive power in time.
However there's also a meta aspect: the trustworthiness of any given "fact" we learn. It's common to see people acting vigorously on information that's high impact but low trustworthiness. Another common sight is, as you say, people refusing to learn a new "fact "because it is in conflict with the other "facts" they already know, with little regard whether the new one is more trustworthy.
Somewhere along the road we fail, or maybe even refuse, to associate the "facts" we know with how much trust we can put in them. This is matter of handling and processing meta information, and frankly our current education and upbringing curricula don't seem to help much with it.
I hold it to be generally immoral to perform high impact acts based off of "facts" that are known with only low trustworthiness. And as you say science helps us with obtaining ever better set of facts.
--
[0] scare quotes to differentiate between information as it is known vs. idealized truthful facts
[1] either running counter to the idealized truthful facts, or imprecise enough to be misleading