Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Technology Connections had a great observation about this in his video about touch lamps: the problem with Google's knowledge graph is that it takes everything it reads on the internet at face value.

https://youtu.be/TbHBHhZOglw?t=58 (0:58 through 4:20)



> takes everything it reads on the internet at face value.

it's terrible but a lot of people [0] think digital information, and thus internet, is truth

[0] and if google itself falls for this.. no wonder if people do too


Google isn’t interested in truth, they are interested in information. They are provide a search engine not a truth engine.

Sarcasm aside, how do we propose they determine what is truth? If we assume the internet is full of information and more truthful than not, then Google’s assumption could be accurate. Of course they do try and solve this with the knowledge graph and expert curation. Connections to verified information might give validity to that information, but not always.


Google isn’t interested in truth, they are interested in information. They are provide a search engine not a truth engine

Google has been transitioning to attempting to provide a "truth engine" for several years. Whenever I try a complex key-word search, it suggests a question format for it (often with worse result but sometimes OK). When I have finally got the key words down to filter just what I want, google whines about "Not very many results, here's what you should do..." and, of course, Google often gives explicit answers for questions in it's search results (a notable percentage of which are wrong as noted).

And Google being half-assed truth engine is all sorts of bad...


That doesn't need to be sarcasm, because it's true: at its core, Google's search is a method of finding information, not a method of directly ascertaining truth. It's not really possible for it to be a truth engine, and if you realize that, it's not even a flaw. You're left with a way of finding information you will need to evaluate for yourself, which is fine.

The problem is in the presentation: Google's tools in general, not limited to search, present themselves as though they can identify truth. That's the flaw, the lie, if you prefer.


Given that they are actively curating the information and censoring "misinformation", they certainly think they are a truth engine. And present it this way. Of course you'd only believe it if you believe Google is omniscient, omnipotent and benevolent.


That's the problem, isn't it. Most people want to be good, so most information on the internet is the-truth-as-they-know-it. It lulls you into a false sense of security.


I don't, I stopped caring about truthes, at least in a large scale social context.. now if someone could tell this to the world.


if you peek outside of math/physics it's pretty much landscape of relative truths; imho ideal thruth/fake news detecting machine would simply require axiomatic/weighted input, ie. "I trust MIT with public key 0x..., youtube jesus from la with public key 0x... and my childhood mate 0x..." – based on that is "X true, false or undefined"? (weighted output). Because I trust MIT with weight ie. 500% and MIT trusts Caltech, my trust graph will favour Caltech view of the world. Yes you can throw blockchain and AI into it and it actually makes sense.


Worth watching for the capacitater joke alone




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: