Access the full text.
Sign up today, get DeepDyve free for 14 days.
[David Hume (1748) famously said, “when anyone tells me, that he saw a dead man restored to life, I immediately consider with myself, whether it be more probable, that this person should either deceive or be deceived, or that the fact, which he relates, should really have happened.” Of course, intentionally deceptive information on many topics (not just reports of miracles) can interfere with our ability to achieve our epistemic goals of acquiring true beliefs and avoiding false beliefs. Thus, it would be beneficial to reduce the spread of such disinformation. In order to do this, we need to identify what sorts of things affect the amount of disinformation and how they affect it. Toward this end, I offer an analysis of what disinformation is. I then use this analysis to develop a game-theoretic model (which is inspired by the work of Elliott Sober and of Brian Skyrms and which appeals to philosophical work on epistemic values) of the sending and receiving of disinformation.]
Published: Mar 20, 2014
Keywords: Equilibrium Point; True Belief; Pure Strategy; Epistemic State; Monarch Butterfly
Read and print from thousands of top scholarly journals.
Already have an account? Log in
Bookmark this article. You can see your Bookmarks on your DeepDyve Library.
To save an article, log in first, or sign up for a DeepDyve account if you don’t already have one.
Copy and paste the desired citation format or use the link below to download a file formatted for EndNote
Access the full text.
Sign up today, get DeepDyve free for 14 days.
All DeepDyve websites use cookies to improve your online experience. They were placed on your computer when you launched this website. You can change your cookie settings through your browser.