Belief

A belief is an attitude toward a proposition.

Contents

1   Study

Epistemology is a branch of philosophy that studies knowledge and belief, how it is acquired, how it is justified, and what is true. [1]

2   Function

Knowledge expands what actions we are capable of; it increases our freedom.


Knowledge is only useful when it is organized to be actionable and can be guided by wisdom.

3   System

3.1   Logical Positivism (Logical Empiricism)

Two sources of knowledge:

  • Logical reasoning (A priori)
  • Empirical experience (A posteriori)

A statement is meaningful if and only if it can be proved true or false by mean of experience

  • "Verifiability principle"
  • The meaning of a statement is its method of verification
    • We know the meaning of a statement if we know the conditions under which the statement is true or false
    • Philosophy becomes the activity by means of which the meaning of statements is clarified and defined

Statements are either true, false, or meaningless.

3.2   Reductionism

Reductionism is a philosophical position which holds that a complex system is nothing but the sum of its parts, and that an account of it can reduced to accounts of its individual constituents.

Kinds:

  • Theoretical reductionism
  • Methodological reductionism
  • Ontological reductionism

4   Properties

4.1   True & False

A belief may or may not be true. A belief which is true is called a fact.

4.2   Justified & Unjustified

Most beliefs seem to be unjustified.

It was easier to know it than to explain why I know it. If you were asked to prove that two and two made four, you might find some difficulty, and yet you are quite sure of the fact. - Sherlock Holmes

Knowledge is graded based on how credible we believe it to be.

  • Certainty is the highest level of belief: the complete lack of doubt.
  • Doubt is a status between belief and disbelief.

How do we know what we know?

See: criticism

People who have not studied philosophy will often dismiss it because, roughly, they believe we should only take claims about knowledge seriously if they involve something that we test empirically. And, since we can't empirically test claims about morality, then there is nothing to morality beyond people's attitudes or beliefs about what's right or wrong. The most obvious object to this claim is that the same is true of mathematics, yet we take its claims seriously.

4.3   Public & Private

Knowledge that is private is called a secret. (For example, a trade secret.)

4.4   Intuitive & Counterintuitive

Knowledge is intuitive if it can be directly perceived, without the need for instructors. For example, how to run.

Knowledge is counterintuitive if it appears to be intuitive, but is not. With counterintuitive topics, you cannot trust your instincts and it is useful to have instructors. For example, skiing. Part of learning counterintuitive topics is learning to suppress impulse and stay conscious, until new habits are formed.

“Intuition is defined as knowing without knowing how you know,” he explained. “That’s the wrong definition. Because by that definition, you cannot have the wrong intuition. It presupposes that we know, and there is really a prejudice in favor of intuition. We like intuitions to be right.”

According to Kahneman, a better definition — or a more precise one — would be that “intuition is thinking that you know without knowing why you do.” By this definition, the intuition could be right or it could be wrong, he added.

“Intuitions of master chess players when they look at the board [and make a move], they’re accurate,” he said. “Everybody who’s been married could guess their wife’s or their husband’s mood by one word on the telephone. That’s an intuition and it’s generally very good, and very accurate.”

According to Kahneman, who’s studied when one can trust intuition and when one cannot, there are three conditions that need to be met in order to trust one’s intuition.

The first is that there has to be some regularity in the world that someone can pick up and learn.

The second condition for accurate intuition is “a lot of practice,” according to Kahneman.

And the third condition is immediate feedback. Kahneman said that “you have to know almost immediately whether you got it right or got it wrong.”

When those three kinds of conditions are satisfied, people develop expert intuition.

5   Classification

5.1   Knowledge

Knowledge is justified true belief. [*]

Knowledge includes fact, description information, skill.

http://plato.stanford.edu/entries/episteme-techne/

The Greeks divided knowledge into three kinds: sophos, techne, and epistme.

5.2   Biologically primary and biologically secondary

Biologically primary knowledge refers to knowledge that humans have evolved to rapidly acquire. For example, the ability to walk and speak.

Biologically secondary knowledge refers to knowledge which can acquire, but which we are not evolved to do. For example, carrying out mathematical operations or playing chess.

This distinction can explain why toddlers, but not computers, can walk and use language with ease and why computers, but not toddlers, can play chess with ease.

5.2.1   Acquaintance & Description

Knowledge by acquaintance is knowledge obtained through direct perception.

  • One's self
  • Pain

Knowledge by description is knowledge obtained through mediated perception or inference.

  • The center of mass of the universe
  • Physical objects
  • Past events
  • Future states of affairs
  • Another person's mind

Note: All thinking has to start from acquaintance; but it succeeds in thinking about many things with which we have no acquaintance.

5.2.2   Mediate & Immediate

Knowledge is mediated iff ...

  • Past experience and reflection enables us to recognize things in the stream of impressions.

Knowledge is immediate iff it was gained without proof by a direct contemplate of truth.

6   Acquisition

Since knowledge is only useful if it is actionable, the acquisition of knowledge should be guided by reason rather than by whim_. This means that before seeking knowledge, a person should state their goals for the knowledge (a sort of book proposal), and argue for the contents of and the order of their course. For example, rather than storing a unsorted list of books (which is not helpful), a person should store them in order, with annotations describing why they are on the list, and a description of why they are in the order that they are.

The two primary ways of acquiring knowledge are discovery and communication.

Knowledge can be acquired actively or passively.

See science.

6.1   Passive learning

Passive learning would consists of increasing the number of feedback loops and decreasing their cycle time. This is somewhat dangerous because it is possible that customers may never propose the kind of improvements they would need. For instance, listening to customers would have led to faster horses, not cars.

6.2   Active learning

Active learning would consist of proposing a hypothesis, and then building a MVP to test the hypothesis.

7   Usage

static/images/Athena.jpg

Mattei Athena at Louvre. Roman copy from the 1st century BC/AD after a Greek original of the 4th century BC, attributed to Cephisodotos or Euphranor.

Wisdom or sophistication is knowledge of how to use knowledge; it is knowing what is good and bad and how to gain good things and avoid bad things; it knowing what to do (i.e. what is of value), and when and how to do so. For example, wisdom is knowledge of good food, music, and art. The opposite of wisdom is folly (foolishness) or ignorance; willfully not using knowledge one has or failing to seek out knowledge that one ought to know.

We do not consider any of the senses to be Wisdom. They are indeed our chief sources of knowledge about particulars, but they do not tell us the reason for anything, as for example why fire is hot, but only that it is hot... The man of experience is held to be wiser than the mere possessors of any power of sensation, the artist than the man of experience, the master craftsman than the artisan; and the speculative sciences to be more learned than the productive. Thus it is clear that Wisdom is knowledge of certain principles and causes.

Since we are investigating this kind of knowledge, we must consider what these causes and principles are whose knowledge is Wisdom. Perhaps it will be clearer if we take the opinions which we hold about the wise man. We consider first, then, that the wise man knows all things, so far as it is possible, without having knowledge of every one of them individually; next, that the wise man is he who can comprehend difficult things, such as are not easy for human comprehension (for sense-perception, being common to all, is easy, and has nothing to do with Wisdom); and further that in every branch of knowledge a man is wiser in proportion as he is more accurately informed and better able to expound the causes. Again among the sciences we consider that that science which is desirable in itself and for the sake of knowledge is more nearly Wisdom than that which is desirable for its results, and that the superior is more nearly Wisdom than the subsidiary; for the wise man should give orders, not receive them; nor should he obey others, but the less wise should obey him.

—Aristotle, Metaphysics, Book I

Ignorance is distinguished from stupidity, although both can lead to bad action. For example, when a child touches a fire for the first time and gets burned, we would call them stupid but not a fool because they were not expected to know that touching fire causes a burn. But we call adult fools when they touch fires and get burned because we expect them to know the consequences.

Similarly, if a person falls for a scam, or if a person buys a car without inspecting it first only to discover it is a lemon or that it doesn't fit in their garage or that they could have gotten it for a cheaper price, we call these people fools, because we expected them to know that they should know to do due diligence before buying. At a higher level, we could call people for not realizing business opportunities.

Gaining knowledge faster than wisdom can be dangerous. For this reason, we do not let children use matches, hold weapons, or otherwise make important decisions. Similarly, we can argue that men should not have invented nuclear weapons and should not attempt to invent `strong AI`_ or study cloning. Similarly, we frequently do not teach people what they are capable of. For example, we not teach children about sex because they might become parents.

7.1   Etymology

The Greek word for wisdom is "sophos", forming the root of words such as "sophist", "philosophy", "sophisticated", "sophomore", and "Sophia".

To Socrates, philosophy was literally the love of Wisdom.

7.2   Study

The study of wisdom is called philosophy.

7.3   Acquisition

Wisdom can be acquired through studying history.

Wisdom requires experience.


Related to pattern recognition.

7.4   Distinction

We are not suffering from too much reason but not enough.[1] Knowledge inquiry is profoundly irrational when judged from the standpoint of contributing to human welfare.[1] Scientific rationality merely poses as rationality.

Knowledge inquiry demands a sharp split between social and humanitarian aims and the intellectual aim. The intellectual aim is to acquire knowledge of truth.

Four elementary rules of reason are:[1]

  1. Articulate and seek to improve the articulation of the basic problems to be solved.
  2. Propose and critically assess alternative possible solutions.
  3. When necessary, breaks up the basic problems to be solved into a number of specialized problems - preliminary, simpler, analogs, subordinate problems, in an attempt to work gradually toward a solution to the basic problems to be solved.
  4. Inter-connect attempts to solve the basic problem and specialized problems, so that basic problem solving may guide, and be guided by, specialized problem solving.

No enterprise that violates 1 and 4 can be judged rational.[1]

Granted academic inquiry has as its fundamental aim to help promote human welfare by intellectual and educational means, then the problems that inquiry fundamentally ought to try to help solve are problems of living, problems of action.[1]

8   Grading

It is the mark of an educated man to look for precision in each class of things just so far as the nature of the subject admits; it is evidently equally foolish to accept probable reasoning from a mathematician and to demand from a rhetorician scientific proofs. - Aristotle, Nicomachean Ethics, Book 1, Chapter 3

8.1   Credibility

One simple way to distinguish between credibility of sources is whether they cite sources. If they don't, it's probably not worth trusting on its own.

Would be curious to see how credibility of higher tiered papers/magazines is listed.

A search engine that prioritized credibility (unlike Google) would be really valuable.

9   Contrarianism

Peter Thiel is a fan of asking the question "What is one thing you believe to be true that most people don't?". It's another way of asking "What is an unpopular opinion you hold".

Tyler Cowen's habit of asking people "Underrated vs overrated: X" encourages contrarian thinking.

10   Competence

Competence is the ability to do something successfully.

Conscious incompetence is necessary for deliberate self-improvement. (One may be unconscious incompetent and improve with the aid of a coach or otherwise accidentally.)

10.1   Four stages of competence

Competence has four stages:

  1. Unconscious incompetence: A person does not know that they do not know something.
  2. Conscious incompetence: A person knows that they do not know something.
  3. Conscious competence: A person knows that they know something.
  4. Unconscious competence: A person does not know that they know something.

---

This is an interesting model for a lot of things.

Perhaps most commonly, it is a good model for one's ability to teach, since teaching comes with the curse of knowledge. (That is to say, many teachers do not know they do not know how to teach, fewer teacher know they don't know, fewer still know how to teach, and very few are so proficient they forget what they know.)

The last stage is called expertise or mastery, and is pretty much the definition of intuition.

The stages also model a person's confidence about a domain. Incompetent people and masters are the most confident, while amateurs tend to have little.

It also models ignorance in many ways. Masters become so accustomed to knowing things they forget that they ever learned them, and consequently forget that it could be different. Hence the hatred for American tourists. In that sense, a master of a subject become an incompetent in more general subjects. (An expert in American culture, ie an American, is an idiot in world culture.) This error is one of induction.


Curse of Expertise is not that experts forgot how they learned; it's that they don't really KNOW what they know & use

https://pbs.twimg.com/media/BgtAAnYCEAAjxRl.jpg:large


If professr needs to prepare to give a class, don't attend. People should only teach what they have learned organically… or get another job.

Yes but some imbeciles are not getting the aphorism. Didn't say prof SHOULD not prepare, said shd NOT NEED to.

The same goes for exams. Cramming is a tool used by those lacking true knowledge.

11   Experts

How do we trust experts?

12   Compounding

13   Notes

[*]This was argued by Plato, but is subject to criticism.

14   See also

15   Credible resources

16   Further reading

17   References

[0]Maritain, 1932. The Degrees of Knowledge.
[1]Wolf 2008
[2]Benedict Carey. Sep 4, 2014. Why Flunking Exams Is Actually a Good Thing. http://www.nytimes.com/2014/09/07/magazine/why-flunking-exams-is-actually-a-good-thing.html?action=click&pgtype=Homepage&version=Moth-Visible&module=inside-nyt-region&region=inside-nyt-region&WT.nav=inside-nyt-region&_r=0
[3]From Knowledge to To Wisdom, Nicholas Maxwell, 2008
[4]Aristotle, Metaphysics
[5]Richard Hamming. March 28, 1995. The Art of Doing Science and Engineering: Learning to Learn. https://www.youtube.com/watch?v=AD4b-52jtos&list=PL2FF649D0C4407B30&index=1
[6](1, 2) Justin McByarer. March 2 2015. Why Our Children Don’t Think There Are Moral Facts. http://opinionator.blogs.nytimes.com/2015/03/02/why-our-children-dont-think-there-are-moral-facts/?_r=0
[7]Paul Graham. Dec 2014. How to be an expert in a changing world. http://paulgraham.com/ecw.html

There are four large problems we have evolved to deal with: 1) how do we filter large amounts of information, 2) how we construct meaning from information, 3) acting quickly before we lose out on opportunities, and 4) how to remember what's important so we can improve over time.

Not sure I like this source.



What I mean by having porous knowledge is having memory of related facts, without understanding their relations. It's basically rote -- think memorizing a random series of numbers, word in a foreign language, or people's names -- except that in truth there is some relation to it all.


On understanding: it is not enough to be able to act. See Chinese Room. There is something more to it.


Knowledge generates stupidity like a ripple on a pond. With the increase circumference is more stupidity.

Question propagation. - Kant

See: Competence

The purpose of knowledge is to be able to ask thoughtful question. Not just to have knowledge.


Fool's Paradise


http://en.wikipedia.org/wiki/Fluid_and_crystallized_intelligence


Paul J. H. Schoemaker and Robert E. Gunther. June 2006. The Wisdom of Deliberaet Mistakes.

Why focus on faith? The main reason is that it’s a faulty epistemology, that is, a bad way of arriving at the truth. Apart from relativists, most people want to believe things that are true. But too often this noble desire is dwarfed by the desire to avoid cognitive dissonance and uncertainty. Boghossian recognises that faith panders to the latter, and it therefore prioritises psychological satisfaction over truth. With such a faulty epistemology in place, a person is left vulnerable to lies, manipulation, poor life choices, conspiracy theories, woo medicine, and other dangers. To allow for a more reliable option, he wants to eradicate faith — not by coercion, which never works and is a pointless violation of rights, but by gently persuading others to see why faith isn’t a reliable path to knowledge.


Before the breakup of AT&T’s Bell System, U.S. telephone companies were required to offer service to every household in their regions, no matter how creditworthy. Throughout the United States, there were about 12 million new subscribers each year, with bad debts exceeding $450 million annually. To protect themselves against this credit risk and against equipment theft and abuse by customers, the companies were permitted by law to demand a security deposit from a small percentage of subscribers. Each Bell operating company developed its own complex statistical model for figuring out which customers posed the greatest risk and should therefore be charged a deposit. But the companies never really knew whether the models were right. They decided that the way to test them was to make a deliberate, multimillion-dollar mistake.

For almost a year, the companies asked for no deposit from nearly 100,000 new customers who were randomly selected from among those considered high risks. It was clearly a mistake: Some of these customers would surely not pay their bills or would run off with the equipment, costing the providers millions. But the companies were concerned enough about what they didn’t know to study how these customers compared with the rest of the population.

To the companies’ surprise, many of the presumed bad customers paid their bills fully and on time and did not steal or damage the phones. Armed with these new insights, Bell Labs helped the operating companies recalibrate their credit scoring models and institute a much smarter screening strategy, which added, on average, $137 million to the Bell System’s bottom line every year for the next decade.

While few companies are willing to commit to a course that looks like an error, the power of intentionally taking the wrong road can be seen in the high payoffs that have come from strategies that initially seemed like mistakes. Great business ideas such as FedEx’s distribution system were judged by savvy people to be wrongheaded. Before Enterprise, it was considered foolish to offer rental cars anywhere except at an airport or city center. Thomas Edison doggedly pursued the development of the phonograph even though he considered the idea to have no commercial value. He made the “mistake” of investing time and energy in an invention that he assumed not many people would buy. When advertising pioneer David Ogilvy tested his ideas, he deliberately included ads that he thought would not work in order to test and improve his decision rules for evaluating advertising. Most of the mistake ads were, as expected, dismal failures, but the few that succeeded pointed to innovative approaches in the fickle world of advertising. And the value of mistakes is explicitly highlighted in Google’s recent IPO prospectus: “We would fund projects that have a 10% chance of earning a billion dollars…Do not be surprised if we place smaller bets in areas that seem very speculative or even strange.” Google is alerting investors to expect company actions that may look like mistakes.

Although organizations need to make mistakes in order to improve, they go to great lengths to avoid anything resembling an error. That’s because most companies are designed for optimum performance rather than learning, and mistakes are seen as defects that need to be minimized.

True deliberate mistakes are expected, on the basis of current assumptions, to fail and not be worth the cost of the experiment. According to conventional wisdom, they have a negative expected value. But if such a mistake unexpectedly succeeds, then it has undermined at least one current assumption (and, often, more). That is what creates opportunities for profitable learning.

Philosophers of science have long emphasized the virtues of falsification—disproving your hypotheses and then testing new ones—as the fastest way to the truth. Sometimes making mistakes can be the quickest way to discover a problem’s solution. In our executive education programs on decision making, we ask managers to find the underlying pattern in a sequence of three numbers, such as 2, 4, 6. The participants are allowed to propose alternative sets of three numbers and ask whether they fit the pattern.

Most people formulate a preliminary hypothesis—for 2, 4, 6, they might guess the pattern is ascending, adjacent even numbers. Forming a hypothesis is generally a good idea, but problems arise when people devise strategies to test their ideas. Should they propose a set of numbers that fits the hypothesis or violates it? They typically propose sequences that fit their rules, as illustrated in the “Testing a hypothesis” column of the table (which includes our yes-or-no answers to whether each test sequence fits the hidden pattern). After three successful tests, the participants usually state with great confidence that the “ascending, adjacent even numbers” hypothesis is correct. But they are wrong.

Consider the alternative approach of testing sequences that violate the hypothesis—in other words, making deliberate mistakes. Participants who choose numbers that don’t fit the hypothesis are likely to discover more quickly than their colleagues that the real pattern is any ascending sequence. In this experiment, the pattern is rarely uncovered unless subjects are willing to make mistakes by testing numbers that violate their hypotheses.

Sometimes, committing errors is not just the fastest way to the correct answer, it’s the only way. When college students were allowed to test as many sets of three numbers as they wished, fewer than 10% discovered the pattern. The vast majority became locked into a narrow hypothesis and tested only combinations of numbers that would confirm it.

The expense of a failed mistake should not be too high in comparison with the potential rewards, including learning. The Bell System’s mistake cost millions and might have resulted in no improvement, but the potential payoff—reducing a $450 million bad debt—was very large. David Ogilvy’s cost of running additional ads was small, considering the potential benefit of learning about what worked.

A strategy of knowingly making errors is likely to be valuable in environments where core assumptions drive large numbers of routine decisions, such as those about hiring, running ads, devising promotional tactics, or assessing credit risks

In an environment that is changing quickly, the strategic advantage shifts to those who learn fastest—and rapid learning may require deliberate mistakes.


What ways do we have of knowing?

So the most obvious way we come to know things is through entailment. If I know that the bus arrive at 8 o’clock and it is 5 past 8, but the bus has not arrived, I know that the bus is not on time. Obviously! However, this process of reasoning towards knowledge won’t work for everything that we rightly claim to know. I mentioned that I know that the bus arrives at 8 o’clock, but how do I know this? Well perhaps I looked at the schedule printed at the bus stop and that’s what it said. But how do I know that the schedule was accurate? How do I know that my eyes weren’t deceiving me? And so on. We can continue on and on demanding justification for the justification we gave for our first claim, that the bus arrives at 8 o’clock. However, this won’t do. If justification proceeds downwards forever and ever then our everyday mundane beliefs about buses and such are at stake for they all begin to take the form “If I’m justified in believing X, then I’m justified in believing Y.” Yet we’re never justified in believing X because there’s another layer: “If I’m justified in believing Z, then I’m justified in believing X.” And so on forever and ever. So in order to save all of our beliefs (including our scientific beliefs) about the world, we need for our chain of justification to stop at some belief that does not rely on any other belief for its justification.

We need some solution to this worry about infinite regress for justification. There have been several proposed over the years, but the one I’ll favor here is a form of foundationalism, the view that there are some foundational beliefs that are justified without reference to any other beliefs. I’m going with foundationalism because it’s (as far as I know) the more popular view among philosophers right now and because I find it to be a more satisfying solution. However, I think that my points can probably be supported by foundationalism’s main competitors, such as coherentism, as well, so it shouldn't really matter.

What sort of beliefs do we have non-inferential justification for, then? Well an obvious candidate would just be our day-to-day beliefs about the world. For example, when I see the bus pulling up to the stop, I’m justified in believing that the bus has arrive. However, we’re trying to avoid the regress worry, so we don’t want to ground my justification in some other beliefs. Instead, I think we can ground it in the fact that it seems to me that the bus has arrive.

http://www.reddit.com/r/philosophy/comments/26cl0i/why_should_we_take_morality_seriously/


http://www.reddit.com/r/compsci/comments/24yacd/simon_peyton_jones_how_to_write_a_great_research/

How to write a research paper.

Idea -> Write Paper -> Do research.

Don't do research first. If you do, you realize a lot of work was misdirected, and some key parts of the paper are missing. Writing paper is a mechanism for doing research, not just for reporting it. Somehow we think more clearly when we write.

Where does the Idea come from? Write the idea on matter how insignificant seems to be. Its like a snowflake, it grows as its written.

Identify your key idea. You want to infect the mind of your reader with the idea like a virus. Your paper should have one clear, sharp idea. You may not know when you start, but you must know it at the end. If the reader cannot articulate it, then it failed. If you have lots of idea, write lots of papers. Often you should wrote "The main idea of this paper is...".

When writing, imagine you are standing in front of a white board. People explain in a much more accessible and engaging way at a whiteboard.

Write in an inverted pyramid, assuming readers will drop off or your work cut off.

Introduction. Describe the problem (with an example) and state your contributions.

Don't frame work your too ambitiously or people wont believe you. Compare "Computer programs often have bugs. It is very important to eliminate these bugs [1, 2]. Many researchers have tried [3, 4, 5, 6]. It really is very important" with "Consider this program, which an interesting bug. <brief description>. We will show an automatic technique for identifying and remove such bugs."

In this paper, I'm going to substantiate each of these claims. The paper substantiates the claim. You can write the claims using bullets and give a forward reference to evidence in the paper (e.g. "We explain X (Section 4)" or "We discusses these effects in Sections 5 and 6, and contrast them in Section 7."). These forward references eliminate a need for a paragraph explaining the structure of the paper. Contributions should be refutable. Don't "we describe X or we study Y" do "We give the syntax and semantics of a language that supports concurrent processes (Section 3)" or "We prove that the type system is sound, and that type checking is decidable".


[5]

The amount of knowledge increases quickly.

Knowledge becomes obsolete.

Stopped at 21:08. Nothing hugely interesting so far. Watch this at night when I'm tired.


https://news.ycombinator.com/item?id=9101094


On the importance of organizing knowledge-- it's much like having the "best cards" in a trading card game and just stuffing them altogether rather than taking advantage of synergy or strategy. Or having the "best players" on a soccer team but not putting them into formation.


[I do not] carry such information in my mind since it is readily available in books. ...The value of a college education is not the learning of many facts but the training of the mind to think. -- Albert Einstein

To teach well, I think you need to understand well. Understanding well makes something hard simple. And then teaching becomes simple. Too often, in an attempt to simplify things, people make things more complex. How can adding things make things simpler?


I evaluate intelligence based on 1) how many true beliefs a person holds and 2) how well these beliefs are justified and (loosely) organized. If a person can't make a connection between related ideas, then they aren't as smart as someone else.


[6]

Given the presence of moral relativism in some academic circles, some people might naturally assume that philosophers themselves are to blame. But they aren’t. There are historical examples of philosophers who endorse a kind of moral relativism, dating back at least to Protagoras who declared that “man is the measure of all things,” and several who deny that there are any moral facts whatsoever. But such creatures are rare. [6]

When I went to visit my son’s second grade open house, I found a troubling pair of signs hanging over the bulletin board. They read:

Fact: Something that is true about a subject and can be tested or proven.

Opinion: What someone thinks, feels, or believes.

First, the definition of a fact waffles between truth and proof — two obviously different features. Things can be true even if no one can prove them. For example, it could be true that there is life elsewhere in the universe even though no one can prove it. Conversely, many of the things we once “proved” turned out to be false. For example, many people once thought that the earth was flat. It’s a mistake to confuse truth (a feature of the world) with proof (a feature of our mental lives). Furthermore, if proof is required for facts, then facts become person-relative. Something might be a fact for me if I can prove it but not a fact for you if you can’t. In that case, E=MC2 is a fact for a physicist but not for me.

But second, and worse, students are taught that claims are either facts or opinions. They are given quizzes in which they must sort claims into one camp or the other but not both.

How does the dichotomy between fact and opinion relate to morality? I learned the answer to this question only after I investigated my son’s homework (and other examples of assignments online). Kids are asked to sort facts from opinions and, without fail, every value claim is labeled as an opinion.

In summary, our public schools teach students that all claims are either facts or opinions and that all value and moral claims fall into the latter camp. The punchline: there are no moral facts. And if there are no moral facts, then there are no moral truths.

The inconsistency in this curriculum is obvious. For example, at the outset of the school year, my son brought home a list of student rights and responsibilities. Had he already read the lesson on fact vs. opinion, he might have noted that the supposed rights of other students were based on no more than opinions. According to the school’s curriculum, it certainly wasn’t true that his classmates deserved to be treated a particular way — that would make it a fact. Similarly, it wasn’t really true that he had any responsibilities — that would be to make a value claim a truth.

If it’s not true that it’s wrong to murder a cartoonist with whom one disagrees, then how can we be outraged? If there are no truths about what is good or valuable or right, how can we prosecute people for crimes against humanity? If it’s not true that all humans are created equal, then why vote for any political system that doesn’t benefit you over others?

Facts are things that are true. Opinions are things we believe. Some of our beliefs are true.


.

Writing is nature's way of letting you know how sloppy you're thinking is.

—Guindon

No battle was ever won according to plan, but no battle was won without one.

—Dwight Eisenhower

If you are thinking without writing, you only think you're thinking.

When it's important or difficult to get right you should have a formal specification.

The two most useful models are functions and the standard behavioral model (a behavior = sequences of states, a state is an assignment of values to variables). 1) Model a program as a function.

The limitations of the function model are that they specify what a program does but not how. For example, quicksort and bubble-sort.

2) Program is modeled as a set of behaviors. For example, Euclid's algorithm. Compute gcd. To describe a set of behavior, describe the set of possible initial states, and a next-state relation describe all possible successor state of any state.


[7]

If the world were static, we could have monotonically increasing confidence in our beliefs. The more (and more varied) experience a belief survived, the less likely it would be false.

Most really good startup ideas look like bad ideas at first, and many of those look bad specifically because some change in the world just switched them from bad to good.

To protect yourself against obsolete beliefs, the first step is to have an explicit belief in change. Instead of trying to point yourself in the right direction, admit you have no idea what the right direction is, and try instead to be super sensitive to the winds of change.

I believe this passive m.o. works not just for evaluating new ideas but also for having them. The way to come up with new ideas is not to try explicitly to, but to try to solve problems and simply not discount weird hunches you have in the process.

Another trick I've found to protect myself against obsolete beliefs is to focus initially on people rather than ideas. Though the nature of future discoveries is hard to predict, I've found I can predict quite well what sort of people will make them.


The greatest enemy of knowledge is not ignorance, it is the illusion of knowledge.


He is just being a cynic thinking he is a intellectual skeptic.

Cynic : Cynics are distrustful of any advice or information that they do not agree with themselves. Cynics do not accept any claim that challenges their belief system.

"Being skeptical, not cynical, helps us in forming beliefs that are in agreement with evidence."


Division between de jure and de facto mirrors the division between knowledge and wisdom.


Stupidity has a knack of getting its way. -- Albert Camus

The Greeks since Aristotle’s Nichomachean Ethics distinguished two different kinds of wisdom: phronesis, or practical wisdom, and sophia, or “transcendental” wisdom. To complicate things from a Stoic perspective, while phronesis is one of the four cardinal virtues (the others being temperance, courage, and justice), many Stoics thought — together with Socrates — that these are all aspects of one underlying virtue, which they referred to as wisdom. Clearly, a bit of unpacking is in order.

A good, if somewhat unusual, place where to start is a special issue of the journal Research in Human Development (volume 8, issue 2, 2011), which published a collection of essays on “Sophia and Phronesis in Psychology, Philosophy, and Traditional Wisdom,” as the title of the introductory article, by Richard Hawley Trowbridge and Michel Ferrari, spells out.

Until reading the paper, I was unaware, for instance, that psychologists have published over 100 articles featuring empirical research on wisdom, or of the existence of the Berlin Wisdom Paradigm, an ongoing research effort on wisdom based on principles of cognitive psychology.

f we drop talk of somewhat anthropomorphic gods, this isn’t that different from the Stoic concept that in order to be wise one has to understand the way the world works (by studying “physics”) and locate one’s place in it. Not doing so leads to foolishness, not wisdom, because one begins to wish for things that are not “in accordance with nature.”

So, again, while phronesis is a practical type of wisdom, sophia is more general, more abstract. It actually helped me recently to be reminded that the Romans translated phronesis as prudentia, or prudence, which comes from providentia, meaning “seeing ahead, sagacity,” and is the ability to govern and discipline oneself by the use of reason. If we follow the Romans, then the four cardinal Stoic virtues become prudence, temperance, justice and courage, and we can use the word “wisdom” for that kind of deeper understanding of things from which, presumably, the four virtues themselves spring. I find this solution of the one-vs-many virtues issue as well as of the difference between sophia and phronesis both elegant and helpful.

Interestingly, Trowbridge and Ferrari note that the emphasis on wisdom continued during the Middle Ages and into the Renaissaince, but tapered off after the Scientific Revolution of the 17th and 18th centuries: “With the advent of modern science, this tradition ended up disparaged, forgotten, and ignored in the leading circles of learned thought in the West. In the Routledge Encyclopedia of Philosophy, Smith wrote that after the mid–17th century, ‘wisdom is mentioned only in passing, or simply passed by altogether by philosophers’. Scholars in Europe increasingly turned from a pursuit of wisdom and happiness to a pursuit of truth and usefulness.”


https://www.bloomberg.com/view/articles/2016-12-29/can-you-spot-fake-news-don-t-be-so-sure

Brown University psychology professor Steven Sloman has been investigating this tendency. In one 2013 study, he asked subjects how much they know about complex policies such as unilateral sanctions on Iran. Most people reported knowing a lot -- but when asked to explain how the policies worked, they couldn’t. Shattering what Sloman calls “the knowledge illusion” leads people to downgrade their self-assessment -- but then their overconfidence returns. More recently, Sloman has been researching the way people overestimate their understanding of everything from glue to coffee-makers to toilets for his forthcoming book “The Knowledge Illusion: Why We Never Think Alone.”

Not everyone is equally self-deluded, of course. As a test, Sloman asks a simple math question: A bat costs $1 more than a ball, and together they cost $1.10. What does the ball cost? About 20 percent of people get this right, he said, and they are not as vulnerable to the knowledge illusion.

But what about content knowledge -- knowing facts about the world? You might think that would arm people against fake news, but experiments show that’s not necessarily the case. Education professor Joseph Kahne of the University of California Riverside gave young subjects, age 15 to 27, a short test for political literacy and then showed them a mix of fake and real news stories presented as Facebook posts. He found high scorers were no better than the rest at separating fake stories from evidence-based ones.

What did matter was whether a news story bolstered the subjects’ existing beliefs. “The judgments people make are heavily influenced by whether or not information aligns with a policy position they already hold,” Kahne said. People who identify as liberals have no trouble pooh-poohing the rumor that Barack Obama was born in Kenya, for example, while those on the conservative end were more likely to believe it. Likewise, liberals were more likely than conservatives to swallow a false claim that 90 percent of rich Americans pay no taxes.

And false stories are easier than ever to generate and spread. In decades past, Kahne said, people trusted established newspapers, magazines and TV news programs. But trust in the mainstream media has declined massively over the past 20 years, while a majority of Americans now get news from Facebook.


You cannot learn a thing you think you know.


'In the minds of geniuses, we find our own neglected thoughts' (Emerson).

It is when you feel most righteous that you should be most on your guard. Take care that what are you are fighting for isn't your own ego.


... But two scientific discoveries that would soon dominate the world were absent at the fair: nuclear energy and electronic computers.

The very beginnings of both technologies, however, could be found at an institution that had been Einstein’s academic home since 1933: the Institute for Advanced Study in Princeton, N.J. The institute was the brainchild of its first director, Abraham Flexner. Intended to be a "paradise for scholars" with no students or administrative duties, it allowed its academic stars to fully concentrate on deep thoughts, as far removed as possible from everyday matters and practical applications.

By setting up his academic paradise, Flexner enabled the nuclear and digital revolutions. Among his first appointments was Einstein, who would follow his speech at the World’s Fair with his famous letter to President Roosevelt in August 1939, urging him to start the atomic-bomb project. Another early Flexner appointee was the Hungarian mathematician John von Neumann, perhaps an even greater genius than Einstein.

Von Neumann’s early reputation was based on his work in pure mathematics and the foundations of quantum theory. Together with the American logician Alonzo Church, he made Princeton a center for mathematical logic in the 1930s, attracting such luminaries as Kurt Gödel and Alan Turing.

Von Neumann was fascinated by Turing’s abstract idea of a universal calculating machine that could mechanically prove mathematical theorems. When the nuclear bomb program required large-scale numeric modeling, von Neumann gathered a group of engineers at the institute to begin designing, building, and programming an electronic digital computer — the physical realization of Turing’s universal machine.

Rather than attempting to demarcate the nebulous and artificial distinction between "useful" and "useless" knowledge, we may follow the example of the British chemist and Nobel laureate George Porter, who spoke instead of applied and "not-yet-applied" research.

Supporting applied and not-yet-applied research is not just smart but a social imperative. In order to enable and encourage the full cycle of scientific innovation, which feeds into society in numerous important ways, it is more productive to think of developing a solid portfolio of research in much the same way as we approach well-managed financial resources. Such a balanced portfolio would contain predictable and stable short-term investments, as well as long-term bets that are intrinsically more risky but can potentially earn off-the-scale rewards.

Flexner’s efforts and vision led to his joining the General Education Board of the Rockefeller Foundation in 1912, lending him added stature and resources as an influential force in higher education and philanthropy. He soon became its executive secretary, a position he held until his retirement in 1927. It was in this capacity that he formed the ideas underlying his essay "The Usefulness of Useless Knowledge." It would eventually be published in Harper’s magazine in October 1939, but it began as a 1921 internal memo prepared for the board.

The birth of quantum theory was long and painful. The German physicist Max Planck described his revolutionary thesis, first proposed in 1900, that energy could only occur in packets or "quanta" as "an act of desperation." In his words, "I was willing to make any offer to the principles in physics that I then held." His gambit played out very well. Without quantum theory, we wouldn’t understand the nature of any material, including its color, texture, and chemical and nuclear properties. These days, in a world totally dependent on microprocessors, lasers, and nanotechnology, it has been estimated that 30 percent of the U.S. gross national product is based on inventions made possible by quantum mechanics... Within a hundred years, an esoteric theory of young physicists became a mainstay of the modern economy.

It took nearly as long for Einstein’s own theory of relativity, first published in 1905, to be used in everyday life in an entirely unexpected way. The accuracy of the global positioning system, the space-based navigation system that provides location and time information in today’s mobile society, depends on reading time signals of orbiting satellites. The presence of Earth’s gravitational field and the movement of these satellites cause clocks to speed up and slow down, shifting them by 38 microseconds a day. In one day, without Einstein’s theory, our GPS tracking devices would be inaccurate by about seven miles.

The path from exploratory blue-sky research to practical applications is not one-directional and linear, but rather complex and cyclic, with resultant technologies enabling even more fundamental discoveries. Take, for example, superconductivity, the phenomenon discovered by the Dutch physicist Heike Kamerlingh Onnes in 1911. Certain materials, when cooled down to ultralow temperatures, turn out to conduct electricity without any resistance, allowing large electric currents to flow at no energy costs. The powerful magnets that can be so constructed have led to many innovative applications, from the maglev transport technology that allows trains to travel at very high speeds as they levitate through magnetic fields to the fMRI technology used to make detailed brain scans for diagnosis and treatment.

Through these breakthrough technologies, superconductivity has in turn pushed the frontiers of basic research in many directions. High-precision scanning has made possible the flourishing field of present-day neuroscience, which is probing the deepest questions about human cognition and consciousness. Superconductivity is playing a crucial role in the development of quantum computers and the next revolution in our computational powers with unimaginable consequences. And in fundamental physics, it has produced the largest and strongest magnets on the planet, buried a hundred meters underground in the 17-miles-long ring of the Large Hadron Collider, the particle accelerator built in the CERN laboratory in Geneva. The resulting 2012 discovery of the Higgs boson was the capstone that completed the Standard Model of particle physics, enabling physicists to further probe and unravel the mysteries of the universe.

One of the least-known success stories in human history is how over the past two-and-a-half centuries advances in medicine and hygiene have tripled life expectancy in the West. The discovery of the double helical structure of DNA in 1953 jump-started the age of molecular biology, unraveling the genetic code and the complexity of life. The advent of recombinant DNA technology in the 1970s and the completion of the Human Genome Project in 2003 revolutionized pharmaceutical research and created the modern biotech industry. Currently, the Crispr-Cas9 technology for gene editing allows scientists to rewrite the genetic code with unbounded potential for preventing and treating diseases and improving agriculture and food security. We should never forget that these groundbreaking discoveries, with their immense consequences for health and diseases, were products of addressing deep basic questions about living systems, without any thoughts of immediate applications.

As is often said, knowledge is the only resource that increases when used.

It is estimated that more than half of all economic growth comes from innovation. Leading information technology and biotech industries can trace their success directly to the fruits of fundamental research grown in the fertile environments around research universities as in Silicon Valley and the Boston area, often infused by generous public investments. MIT estimates that it has given rise to more than 30,000 companies with roughly 4.6 million employees, including giants such as Texas Instruments, McDonnell Douglas, and Genentech. The two founders of Google worked as graduate students at Stanford University on a project supported by the Digital Libraries Initiative of the National Science Foundation.

e postwar decades saw an unprecedented worldwide growth of science, including the creation of funding councils like the National Science Foundation and massive investments in research infrastructure. Recent decades have seen a marked retrenchment. One can argue that the state of scholarship has now reached a critical stage that in many ways mirrors the crisis that Flexner discussed. Steadily declining public funding is currently insufficient to keep up with the expanding role of the scientific enterprise in a modern knowledge-based society. The U.S. federal research and development budget, measured as a fraction of the gross domestic product, has steadily declined, from a high of 2.1 percent in 1964, at the height of the Cold War and the space race, to currently less than .8 percent. (Note that roughly half of that budget has remained defense oriented.) The budget for the National Institutes of Health, the largest supporter of medical research in the United States, has fallen by 25 percent in inflation-adjusted dollars over the past decade.

On top of this, industry, driven by short-term shareholder pressure, has been steadily decreasing its research activities, transferring that responsibility largely to public funding and private philanthropy. A committee of the U.S. Congress found that in 2012 business only provided 6 percent of basic research funding, with the lion’s share — 53 percent — shouldered by the federal government and the remainder coming from universities and foundations.


.

“Of course!” Joe thinks. “It’s all so simple now. The key to understanding monads is that they are Like Burritos. If only I had thought of this before!” The problem, of course, is that if Joe HAD thought of this before, it wouldn’t have helped: the week of struggling through details was a necessary and integral part of forming Joe’s Burrito intuition, not a sad consequence of his failure to hit upon the idea sooner.

But now Joe goes and writes a monad tutorial called “Monads are Burritos,” under the well-intentioned but mistaken assumption that if other people read his magical insight, learning about monads will be a snap for them. “Monads are easy,” Joe writes. “Think of them as burritos.” Joe hides all the actual details about types and such because those are scary, and people will learn better if they can avoid all that difficult and confusing stuff. Of course, exactly the opposite is true, and all Joe has done is make it harder for people to learn about monads, because now they have to spend a week thinking that monads are burritos and getting utterly confused, and then a week trying to forget about the burrito analogy, before they can actually get down to the business of learning about monads.

If you ever find yourself frustrated and astounded that someone else does not grasp a concept as easily and intuitively as you do, even after you clearly explain your intuition to them (“look, it’s really quite simple,” you say…) then you are suffering from the monad tutorial fallacy.


With respect, I will not provide answers if I feel that it violates OPSEC, I will tell you when I am providing first or second hand knowledge, when I am making educated guesses (based on 6 years in 4-25, multiple deployments, and my expertise as an Sniper, Tracker, and infantryman), and when I am speculating. If you have any questions about the situation, or on any of the military lingo that has been presented so far, I'll do my best to answer it here.

A mistake I see smart people make all the time: Presenting one scientist's take on a contested topic, and saying "See, scientists think X"


This pairing of interest with ignorance has created a perfect storm for a misinformation epidemic. The outsize demand for stories about AI has created a tremendous opportunity for impostors to capture some piece of this market.

When founding Approximately Correct, I lamented that too few academics possessed either the interest or the talent for both expository writing and for addressing social issues. And on the other hand, too few journalists

possess the technical strength to relate developments in machine learning to the public faithfully. As a result, there are not enough voices engaging the public in the non-sensational way that seems necessary now.

As I quickly learned as a young PhD student blogging for KDnuggets, articles about deep learning get lots of clicks. Everything else being equal, in my experience, articles about deep learning attracted on the order of ten times as many eyeballs as other data science stories.

If machine learning had no societal impact, the present situation might not be so alarming. After all, the cartoonish descriptions of string theory and particle physics seem relatively harmless, since developments in particle physics show no sign of impacting free speech or the employment markets in the near future.


The measure of wisdom is how calm you are when facing any given situation.


Yes, there is a conspiracy, indeed there are a great number of conspiracies, all tripping each other up ... the main thing that I learned about conspiracy theories is that conspiracy theorists actually believe in the conspiracy because that is more comforting. The truth of the world is that it is chaotic. The truth is, that it is not the Jewish banking conspiracy, or the grey aliens, or the twelve-foot reptiloids from another dimension that are in control, the truth is far more frightening; no-one is in control, the world is rudderless." - Alan Moore


Use Enrico Fermi’s guesstimation techniques to check the plausibility of data-based claims. Fermi, the Italian physicist who created the first controlled, self-sustaining nuclear chain reaction, was also known for his eerily accurate approximations, which he made by replacing each variable in a problem with a reasonable assumption. In July, 1945, as Fermi watched the Trinity Test in the New Mexico desert, he observed the effect of the explosion on small pieces of falling paper—then used that measurement to accurately estimate the strength of the blast to within an order of magnitude.

Beware of Big Data hubris. TheGoogle Flu Trends project, which claimed, with much fanfare, to anticipate seasonal flu outbreaks by tracking user searches for flu-related terms, proved to be a less reliable predictor of outbreaks than a simple model of local temperatures. (One problem was that Google’s algorithm was hoodwinked by meaningless correlations—between flu outbreaks and high-school basketball seasons, for example, both of which occur in winter.

Mind the Bullshit Asymmetry Principle, articulated by the Italian software developer Alberto Brandolini in 2013: the amount of energy needed to refute bullshit is an order of magnitude bigger than that needed to produce it. Or, as Jonathan Swift put it in 1710, “Falsehood flies, and truth comes limping after it.”Plus ça change.


There are two things you need to be successful. You have to come up with the best decisions and you have to have the courage to make them. And I think that the real problem about most people with the best decisions is that they think it's in their heads, and most people tragically are attached to opinions in their heads, and they don't put them out and stress test them. So the thing that I learned was really to take those ideas, put them out there in a certain way, and stress test them. Whatever success I've had in life had more to do with my knowing how to deal with my not-knowing than anything I know, and that taught me a lot more, and it taught me how to take in what others have.

—Ray Dalio


Many people are sloppy thinkers, and opt for “middle of the road” positions when faced with two opposing extremes. But compromise is often even less accurate than the extreme poles of a dialectic. In my experience, it’s common that deep truths exist at both extremes of a dialectic, and the wisest stance on an issue will incorporate “both of the opposites within itself”.

An obvious one is the product part of an organization. It’s true that a great product team will collect lots of user feedback, systematize it, and make data-driven decisions — a “scientific” approach to product. However, as we know from Sony in the 1980’s, Steve Jobs, and other iconic product organizations, it’s also true that greatness in products requires leaders to tell customers what they want, not merely to ask and respond to customer data. This requires leaps of creative intuition, or an “artistic” approach to product.

These are conflicting methodologies, but extreme forms of both must exist inside of a product organization for it to be great. It’s easy for experienced company-builders to see how mistakes will be made by only relying on one of these and not the other.

Another example is the dialectic between entrepreneurial vigor and lawyerly/bureaucratic restraint. As an entrepreneur, I had a strong bias to move quickly and decisively. I found it frustrating and suffocating when legal counsel or management restricted my freedom to experiment. I understood that most big corporations and government institutions function in massively inefficient ways (which created huge opportunities for Palantir and other ventures). Although I remain committed to these views, I have now grown to appreciate that a bunch of ardent entrepreneurs with large risk appetites can be too volatile; that you need corporate governance and procedural checks to keep a company from blowing up. What’s important to recognize here is that a company should not strive for a middling balance between these two extremes, it should cultivate both at once.

A third example is the dialectic between breadth and depth. It has been my experience that deep, obsessive focus at the expense of other areas of life yields exponential returns. The curve tracking productive output against the time a person invests in a single subject matter is highly convex. Yet I have also come to realize the importance of building bridges and relationships across disciplines, and in equally vital pursuits such as family life. A fully torqued engine burns out more quickly. I have discovered that extra-disciplinary conversations spark my intuition and create important synergies for the businesses I have been a part of. My bias is to work extremely hard and dive very deeply into topics, and I attribute some of my greatest work (such as at Palantir and Addepar) to periods of extreme focus. But developing relationships with experts outside of my chosen fields and cultivating other personal interests has helped me expand my worldview and pinpoint opportunities I might have otherwise missed entirely.


Aporia is an irresolvable internal contradiction.

Really like this https://streetepistemology.com/publications/street_epistemology_the_basics


A simple way to dive deeper into a vague statement: Read a sentence like "We don't use your microphone for ads" over and over and emphasize a different word each time, trying to guess which one is the magic word. e.g. "WE don't serve you use your microphone for ads" (but maybe our partners do or "We don't use YOUR MICROPHONE [to serve] you ads."


The thing that frustrated me with Lance Armstrong over so many years was the smirk on his face when asked whether he used drugs, and he would reply, "Hey, I passed the test, isn't that good enough for people to stop asking?"

A fact is the close agreement of a series of observations of the same phenomenon. Observation can happen through our sense organs, or through an interpreter. In many cases phenomena lies outside of direct perception and we must devise ways of causing them to produce effect which lie within that field.

All observations must be susceptible of confirmation. They must be carried out that they may be repeated at will.

Assuming you have never visited Sydney, Australia, how do you know there is a city by that name? The reason you believe in the existence of this place is because you know that knowledge is the kind that can be verified. If worst came to worst, you could go there yourself.

When Napoleon's chief spy, Karl Schulmeister, was working himself high in the ranks of the Austrian secret service, he received almost daily a copy of a Parisian newspaper. He said an agent of his smuggled it across the border. Naturally, the Austrians got a lot of information about conditions in France. The truth was that the newspaper was printed solely for Schulmeister and the Austrian generals, and each edition consisted of only one copy. It was all false, all exactly what Napoleon wanted his enemy know.

A definition is an agreement, wholly arbitrary in character, among men; while a fact is an agreement among investigations carried out by men.

In the International Bureau of standards near Paris is a certain bar of metal -- one only. It is an alloy of platinum and iridium. Of this bar are two marks. A meter is defined as the distance between these two marks when the bar is a 0 degrees Centigrade.

It is the postulates, three in number, that are the foundations of science: (1) the external world actually is, (2) nature is uniform, and (3) there are symbols in the 'mind' which stand for event and things in the external world.

The first allows us to work on things without having to establish their existence.

The second means we do not have to worry that nature will behave chaotically.

The third postulates establishes a one-to-one correspondence between all that is in our minds and all that is in the external world. A corollary of this is that there is nothing in all the world that has the a prioi quality of being unknowable.

All scientists agree not to question the postulates, nor require proof thereof.

Science is, in a dynamic sense, essentially a method of prediction. It has been defined as being the method of the determination of the most probable.


1650s, from Greek oxymoron, noun use of neuter of oxymoros (adj.) "pointedly foolish," from oxys "sharp, pointed" + moros "stupid" (see moron). Rhetorical figure by which contradictory terms are conjoined so as to give point to the statement or expression; the word itself is an illustration of the thing


If the primary purpose of school was education, the Internet should obsolete it. But school is mainly about credentialing. Employers looking past traditional credentials can arbitrage the gap. @ycombinator made $Bs doing this for young founders. The more meritocratic an industry, the faster it moves away from false credentialing. I.e., the MBA and tech startups. A generation of auto-didacts, educated by the Internet & leveraged by technology, will eventually starve the industrial-education system. Until then, only the most desperate and talented students will make the leap. Even today, what to study and how to study it are more important than where to study it and for how long.

Educational credentials are badges that admit one to the elite class. Expect elites to struggle mightily to justify the current system.

—Naval Ravikant


Quoth Matt Cohler from Benchmark: If there is one distinguishing trait in the best people I work with, they are learn-it-alls not know-it-alls.


Usually people of think of learning as seeing new facts. But isn't correcting existing beliefs equally important? What percent of your beliefs do you think are actually true (noting that we often have to update our beliefs)?


It's possible to believe in something knowing it may not be true. For example, I believe that most things are okay to eat so long as they consumed in moderation, and I also believe physical balance is unimportant. However, I do not know if either are true.

This seems to occur when I need rules to make decisions and only have limited knowledge. I'll go off what limited information I have knowing it may be incorrect.


A myth is a belief that many people hold and act on, but which is not true. For example, the Greek myths, the "myth" of the constitution/the power of the supreme court/money.


When reading an article, always ask yourself: Who writes the stories? Who benefits from the stories? Who is missing from the stories?


All curation grows until it requires search. All search grows until it requires curation. All newsfeeds grow until they need an algorithm. All algorithmic feeds grow until they over-fit to ‘engagement’? Yahoo’s directory grew until it broke and we needed search instead. FB’s newsfeed grew until it broke and we needed an algorithm instead. But search breaks discovery; what exactly is it that the algorithmic feed breaks?

Benedict Evans

Algorithmic feed breaks sharing. You think you're sharing with one set of friends but it really sharing with unknown subsets of each of your social circles. Yes, why would you share anything important when you have no idea who will see it?


Yonatan Zunger. Jan 2. Tolerance is not a moral precept. https://extranewsfeed.com/tolerance-is-not-a-moral-precept-1af7007d6376

Tolerance is not a moral absolute; it is a peace treaty. Tolerance is a social norm because it allows different people to live side-by-side without being at each other’s throats. It means that we accept that people may be different from us, in their customs, in their behavior, in their dress, in their sex lives, and that if this doesn’t directly affect our lives, it is none of our business. But the model of a peace treaty differs from the model of a moral precept in one simple way: the protection of a peace treaty only extends to those willing to abide by its terms. It is an agreement to live in peace, not an agreement to be peaceful no matter the conduct of others.

When viewed through this lens, the problems above have clear answers. The antisocial member of the group, who harms other people in the group on a regular basis, need not be accepted; the purpose of your group’s acceptance is to let people feel that they have a home, and someone who actively tries to thwart this is incompatible with the broader purpose of that acceptance. Prejudice against Nazis is not the same as prejudice against Blacks, because one is based on people’s stated opposition to their neighbors’ lives and safety, the other on a characteristic that has nothing to do with whether they’ll live in peace with you or not.

After a breach, the moral rules which apply are not the rules of peace, but the rules of broken peace, and the rules of war. We might ask, is the response proportional? Is it necessary? Does it serve the larger purpose of restoring the peace? But we do not take an invaded country to task for defending its borders.

.

But even after six generations of fighting, and tens of millions of dead, these wars came to an end. The Peace of Westphalia, the series of treaties which ended them, was built on two radical tenets: that each ruler had the right to choose the religion of their state, and that Christians living in principalities where their faith was not the established faith still had the right to practice their religion. A decision was made, in essence, to accept the risk of the monster rather than the reality of the war.

The Peace of Westphalia was the political foundation for the concept of secularism: that religious matters are so uncertain that the state should not have the power to mandate them. It remains one of the classic peace treaties between fundamentally incompatible groups. It was also, in turn, the basis for the concept of religious freedom brought by European settlers to North America; the American Bill of Rights is its direct descendant.


People understand that if they roll a die 100 times, they will get some 1’s. But when they see a probability for one event, they tend to think: Is this going to happen or not?

They then effectively round to 0 or to 100 percent. That’s what the Israeli official did. It’s also what many Americans did when they heard Hillary Clinton had a 72 percent or 85 percent chance of winning.

Welch, a Dartmouth professor, pointed me to an online pictograph about breast-cancer risk. It shows 1,000 stick figures, of which 973 are gray (no cancer), 22 are yellow (future survivor) and five are red (die in next 10 years). You can see the most likely outcome without ignoring the others.


Jason Zweig says, “Being right is the enemy of staying right, partly because it makes you overconfident, even more importantly because it leads you to forget the way the world works.”


The age of abundance only began 200 years ago, so would not be surprising if all people had a natural disposition toward pessimism. Second, our brain probably prioritizes processing information that is potentially threatening in order to keep us alive. There is also the availability heuristic, where the brain overestimates events which are easier to recall. Additionally, people can have difficulty imagine being much happier, but can vividly imagine being worse off. Finally, good and bad things happen on different timelines. Good things happen over long timelines, while bad things happen quickly.


Smart people are often the most dangerous in terms of poor decision-making ability because they tend to be overconfident, make things too complex, and over-think things.

In 2012, psychologists Richard West, Russell Meserve, and Keith Stanovich tested the blind-spot bias—an irrationality where people are better at recognizing biased reasoning in others but are blind to bias in themselves. Overall, their work supported, across a variety of cognitive biases, that, yes, we all have a blind spot about recognizing our biases. The surprise is that blind-spot bias is greater the smarter you are. The researchers tested subjects for seven cognitive biases and found that cognitive ability did not attenuate the blind spot. “Furthermore, people who were aware of their own biases were not better able to overcome them.” In fact, in six of the seven biases tested, “more cognitively sophisticated participants showed larger bias blind spots.” They have since replicated this result.

—Annie Duke, Thinking in Bets

At our EBI East Conference in NYC a few months ago Jason Zweig stated that the worst bias an investor can have is the bias they don’t know they have. And the research shows that intelligent people are more prone to the blind-spot bias.


I have been reading the thought provoking Elephant in the Brain, and will probably have more to say on it later. But if I understand correctly, a dominant theory of how humans came to be so smart is that they have been in an endless cat and mouse game with themselves, making norms and punishing violations on the one hand, and cleverly cheating their own norms and excusing themselves on the other (the ‘Social Brain Hypothesis’ or ‘Machiavellian Intelligence Hypothesis’). Intelligence purportedly evolved to get ourselves off the hook, and our ability to construct rocket ships and proofs about large prime numbers are just a lucky side product.

But if intelligence evolved for the prime purpose of evading rules, shouldn’t the smartest people be best at navigating rule evasion? Or at least reliably non-terrible at it? Shouldn’t they be the most delighted to find themselves in situations where the rules were ambiguous and the real situation didn’t match the claimed rules? Shouldn’t the people who are best at making rocket ships and proofs also be the best at making excuses and calculatedly risky norm-violations?

I offer a different theory. If the human brain grew out of an endless cat and mouse game, what if the thing we traditionally think of as ‘intelligence’ grew out of being the cat, not the mouse?

The skill it takes to apply abstract theories across a range of domains and to notice places where reality doesn’t fit sounds very much like policing norms, not breaking them. The love of consistency that fuels unifying theories sounds a lot like the one that insists on fair application of laws, and social codes that can apply in every circumstance.


People seem to have different epistemological standards for things they are biased towards or against. For beliefs they are biased towards, they ask themselves, "Can I believe this?"; for beliefs they are biased against, they ask themselves "Must I believe this?". -- Julia Galef


Interesting when we accuse people of a conflict of interest or not having skin in the game, which seem to be flips of the same coin.


Akrasia is the state of acting against your better judgment. It applies to procrastination, or a lack of self-control.

The present self values immediate reward over long-term reward. This is why the ability to delay gratification is such a great predictor of success in life.


One problem with humanity is that every generation starts with 0 understanding o the way the world works and has to learn it all over again.


Bad (but typical) advice: "Listen to people with totally different views from yours”

Better advice: "Listen to people you have at least some common ground with (shared epistemology, goals, or key premises) but whose views are still meaningfully different from yours"

I see this as like rock climbing - you have to keep at least one hand/foot anchored on the cliff face in order to move the others.


Standards of evidence:

  1. Dodgy source, educated guess, having to rely on someone's memory
  2. Conflicting sources, evidence for different facts
  3. Made up guess (e.g. date) but no evidence that it's wrong