A Brief List of Law’s of Logic, Reality, and Science to Remember. Put together to help you remember and as a homage to the discoveries of great men while always keeping in mind that “being from an ‘authority’ is the weakest form of evidence” (Mr. H’s Law, hahaha). Use these where they work, drop them where they don’t. NEVER ACCEPT THESE OR ANYTHING WITHOUT FIRST OBJECTIVELY AND CRITICALLY TESTING IT. Thanks to Wikipedia for providing the source material.
1. Amara’s Law: “We tend to overestimate the effect of a technology in the short run and underestimate the effect in the long run.”
2. Bayes’ Theorem: the relationship between the probabilities of A and B, P(A) and P(B), and the conditional probabilities of A given B and B given A, P(A | B) and P(B | A). In its most common form, it is:
3. Benford’s Law: In any collection of data, a given data point has roughly 30% probability of starting with the digit 1.
4. Campbell’s Law: “The more any quantitative social indicator is used for social decision-making, the more subject it will be to corruption pressures and the more apt it will be to distort and corrupt the social processes it is intended to monitor.”
5. Celine’s Laws:
I- National Security is the chief cause of national insecurity.
II- Accurate communication is possible only in a non-punishing situation.
III- An honest politician is a national calamity.
6. Clarke’s Laws:
I-When a distinguished but elderly scientist states that something is possible, he is almost certainly right. When he states that something is impossible, he is very probably wrong.
II-The only way of discovering the limits of the possible is to venture a little way past them into the impossible.
III-Any sufficiently advanced technology is indistinguishable from magic.
7. De Morgan’s Laws:
NOT (A AND B) ≡ (NOT A) OR (NOT B) NOT (A OR B) ≡ (NOT A) AND (NOT B)
The rules can be expressed in English as:
8. Dunbar’s Number:
A theoretical cognitive limit to the number of people with whom one can maintain stable social relationships. No precise value has been proposed for Dunbar’s number, but a commonly cited approximation is 150.
9. Duverger’s Law:
Winner-take-all (or first-past-the-post) electoral systems tend to create a 2 party system, while proportional representation tends to create a multiple party system.
10. Gall’s Law:
A complex system that works is invariably found to have evolved from a simple system that worked. The inverse proposition also appears to be true: A complex system designed from scratch never works and cannot be made to work. You have to start over, beginning with a working simple system.
11. Gibrat’s Law:
“The size of a firm and its growth rate are independent.”
12. Gresham’s Law:
Coined by British economist Henry Macleod, named for Sir Thomas Gresham, attributed to Copernicus.
“Bad money drives good money out of circulation if their exchange rate is set by law.”
13. Grosch’s Law:
There is a fundamental rule, which I modestly call Grosch’s law, giving added economy only as the square root of the increase in speed — that is, to do a calculation 10 times as cheaply you must do it 100 times as fast.
This adage is more commonly stated as
Computer performance increases as the square of the cost. If computer A costs twice as much as computer B, you should expect computer A to be four times as fast as computer B.
14. Hanlon’s Razor:
“Never attribute to malice that which can be explained by stupidity.” But don’t rule out malice. Napoleon Bonaparte had a similar quote.
15. Hawthorne Effect:
subjects improve an aspect of their behavior being experimentally measured simply in response to the fact that they are being studied.
16. Heisenberg’s Uncertainty Principle:
one cannot measure values (with arbitrary precision) of certain conjugate quantities, which are pairs of observables of a single elementary particle. The most familiar of these pairs is position and momentum.
17. Hebb’s Law:
“Neurons that fire together wire together.”
18. Herblock’s Law:
“If it’s good, they’ll stop making it.”
19. Hick’s Law:
describes the time it takes for a person to make a decision as a result of the possible choices he or she has. The Hick-Hyman Law assesses cognitive information capacity in choice reaction experiments. The amount of time taken to process a certain amount of bits in the Hick-Hyman Law is known as the rate of gain of information. Given n equally probable choices, the average reaction time T required to choose among them is approximately
where b is a constant that can be determined empirically by fitting a line to measured data. Operation of logarithm here expresses depth of “choice tree” hierarchy. Basically log2 means that you perform binary search. According to Card, Moran, and Newell (1983), the +1 is “because there is uncertainty about whether to respond or not, as well as about which response to make.” The law can be generalized in the case of choices with unequal probabilities pi of occurring, to
T = bH
where H is the information-theoretic entropy of the decision, defined as
where pi refers to the probability of the ith alternative yielding the information-theoretic entropy.
20. Hofstadter’s Law:
The law is a statement regarding the difficulty of accurately estimating the time it will take to complete tasks of any substantial complexity.
Hofstadter’s Law: It always takes longer than you expect, even when you take into account Hofstadter’s Law.
21. Hotelling’s Law:
Under some conditions, it is rational for competitors to make their products as nearly identical as possible.
22. Hubble’s Law:
Galaxies recede from an observer at a rate proportional to their distance to that observer.
23. Humphrey’s Law:
conscious attention to a task normally performed automatically can impair its performance.
24. Hutber’s Law:
states that “improvement means deterioration”. It is founded on the cynical observation that a stated improvement actually hides a deterioration.