Blog Archive

Monday, May 15, 2023

05-14-2023-1827 - variety continued... (draft)

Pencil illustration of a brownie by a twenty-first-century artist, based in part on nineteenth century descriptions

https://en.wikipedia.org/wiki/Brownie_(folklore)

In early modern Scotland, inbetween the early 16th century and the mid-18th century, judicial proceedings concerned with the crimes of witchcraft (Scottish Gaelic: buidseachd) took place as part of a series of witch trials in Early Modern Europe. In the late middle age there were a handful of prosecutions for harm done through witchcraft, but the passing of the Witchcraft Act 1563 made witchcraft, or consulting with witches, capital crimes. The first major issue of trials under the new act were the North Berwick witch trials, beginning in 1590, in which King James VI played a major part as "victim" and investigator. He became interested in witchcraft and published a defence of witch-hunting in the Daemonologie in 1597, but he appears to have become increasingly sceptical and eventually took steps to limit prosecutions.

An estimated 4,000 to 6,000 people, mostly from the Scottish Lowlands, were tried for witchcraft in this period, a much higher rate than for neighbouring England. There were major series of trials in 1590–91, 1597, 1628–31, 1649–50 and 1661–62. Seventy-five per cent of the accused were women. Modern estimates indicate that more than 1,500 persons were executed; most were strangled and then burned. The hunts subsided under English occupation after the Civil Wars during the period of the Commonwealth led by Oliver Cromwell in the 1650s, but returned after the Restoration in 1660, causing some alarm and leading to the Privy Council of Scotland limiting arrests, prosecutions and torture. There was also growing scepticism in the later seventeenth century, while some of the factors that may have contributed to the trials, such as economic distress, subsided. Although there were occasional local outbreaks of witch-hunting, the last recorded executions were in 1706 and the last trial in 1727. The Scottish and English parliaments merged in 1707, and the unified British parliament repealed the 1563 Act in 1736.

Many causes have been suggested for the hunts, including economic distress, changing attitudes to women, the rise of a "godly state",[1] the inquisitorial Scottish judicial system, the widespread use of judicial torture, the role of the local kirk, decentralised justice and the prevalence of the idea of the diabolic pact. The proliferation of partial explanations for the witch-hunt has led some historians to proffer the concept of "associated circumstances", rather than one single significant cause.[2]

https://en.wikipedia.org/wiki/Witch_trials_in_early_modern_Scotland

St Andrews Castle is a ruin located in the coastal Royal Burgh of St Andrews in Fife, Scotland. The castle sits on a rocky promontory overlooking a small beach called Castle Sands and the adjoining North Sea. There has been a castle standing at the site since the times of Bishop Roger (1189–1202), son of the Earl of Leicester. It housed the burgh’s wealthy and powerful bishops while St Andrews served as the ecclesiastical centre of Scotland during the years before the Protestant Reformation. In their Latin charters, the Archbishops of St Andrews wrote of the castle as their palace, signing, "apud Palatium nostrum."[1] 

https://en.wikipedia.org/wiki/St_Andrews_Castle

The Scottish Reformation Parliament was the assembly commencing in 1560 that claimed to pass major pieces of legislation establishing the Scottish Reformation, most importantly the Confession of Faith Ratification Act 1560;[1] and Papal Jurisdiction Act 1560.[2]

https://en.wikipedia.org/wiki/Scottish_Reformation_Parliament

Heresy is any belief or theory that is strongly at variance with established beliefs or customs, particularly the accepted beliefs or religious law of a religious organization.[1][2] A heretic is a proponent of heresy.[1]

Heresy in Christianity, Judaism, and Islam has at times been met with censure ranging from excommunication to the death penalty.[3]

Heresy is distinct from apostasy, which is the explicit renunciation of one's religion, principles or cause;[4] and from blasphemy, which is an impious utterance or action concerning God or sacred things.[5] Heresiology is the study of heresy. 

Etymology

Derived from Ancient Greek haíresis (αἵρεσις), the English heresy originally meant "choice" or "thing chosen".[6] However, it came to mean the "party, or school, of a man's choice",[7] and also referred to that process whereby a young person would examine various philosophies to determine how to live.[citation needed]

The word heresy is usually used within a Christian, Jewish, or Islamic context, and implies slightly different meanings in each. The founder or leader of a heretical movement is called a heresiarch, while individuals who espouse heresy or commit heresy are known as heretics.

 

https://en.wikipedia.org/wiki/Heresy


In metaphysics, nominalism is the view that universals and abstract objects do not actually exist other than being merely names or labels.[1][2] There are at least two main versions of nominalism. One version denies the existence of universals – things that can be instantiated or exemplified by many particular things (e.g., strength, humanity). The other version specifically denies the existence of abstract objects – objects that do not exist in space and time.[3]

Most nominalists have held that only physical particulars in space and time are real, and that universals exist only post res, that is, subsequent to particular things.[4] However, some versions of nominalism hold that some particulars are abstract entities (e.g., numbers), while others are concrete entities – entities that do exist in space and time (e.g., pillars, snakes, bananas).

Nominalism is primarily a position on the problem of universals. It is opposed to realist philosophies, such as Platonic realism, which assert that universals do exist over and above particulars, and to the hylomorphic substance theory of Aristotle, which asserts that universals are immanently real within them. However, the name "nominalism" emerged from debates in medieval philosophy with Roscellinus.

The term nominalism stems from the Latin nomen, "name". John Stuart Mill summarised nominalism in the apothegm "there is nothing general except names".[5]

In philosophy of law, nominalism finds its application in what is called constitutional nominalism.[6] 

https://en.wikipedia.org/wiki/Nominalism

In philosophy, similarity or resemblance is a relation between objects that constitutes how much these objects are alike. Similarity comes in degrees: e.g. oranges are more similar to apples than to the moon. It is traditionally seen as an internal relation and analyzed in terms of shared properties: two things are similar because they have a property in common.[1] The more properties they share, the more similar they are. They resemble each other exactly if they share all their properties. So an orange is similar to the moon because they both share the property of being round, but it is even more similar to an apple because additionally, they both share various other properties, like the property of being a fruit. On a formal level, similarity is usually considered to be a relation that is reflexive (everything resembles itself), symmetric (if a is similar to b then b is similar to a) and non-transitive (a need not resemble c despite a resembling b and b resembling c).[2] Similarity comes in two forms: respective similarity, which is relative to one respect or feature, and overall similarity, which expresses the degree of resemblance between two objects all things considered. There is no general consensus whether similarity is an objective, mind-independent feature of reality, and, if so, whether it is a fundamental feature or reducible to other features.[3][4] Resemblance is central to human cognition since it provides the basis for the categorization of entities into kinds and for various other cognitive processes like analogical reasoning.[3][5] Similarity has played a central role in various philosophical theories, e.g. as a solution to the problem of universals through resemblance nominalism or in the analysis of counterfactuals in terms of similarity between possible worlds.[6][7]

https://en.wikipedia.org/wiki/Similarity_(philosophy)

Analogy (from Greek analogia, "proportion", from ana- "upon, according to" [also "against", "anew"] + logos "ratio" [also "word, speech, reckoning"][1][2]) is a cognitive process of transferring information or meaning from a particular subject (the analog, or source) to another (the target), or a linguistic expression corresponding to such a process. In a narrower sense, analogy is an inference or an argument from one particular to another particular, as opposed to deduction, induction, and abduction, in which at least one of the premises, or the conclusion, is general rather than particular in nature. The term analogy can also refer to the relation between the source and the target themselves, which is often (though not always) a similarity, as in the biological notion of analogy.

Ernest Rutherford's model of the atom (modified by Niels Bohr) made an analogy between the atom and the Solar System.

Analogy plays a significant role in problem solving, as well as decision making, argumentation, perception, generalization, memory, creativity, invention, prediction, emotion, explanation, conceptualization and communication. It lies behind basic tasks such as the identification of places, objects and people, for example, in face perception and facial recognition systems. It has been argued that analogy is "the core of cognition".[3] Specific analogical language comprises exemplification, comparisons, metaphors, similes, allegories, and parables, but not metonymy. Phrases like and so on, and the like, as if, and the very word like also rely on an analogical understanding by the receiver of a message including them. Analogy is important not only in ordinary language and common sense (where proverbs and idioms give many examples of its application) but also in science, philosophy, law and the humanities. The concepts of association, comparison, correspondence, mathematical and morphological homology, homomorphism, iconicity, isomorphism, metaphor, resemblance, and similarity are closely related to analogy. In cognitive linguistics, the notion of conceptual metaphor may be equivalent to that of analogy. Analogy is also a basis for any comparative arguments as well as experiments whose results are transmitted to objects that have been not under examination (e.g., experiments on rats when results are applied to humans).

Analogy has been studied and discussed since classical antiquity by philosophers, scientists, theologists and lawyers. The last few decades have shown a renewed interest in analogy, most notably in cognitive science

https://en.wikipedia.org/wiki/Analogy

An amateur (from French 'one who loves'[1]) is generally considered a person who pursues an avocation independent from their source of income. Amateurs and their pursuits are also described as popular, informal, self-taught, user-generated, DIY, and hobbyist.[2]

https://en.wikipedia.org/wiki/Amateur

Common sense (or simply sense) is sound, practical judgment concerning everyday matters, or a basic ability to perceive, understand, and judge in a manner that is shared by (i.e., "common to") nearly all people.[1]

The everyday understanding of common sense derives from historical philosophical discussion involving several European languages. Related terms in other languages include Latin sensus communis, Greek αἴσθησις κοινὴ (aísthēsis koinḕ), and French bon sens, but these are not straightforward translations in all contexts. Similarly in English, there are different shades of meaning, implying more or less education and wisdom: "good sense" is sometimes seen as equivalent to "common sense", and sometimes not.[2]

"Common sense" has at least two specific philosophical meanings. One is as a capability of the animal soul (ψῡχή, psūkhḗ) proposed by Aristotle to explain how the different senses join together and enable discrimination of particular objects by people and other animals. This common sense is distinct from basic sensory perception and from human rational thought, but cooperates with both.

A second philosophical use of the term is Roman-influenced and is used for the natural human sensitivity for other humans and the community.[3] Just like the everyday meaning, both of these refer to a type of basic awareness and ability to judge that most people are expected to share naturally, even if they cannot explain why. All these meanings of "common sense", including the everyday ones, are interconnected in a complex history and have evolved during important political and philosophical debates in modern Western civilisation, notably concerning science, politics and economics.[4] The interplay between the meanings has come to be particularly notable in English, as opposed to other western European languages, and the English term has become international.[5]

Since the Age of Enlightenment the term "common sense" has been used for rhetorical effect both approvingly, as a standard for good taste and source of scientific and logical axioms, and disapprovingly, as equivalent to vulgar prejudice and superstition.[6] It was at the beginning of the 18th century that this old philosophical term first acquired its modern English meaning: "Those plain, self-evident truths or conventional wisdom that one needed no sophistication to grasp and no proof to accept precisely because they accorded so well with the basic (common sense) intellectual capacities and experiences of the whole social body."[7] This began with Descartes's criticism of it, and what came to be known as the dispute between "rationalism" and "empiricism". In the opening line of one of his most famous books, Discourse on Method, Descartes established the most common modern meaning, and its controversies, when he stated that everyone has a similar and sufficient amount of common sense (bon sens), but it is rarely used well. Therefore, a skeptical logical method described by Descartes needs to be followed and common sense should not be overly relied upon.[8] In the ensuing 18th century Enlightenment, common sense came to be seen more positively as the basis for modern thinking. It was contrasted to metaphysics, which was, like Cartesianism, associated with the Ancien Régime. Thomas Paine's polemical pamphlet Common Sense (1776) has been described as the most influential political pamphlet of the 18th century, affecting both the American and French revolutions.[6] Today, the concept of common sense, and how it should best be used, remains linked to many of the most perennial topics in epistemology and ethics, with special focus often directed at the philosophy of the modern social sciences

https://en.wikipedia.org/wiki/Common_sense

Prejudice[1] can be an affective feeling towards a person based on their perceived group membership.[2] The word is often used to refer to a preconceived (usually unfavourable) evaluation or classification of another person based on that person's perceived personal characteristics, such as political affiliation, sex, gender, gender identity, beliefs, values, social class, age, disability, religion, sexuality, race, ethnicity, language, nationality, culture, complexion, beauty, height, body weight, occupation, wealth, education, criminality, sport-team affiliation, music tastes or other perceived characteristics.[3]

The word "prejudice" can also refer to unfounded or pigeonholed beliefs[4][5] and it may apply to "any unreasonable attitude that is unusually resistant to rational influence".[6] Gordon Allport defined prejudice as a "feeling, favorable or unfavorable, toward a person or thing, prior to, or not based on, actual experience".[7] Auestad (2015) defines prejudice as characterized by "symbolic transfer", transfer of a value-laden meaning content onto a socially-formed category and then on to individuals who are taken to belong to that category, resistance to change, and overgeneralization.[8] 

https://en.wikipedia.org/wiki/Prejudice

A superstition is any belief or practice considered by non-practitioners to be irrational or supernatural, attributed to fate or magic, perceived supernatural influence, or fear of that which is unknown. It is commonly applied to beliefs and practices surrounding luck, amulets, astrology, fortune telling, spirits, and certain paranormal entities, particularly the belief that future events can be foretold by specific (apparently) unrelated prior events.[1][2]

Also, the word superstition is often used to refer to a religion not practiced by the majority of a given society regardless of whether the prevailing religion contains alleged superstitions or to all religions by the antireligious.[1] 

https://en.wikipedia.org/wiki/Superstition

In argumentation theory, an argumentum ad populum (Latin for "appeal to the people")[1] is a fallacious argument which is based on claiming a truth or affirming something is good because the majority thinks so.[2]

Alternative names

Other names for the fallacy include:

  • appeal to (common) belief[3][4]
  • appeal to popularity[5][6]
  • appeal to the majority[7]
  • appeal to the masses[8]
  • argument from consensus[9]
  • authority of the many[9][10]
  • bandwagon fallacy[6][11]
  • common belief fallacy[3][4]
  • democratic fallacy[12]
  • mob appeal[citation needed]
  • truth by association[13]
  • consensus gentium (Latin for "agreement of the people")[11]

https://en.wikipedia.org/wiki/Argumentum_ad_populum

Versus populum (Latin for "towards the people") is the liturgical stance of a priest who, while celebrating Mass, faces the people from the other side of the altar. The opposite stance, that of a priest facing in the same direction as the people, is today called ad orientem (literally, "towards the east" − even if the priest is really facing in some other direction) or ad apsidem ("towards the apse" − even if the altar is unrelated to the apse of the church or even if the church or chapel has no apse).

In the early history of Christianity it was considered the norm to pray facing the geographical east.[1] From the middle of the 17th century, almost all new Roman Rite altars were built against a wall or backed by a reredos, with a tabernacle placed on the main altar or inserted into the reredos. This meant that the priest turned to the people, putting his back to the altar, for a few short moments at Mass. However, the Tridentine Missal is not celebrated versus populum since the Ritus Servandus gives corresponding instructions for the priest when performing actions that require him to face the people. In the Ritus Servandus, the rubrics say "with his hands joined before his breast, and with his eyes downcast, he turns toward the people from left to right." This would otherwise not make sense in the context of versus populum since versus populum assumes that he is already facing the people.[2]

https://en.wikipedia.org/wiki/Versus_populum

Argumentum ad populum is a type of informal fallacy,[1][14] specifically a fallacy of relevance,[15][16] and is similar to an argument from authority (argumentum ad verecundiam).[14][4][9] It uses an appeal to the beliefs, tastes, or values of a group of people,[12] stating that because a certain opinion or attitude is held by a majority, it is therefore correct.[12][17] 

https://en.wikipedia.org/wiki/Argumentum_ad_populum

https://en.wikipedia.org/wiki/Category:Genetic_fallacies

https://en.wikipedia.org/wiki/Appeal_to_tradition

https://en.wikipedia.org/wiki/Bandwagon_effect

https://en.wikipedia.org/wiki/Common_sense

https://en.wikipedia.org/wiki/Communal_reinforcement

https://en.wikipedia.org/wiki/Conformity

https://en.wikipedia.org/wiki/Ad_hominem

https://en.wikipedia.org/wiki/Consensus_reality

https://en.wikipedia.org/wiki/Consensus_theory_of_truth

https://en.wikipedia.org/wiki/Fundamental_attribution_error

https://en.wikipedia.org/wiki/Scientific_consensus


https://en.wikipedia.org/wiki/Three_men_make_a_tiger 

https://en.wikipedia.org/wiki/Appeal_to_spite


https://en.wikipedia.org/wiki/Propaganda_techniques


https://en.wikipedia.org/wiki/Resemblance


https://en.wikipedia.org/wiki/Informal_fallacy


https://en.wikipedia.org/wiki/Irrelevant_conclusion

https://en.wikipedia.org/w/index.php?title=Fallacy_of_relevance&redirect=no


From Wikipedia, the free encyclopedia
(Redirected from Fallacy of relevance)

An irrelevant conclusion,[1] also known as ignoratio elenchi (Latin for 'ignoring refutation') or missing the point, is the informal fallacy of presenting an argument that may or may not be logically valid and sound, but (whose conclusion) fails to address the issue in question. It falls into the broad class of relevance fallacies.[2]

The irrelevant conclusion should not be confused with formal fallacy, an argument whose conclusion does not follow from its premises; instead, it is that despite its formal consistency it is not relevant to the subject being talked about. 

https://en.wikipedia.org/wiki/Irrelevant_conclusion

Subjective idealism, or empirical idealism, is a form of philosophical monism that holds that only minds and mental contents exist. It entails and is generally identified or associated with immaterialism, the doctrine that material things do not exist. Subjective idealism rejects dualism, neutral monism, and materialism; indeed, it is the contrary of eliminative materialism, the doctrine that all or some classes of mental phenomena (such as emotions, beliefs, or desires) do not exist, but are sheer illusions.  

https://en.wikipedia.org/wiki/Subjective_idealism

A phenomenon (PL: phenomena), sometimes spelled phaenomenon, is an observable event.[1] The term came into its modern philosophical usage through Immanuel Kant, who contrasted it with the noumenon, which cannot be directly observed. Kant was heavily influenced by Gottfried Wilhelm Leibniz in this part of his philosophy, in which phenomenon and noumenon serve as interrelated technical terms. Far predating this, the ancient Greek Pyrrhonist philosopher Sextus Empiricus also used phenomenon and noumenon as interrelated technical terms. 

https://en.wikipedia.org/wiki/Phenomenon

Eliminative materialism (also called eliminativism) is a materialist position in the philosophy of mind. It is the idea that the majority of mental states in folk psychology do not exist.[1] Some supporters of eliminativism argue that no coherent neural basis will be found for many everyday psychological concepts such as belief or desire, since they are poorly defined. The argument is that psychological concepts of behaviour and experience should be judged by how well they reduce to the biological level.[2] Other versions entail the non-existence of conscious mental states such as pain and visual perceptions.[3]

Eliminativism about a class of entities is the view that the class of entities does not exist.[4] For example, materialism tends to be eliminativist about the soul; modern chemists are eliminativist about phlogiston; and modern physicists are eliminativist about the existence of luminiferous aether. Eliminative materialism is the relatively new (1960s–1970s) idea that certain classes of mental entities that common sense takes for granted, such as beliefs, desires, and the subjective sensation of pain, do not exist.[5][6] The most common versions are eliminativism about propositional attitudes, as expressed by Paul and Patricia Churchland,[7] and eliminativism about qualia (subjective interpretations about particular instances of subjective experience), as expressed by Daniel Dennett, Georges Rey,[3] and Jacy Reese Anthis.[8] These philosophers often appeal to an introspection illusion.

In the context of materialist understandings of psychology, eliminativism is the opposite of reductive materialism, arguing that mental states as conventionally understood do exist, and directly correspond to the physical state of the nervous system.[9] An intermediate position is revisionary materialism, often argues the mental state in question will prove to be somewhat reducible to physical phenomena—with some changes needed to the common sense concept.

Since eliminative materialism arguably claims that future research will fail to find a neuronal basis for various mental phenomena, it may need to wait for science to progress further. One might question the position on these grounds, but other philosophers like Churchland argue that eliminativism is often necessary in order to open the minds of thinkers to new evidence and better explanations.[9] Views closely related to eliminativism include illusionism and quietism

https://en.wikipedia.org/wiki/Eliminative_materialism

Absolute idealism is an ontologically monistic philosophy chiefly associated with G. W. F. Hegel and Friedrich Schelling, both of whom were German idealist philosophers in the 19th century. The label has also been attached to others such as Josiah Royce, an American philosopher who was greatly influenced by Hegel's work, and the British idealists.[1][2]

A form of idealism, absolute idealism is Hegel's account of how being is ultimately comprehensible as an all-inclusive whole (das Absolute). Hegel asserted that in order for the thinking subject (human reason or consciousness) to be able to know its object (the world) at all, there must be in some sense an identity of thought and being. Otherwise, the subject would never have access to the object and we would have no certainty about any of our knowledge of the world.

To account for the differences between thought and being, however, as well as the richness and diversity of each, the unity of thought and being cannot be expressed as the abstract identity "A=A". Absolute idealism is the attempt to demonstrate this unity using a new "speculative" philosophical method, which requires new concepts and rules of logic. According to Hegel, the absolute ground of being is essentially a dynamic, historical process of necessity that unfolds by itself in the form of increasingly complex forms of being and of consciousness, ultimately giving rise to all the diversity in the world and in the concepts with which we think and make sense of the world.[3]

The absolute idealist position dominated philosophy in nineteenth-century Britain and Germany, while exerting significantly less influence in the United States. The absolute idealist position should be distinguished from the subjective idealism of Berkeley, the transcendental idealism of Kant, or the post-Kantian transcendental idealism (also known as critical idealism)[4] of Fichte and of the early Schelling.[5] 

https://en.wikipedia.org/wiki/Absolute_idealism

Dialectic (Greek: διαλεκτική, dialektikḗ; related to dialogue; German: Dialektik), also known as the dialectical method, is a discourse between two or more people holding different points of view about a subject but wishing to establish the truth through reasoned argumentation. Dialectic resembles debate, but the concept excludes subjective elements such as emotional appeal and rhetoric (in the modern pejorative sense).[1][2] Dialectic may thus be contrasted with both the eristic, which refers to argument that aims to successfully dispute another's argument (rather than searching for truth), and the didactic method, wherein one side of the conversation teaches the other. Dialectic is alternatively known as minor logic, as opposed to major logic or critique.

Within Hegelianism, the word dialectic has the specialised meaning of a contradiction between ideas that serves as the determining factor in their relationship. Dialectical materialism, a theory or set of theories produced mainly by Karl Marx and Friedrich Engels, adapted the Hegelian dialectic into arguments regarding traditional materialism. The dialectics of Hegel and Marx were criticized in the twentieth century by the philosophers Karl Popper and Mario Bunge.

Dialectic tends to imply a process of evolution and so does not naturally fit within classical logics, but was given some formalism in the twentieth century. The emphasis on process is particularly marked in Hegelian dialectic, and even more so in Marxist dialectical logic, which tried to account for the evolution of ideas over longer time periods in the real world. 

https://en.wikipedia.org/wiki/Dialectic

Philosophical system

Hegel's philosophical system is divided into three parts: the science of logic, the philosophy of nature, and the philosophy of spirit (the latter two of which together constitute the real philosophy). This structure is adopted from Proclus's Neoplatonic triad of "'remaining-procession-return' and from the Christian Trinity."[73][d] Although evident in draft writings dating back as early as 1805, the system was not completed in published form until the 1817 Encyclopedia (1st ed.).[75]

Frederick C. Beiser argues that the position of the logic with respect to the real philosophy is best understood in terms of Hegel's appropriation of Aristotle's distinction between "the order of explanation" and "the order of being."[e] To Beiser, Hegel is neither a Platonist who believes in abstract logical entities, nor a nominalist according to whom the particular is first in the orders of explanation and being alike. Rather, Hegel is a holist. For Hegel, the universal is always first in the order of explanation even if what is naturally particular is first in the order of being. With respect to the system as a whole, that universal is supplied by the logic.[77]

Michael J. Inwood plainly states, "The logical idea is non-temporal and therefore does not exist at any time apart from its manifestations." To ask 'when' it divides into nature and spirit is analogous to asking 'when' 12 divides into 5 and 7. The question does not have an answer because it is predicated upon a fundamental misunderstanding of its terms.[78] The task of the logic (at this high systemic level) is to articulate what Hegel calls "the identity of identity and non-identity" of nature and spirit. Put another way, it aims to overcome subject-object dualism.[79] This is to say that, among other things, Hegel's philosophical project endeavors to provide the metaphysical basis for an account of spirit that is continuous with, yet distinct from, the 'merely' natural world – without thereby reducing either term to the other.[80]

Furthermore, the final sections of Hegel's Encyclopedia suggest that to give priority to any one of its three parts is to have an interpretation that is "one-sided," incomplete or otherwise inaccurate.[81][80][82] As Hegel famously declares, "The true is the whole."[83] 

https://en.wikipedia.org/wiki/Georg_Wilhelm_Friedrich_Hegel#Philosophical_system

A red herring is something that misleads or distracts from a relevant or important question.[1] It may be either a logical fallacy or a literary device that leads readers or audiences toward a false conclusion. A red herring may be used intentionally, as in mystery fiction or as part of rhetorical strategies (e.g., in politics), or may be used in argumentation inadvertently.[2]

The term was popularized in 1807 by English polemicist William Cobbett, who told a story of having used a strong-smelling smoked fish to divert and distract hounds from chasing a rabbit.[3] 

https://en.wikipedia.org/wiki/Red_herring

n grammar, the genitive case (abbreviated gen)[2] is the grammatical case that marks a word, usually a noun, as modifying another word, also usually a noun—thus indicating an attributive relationship of one noun to the other noun.[3] A genitive can also serve purposes indicating other relationships. For example, some verbs may feature arguments in the genitive case; and the genitive case may also have adverbial uses (see adverbial genitive).

Genitive construction includes the genitive case, but is a broader category. Placing a modifying noun in the genitive case is one way of indicating that it is related to a head noun, in a genitive construction. However, there are other ways to indicate a genitive construction. For example, many Afroasiatic languages place the head noun (rather than the modifying noun) in the construct state.

Possessive grammatical constructions, including the possessive case, may be regarded as a subset of genitive construction. For example, the genitive construction "pack of dogs" is similar, but not identical in meaning to the possessive case "dogs' pack" (and neither of these is entirely interchangeable with "dog pack", which is neither genitive nor possessive). Modern English is an example of a language that has a possessive case rather than a conventional genitive case. That is, Modern English indicates a genitive construction with either the possessive clitic suffix "-'s", or a prepositional genitive construction such as "x of y". However, some irregular English pronouns do have possessive forms which may more commonly be described as genitive (see English possessive). The names of the astronomical constellations have genitive forms which are used in star names, for example the star Mintaka in the constellation Orion (genitive Orionis) is also known as Delta Orionis or 34 Orionis.

Many languages have a genitive case, including Albanian, Arabic, Armenian, Basque, Danish, Dutch, Estonian, Finnish, Georgian, German, Greek, Gothic, Hungarian, Icelandic, Irish, Latin, Latvian, Lithuanian, Nepali, Romanian, Sanskrit, Scottish Gaelic, Swedish, Kannada, Tamil, Telugu, Turkish and all Slavic languages except Bulgarian and Macedonian

https://en.wikipedia.org/wiki/Genitive_case

In linguistics, inalienable possession[1] (abbreviated INAL) is a type of possession in which a noun is obligatorily possessed by its possessor. Nouns or nominal affixes in an inalienable possession relationship cannot exist independently or be "alienated" from their possessor.[2] Inalienable nouns include body parts (such as leg, which is necessarily "someone's leg" even if it is severed from the body), kinship terms (such as mother), and part-whole relations (such as top).[3] Many languages reflect the distinction but vary in how they mark inalienable possession.[4] Cross-linguistically, inalienability correlates with many morphological, syntactic, and semantic properties.

In general, the alienable–inalienable distinction is an example of a binary possessive class system in which a language distinguishes two kinds of possession (alienable and inalienable). The alienability distinction is the most common kind of binary possessive class system, but it is not the only one.[4] Some languages have more than two possessive classes. In Papua New Guinea, for example, Anêm has at least 20 classes, and Amele has 32.[5][4]

Statistically, 15–20% of the world's languages have obligatory possession.[6] 

https://en.wikipedia.org/wiki/Inalienable_possession

In linguistics, the term nominal refers to a category used to group together nouns and adjectives based on shared properties. The motivation for nominal grouping is that in many languages nouns and adjectives share a number of morphological and syntactic properties. The systems used in such languages to show agreement can be classified broadly as gender systems, noun class systems or case marking, classifier systems, and mixed systems.[1] Typically an affix related to the noun appears attached to the other parts of speech within a sentence to create agreement. Such morphological agreement usually occurs in parts within the noun phrase, such as determiners and adjectives. Languages with overt nominal agreement vary in how and to what extent agreement is required.  

https://en.wikipedia.org/wiki/Nominal_(linguistics)

In parapsychology, an apparitional experience is an anomalous experience characterized by the apparent perception of either a living being or an inanimate object without there being any material stimulus for such a perception.

In academic discussion, the term "apparitional experience" is preferred to the term "ghost" because:

  1. The term ghost implies that some element of the human being survives death and, at least under certain circumstances, can make itself perceptible to living human beings. There are other competing explanations of apparitional experiences.
  2. Firsthand accounts of apparitional experiences differ in many respects from their fictional counterparts in literary or traditional ghost stories and films (see below).
  3. The content of apparitional experiences includes living beings, both human and animal, and even inanimate objects.[1]
 

 https://en.wikipedia.org/wiki/Apparitional_experience

Synchronicity (German: Synchronizität) is a concept first introduced by analytical psychologist Carl G. Jung "to describe circumstances that appear meaningfully related yet lack a causal connection."[1] In contemporary research, synchronicity experiences refer to one's subjective experience that coincidences between events in one's mind and the outside world may be causally unrelated to each other yet have some other unknown connection.[2] Jung held that this was a healthy, even necessary, function of the human mind that can become harmful within psychosis.[3][4]

Jung developed the theory of synchronicity as a hypothetical noncausal principle serving as the intersubjective or philosophically objective connection between these seemingly meaningful coincidences.[1][5] Mainstream science generally regards that any such hypothetical principle either does not exist or falls outside the bounds of science.[6][7] After first coining the term in the late 1920s[5] or early 30s,[8] Jung further developed the concept in collaboration with physicist and Nobel laureate Wolfgang Pauli through long correspondences and in their eventual 1952 work The Interpretation of Nature and the Psyche (German: Naturerklärung und Psyche) which comprises one paper from each of the two thinkers.[9][10][11][12] Their work together culminated in what is now called the Pauli–Jung conjecture.[13] During his career, Jung furnished several different definitions of synchronicity,[14] defining it as "a hypothetical factor equal in rank to causality as a principle of explanation",[15] "an acausal connecting principle", "acausal parallelism", and as the "meaningful coincidence of two or more events where something other than the probability of chance is involved".[16] In Pauli's words, synchronicities were "corrections to chance fluctuations by meaningful and purposeful coincidences of causally unconnected events", though he had also proposed to move the concept away from coincidence towards instead a "correspondence", "connection", or "constellation" of discrete factors.[17] Jung and Pauli's view was that, just as causal connections can provide a meaningful understanding of the psyche and the world, so too may acausal connections.[3][17][8]

A 2016 study found that two thirds of therapists surveyed agreed that synchronicity experiences could be useful for therapy.[18] Analytical psychologists likewise hold that individuals must come to understand the compensatory meaning of these experiences in order to "enhance consciousness rather than merely build up superstitiousness".[19] However, clients who disclose synchronicity experiences in a clinical setting often report not being listened to, accepted, or understood.[20] Furthermore, the experiencing of an overabundance of meaningful coincidences is characteristic of the earliest stages of schizophrenic delusion.[21] M. K. Johansen and M. Osman write that "prevalent among many scientists, particularly psychologists studying coincidences, is [the view] that the occurrence of coincidences, as psychologically experienced, is induced by noisy chance occurrences out in the world which are then misconstrued via irrational cognitive biases into unfounded, possibly even paranormal, beliefs in the mind."[7] One study has shown that both counselors and psychoanalysts were less likely than psychologists to agree that chance coincidence was an adequate explanation for synchronicity, while more likely than psychologists to agree that a need for unconscious material to be expressed could be an explanation for synchronicity experiences in the clinical setting.[18]

Jung used the concept of synchronicity in arguing for the existence of the paranormal.[22] This idea was similarly explored by writer Arthur Koestler in his 1972 work The Roots of Coincidence[23] and was also taken up by the New Age movement.[6] Unlike magical thinking, which believes causally unrelated events to have some paranormal causal connection, the synchronicity principle supposes that events may truly be causally unrelated yet have some unknown noncausal connection.[24] The objection from a scientific standpoint, however, is that this is neither testable nor falsifiable and therefore does not fall within the realm of empirical study.[6] Scientific scepticism regards it as pseudoscience.[6] Jung stated that synchronicity events are nothing but chance occurrences from a statistical point of view, but are meaningful in that they may seem to validate paranormal ideas. However, no empirical studies of synchronicity experiences based on observable mental states and scientific data were conducted by Jung in order to draw his conclusions,[6] though some studies have since been done in this area (see § Studies, below).

While a given observer may subjectively experience a coincidence as meaningful, this alone cannot prove any objective meaning to the coincidence.[6] Various statistical laws, such as Littlewood's law and the law of truly large numbers, show how unexpected occurrences can be more likely to encounter than people otherwise assume. These serve to explain coincidences such as synchronicity experiences as chance events which have been misinterpreted by confirmation biases, spurious correlations, or underestimated probability.[25][26][27] 

https://en.wikipedia.org/wiki/Synchronicity

https://en.wikipedia.org/wiki/Memories,_Dreams,_Reflections

 

https://en.wikipedia.org/wiki/Category:Relevance_fallacies

https://en.wikipedia.org/wiki/Evasion_(ethics)

https://en.wikipedia.org/wiki/Enthymeme

https://en.wikipedia.org/wiki/Genetic_fallacy

https://en.wikipedia.org/wiki/Sophist

https://en.wikipedia.org/wiki/Formal_fallacy

https://en.wikipedia.org/wiki/Accident_(fallacy)

https://en.wikipedia.org/wiki/Argumentum_ad_baculum

https://en.wikipedia.org/wiki/Base_rate_fallacy

 

https://en.wikipedia.org/wiki/Chronological_snobbery

https://en.wikipedia.org/wiki/Moralistic_fallacy

https://en.wikipedia.org/wiki/Invincible_ignorance_fallacy

https://en.wikipedia.org/wiki/Etymological_fallacy

https://en.wikipedia.org/wiki/Proof_by_assertion

https://en.wikipedia.org/wiki/Shaggy_defense

https://en.wikipedia.org/wiki/Victim_blaming

https://en.wikipedia.org/wiki/Begging_the_question

 

 

In classical rhetoric and logic, begging the question or assuming the conclusion (Latin: petitio principii) is an informal fallacy that occurs when an argument's premises assume the truth of the conclusion. A question-begging inference is valid, in the sense that the conclusion is as true as the premise, but it is not a valid argument.[1]

For example, the statement that "wool sweaters are superior to nylon jackets because wool sweaters have higher wool content" begs the question because this statement assumes that higher wool content implies being a superior material.[2] Begging the question is a type of circular reasoning, and often occurs in an indirect way such that the fallacy's presence is hidden, or at least not easily apparent.[3]

The phrase "begs the question" is also commonly used to mean "prompts a question" or "raises a question".[4]

https://en.wikipedia.org/wiki/Begging_the_question


https://en.wikipedia.org/wiki/Begging_the_question

https://en.wikipedia.org/wiki/Evasion_(ethics)#Question_dodging

https://en.wikipedia.org/wiki/Circular_reasoning

https://en.wikipedia.org/wiki/Infinite_regress#Failure_to_explain

https://en.wikipedia.org/wiki/Ambiguity


https://en.wikipedia.org/wiki/Catch-22_(logic)


https://en.wikipedia.org/wiki/Consequentia_mirabilis

https://en.wikipedia.org/wiki/Fallacies_of_definition

https://en.wikipedia.org/wiki/Open-question_argument

https://en.wikipedia.org/wiki/Polysyllogism

https://en.wikipedia.org/wiki/Presuppositional_apologetics



Presuppositionalism is an epistemological school of Christian apologetics that examines the presuppositions on which worldviews are based, and invites comparison and contrast between the results of those presuppositions.

It claims that apart from presuppositions, one could not make sense of any human experience, and there can be no set of neutral assumptions from which to reason with a non-Christian.[1] Presuppositionalists claim that Christians cannot consistently declare their belief in the necessary existence of the God of the Bible and simultaneously argue on the basis of a different set of assumptions that God may not exist and Biblical revelation may not be true.[2][failed verification] Two schools of presuppositionalism exist, based on the different teachings of Cornelius Van Til and Gordon Haddon Clark. Presuppositionalism contrasts with classical apologetics and evidential apologetics.

Presuppositionalists compare their presupposition against other ultimate standards such as reason, empirical experience, and subjective feeling, claiming presupposition in this context is:

a belief that takes precedence over another and therefore serves as a criterion for another. An ultimate presupposition is a belief over which no other takes precedence. For a Christian, the content of Scripture must serve as his ultimate presupposition… This doctrine is merely the outworking of the 'lordship of the Christian god' in the area of human thought. It merely applies the doctrine of scriptural infallibility to the realm of knowing.[3]

https://en.wikipedia.org/wiki/Presuppositional_apologetics


In epistemology, the regress argument is the argument that any proposition requires a justification. However, any justification itself requires support. This means that any proposition whatsoever can be endlessly (infinitely) questioned, resulting in infinite regress. It is a problem in epistemology and in any general situation where a statement has to be justified.[1][2][3]

The argument is also known as diallelus[4] (Latin) or diallelon, from Greek di' allelon "through or by means of one another" and as the epistemic regress problem. It is an element of the Münchhausen trilemma.[5]

https://en.wikipedia.org/wiki/Regress_argument

https://en.wikipedia.org/wiki/Spin_(propaganda)

 

In logic, reductio ad absurdum (Latin for "reduction to absurdity"), also known as argumentum ad absurdum (Latin for "argument to absurdity") or apagogical arguments, is the form of argument that attempts to establish a claim by showing that the opposite scenario would lead to absurdity or contradiction.[1][2][3][4] This argument form traces back to Ancient Greek philosophy and has been used throughout history in both formal mathematical and philosophical reasoning, as well as in debate. 

https://en.wikipedia.org/wiki/Reductio_ad_absurdum

Negation introduction is a rule of inference, or transformation rule, in the field of propositional calculus.

Negation introduction states that if a given antecedent implies both the consequent and its complement, then the antecedent is a contradiction.[1][2] 

https://en.wikipedia.org/wiki/Negation_introduction

In the philosophy of logic, a rule of inference, inference rule or transformation rule is a logical form consisting of a function which takes premises, analyzes their syntax, and returns a conclusion (or conclusions). For example, the rule of inference called modus ponens takes two premises, one in the form "If p then q" and another in the form "p", and returns the conclusion "q". The rule is valid with respect to the semantics of classical logic (as well as the semantics of many other non-classical logics), in the sense that if the premises are true (under an interpretation), then so is the conclusion.

Typically, a rule of inference preserves truth, a semantic property. In many-valued logic, it preserves a general designation. But a rule of inference's action is purely syntactic, and does not need to preserve any semantic property: any function from sets of formulae to formulae counts as a rule of inference. Usually only rules that are recursive are important; i.e. rules such that there is an effective procedure for determining whether any given formula is the conclusion of a given set of formulae according to the rule. An example of a rule that is not effective in this sense is the infinitary ω-rule.[1]

Popular rules of inference in propositional logic include modus ponens, modus tollens, and contraposition. First-order predicate logic uses rules of inference to deal with logical quantifiers

https://en.wikipedia.org/wiki/Rule_of_inference

In logic, proof by contradiction is a form of proof that establishes the truth or the validity of a proposition, by showing that assuming the proposition to be false leads to a contradiction. Although it is quite freely used in mathematical proofs, not every school of mathematical thought accepts this kind of nonconstructive proof as universally valid.

More broadly, proof by contradiction is any form of argument that establishes a statement by arriving at a contradiction, even when the initial assumption is not the negation of the statement to be proved. In this general sense, proof by contradiction is also known as indirect proof, proof by assuming the opposite,[citation needed] and reductio ad impossibile.[1]

A mathematical proof employing proof by contradiction usually proceeds as follows:

  1. The proposition to be proved is P.
  2. We assume P to be false, i.e., we assume ¬P.
  3. It is then shown that ¬P implies falsehood. This is typically accomplished by deriving two mutually contradictory assertions, Q and ¬Q, and appealing to the law of noncontradiction.
  4. Since assuming P to be false leads to a contradiction, it is concluded that P is in fact true.

An important special case is the existence proof by contradiction: in order to demonstrate that an object with a given property exists, we derive a contradiction from the assumption that all objects satisfy the negation of the property. 

https://en.wikipedia.org/wiki/Proof_by_contradiction

In logic, the law of non-contradiction (LNC) (also known as the law of contradiction, principle of non-contradiction (PNC), or the principle of contradiction) states that contradictory propositions cannot both be true in the same sense at the same time, e. g. the two propositions "p is the case" and "p is not the case" are mutually exclusive. Formally this is expressed as the tautology ¬(p ∧ ¬p). The law is not to be confused with the law of excluded middle which states that at least one, "p is the case" or "p is not the case" holds.

One reason to have this law is the principle of explosion, which states that anything follows from a contradiction. The law is employed in a reductio ad absurdum proof.

To express the fact that the law is tenseless and to avoid equivocation, sometimes the law is amended to say "contradictory propositions cannot both be true 'at the same time and in the same sense'".

It is one of the so called three laws of thought, along with its complement, the law of excluded middle, and the law of identity. However, no system of logic is built on just these laws, and none of these laws provide inference rules, such as modus ponens or De Morgan's laws.

The law of non-contradiction and the law of excluded middle create a dichotomy in "logical space", wherein the two parts are "mutually exclusive" and "jointly exhaustive". The law of non-contradiction is merely an expression of the mutually exclusive aspect of that dichotomy, and the law of excluded middle, an expression of its jointly exhaustive aspect. 

https://en.wikipedia.org/wiki/Law_of_noncontradiction

Total depravity (also called radical corruption[1] or pervasive depravity) is a Protestant theological doctrine derived from the concept of original sin. It teaches that, as a consequence of man's fall, every person born into the world is enslaved to the service of sin as a result of their fallen nature and, apart from the efficacious (irresistible) or prevenient (enabling) grace of God, is completely unable to choose by themselves to follow God, refrain from evil, or accept the gift of salvation as it is offered.

The doctrine is advocated to various degrees by many Protestant denominations, including some Lutheran synods,[2][3] and all Calvinist churches.[4][5][6][7] Arminian denominations, such as Methodists, believe and teach total depravity, but with distinct differences,[8][9] the most important of which is the distinction between irresistible grace and prevenient grace.[10] 

https://en.wikipedia.org/wiki/Total_depravity

https://en.wikipedia.org/wiki/Deprivity

 

 

 

 

 

 

 

 


 



 

 

 

 

 

 





 

 

 

 

 

 

 



 

No comments:

Post a Comment