Blog Archive

Monday, May 15, 2023

05-14-2023-1802 - various etc. (draft)

A ghostwriter is a catch phrase and unofficial term for a human person hired to write literary or journalistic works, speeches, or other texts that are putatively credited to another person as the author. Celebrities, executives, participants in timely news stories, and political leaders often hire ghostwriters to draft or edit autobiographies, memoirs, magazine articles, or other written material.

Memoir ghostwriters often pride themselves in "disappearing" when impersonating others since such disappearance signals the quality of their craftsmanship.[1] In music, ghostwriters are often used to write songs, lyrics, and instrumental pieces. Screenplay authors can also use ghostwriters to either edit or rewrite their scripts to improve them. Usually, there is a confidentiality clause in the contract between the ghostwriter and the credited author that obligates the former to remain anonymous. Sometimes the ghostwriter is acknowledged by the author or publisher for their writing services, euphemistically called a "researcher" or "research assistant", but often the ghostwriter is not credited.

Ghostwriting (or simply "ghosting") also occurs in other creative fields. Composers have long hired ghostwriters to help them to write musical pieces and songs; Wolfgang Amadeus Mozart is an example of a well-known composer who was paid to ghostwrite music for wealthy patrons. Ghosting also occurs in popular music. A pop music ghostwriter writes lyrics and a melody in the style of the credited musician. However it is most notable today among rap artists since there is a pressure to produce lyrically challenging content. Notable rappers such as Nicki Minaj have criticized artists like Cardi B for using ghostwriters to make their songs.[2] In hip hop music, the increasing use of ghostwriters by high-profile hip-hop stars has led to controversy.[3] In the visual arts, it is not uncommon in either fine art or commercial art such as comics for a number of assistants to do work on a piece that is credited to a single artist; Andy Warhol engaged in this practice, supervising an assembly line silk screen process for his artwork.[4] However, when credit is established for the writer, the acknowledgment of their contribution is public domain and the writer in question would not be considered a ghostwriter. 

https://en.wikipedia.org/wiki/Ghostwriter

Ghost riding, frequently used in the context of "ghost riding the whip" (a "whip" being a vehicle) or simply ghostin', is when a person exits their moving vehicle, and dances beside and around it.

American rapper E-40's 2006 song "Tell Me When to Go" produced by Lil Jon brought mainstream attention to "ghost riding".[1][2] Ghost riding is also another term used for car surfing, and the term is also occasionally used to describe a moving vehicle with no occupant, such as when a car without the hand brake applied starts to roll down an incline.[3] The practice originated in Northern California, specifically the Bay Area. It gets its name from the fact that while the driver is dancing beside the moving vehicle, it appears that the vehicle is being driven by an invisible driver.

Ghost riding is an activity that has been practiced in the San Francisco Bay Area and Oakland, California for many years, during what are called sideshows.[4][5] It is thought to have started as a trend around 2006.[6] The popularization of ghost riding a car is a byproduct of popular Bay Area music, and the hyphy subculture in general, additionally it has been suggested ghost riding is a copycat crime popularized by YouTube videos and online social media.[7][4][6][5]

Ghost riding is performed by exiting an automobile while it is left in gear. The automobile's engine runs at idle speed, slowly propelling the car forward. As with car surfing, ghost riding is dangerous and has resulted in deaths in North America.[7] 

https://en.wikipedia.org/wiki/Ghost_riding

An encyclical was originally a circular letter sent to all the churches of a particular area in the ancient Roman Church. At that time, the word could be used for a letter sent out by any bishop. The word comes from the Late Latin encyclios (originally from the Latin encyclius, a Latinization of Greek ἐνκύκλιος (enkyklios), meaning "circular", "in a circle", or "all-round", also part of the origin of the word encyclopedia).[1] The term has been used by Catholics, Anglicans and the Eastern Orthodox Church

https://en.wikipedia.org/wiki/Encyclical

Communication is usually defined as the transmission of information. The term can also refer to the message itself, or the field of inquiry studying these transmissions, also known as communication studies. There are some disagreements about the precise definition of communication - for example, whether unintentional or failed transmissions are also included and whether communication does not just transmit meaning but also create it. Models of communication aim to provide a simplified overview of its main components and their interaction. Many models include the idea that a source uses a coding system to express information in the form of a message. The source uses a channel to send the message to a receiver who has to decode it in order to understand its meaning. Channels are usually discussed in terms of the senses used to perceive the message, like hearing, sight, smell, touch, and taste.

Communication can be classified based on whether information is exchanged between humans, members of other species, or non-living entities such as computers. For human communication, a central distinction is between verbal and non-verbal communication. Verbal communication involves the exchange of messages in linguistic form. This can happen through natural languages, like English or Japanese, or through artificial languages, like Esperanto. Verbal communication includes spoken and written messages as well as the use of sign language. Non-verbal communication happens without the use of a linguistic system. There are many forms of non-verbal communication, for example, using body language, body position, touch, and intonation. Another distinction is between interpersonal and intrapersonal communication. Interpersonal communication happens between distinct individuals, such as greeting someone on the street or making a phone call. Intrapersonal communication, on the other hand, is communication with oneself. This can happen internally, as a form of inner dialog or daydreaming, or externally, for example, when writing down a shopping list or engaging in a monologue.

Non-human forms of communication include animal and plant communication. Researchers in this field often formulate additional criteria for their definition of communicative behavior, like the requirement that the behavior serves a beneficial function for natural selection or that a response to the message is observed. Animal communication plays important roles for various species in the areas of courtship and mating, parent-offspring relations, social relations, navigation, self-defense, and territoriality. In the area of courtship and mating, for example, communication is used to identify and attract potential mates. An often-discussed example concerning navigational communication is the waggle dance used by bees to indicate to other bees where flowers are located. Due to the rigid cell walls of plants, their communication often happens through chemical means rather than movement. For example, various plants, like maple trees, release so-called volatile organic compounds into the air to warn other plants of a herbivore attack. Most communication takes place between members of the same species since its purpose is usually some form of cooperation, which is not as common between species. However, there are also forms of interspecies communication, mainly in cases of symbiotic relationships. For example, many flowers use symmetrical shapes and colors that stand out from their surroundings in order to communicate to insects where nectar is located to attract them. Humans also practice interspecies communication, for example, when interacting with pets.

The field of communication includes various other issues, like communicative competence and the history of communication. Communicative competence is the ability to communicate well and applies both to the capability to formulate messages and to understand them. Two central aspects are that the communicative behavior is effective, i.e. that it achieves the individual's goal, and that it is appropriate, i.e. that it follows social standards and expectations. Human communication has a long history and how people exchange information has changed over time. These changes were usually triggered by the development of new communication technologies, such as the invention of writing systems (first pictographic and later alphabetic), the development of mass printing, the use of radio and television, and the invention of the internet. 

https://en.wikipedia.org/wiki/Communication

Discourse is a generalization of the notion of a conversation to any form of communication.[1] Discourse is a major topic in social theory, with work spanning fields such as sociology, anthropology, continental philosophy, and discourse analysis. Following pioneering work by Michel Foucault, these fields view discourse as a system of thought, knowledge, or communication that constructs our experience of the world. Since control of discourse amounts to control of how the world is perceived, social theory often studies discourse as a window into power. Within theoretical linguistics, discourse is understood more narrowly as linguistic information exchange and was one of the major motivations for the framework of dynamic semantics, in which expressions' denotations are equated with their ability to update a discourse context.  

https://en.wikipedia.org/wiki/Discourse

Rhetoric (/ˈrɛtərɪk/)[note 1] is the art of persuasion, which along with grammar and logic (or dialectic), is one of the three ancient arts of discourse. Rhetoric aims to study the techniques writers or speakers utilize to inform, persuade, or motivate particular audiences in specific situations.[5] Aristotle defines rhetoric as "the faculty of observing in any given case the available means of persuasion" and since mastery of the art was necessary for victory in a case at law, for passage of proposals in the assembly, or for fame as a speaker in civic ceremonies, he calls it "a combination of the science of logic and of the ethical branch of politics".[6] Rhetoric typically provides heuristics for understanding, discovering, and developing arguments for particular situations, such as Aristotle's three persuasive audience appeals: logos, pathos, and ethos. The five canons of rhetoric or phases of developing a persuasive speech were first codified in classical Rome: invention, arrangement, style, memory, and delivery.

From Ancient Greece to the late 19th century, rhetoric played a central role in Western education in training orators, lawyers, counsellors, historians, statesmen, and poets.[7][note 2] 

https://en.wikipedia.org/wiki/Rhetoric

An essay mill (also term paper mill) is a business that allows customers to commission an original piece of writing on a particular topic so that they may commit academic fraud. Customers provide the company with specific information about the essay, including: a page length, a general topic, and a time frame with which to work. The customer is then charged a certain amount per page.[1] The similar essay bank concept is a company from which students can purchase pre-written but less expensive essays on various topics, at higher risk of being caught. Both forms of business are under varying legal restraints in some jurisdictions.

History

The idea behind term paper mills can be dated back to the mid-nineteenth century in which "paper reservoirs" were located in the basements of fraternity houses. Otherwise known as "fraternity files," these essay banks were practices in which students shared term papers and submitted work that had been done by other students.[2][clarification needed] These essay banks inspired the commercialization of ghostwritten essay-writing practices. As early as the 1950s, advertisements circulating in college campuses described services that included ghostwritten work for dissertations, theses, and term papers.[2]

In conjunction with this practice, the changing attitudes of students in the 1960s and 1970s started to stray away from diligent and engaged course work because they saw an emphasis on the benefits of community involvement. A new focus on activities outside of the classroom took away from time to focus on class work, thus promoting these writing services throughout college campuses.[2]

Soon, actual businesses were providing custom-written essays for students in exchange for compensation. They were located near college campuses.[3] One could walk into a building and peruse pricing pamphlets, speak to someone directly to place an order, or possibly make a selection from a vault of recycled research papers stored in the basement of these businesses.[2] 

https://en.wikipedia.org/wiki/Essay_mill

A literary agent is an agent who represents writers and their written works to publishers, theatrical producers, film producers, and film studios, and assists in sale and deal negotiation. Literary agents most often represent novelists, screenwriters, and non-fiction writers.

Reputable literary agents generally charge a commission and do not charge a fee upfront. The commission rate is generally 15%.[1]

Diversity

Literary agencies can range in size from a single agent who represents perhaps a dozen authors, to a substantial firm with senior partners, sub-agents, specialists in areas like foreign rights or licensed merchandise tie-ins, and clients numbering in the hundreds. Most agencies, especially smaller ones, specialize to some degree. They may represent—for example—authors of science fiction, mainstream thrillers and mysteries, children's books, romance, or highly topical nonfiction. Very few agents represent short stories or poetry.

Legitimate agents and agencies in the book world are not required to be members of the Association of Authors' Representatives (AAR), but according to Writer's Market listings, many agents in the United States are. To qualify for AAR membership, agents must have sold a minimum number of books and pledge to abide by a Canon of Ethics.[2] 

https://en.wikipedia.org/wiki/Literary_agent

Communication studies or communication science is an academic discipline that deals with processes of human communication and behavior, patterns of communication in interpersonal relationships, social interactions and communication in different cultures.[1] Communication is commonly defined as giving, receiving or exchanging ideas, information, signals or messages through appropriate media, enabling individuals or groups to persuade, to seek information, to give information or to express emotions effectively.[2][3] Communication studies is a social science that uses various methods of empirical investigation and critical analysis to develop a body of knowledge that encompasses a range of topics, from face-to-face conversation at a level of individual agency and interaction to social and cultural communication systems at a macro level.[4][5]

Scholarly communication theorists[citation needed] focus primarily on refining the theoretical understanding of communication, examining statistics in order to help substantiate claims. The range of social scientific methods to study communication has been expanding. Communication researchers draw upon a variety of qualitative and quantitative techniques. The linguistic and cultural turns of the mid-20th century led to increasingly interpretative, hermeneutic, and philosophic approaches towards the analysis of communication.[6] Conversely, the end of the 1990s and the beginning of the 2000s have seen the rise of new analytically, mathematically, and computationally focused techniques.[7][failed verification]

As a field of study, communication is applied to journalism, business, mass media, public relations, marketing, news and television broadcasting, interpersonal and intercultural communication, education, public administration—and beyond.[8][9] As all spheres of human activity and conveyance are affected by the interplay between social communication structure and individual agency,[5][10] communication studies has gradually expanded its focus to other domains, such as health, medicine, economy, military and penal institutions, the Internet, social capital, and the role of communicative activity in the development of scientific knowledge

https://en.wikipedia.org/wiki/Communication_studies

The following outline is provided as an overview of and topical guide to communication:

Communication – purposeful activity of exchanging information and meaning across space and time using various technical or natural means, whichever is available or preferred. Communication requires a sender, a message, a medium and a recipient, although the receiver does not have to be present or aware of the sender's intent to communicate at the time of communication; thus communication can occur across vast distances in time and space.

Essence of communication

https://en.wikipedia.org/wiki/Outline_of_communication

A heuristic (/hjʊˈrɪstɪk/; from Ancient Greek εὑρίσκω (heurískō) 'to find, discover'), or heuristic technique, is any approach to problem solving or self-discovery that employs a practical method that is not guaranteed to be optimal, perfect, or rational, but is nevertheless sufficient for reaching an immediate, short-term goal or approximation. Where finding an optimal solution is impossible or impractical, heuristic methods can be used to speed up the process of finding a satisfactory solution. Heuristics can be mental shortcuts that ease the cognitive load of making a decision.[1][2]

Examples that employ heuristics include using trial and error, a rule of thumb or an educated guess.

Heuristics are the strategies derived from previous experiences with similar problems. These strategies depend on using readily accessible, though loosely applicable, information to control problem solving in human beings, machines and abstract issues.[3][4] When an individual applies a heuristic in practice, it generally performs as expected. However it can alternatively create systematic errors.[5]

The most fundamental heuristic is trial and error, which can be used in everything from matching nuts and bolts to finding the values of variables in algebra problems. In mathematics, some common heuristics involve the use of visual representations, additional assumptions, forward/backward reasoning and simplification. Here are a few commonly used heuristics from George Pólya's 1945 book, How to Solve It:[6]

  • If you are having difficulty understanding a problem, try drawing a picture.
  • If you can't find a solution, try assuming that you have a solution and seeing what you can derive from that ("working backward").
  • If the problem is abstract, try examining a concrete example.
  • Try solving a more general problem first (the "inventor's paradox": the more ambitious plan may have more chances of success).

In psychology, heuristics are simple, efficient rules, either learned or inculcated by evolutionary processes. These psychological heuristics have been proposed to explain how people make decisions, come to judgements, and solve problems. These rules typically come into play when people face complex problems or incomplete information. Researchers employ various methods to test whether people use these rules. The rules have been shown to work well under most circumstances, but in certain cases can lead to systematic errors or cognitive biases.[7]

History

The study of heuristics in human decision-making was developed in the 1970s and the 1980s, by the psychologists Amos Tversky and Daniel Kahneman,[8] although the concept had been originally introduced by the Nobel laureate Herbert A. Simon. Simon's original primary object of research was problem solving that showed that we operate within what he calls bounded rationality. He coined the term satisficing, which denotes a situation in which people seek solutions, or accept choices or judgements, that are "good enough" for their purposes although they could be optimised.[9]

Rudolf Groner analysed the history of heuristics from its roots in ancient Greece up to contemporary work in cognitive psychology and artificial intelligence,[10] proposing a cognitive style "heuristic versus algorithmic thinking", which can be assessed by means of a validated questionnaire.[11]

Adaptive toolbox

Gerd Gigerenzer and his research group argued that models of heuristics need to be formal to allow for predictions of behavior that can be tested.[12] They study the fast and frugal heuristics in the "adaptive toolbox" of individuals or institutions, and the ecological rationality of these heuristics; that is, the conditions under which a given heuristic is likely to be successful.[13] The descriptive study of the "adaptive toolbox" is done by observation and experiment, the prescriptive study of the ecological rationality requires mathematical analysis and computer simulation. Heuristics – such as the recognition heuristic, the take-the-best heuristic and fast-and-frugal trees – have been shown to be effective in predictions, particularly in situations of uncertainty. It is often said that heuristics trade accuracy for effort but this is only the case in situations of risk. Risk refers to situations where all possible actions, their outcomes and probabilities are known. In the absence of this information, that is under uncertainty, heuristics can achieve higher accuracy with lower effort.[14] This finding, known as a less-is-more effect, would not have been found without formal models. The valuable insight of this program is that heuristics are effective not despite their simplicity — but because of it. Furthermore, Gigerenzer and Wolfgang Gaissmaier found that both individuals and organisations rely on heuristics in an adaptive way.[15]

Cognitive-experiential self-theory

Heuristics, through greater refinement and research, have begun to be applied to other theories, or be explained by them. For example, the cognitive-experiential self-theory (CEST) also is an adaptive view of heuristic processing. CEST breaks down two systems that process information. At some times, roughly speaking, individuals consider issues rationally, systematically, logically, deliberately, effortfully and verbally. On other occasions, individuals consider issues intuitively, effortlessly, globally, and emotionally.[16] From this perspective, heuristics are part of a larger experiential processing system that is often adaptive, but vulnerable to error in situations that require logical analysis.[17]

Attribute substitution

In 2002, Daniel Kahneman and Shane Frederick proposed that cognitive heuristics work by a process called attribute substitution, which happens without conscious awareness.[18] According to this theory, when somebody makes a judgement (of a "target attribute") that is computationally complex, a more easily calculated "heuristic attribute" is substituted. In effect, a cognitively difficult problem is dealt with by answering a rather simpler problem, without being aware of this happening.[18] This theory explains cases where judgements fail to show regression toward the mean.[19] Heuristics can be considered to reduce the complexity of clinical judgments in health care.[20]

Psychology

Heuristics is the process by which humans use mental short cuts to arrive at decisions. Heuristics are simple strategies that humans, animals,[21][22][23] organizations,[24] and even machines[25] use to quickly form judgments, make decisions, and find solutions to complex problems. Often this involves focusing on the most relevant aspects of a problem or situation to formulate a solution.[26][27][28][29] While heuristic processes are used to find the answers and solutions that are most likely to work or be correct, they are not always right or the most accurate.[30] Judgments and decisions based on heuristics are simply good enough to satisfy a pressing need in situations of uncertainty, where information is incomplete.[31] In that sense they can differ from answers given by logic and probability.

The economist and cognitive psychologist Herbert A. Simon introduced the concept of heuristics in the 1950s, suggesting there were limitations to rational decision making. In the 1970s, psychologists Amos Tversky and Daniel Kahneman added to the field with their research on cognitive bias. It was their work that introduced specific heuristic models, a field which has only expanded since. While some argue that pure laziness is behind the heuristics process, others argue that it can be more accurate than decisions based on every known factor and consequence, the less-is-more effect.

Philosophy

A heuristic device is used when an entity X exists to enable understanding of, or knowledge concerning, some other entity Y.

A good example is a model that, as it is never identical with what it models, is a heuristic device to enable understanding of what it models. Stories, metaphors, etc., can also be termed heuristic in this sense. A classic example is the notion of utopia as described in Plato's best-known work, The Republic. This means that the "ideal city" as depicted in The Republic is not given as something to be pursued, or to present an orientation-point for development. Rather, it shows how things would have to be connected, and how one thing would lead to another (often with highly problematic results), if one opted for certain principles and carried them through rigorously.

Heuristic is also often used as a noun to describe a rule-of-thumb, procedure, or method.[32] Philosophers of science have emphasised the importance of heuristics in creative thought and the construction of scientific theories.[33] Seminal works include Karl Popper's The Logic of Scientific Discovery and others by Imre Lakatos,[34] Lindley Darden, and William C. Wimsatt.

Law

In legal theory, especially in the theory of law and economics, heuristics are used in the law when case-by-case analysis would be impractical, insofar as "practicality" is defined by the interests of a governing body.[35]

The present securities regulation regime largely assumes that all investors act as perfectly rational persons. In truth, actual investors face cognitive limitations from biases, heuristics, and framing effects. For instance, in all states in the United States the legal drinking age for unsupervised persons is 21 years, because it is argued that people need to be mature enough to make decisions involving the risks of alcohol consumption. However, assuming people mature at different rates, the specific age of 21 would be too late for some and too early for others. In this case, the somewhat arbitrary delineation is used because it is impossible or impractical to tell whether an individual is sufficiently mature for society to trust them with that kind of responsibility. Some proposed changes, however, have included the completion of an alcohol education course rather than the attainment of 21 years of age as the criterion for legal alcohol possession. This would put youth alcohol policy more on a case-by-case basis and less on a heuristic one, since the completion of such a course would presumably be voluntary and not uniform across the population.

The same reasoning applies to patent law. Patents are justified on the grounds that inventors must be protected so they have incentive to invent. It is therefore argued that it is in society's best interest that inventors receive a temporary government-granted monopoly on their idea, so that they can recoup investment costs and make economic profit for a limited period. In the United States, the length of this temporary monopoly is 20 years from the date the patent application was filed, though the monopoly does not actually begin until the application has matured into a patent. However, like the drinking-age problem above, the specific length of time would need to be different for every product to be efficient. A 20-year term is used because it is difficult to tell what the number should be for any individual patent. More recently, some, including University of North Dakota law professor Eric E. Johnson, have argued that patents in different kinds of industries – such as software patents – should be protected for different lengths of time.[36]

Stereotyping

Stereotyping is a type of heuristic that people use to form opinions or make judgements about things they have never seen or experienced.[37] They work as a mental shortcut to assess everything from the social status of a person (based on their actions),[2] to whether a plant is a tree based on the assumption that it is tall, has a trunk and has leaves (even though the person making the evaluation might never have seen that particular type of tree before).

Stereotypes, as first described by journalist Walter Lippmann in his book Public Opinion (1922), are the pictures we have in our heads that are built around experiences as well as what we are told about the world.[38][39]

Artificial intelligence

A heuristic can be used in artificial intelligence systems while searching a solution space. The heuristic is derived by using some function that is put into the system by the designer, or by adjusting the weight of branches based on how likely each branch is to lead to a goal node.

Behavioural Economics

Heuristics is referred to the cognitive shortcuts that individuals use to simplify decision-making processes in economic situations. Behavioral economics is a field that integrates insights from psychology and economics to better understand how people make decisions.

Anchoring and adjustment is one of the most extensively researched heuristics in behavioural economics. Anchoring is the tendency of people to make future judgements or conclusions based too heavily on the original information supplied to them. This initial knowledge functions as an anchor, and it can influence future judgements even if the anchor is entirely unrelated to the decisions at hand. Adjustment, on the other hand, is the process through which individuals make gradual changes to their initial judgements or conclusions.

Anchoring and adjustment has been observed in a wide range of decision-making contexts, including financial decision-making, consumer behavior, and negotiation. Researchers have identified a number of strategies that can be used to mitigate the effects of anchoring and adjustment, including providing multiple anchors, encouraging individuals to generate alternative anchors, and providing cognitive prompts to encourage more deliberative decision-making.

Other heuristics studied in behavioral economics include the representativeness heuristic, which refers to the tendency of individuals to categorize objects or events based on how similar they are to typical examples[40] , and the availability heuristic, which refers to the tendency of individuals to judge the likelihood of an event based on how easily it comes to mind[41] .

Types of Heuristics

Availability Heuristic

According to Tversky & Kahneman (1973), the availability heuristic can be described as the tendency to consider events that they can remember with greater facilitation as more likely to occur than events that are more difficult to recall.[42] An example of this would be asking someone whether they believe they are more likely to get bitten by a shark attack or die in a drowning incident. Someone may quickly answer with the incorrect belief that you are more likely to die from a shark attack as the event is more easily remembered and is often a big story on the news than drowning incidents. The reality of the answer is that you are at a higher odds of drowning (1 in 1,134) when compared to being bitten by a shark (1 in 4,332,817).[43]

Representative Heuristic

The representativeness heuristic refers to the cognitive bias where people rely on their preconceived mental image/prototype of a particular category or concept rather than actual probabilities and statistical data for making judgments. This behavior often leads to stereotyping/generalization with limited information causing errors as well as distorted views about reality.[44]

For instance, when trying to guess someone's occupation based on their appearance, a representative heuristic might be used by assuming that an individual in a suit must be either a lawyer or businessperson while assuming that someone in uniform fits the police officer or soldier category. This shortcut could sometimes be useful but may also result in stereotypes and overgeneralizations.

See also

References


  • Myers, David G. (2010). Social psychology (Tenth ed.). New York, NY: McGraw-Hill. p. 94. ISBN 978-0-07337-066-8. OCLC 667213323.

  • "Heuristics—Explanation and examples". Conceptually. Retrieved 23 October 2019.

  • Pearl, Judea (1983). Heuristics: Intelligent Search Strategies for Computer Problem Solving. New York, NY: Addison-Wesley. p. vii. ISBN 978-0-201-05594-8.

  • Emiliano, Ippoliti (2015). Heuristic Reasoning: Studies in Applied Philosophy, Epistemology and Rational Ethics. Switzerland: Springer International Publishing. pp. 1–2. ISBN 978-3-319-09159-4.

  • Sunstein, Cass (2005). "Moral Heuristics". The Behavioral and Brain Sciences. 28 (4): 531–542. doi:10.1017/S0140525X05000099. PMID 16209802. S2CID 231738548.

  • Pólya, George (1945) How to Solve It: A New Aspect of Mathematical Method, Princeton, NJ: Princeton University Press. ISBN 0-691-02356-5 ISBN 0-691-08097-6

  • Gigerenzer, Gerd (1991). "How to Make Cognitive Illusions Disappear: Beyond "Heuristics and Biases"" (PDF). European Review of Social Psychology. 2: 83–115. CiteSeerX 10.1.1.336.9826. doi:10.1080/14792779143000033. Retrieved 14 October 2012.

  • Kahneman, Daniel; Slovic, Paul; Tversky, Amos, eds. (30 April 1982). Judgment Under Uncertainty. Cambridge, UK: Cambridge University Press. doi:10.1017/cbo9780511809477. ISBN 978-0-52128-414-1.

  • Heuristics and heuristic evaluation. Interaction-design.org. Retrieved 1 September 2013.

  • Groner, Rudolf; Groner, Marina; Bischof, Walter F. (1983). Methods of Heuristics. Hillsdale, NJ: Lawrence Erlbaum.

  • Groner, Rudolf; Groner, Marina (1991). "Heuristische versus algorithmische Orientierung als Dimension des individuellen kognitiven Stils" [Heuristic versus algorithmic orientation as a dimension of the individual cognitive style]. In K. Grawe; N. Semmer; R. Hänni (eds.). Über die richtige Art, Psychologie zu betreiben [About the right way to do psychology] (in German). Göttingen: Hogrefe. ISBN 978-3-80170-415-5.

  • Gigerenzer, Gerd; Todd, Peter M.; and the ABC Research Group (1999). Simple Heuristics That Make Us Smart. Oxford, UK: Oxford University Press. ISBN 978-0-19512-156-8.

  • Gigerenzer, Gerd; Selten, Reinhard, eds. (2002). Bounded Rationality: The Adaptive Toolbox. Cambridge, MA: MIT Press. ISBN 978-0-26257-164-7.

  • Gigerenzer, Gerd; Hertwig, Ralph; Pachur, Thorsten (15 April 2011). Heuristics: The Foundations of Adaptive Behavior. Oxford University Press. doi:10.1093/acprof:oso/9780199744282.001.0001. hdl:11858/00-001M-0000-0024-F172-8. ISBN 978-0-19989-472-7.

  • Gigerenzer, Gerd; Gaissmaier, Wolfgang (January 2011). "Heuristic Decision Making". Annual Review of Psychology. 62: 451–482. doi:10.1146/annurev-psych-120709-145346. hdl:11858/00-001M-0000-0024-F16D-5. PMID 21126183. SSRN 1722019.

  • De Neys, Wim (18 October 2008). "Cognitive experiential self theory". Perspectives on Psychological Science. 7 (1): 28–38. doi:10.1177/1745691611429354. PMID 26168420. S2CID 32261626. Archived from the original on 31 July 2013.

  • Epstein, S.; Pacini, R.; Denes-Raj, V.; Heier, H. (1996). "Individual differences in intuitive-experiential and analytical-rational thinking styles". Journal of Personality and Social Psychology. 71 (2): 390–405. doi:10.1037/0022-3514.71.2.390. PMID 8765488.

  • Kahneman, Daniel; Frederick, Shane (2002). "Representativeness Revisited: Attribute Substitution in Intuitive Judgment". In Thomas Gilovich; Dale Griffin; Daniel Kahneman (eds.). Heuristics and Biases: The Psychology of Intuitive Judgment. Cambridge, UK: Cambridge University Press. pp. 49–81. ISBN 978-0-52179-679-8. OCLC 47364085.

  • Kahneman, Daniel (December 2003). "Maps of Bounded Rationality: Psychology for Behavioral Economics" (PDF). American Economic Review. 93 (5): 1449–1475. CiteSeerX 10.1.1.194.6554. doi:10.1257/000282803322655392. ISSN 0002-8282. Archived from the original (PDF) on 19 February 2018.

  • Cioffi, Jane (1997). "Heuristics, servants to intuition, in clinical decision making". Journal of Advanced Nursing. 26 (1): 203–208. doi:10.1046/j.1365-2648.1997.1997026203.x. PMID 9231296.

  • Marsh, Barnaby (2002-01-01). "Do Animals Use Heuristics?". Journal of Bioeconomics. 4 (1): 49–56. doi:10.1023/A:1020655022163. ISSN 1573-6989. S2CID 142852213.

  • Gigerenzer, Gerd; Brighton, Henry (2009). "Homo Heuristicus: Why Biased Minds Make Better Inferences". Topics in Cognitive Science. 1 (1): 107–143. doi:10.1111/j.1756-8765.2008.01006.x. hdl:11858/00-001M-0000-0024-F678-0. ISSN 1756-8765. PMID 25164802.

  • Hutchinson, John M. C.; Gigerenzer, Gerd (2005-05-31). "Simple heuristics and rules of thumb: Where psychologists and behavioural biologists might meet". Behavioural Processes. Proceedings of the meeting of the Society for the Quantitative Analyses of Behavior (SQAB 2004). 69 (2): 97–124. doi:10.1016/j.beproc.2005.02.019. ISSN 0376-6357. PMID 15845293. S2CID 785187.

  • Gigerenzer, Gerd; Gaissmaier, Wolfgang (2011). "Heuristic Decision Making". Annual Review of Psychology. 62 (1): 451–482. doi:10.1146/annurev-psych-120709-145346. hdl:11858/00-001M-0000-0024-F16D-5. PMID 21126183.

  • Braun, T.D.; Siegal, H.J.; Beck, N.; Boloni, L.L.; Maheswaran, M.; Reuther, A.I.; Robertson, J.P.; Theys, M.D.; Bin Yao; Hensgen, D.; Freund, R.F. (1999). "A comparison study of static mapping heuristics for a class of meta-tasks on heterogeneous computing systems". Proceedings. Eighth Heterogeneous Computing Workshop (HCW'99). IEEE Comput. Soc: 15–29. doi:10.1109/hcw.1999.765093. hdl:10945/35227. ISBN 0-7695-0107-9. S2CID 2860157.

  • Alan, Lewis (2018). The Cambridge Handbook of Psychology and Economic Behavior. Cambridge University Press. p. 43. ISBN 978-0-521-85665-2.

  • Lori, Harris (2007). CliffsAP Psychology. John Wiley & Sons. p. 65. ISBN 978-0-470-19718-9.

  • Nevid, Jeffery (2008). Psychology: Concepts and Applications. Cengage Learning. p. 251. ISBN 978-0-547-14814-4.

  • Gigerenzer, Gerd; Brighton, Henry (2009). "Homo heuristicus: why biased minds make better inferences". Topics in Cognitive Science. 1 (1): 107–143. doi:10.1111/j.1756-8765.2008.01006.x. hdl:11858/00-001M-0000-0024-F678-0. ISSN 1756-8765. PMID 25164802.

  • Goldstein, E. Bruce (2018-07-23). Cognitive psychology : connecting mind, research, and everyday experience. ISBN 978-1-337-40827-1. OCLC 1055681278.

  • Scholz, R. W. (1983-11-01). Decision Making under Uncertainty: Cognitive Decision Research, Social Interaction, Development and Epistemology. Elsevier. ISBN 978-0-08-086670-3.

  • Jaszczolt, K. M. (2006). "Defaults in Semantics and Pragmatics". Stanford Encyclopedia of Philosophy. ISSN 1095-5054.

  • Frigg, Roman; Hartmann, Stephan (2006). "Models in Science". Stanford Encyclopedia of Philosophy. ISSN 1095-5054.

  • Kiss, Olga (2006). "Heuristic, Methodology or Logic of Discovery? Lakatos on Patterns of Thinking". Perspectives on Science. 14 (3): 302–317. doi:10.1162/posc.2006.14.3.302. S2CID 57559578.

  • Gigerenzer, Gerd; Engel, Christoph, eds. (2007). Heuristics and the Law. Cambridge, MA: MIT Press. ISBN 978-0-262-07275-5.

  • Johnson, Eric E. (2006). "Calibrating Patent Lifetimes" (PDF). Santa Clara Computer & High Technology Law Journal. 22: 269–314. Archived from the original (PDF) on 2011-10-05.

  • Bodenhausen, Galen V.; et al. (1999). "On the Dialectics of Discrimination: Dual Processes in Social Stereotyping". In Chaiken, Shelly; Trope, Yaacov (eds.). Dual-process Theories in Social Psychology. New York, NY: Guilford Press. pp. 271–292. ISBN 978-1-57230-421-5.

  • Kleg, Milton (1993). Hate Prejudice and Racism. Albany, NY: State University of New York Press. p. 135. ISBN 978-0-79141-536-8.

  • Gökçen, Sinan (20 November 2007). "Pictures in Our Heads". European Roma Rights Centre. Retrieved 24 March 2015.

  • Bhatia, Sudeep (2015). "Conceptualizing and studying linguistic representations across multiple levels of analysis: The case of L2 processing research" (PDF). Cognitive Science. 39: 122–148. Retrieved 2023-04-20.

  • Dale, Sarah (2015). "Heuristics and biases: The science of decision-making". Business Information Review. 32 (2): 93–99. doi:10.1177/0266382115592536.

  • Tversky, Amos; Kahneman, Daniel (1973-09-01). "Availability: A heuristic for judging frequency and probability". Cognitive Psychology. 5 (2): 207–232. doi:10.1016/0010-0285(73)90033-9. ISSN 0010-0285.

  • "Shark Attack Statistics - Frequency & Fatality Worldwide". 2023-02-02. Retrieved 2023-05-09.

    1. Kahneman, Daniel; Tversky, Amos (July 1973). "On the psychology of prediction". Psychological Review. 80 (4): 237–251. doi:10.1037/h0034747. ISSN 1939-1471.

    Further reading

     https://en.wikipedia.org/wiki/Heuristic

    Continental philosophy is a term used to describe some philosophers and philosophical traditions that do not fall under the umbrella of analytic philosophy. However, there is no academic consensus on the definition of continental philosophy. Prior to the twentieth century, the term "continental" was used broadly to refer to philosophy from continental Europe.[1][2] A different use of the term originated among English-speaking philosophers in the second half of the 20th century, who used it to refer to a range of thinkers and traditions outside the analytic movement.[3] Continental philosophy includes German idealism, phenomenology, existentialism (and its antecedents, such as the thought of Kierkegaard and Nietzsche), hermeneutics, structuralism, post-structuralism, deconstruction, French feminism, psychoanalytic theory, and the critical theory of the Frankfurt School as well as branches of Freudian, Hegelian and Western Marxist views.[4]

    The term continental philosophy lacks clear definition and may mark merely a family resemblance across disparate philosophical views. Simon Glendinning has suggested that the term was originally more pejorative than descriptive, functioning as a label for types of western philosophy rejected or disliked by analytic philosophers.[5] Nonetheless, Michael E. Rosen has ventured to identify common themes that typically characterize continental philosophy.[6] The themes proposed by Michael E. Rosen derive from a broadly Kantian thesis that knowledge, experience, and reality are bound and shaped by conditions best understood through philosophical reflection rather than exclusively empirical inquiry.[7]

    Definition

    The term continental philosophy, in the above sense, was first widely used by English-speaking philosophers to describe university courses in the 1970s, emerging as a collective name for the philosophies then widespread in France and Germany, such as phenomenology, existentialism, structuralism, and post-structuralism.[8]

    However, the term (and its approximate sense) can be found at least as early as 1840, in John Stuart Mill's 1840 essay on Coleridge, where Mill contrasts the Kantian-influenced thought of "Continental philosophy" and "Continental philosophers" with the English empiricism of Bentham and the 18th century generally.[9] This notion gained prominence in the early 20th century as figures such as Bertrand Russell and G. E. Moore advanced a vision of philosophy closely allied with natural science, progressing through logical analysis. This tradition, which has come to be known broadly as analytic philosophy, became dominant in Britain and the United States from roughly 1930 onward. Russell and Moore made a dismissal of Hegelianism and its philosophical relatives a distinctive part of their new movement.[10] Commenting on the history of the distinction in 1945, Russell distinguished "two schools of philosophy, which may be broadly distinguished as the Continental and the British respectively," a division he saw as operative "from the time of Locke;" Russell proposes the following broad points of distinction between Continental and British types of philosophy:[11]

    1. in method, deductive system-building vs. piecemeal induction;
    2. in metaphysics, rationalist theology vs. metaphysical agnosticism;
    3. in ethics, non-naturalist deontology vs. naturalist hedonism; and
    4. in politics, authoritarianism vs. liberalism.

    Since the 1970s, however, many philosophers in the United States and Britain have taken interest in continental philosophers since Kant, and the philosophical traditions in many European countries have similarly incorporated many aspects of the "analytic" movement. Self-described analytic philosophy flourishes in France, including philosophers such as Jules Vuillemin, Vincent Descombes, Gilles Gaston Granger, François Recanati, and Pascal Engel. Likewise, self-described "continental philosophers" can be found in philosophy departments in the United Kingdom, North America, and Australia.,[12] "Continental philosophy" is thus defined in terms of a family of philosophical traditions and influences rather than a geographic distinction. The issue of geographical specificity has been raised again more recently in post-colonial and decolonial approaches to "continental philosophy," which critically examine the ways that European imperial and colonial projects have influenced academic knowledge production. For this reason, some scholars[who?] have advocated for "post-continental philosophy" as an outgrowth of continental philosophy.[13]

    Characteristics

    The term continental philosophy, like analytic philosophy, lacks a clear definition and may mark merely a family resemblance across disparate philosophical views. Simon Glendinning has suggested that the term was originally more pejorative than descriptive, functioning as a label for types of western philosophy rejected or disliked by analytic philosophers.[5] Nonetheless, Michael E. Rosen has ventured to identify common themes that typically characterize continental philosophy:[6]

    1. Continental philosophers generally reject the view that the natural sciences are the only or most accurate way of understanding natural phenomena. This contrasts with many analytic philosophers who consider their inquiries as continuous with, or subordinate to, those of the natural sciences. Continental philosophers often argue that science depends upon a "pre-theoretical substrate of experience" (a version of Kantian conditions of possible experience or the phenomenological "lifeworld") and that scientific methods are inadequate to fully understand such conditions of intelligibility.[14]
    2. Continental philosophy usually considers these conditions of possible experience as variable: determined at least partly by factors such as context, space and time, language, culture, or history. Thus continental philosophy tends toward historicism (or historicity). Where analytic philosophy tends to treat philosophy in terms of discrete problems, capable of being analyzed apart from their historical origins (much as scientists consider the history of science inessential to scientific inquiry), continental philosophy typically suggests that "philosophical argument cannot be divorced from the textual and contextual conditions of its historical emergence."[15]
    3. Continental philosophy typically holds that human agency can change these conditions of possible experience: "if human experience is a contingent creation, then it can be recreated in other ways."[16] Thus continental philosophers tend to take a strong interest in the unity of theory and practice, and often see their philosophical inquiries as closely related to personal, moral, or political transformation. This tendency is very clear in the Marxist tradition ("philosophers have only interpreted the world, in various ways; the point, however, is to change it"), but is also central in existentialism and post-structuralism.
    4. A final characteristic trait of continental philosophy is an emphasis on metaphilosophy. In the wake of the development and success of the natural sciences, continental philosophers have often sought to redefine the method and nature of philosophy.[17] In some cases (such as German idealism or phenomenology), this manifests as a renovation of the traditional view that philosophy is the first, foundational, a priori science. In other cases (such as hermeneutics, critical theory, or structuralism), it is held that philosophy investigates a domain that is irreducibly cultural or practical. And some continental philosophers (such as Kierkegaard, Nietzsche, or the later Heidegger) doubt whether any conception of philosophy can coherently achieve its stated goals.

    Ultimately, the foregoing themes derive from a broadly Kantian thesis that knowledge, experience, and reality are bound and shaped by conditions best understood through philosophical reflection rather than exclusively empirical inquiry.[7]

    History

    The history of continental philosophy (taken in the narrower sense of "late modern/contemporary continental philosophy") is usually thought to begin with German idealism.[i] Led by figures like Fichte, Schelling, and later Hegel, German idealism developed out of the work of Immanuel Kant in the 1780s and 1790s and was closely linked with romanticism and the revolutionary politics of the Enlightenment. Besides the central figures listed above, important contributors to German idealism also included Friedrich Heinrich Jacobi, Gottlob Ernst Schulze, Karl Leonhard Reinhold, and Friedrich Schleiermacher.

    Henri Bergson

    As the institutional roots of "continental philosophy" in many cases directly descend from those of phenomenology,[ii] Edmund Husserl has always been a canonical figure in continental philosophy. Nonetheless, Husserl is also a respected subject of study in the analytic tradition.[18] Husserl's notion of a noema, the non-psychological content of thought, his correspondence with Gottlob Frege, and his investigations into the nature of logic continue to generate interest among analytic philosophers.

    J. G. Merquior argued that a distinction between analytic and continental philosophies can be first clearly identified with Henri Bergson (1859–1941), whose wariness of science and elevation of intuition paved the way for existentialism.[19] Merquior wrote: "the most prestigious philosophizing in France took a very dissimilar path [from the Anglo-Germanic analytic schools]. One might say it all began with Henri Bergson."[19]

    An illustration of some important differences between analytic and continental styles of philosophy can be found in Rudolf Carnap's "Elimination of Metaphysics through Logical Analysis of Language" (1932; "Überwindung der Metaphysik durch Logische Analyse der Sprache"), a paper some observers[who?] have described as particularly polemical. Carnap's paper argues that Heidegger's lecture "What Is Metaphysics?" violates logical syntax to create nonsensical pseudo-statements.[20][21] Moreover, Carnap claimed that many German metaphysicians of the era were similar to Heidegger in writing statements that were syntactically meaningless.

    With the rise of Nazism, many of Germany's philosophers, especially those of Jewish descent or leftist or liberal political sympathies (such as many in the Vienna Circle and the Frankfurt School), fled to the English-speaking world. Those philosophers who remained—if they remained in academia at all—had to reconcile themselves to Nazi control of the universities. Others, such as Martin Heidegger, among the most prominent German philosophers to stay in Germany, aligned themselves with Nazism when it came to power.

    20th-century French philosophy

    Both before and after World War II there was a growth of interest in German philosophy in France. A new interest in communism translated into an interest in Marx and Hegel, who became for the first time studied extensively in the politically conservative French university system of the Third Republic. At the same time the phenomenological philosophy of Husserl and Heidegger became increasingly influential, perhaps owing to its resonances with French philosophies which placed great stock in the first-person perspective (an idea found in divergent forms such as Cartesianism, spiritualism, and Bergsonism). Most important in this popularization of phenomenology was the author and philosopher Jean-Paul Sartre, who called his philosophy existentialism.

    Another major strain of continental thought is structuralism/post-structuralism. Influenced by the structural linguistics of Ferdinand de Saussure, French anthropologists such as Claude Lévi-Strauss began to apply the structural paradigm to the humanities. In the 1960s and '70s, post-structuralists developed various critiques of structuralism. Post-structuralist thinkers include Jacques Derrida and Gilles Deleuze. After this wave, most of the late 20th century, the tradition has been carried into the 21st century by Quentin Meillassoux, Tristan Garcia, Francois Laruelle, and others.

    Recent Anglo-American developments

    From the early 20th century until the 1960s, continental philosophers were only intermittently discussed in British and American universities, despite an influx of continental philosophers, particularly German Jewish students of Nietzsche and Heidegger, to the United States on account of the persecution of the Jews and later World War II; Hannah Arendt, Herbert Marcuse, Leo Strauss, Theodor W. Adorno, and Walter Kaufmann are probably the most notable of this wave, arriving in the late 1930s and early 1940s. However, philosophy departments began offering courses in continental philosophy in the late 1960s and 1970s.

    Continental Philosophy features prominently in a number of British and Irish Philosophy departments, for instance at the University of Essex, Warwick, Newcastle, Sussex, Dundee, Aberdeen (Centre for Modern Thought), and University College Dublin; as well as Manchester Metropolitan, Kingston, Staffordshire (postgraduate only), and the Open University.

    American university departments in literature, the fine arts, film, sociology, and political theory have increasingly incorporated ideas and arguments from continental philosophers into their curricula and research. North American Philosophy departments offering courses in Continental Philosophy include the University of Hawaiʻi at Mānoa, Boston College, Stony Brook University, Vanderbilt University, DePaul University, Villanova University, the University of Guelph, The New School, Pennsylvania State University, University of Oregon, Emory University, University of Pittsburgh, Duquesne University, the University of Memphis, University of King's College, and Loyola University Chicago. The most prominent organization for continental philosophy in the United States is the Society for Phenomenology and Existential Philosophy (SPEP).[22]

    Significant works

    See also

    References

    Notes


  • Critchley 2001 and Solomon 1988 date the origins of continental philosophy a generation earlier, to the work of Jean-Jacques Rousseau

    1. E.g., the largest academic organization devoted to furthering the study of continental philosophy is the Society for Phenomenology and Existential Philosophy.

    Citations


  • Leiter 2007, p. 2: "As a first approximation, we might say that philosophy in Continental Europe in the nineteenth and twentieth centuries is best understood as a connected weave of traditions, some of which overlap, but no one of which dominates all the others."

  • Critchley, Simon (1998). "Introduction: what is continental philosophy?". In Critchley, Simon; Schroder, William (eds.). A Companion to Continental Philosophy. Blackwell Companions to Philosophy. Malden, MA: Blackwell Publishing Ltd. p. 4.

  • Critchley 2001, p. 32: "As such, Continental philosophy is an invention, or, more accurately, a projection of the Anglo-American academy onto a Continental Europe.."

  • The above list includes only those movements common to both lists compiled by Critchley 2001, p. 13 and Glendinning 2006, pp. 58–65

  • Glendinning 2006, p. 12.

  • Rosen, Michael E. "Continental Philosophy from Hegel." In Philosophy 2: Further through the Subject, edited by A. C. Grayling. p. 665.

  • Continental philosophers usually identify such conditions with the transcendental subject or self: Solomon 1988, p. 6, "It is with Kant that philosophical claims about the self attain new and remarkable proportions. The self becomes not just the focus of attention but the entire subject-matter of philosophy. The self is not just another entity in the world, but in an important sense it creates the world, and the reflecting self does not just know itself, but in knowing itself knows all selves, and the structure of any and every possible self."

  • Critchley 2001, p. 38.

  • Mill, John Stuart (1950). On Bentham and Coleridge. Harper Torchbooks. New York: Harper & Row. pp. 104, 133, 155.

  • Russell, Bertrand (1959). My Philosophical Development. London: Allen & Unwin. p. 62. Hegelians had all kinds of arguments to prove this or that was not 'real'. Number, space, time, matter, were all professedly convicted of being self-contradictory. Nothing was real, so we were assured, except the Absolute, which could think only of itself since there was nothing else for it to think of and which thought eternally the sort of things that idealist philosophers thought in their books.

  • Russell, Bertrand. 1945. A History of Western Philosophy. Simon & Schuster. pages 643, 641. Ibid., pages 643–47.

  • See, e.g., Brogan, Walter, and James Risser, eds. 2000. American Continental Philosophy: A Reader. Indiana University Press.

  • Laurie, Timothy; Stark, Hannah; Walker, Briohny (2019). "Critical Approaches to Continental Philosophy: Intellectual Community, Disciplinary Identity, and the Politics of Inclusion". Parrhesia: A Journal of Critical Philosophy. 30: 1–17.

  • Critchley 2001, p. 115.

  • Critchley 2001, p. 57.

  • Critchley 2001, p. 64.

  • Leiter 2007, p. 4: "While forms of philosophical naturalism have been dominant in Anglophone philosophy, the vast majority of authors within the Continental traditions insist on the distinctiveness of philosophical methods and their priority to those of the natural sciences."

  • Kenny, Anthony, ed. The Oxford Illustrated History of Western Philosophy. ISBN 0-19-285440-2.

  • Merquior, J. G. 1987. Foucault, Fontana Modern Masters series. University of California Press. ISBN 0-520-06062-8.

  • Gregory, Wanda T. 2001. "Heidegger, Carnap and Quine at the Crossroads of Language." Current Studies in Phenomenology and Hermeneutics 1(Winter). Archived from the original 2006-08-21.

  • Stone, Abraham D. 2005. "Heidegger and Carnap on the Overcoming of Metaphysics." Chapter 8 in Martin Heidegger, edited by S. Mulhall. doi:10.4324/9781315249636.

    1. "| Society for Phenomenology and Existential Philosophy". www.spep.org. Retrieved 2021-02-28.

    Sources

    • Babich, Babette (2003). "On the Analytic-Continental Divide in Philosophy: Nietzsche's Lying Truth, Heidegger's Speaking Language, and Philosophy." In: C. G. Prado, ed., A House Divided: Comparing Analytic and Continental Philosophy. Amherst, New York: Prometheus/Humanity Books. pp. 63–103.
    • Critchley, Simon (2001). Continental Philosophy: A Very Short Introduction. Oxford; New York: Oxford University Press. ISBN 978-0-19-285359-2.
    • Cutrofello, Andrew (2005). Continental Philosophy: A Contemporary Introduction. Routledge Contemporary Introductions to Philosophy. New York; Abingdon: Routledge Taylor & Francis Group.
    • Glendinning, Simon (2006). The idea of continental philosophy: a philosophical chronicle. Edinburgh: Edinburgh University Press Ltd.
    • Leiter, Brian; Rosen, Michael, eds. (2007). The Oxford Handbook of Continental Philosophy. Oxford; New York: Oxford University Press.
    • Schrift, Alan D. (2010). The History of Continental Philosophy. 8 Volumes. Chicago; Illinois: University of Chicago Press Press.
    • Solomon, Robert C. (1988). Continental philosophy since 1750: the rise and fall of the self. Oxford; New York: Oxford University Press.
    • Kenny, Anthony (2007). A New History of Western Philosophy, Volume IV: Philosophy in the Modern World. New York: Oxford University Press.

    External links

     https://en.wikipedia.org/wiki/Continental_philosophy

    https://en.wikipedia.org/wiki/Category:Western_philosophy

    Anthropology is the scientific study of humanity, concerned with human behavior, human biology, cultures, societies, and linguistics, in both the present and past, including past human species.[1][2][3] Social anthropology studies patterns of behavior, while cultural anthropology studies cultural meaning, including norms and values.[1][2][3] A portmanteau term sociocultural anthropology is commonly used today.[4] Linguistic anthropology studies how language influences social life. Biological or physical anthropology studies the biological development of humans.[1][2][3]

    Archaeological anthropology, often termed as "anthropology of the past," studies human activity through investigation of physical evidence.[5][6] It is considered a branch of anthropology in North America and Asia, while in Europe, archaeology is viewed as a discipline in its own right or grouped under other related disciplines, such as history and palaeontology.[7] 

    https://en.wikipedia.org/wiki/Anthropology

    https://en.wikipedia.org/wiki/History_of_anthropology

    Theoretical linguistics is a term in linguistics which,[1] like the related term general linguistics,[2] can be understood in different ways. Both can be taken as a reference to theory of language, or the branch of linguistics which inquires into the nature of language and seeks to answer fundamental questions as to what language is, or what the common ground of all languages is.[2] The goal of theoretical linguistics can also be the construction of a general theoretical framework for the description of language.[1]

    Another use of the term depends on the organisation of linguistics into different sub-fields. The term theoretical linguistics is commonly juxtaposed with applied linguistics.[3] This perspective implies that the aspiring language professional, e.g. a teacher student, must first learn the theory i.e. properties of the linguistic system, or what Ferdinand de Saussure called internal linguistics.[4] This is followed by practice, or studies in the applied field. The dichotomy is not fully unproblematic because language pedagogy, language technology and other aspects of applied linguistics include theory, too.[3]

    Similarly, the term general linguistics is used to distinguish core linguistics from other types of study. However, because college and university linguistics is largely distributed with the institutes and departments of a relatively small number of national languages, some larger universities also offer courses and research programmes in 'general linguistics' which may cover exotic and minority languages, cross-linguistic studies and various other topics outside the scope of the main philological departments.[5]

    Fields of linguistics proper

    When the concept of theoretical linguistics is taken as referring to core or internal linguistics, it means the study of the parts of the language system. This traditionally means phonology, morphology, syntax and semantics. Pragmatics and discourse can also be included; delimitation varies between institutions. Furthermore, Saussure's definition of general linguistics consists of the dichotomy of synchronic and diachronic linguistics, thus including historical linguistics as a core issue.[4]

    Linguistic theories

    There are various frameworks of linguistic theory which include a general theory of language and a general theory of linguistic description. Current humanistic approaches include theories within structural linguistics and functional linguistics. Evolutionary linguistics includes various frameworks of generative grammar and cognitive linguistics.

    See also

    References


  • Hamp, Eric P.; Ivić, Pavle; Lyons, John (2020). Linguistics. Encyclopædia Britannica, inc. ISBN 9783110289770. Retrieved 2020-08-03.

  • Graffi, Giorgio (2009). "20th century linguistics: overview of trends". Concise Encyclopedia of Philosophy of Language and Linguistics. Elsevier. pp. 780–794. ISBN 9780080965017.

  • Harris, Tony (2001). "Linguistics in applied linguistics: a historical overview". Journal of English Studies. 3 (2): 99–114. doi:10.18172/jes.72. Retrieved 2020-08-03.

  • de Saussure, Ferdinand (1959) [First published 1916]. Course in General Linguistics (PDF). New York: Philosophy Library. ISBN 9780231157278. Archived from the original (PDF) on 2019-08-08. Retrieved 2020-08-03.

    1. "General linguistics". University of Helsinki. 2020. Retrieved 2020-08-03.

     https://en.wikipedia.org/wiki/Theoretical_linguistics

    Discourse analysis (DA), or discourse studies, is an approach to the analysis of written, vocal, or sign language use, or any significant semiotic event.

    The objects of discourse analysis (discourse, writing, conversation, communicative event) are variously defined in terms of coherent sequences of sentences, propositions, speech, or turns-at-talk. Contrary to much of traditional linguistics, discourse analysts not only study language use 'beyond the sentence boundary' but also prefer to analyze 'naturally occurring' language use, not invented examples.[1] Text linguistics is a closely related field. The essential difference between discourse analysis and text linguistics is that discourse analysis aims at revealing socio-psychological characteristics of a person/persons rather than text structure.[2]

    Discourse analysis has been taken up in a variety of disciplines in the humanities and social sciences, including linguistics, education, sociology, anthropology, social work, cognitive psychology, social psychology, area studies, cultural studies, international relations, human geography, environmental science, communication studies, biblical studies, public relations, argumentation studies, and translation studies, each of which is subject to its own assumptions, dimensions of analysis, and methodologies

    https://en.wikipedia.org/wiki/Discourse_analysis

    In linguistics and related fields, pragmatics is the study of how context contributes to meaning. The field of study evaluates how human language is utilized in social interactions, as well as the relationship between the interpreter and the interpreted.[1] Linguists who specialize in pragmatics are called pragmaticians. The field has been represented since 1986 by the International Pragmatics Association (IPrA).

    Pragmatics encompasses phenomena including implicature, speech acts, relevance and conversation,[2] as well as nonverbal communication. Theories of pragmatics go hand-in-hand with theories of semantics, which studies aspects of meaning, and syntax which examines sentence structures, principles, and relationships. The ability to understand another speaker's intended meaning is called pragmatic competence.[3][4][5] Pragmatics emerged as its own subfield in the 1950s after the pioneering work of J.L. Austin and Paul Grice.[6][7] 

    https://en.wikipedia.org/wiki/Pragmatics

    Semiotics (also called semiotic studies) is the systematic study of sign processes (semiosis) and meaning making. Semiosis is any activity, conduct, or process that involves signs, where a sign is defined as anything that communicates something, usually called a meaning, to the sign's interpreter. The meaning can be intentional, such as a word uttered with a specific meaning; or unintentional, such as a symptom being a sign of a particular medical condition. Signs can also communicate feelings (which are usually not considered meanings) and may communicate internally (through thought itself) or through any of the senses: visual, auditory, tactile, olfactory, or gustatory (taste). Contemporary semiotics is a branch of science that studies meaning-making and various types of knowledge.[1]

    The semiotic tradition explores the study of signs and symbols as a significant part of communications. Unlike linguistics, semiotics also studies non-linguistic sign systems. Semiotics includes the study of signs and sign processes, indication, designation, likeness, analogy, allegory, metonymy, metaphor, symbolism, signification, and communication.

    Semiotics is frequently seen as having important anthropological and sociological dimensions; for example the Italian semiotician and novelist Umberto Eco proposed that every cultural phenomenon may be studied as communication.[2] Some semioticians focus on the logical dimensions of the science, however. They examine areas also belonging to the life sciences—such as how organisms make predictions about, and adapt to, their semiotic niche in the world (see semiosis). Fundamental semiotic theories take signs or sign systems as their object of study; applied semiotics analyzes cultures and cultural artifacts according to the ways they construct meaning through their being signs. The communication of information in living organisms is covered in biosemiotics (including zoosemiotics and phytosemiotics).

    Semiotics is not to be confused with the Saussurean tradition called semiology, which is a subset of semiotics.[3][4] 

    https://en.wikipedia.org/wiki/Semiotics

    Information is an abstract concept that refers to that which has the power to inform. At the most fundamental level, information pertains to the interpretation (perhaps formally) of that which may be sensed, or their abstractions. Any natural process that is not completely random and any observable pattern in any medium can be said to convey some amount of information. Whereas digital signals and other data use discrete signs to convey information, other phenomena and artefacts such as analogue signals, poems, pictures, music or other sounds, and currents convey information in a more continuous form.[1] Information is not knowledge itself, but the meaning that may be derived from a representation through interpretation.[2]

    Information is often processed iteratively: Data available at one step are processed into information to be interpreted and processed at the next step. For example, in written text each symbol or letter conveys information relevant to the word it is part of, each word conveys information relevant to the phrase it is part of, each phrase conveys information relevant to the sentence it is part of, and so on until at the final step information is interpreted and becomes knowledge in a given domain. In a digital signal, bits may be interpreted into the symbols, letters, numbers, or structures that convey the information available at the next level up. The key characteristic of information is that it is subject to interpretation and processing.

    The concept of information is relevant in various contexts,[3] including those of constraint, communication, control, data, form, education, knowledge, meaning, understanding, mental stimuli, pattern, perception, proposition, representation, and entropy.

    The derivation of information from a signal or message may be thought of as the resolution of ambiguity or uncertainty that arises during the interpretation of patterns within the signal or message.[4]

    Information may be structured as data. Redundant data can be compressed up to an optimal size, which is the theoretical limit of compression.

    The information available through a collection of data may be derived by analysis. For example, data may be collected from a single customer's order at a restaurant. The information available from many orders may be analyzed, and then becomes knowledge that is put to use when the business subsequently is able to identify the most popular or least popular dish.[5]

    Information can be transmitted in time, via data storage, and space, via communication and telecommunication.[6] Information is expressed either as the content of a message or through direct or indirect observation. That which is perceived can be construed as a message in its own right, and in that sense, all information is always conveyed as the content of a message.

    Information can be encoded into various forms for transmission and interpretation (for example, information may be encoded into a sequence of signs, or transmitted via a signal). It can also be encrypted for safe storage and communication.

    The uncertainty of an event is measured by its probability of occurrence. Uncertainty is inversely proportional to the probability of occurrence. Information theory takes advantage of this by concluding that more uncertain events require more information to resolve their uncertainty. The bit is a typical unit of information. It is 'that which reduces uncertainty by half'.[7] Other units such as the nat may be used. For example, the information encoded in one "fair" coin flip is log2(2/1) = 1 bit, and in two fair coin flips is log2(4/1) = 2 bits. A 2011 Science article estimates that 97% of technologically stored information was already in digital bits in 2007 and that the year 2002 was the beginning of the digital age for information storage (with digital storage capacity bypassing analogue for the first time).[8]

    Etymology

    The English word "information" comes from Middle French enformacion/informacion/information 'a criminal investigation' and its etymon, Latin informatiō(n) 'conception, teaching, creation'.[9]

    In English, "information" is an uncountable mass noun.

    https://en.wikipedia.org/wiki/Information


    Category:Formal sciences

    From Wikipedia, the free encyclopedia
    A formal science is an abstract science that is studied through an axiomatic system. A formal science is not the same thing as a field of study that has been formally or officially recognized as a science. It does not include applied sciences (like engineering) or hard sciences (like chemistry).

    Subcategories

    This category has the following 2 subcategories, out of 2 total.

    D


    I


     https://en.wikipedia.org/wiki/Category:Formal_sciences

    Dynamic semantics is a framework in logic and natural language semantics that treats the meaning of a sentence as its potential to update a context. In static semantics, knowing the meaning of a sentence amounts to knowing when it is true; in dynamic semantics, knowing the meaning of a sentence means knowing "the change it brings about in the information state of anyone who accepts the news conveyed by it."[1] In dynamic semantics, sentences are mapped to functions called context change potentials, which take an input context and return an output context. Dynamic semantics was originally developed by Irene Heim and Hans Kamp in 1981 to model anaphora, but has since been applied widely to phenomena including presupposition, plurals, questions, discourse relations, and modality.[2] 

    https://en.wikipedia.org/wiki/Dynamic_semantics

    In linguistics, anaphora (/əˈnæfərə/) is the use of an expression whose interpretation depends upon another expression in context (its antecedent or postcedent). In a narrower sense, anaphora is the use of an expression that depends specifically upon an antecedent expression and thus is contrasted with cataphora, which is the use of an expression that depends upon a postcedent expression. The anaphoric (referring) term is called an anaphor. For example, in the sentence Sally arrived, but nobody saw her, the pronoun her is an anaphor, referring back to the antecedent Sally. In the sentence Before her arrival, nobody saw Sally, the pronoun her refers forward to the postcedent Sally, so her is now a cataphor (and an anaphor in the broader, but not the narrower, sense). Usually, an anaphoric expression is a pro-form or some other kind of deictic (contextually dependent) expression.[1] Both anaphora and cataphora are species of endophora, referring to something mentioned elsewhere in a dialog or text.

    Anaphora is an important concept for different reasons and on different levels: first, anaphora indicates how discourse is constructed and maintained; second, anaphora binds different syntactical elements together at the level of the sentence; third, anaphora presents a challenge to natural language processing in computational linguistics, since the identification of the reference can be difficult; and fourth, anaphora partially reveals how language is understood and processed, which is relevant to fields of linguistics interested in cognitive psychology.[2] 

    https://en.wikipedia.org/wiki/Anaphora_(linguistics)

    https://en.wikipedia.org/wiki/Computational_linguistics

    In linguistics and philosophy, the denotation of an expression is its literal meaning. For instance, the English word "warm" denotes the property of being warm. Denotation is contrasted with other aspects of meaning including connotation. For instance, the word "warm" may evoke calmness or cosiness, but these associations are not part of the word's denotation. Similarly, an expression's denotation is separate from pragmatic inferences it may trigger. For instance, describing something as "warm" often implicates that it is not hot, but this is once again not part of the word's denotation.

    Denotation plays a major role in several fields. Within philosophy of language, denotation is studied as an important aspect of meaning. In mathematics and computer science, assignments of denotations are assigned to expressions are a crucial step in defining interpreted formal languages. The main task of formal semantics is to reverse engineer the computational system which assigns denotations to expressions of natural languages

    https://en.wikipedia.org/wiki/Denotation

    Humanities are academic disciplines that study aspects of human society and culture. In the Renaissance, the term contrasted with divinity and referred to what is now called classics, the main area of secular study in universities at the time. Today, the humanities are more frequently defined as any fields of study outside of natural sciences, social sciences, formal sciences (like mathematics) and applied sciences (or professional training).[1] They use methods that are primarily critical, or speculative, and have a significant historical element[2]—as distinguished from the mainly empirical approaches of the natural sciences;[2] yet, unlike the sciences, there is no general history of humanities as a distinct discipline in its own right.[further explanation needed][3]

    The humanities include the studies of foreign languages, history, philosophy, language arts (literature, writing, oratory, rhetoric, poetry, etc.), performing arts (theater, music, dance, etc.), and visual arts (painting, sculpture, photography, filmmaking, etc.); culinary art or cookery is interdisciplinary and may be considered both a humanity and a science. Some definitions of the humanities include law and religion,[4] but these are not universally accepted. Although anthropology, archaeology, geography, linguistics, logic, and sociology share some similarities with the humanities, these are widely considered sciences; similarly economics, finance, and political science are not typically considered humanities.

    Scholars in the humanities are called humanities scholars or sometimes humanists.[5] (The term humanist also describes the philosophical position of humanism, which antihumanist scholars in the humanities reject. Renaissance scholars and artists are also known as humanists.) Some secondary schools offer humanities classes usually consisting of literature, global studies, and art.

    Human disciplines like history and language mainly use the comparative method[6] and comparative research. Other methods used in the humanities include hermeneutics, source criticism, esthetic interpretation, and speculative reason

    https://en.wikipedia.org/wiki/Humanities

    History

    History is systematically collected information about the past. When used as the name of a field of study, history refers to the study and interpretation of the record of humans, societies, institutions, and any topic that has changed over time.

    Traditionally, the study of history has been considered a part of the humanities. In modern academia, history can occasionally be classified as a social science, though this definition is contested. 

    https://en.wikipedia.org/wiki/Humanities

    The modern division of philosophy into theoretical philosophy and practical philosophy[1][2] has its origin in Aristotle's categories of natural philosophy and moral philosophy.[3] The one has theory for its object, and the other practice.[1]

    https://en.wikipedia.org/wiki/Theoretical_philosophy

    https://en.wikipedia.org/w/index.php?title=Speculative_reason&redirect=no

    In the branch of linguistics known as pragmatics, a presupposition (or PSP) is an implicit assumption about the world or background belief relating to an utterance whose truth is taken for granted in discourse. Examples of presuppositions include:

    • Jane no longer writes fiction.
      • Presupposition: Jane once wrote fiction.
    • Have you stopped eating meat?
      • Presupposition: you had once eaten meat.
    • Have you talked to Hans?
      • Presupposition: Hans exists.

    A presupposition must be mutually known or assumed by the speaker and addressee for the utterance to be considered appropriate in context. It will generally remain a necessary assumption whether the utterance is placed in the form of an assertion, denial, or question, and can be associated with a specific lexical item or grammatical feature (presupposition trigger) in the utterance.

    Crucially, negation of an expression does not change its presuppositions: I want to do it again and I don't want to do it again both presuppose that the subject has done it already one or more times; My wife is pregnant and My wife is not pregnant both presuppose that the subject has a wife. In this respect, presupposition is distinguished from entailment and implicature. For example, The president was assassinated entails that The president is dead, but if the expression is negated, the entailment is not necessarily true.

    https://en.wikipedia.org/wiki/Presupposition

    In formal semantics and philosophy of language, a definite description is a denoting phrase in the form of "the X" where X is a noun-phrase or a singular common noun. The definite description is proper if X applies to a unique individual or object. For example: "the first person in space" and "the 42nd President of the United States of America", are proper. The definite descriptions "the person in space" and "the Senator from Ohio" are improper because the noun phrase X applies to more than one thing, and the definite descriptions "the first man on Mars" and "the Senator from some Country" are improper because X applies to nothing. Improper descriptions raise some difficult questions about the law of excluded middle, denotation, modality, and mental content.  

    https://en.wikipedia.org/wiki/Definite_description

    In classical antiquity, including the Hellenistic world of ancient Greece and ancient Rome, historians and archaeologists view the public and private rituals associated with religion as part of everyday life. Examples of this phenomenon are found in the various state and cult temples, Jewish synagogues, and churches. These were important hubs for ancient peoples, representing a connection between the heavenly realms (the divine) and the earthly planes (the dwelling place of humanity). This context of magic has become an academic study, especially in the last twenty years.[1]

    https://en.wikipedia.org/wiki/Magic_in_the_Greco-Roman_world

    In linguistics and philosophy, modality refers to the ways language can express various relationships to reality or truth. For instance, a modal expression may convey that something is likely, desirable, or permissible. Quintessential modal expressions include modal auxiliaries such as "could", "should", or "must"; modal adverbs such as "possibly" or "necessarily"; and modal adjectives such as "conceivable" or "probable". However, modal components have been identified in the meanings of countless natural language expressions, including counterfactuals, propositional attitudes, evidentials, habituals, and generics.

    Modality has been intensely studied from a variety of perspectives. Within linguistics, typological studies have traced crosslinguistic variation in the strategies used to mark modality, with a particular focus on its interaction with tense–aspect–mood marking. Theoretical linguists have sought to analyze both the propositional content and discourse effects of modal expressions using formal tools derived from modal logic. Within philosophy, linguistic modality is often seen as a window into broader metaphysical notions of necessity and possibility. 

    https://en.wikipedia.org/wiki/Modality_(linguistics)

    Indefinite article

    An indefinite article is an article that marks an indefinite noun phrase. Indefinite articles are those such as English "some" or "a", which do not refer to a specific identifiable entity. Indefinites are commonly used to introduce a new discourse referent which can be referred back to in subsequent discussion:

    1. A monster ate a cookie. His name is Cookie Monster.

    Indefinites can also be used to generalize over entities who have some property in common:

    1. A cookie is a wonderful thing to eat.

    Indefinites can also be used to refer to specific entities whose precise identity is unknown or unimportant.

    1. A monster must have broken into my house last night and eaten all my cookies.
    2. A friend of mine told me that happens frequently to people who live on Sesame Street.

    Indefinites also have predicative uses:

    1. Leaving my door unlocked was a bad decision.

    Indefinite noun phrases are widely studied within linguistics, in particular because of their ability to take exceptional scope.

    Proper article

    A proper article indicates that its noun is proper, and refers to a unique entity. It may be the name of a person, the name of a place, the name of a planet, etc. The Māori language has the proper article a, which is used for personal nouns; so, "a Pita" means "Peter". In Māori, when the personal nouns have the definite or indefinite article as an important part of it, both articles are present; for example, the phrase "a Te Rauparaha", which contains both the proper article a and the definite article Te refers to the person name Te Rauparaha.

    The definite article is sometimes also used with proper names, which are already specified by definition (there is just one of them). For example: the Amazon, the Hebrides. In these cases, the definite article may be considered superfluous. Its presence can be accounted for by the assumption that they are shorthand for a longer phrase in which the name is a specifier, i.e. the Amazon River, the Hebridean Islands.[citation needed] Where the nouns in such longer phrases cannot be omitted, the definite article is universally kept: the United States, the People's Republic of China.

    This distinction can sometimes become a political matter: the former usage the Ukraine stressed the word's Russian meaning of "borderlands"; as Ukraine became a fully independent state following the collapse of the Soviet Union, it requested that formal mentions of its name omit the article. Similar shifts in usage have occurred in the names of Sudan and both Congo (Brazzaville) and Congo (Kinshasa); a move in the other direction occurred with The Gambia. In certain languages, such as French and Italian, definite articles are used with all or most names of countries: la France/le Canada/l'Allemagne, l'Italia/la Spagna/il Brasile.

    If a name [has] a definite article, e.g. the Kremlin, it cannot idiomatically be used without it: we cannot say Boris Yeltsin is in Kremlin.

    Some languages use definite articles with personal names, as in Portuguese (a Maria, literally: "the Maria"), Greek (η Μαρία, ο Γιώργος, ο Δούναβης, η Παρασκευή), and Catalan (la Núria, el/en Oriol). Such usage also occurs colloquially or dialectally in Spanish, German, French, Italian and other languages. In Hungarian, the colloquial use of definite articles with personal names, though widespread, is considered to be a Germanism.

    The definite article sometimes appears in American English nicknames such as "the Donald", referring to former president Donald Trump, and "the Gipper", referring to former president Ronald Reagan.[4] 

    https://en.wikipedia.org/wiki/Article_(grammar)#Indefinite_article

    Counterfactual conditionals (also subjunctive or X-marked) are conditional sentences which discuss what would have been true under different circumstances, e.g. "If Peter believed in ghosts, he would be afraid to be here." Counterfactuals are contrasted with indicatives, which are generally restricted to discussing open possibilities. Counterfactuals are characterized grammatically by their use of fake tense morphology, which some languages use in combination with other kinds of morphology including aspect and mood.

    Counterfactuals are one of the most studied phenomena in philosophical logic, formal semantics, and philosophy of language. They were first discussed as a problem for the material conditional analysis of conditionals, which treats them all as trivially true. Starting in the 1960s, philosophers and linguists developed the now-classic possible world approach, in which a counterfactual's truth hinges on its consequent holding at certain possible worlds where its antecedent holds. More recent formal analyses have treated them using tools such as causal models and dynamic semantics. Other research has addressed their metaphysical, psychological, and grammatical underpinnings, while applying some of the resultant insights to fields including history, marketing, and epidemiology. 

    https://en.wikipedia.org/wiki/Counterfactual_conditional

    In linguistics, evidentiality[1][2] is, broadly, the indication of the nature of evidence for a given statement; that is, whether evidence exists for the statement and if so, what kind. An evidential (also verificational or validational) is the particular grammatical element (affix, clitic, or particle) that indicates evidentiality. Languages with only a single evidential have had terms such as mediative, médiatif, médiaphorique, and indirective used instead of evidential.  

    https://en.wikipedia.org/wiki/Evidentiality

    In linguistics, the aspect of a verb is a grammatical category that defines the temporal flow (or lack thereof) in a given action, event, or state.[1][2] As its name suggests, the habitual aspect (abbreviated HAB), not to be confused with iterative aspect or frequentative aspect, specifies an action as occurring habitually: the subject performs the action usually, ordinarily, or customarily. As such, the habitual aspect provides structural information on the nature of the subject referent, "John smokes" being interpretable as "John is a smoker", "Enjoh habitually gets up early in the morning" as "Enjoh is an early bird". The habitual aspect is a type of imperfective aspect, which does not depict an event as a single entity viewed only as a whole but instead specifies something about its internal temporal structure.

    Östen Dahl found that the habitual past, the most common tense context for the habitual, occurred in only seven of 60 languages sampled, including English.[2]: 101  Especially in Turkic languages such as Azerbaijani and Turkish, he found[2]: 111  that the habitual can occur in combination with the predictive mood

    https://en.wikipedia.org/wiki/Habitual_aspect

    A haunted house, spook house or ghost house in ghostlore is a house or other building often perceived as being inhabited by disembodied spirits of the deceased who may have been former residents or were otherwise connected with the property. Parapsychologists often attribute haunting to the spirits of the dead who have suffered from violent or tragic events in the building's past such as murder, accidental death, or suicide.[1]

    In a majority of cases, upon scientific investigation, alternative causes to supernatural phenomenon are found to be at fault, such as hoaxes, environmental effects, hallucinations or confirmation biases. Common symptoms of hauntings, like cold spots and creaking or knocking sounds, can be found in most homes regardless of suspected paranormal presences. People are more likely to experience a haunting when they are about to fall asleep when waking if they are intoxicated or sleep-deprived. Carbon monoxide poisoning has been cited as a cause of suspected hauntings. If there is an expectation of a preternatural encounter, it is more likely that one will be perceived or reported. 

    https://en.wikipedia.org/wiki/Haunted_house

    Gothic fiction, sometimes called Gothic horror (primarily in the 20th century), is a loose literary aesthetic of fear and haunting. The name refers to Gothic architecture of the European Middle Ages, which was characteristic of the settings of early Gothic novels.

    The first work to call itself Gothic was Horace Walpole's 1764 novel The Castle of Otranto, later subtitled "A Gothic Story". Subsequent 18th-century contributors included Clara Reeve, Ann Radcliffe, William Thomas Beckford, and Matthew Lewis. The Gothic influence continued into the early 19th century; works by the Romantic poets, and novelists such as Mary Shelley, Charles Maturin, Walter Scott and E. T. A. Hoffmann frequently drew upon gothic motifs in their works.

    The early Victorian period continued the use of gothic aesthetic in novels by Charles Dickens and the Brontë sisters, as well as works by the American writers Edgar Allan Poe and Nathaniel Hawthorne. Later well-known works were Dracula by Bram Stoker, Richard Marsh's The Beetle and Robert Louis Stevenson's Strange Case of Dr. Jekyll and Mr.Hyde. Twentieth-century contributors include Daphne du Maurier, Stephen King, Shirley Jackson, Anne Rice, and Toni Morrison

     

    The ruins of Wolf's Crag castle in Walter Scott's The Bride of Lammermoor (1819)

    The Castle of Otranto (1764) is regarded as the first Gothic novel. The aesthetics of the book have shaped modern-day gothic books, films, art, music and the goth subculture.[1]

     

    https://en.wikipedia.org/wiki/Gothic_fiction

    A monastery is a building or complex of buildings comprising the domestic quarters and workplaces of monastics, monks or nuns, whether living in communities or alone (hermits). A monastery generally includes a place reserved for prayer which may be a chapel, church, or temple, and may also serve as an oratory, or in the case of communities anything from a single building housing only one senior and two or three junior monks or nuns, to vast complexes and estates housing tens or hundreds. A monastery complex typically comprises a number of buildings which include a church, dormitory, cloister, refectory, library, balneary and infirmary, and outlying granges. Depending on the location, the monastic order and the occupation of its inhabitants, the complex may also include a wide range of buildings that facilitate self-sufficiency and service to the community. These may include a hospice, a school, and a range of agricultural and manufacturing buildings such as a barn, a forge, or a brewery.

    In English usage, the term monastery is generally used to denote the buildings of a community of monks. In modern usage, convent tends to be applied only to institutions of female monastics (nuns), particularly communities of teaching or nursing religious sisters. Historically, a convent denoted a house of friars (reflecting the Latin), now more commonly called a friary. Various religions may apply these terms in more specific ways. 

    https://en.wikipedia.org/wiki/Monastery

    A fairy tale (alternative names include fairytale, fairy story, magic tale, or wonder tale) is a short story that belongs to the folklore genre.[1] Such stories typically feature magic, enchantments, and mythical or fanciful beings. In most cultures, there is no clear line separating myth from folk or fairy tale; all these together form the literature of preliterate societies.[2] Fairy tales may be distinguished from other folk narratives such as legends (which generally involve belief in the veracity of the events described)[3] and explicit moral tales, including beast fables. Prevalent elements include dwarfs, dragons, elves, fairies, giants, gnomes, goblins, griffins, mermaids, talking animals, trolls, unicorns, monsters, witches, wizards, and magic and enchantments.

    In less technical contexts, the term is also used to describe something blessed with unusual happiness, as in "fairy-tale ending" (a happy ending)[4] or "fairy-tale romance". Colloquially, the term "fairy tale" or "fairy story" can also mean any far-fetched story or tall tale; it is used especially of any story that not only is not true, but could not possibly be true. Legends are perceived as real within their culture; fairy tales may merge into legends, where the narrative is perceived both by teller and hearers as being grounded in historical truth. However, unlike legends and epics, fairy tales usually do not contain more than superficial references to religion and to actual places, people, and events; they take place "once upon a time" rather than in actual times.[5]

    Fairy tales occur both in oral and in literary form; the name "fairy tale" ("conte de fées" in French) was first ascribed to them by Madame d'Aulnoy in the late 17th century. Many of today's fairy tales have evolved from centuries-old stories that have appeared, with variations, in multiple cultures around the world.[6]

    The history of the fairy tale is particularly difficult to trace because only the literary forms can survive. Still, according to researchers at universities in Durham and Lisbon, such stories may date back thousands of years, some to the Bronze Age.[7][8] Fairy tales, and works derived from fairy tales, are still written today.

    The Jatakas are probably the oldest collection of such tales in literature, and the greater part of the rest are demonstrably more than a thousand years old. It is certain that much (perhaps one-fifth) of the popular literature of modern Europe is derived from those portions of this large bulk which came west with the Crusades through the medium of Arabs and Jews.[9]

    Folklorists have classified fairy tales in various ways. The Aarne-Thompson-Uther classification system and the morphological analysis of Vladimir Propp are among the most notable. Other folklorists have interpreted the tales' significance, but no school has been definitively established for the meaning of the tales. 

    https://en.wikipedia.org/wiki/Fairy_tale

    The underworld, also known as the netherworld, is the supernatural world of the dead in various religious traditions and myths, located below the world of the living.[1] Chthonic is the technical adjective for things of the underworld.

    The concept of an underworld is found in almost every civilization and "may be as old as humanity itself".[2] Common features of underworld myths are accounts of living people making journeys to the underworld, often for some heroic purpose. Other myths reinforce traditions that entrance of souls to the underworld requires a proper observation of ceremony, such as the ancient Greek story of the recently dead Patroclus haunting Achilles until his body could be properly buried for this purpose.[3] Persons having social status were dressed and equipped in order to better navigate the underworld.[4]

    A number of mythologies incorporate the concept of the soul of the deceased making its own journey to the underworld, with the dead needing to be taken across a defining obstacle such as a lake or a river to reach this destination.[5] Imagery of such journeys can be found in both ancient and modern art. The descent to the underworld has been described as "the single most important myth for Modernist authors".[6] 

    https://en.wikipedia.org/wiki/Underworld

    Mictlan (Nahuatl pronunciation: [ˈmikt͡ɬaːn]) is the underworld of Aztec mythology. Most people who die would travel to Mictlan, although other possibilities exist. (see "Other destinations", below).[1] Mictlan consists of nine distinct levels.[1]

    The journey from the first level to the ninth is difficult and takes four years, but the dead are aided by the psychopomp, Xolotl. The dead must pass many challenges, such as crossing a mountain range where the mountains crash into each other, a field with wind that blows flesh-scraping knives, and a river of blood with fearsome jaguars.[citation needed]

    Mictlan also features in the Aztec creation myth. Mictlantecuhtli set a pit to trap Quetzalcoatl. When Quetzalcoatl entered Mictlan seeking bones with which to create humans, Mictlantecuhtli was waiting. He asked Quetzalcoatl to travel around Mictlan four times blowing a conch shell with no holes. Quetzalcoatl eventually put some bees in the conch shell to make sound. Fooled, Mictlantecuhtli showed Quetzalcoatl to the bones. But Quetzalcoatl fell into the pit and some of the bones broke. The Aztecs believed this is why people's height are different.

    Mictlan is believed to be ruled by King Mictlantecuhtli ("Lord of the Underworld")[2] and his wife, Mictecacihuatl ("Lady of the Underworld").[3]

    Other deities in Mictlan include Cihuacoatl (who commanded Mictlan spirits called Cihuateteo), Acolmiztli, Chalmecacihuilt, Chalmecatl and Acolnahuacatl.[citation needed] 

    https://en.wikipedia.org/wiki/Mictl%C4%81n

    Sacrifice is the offering of material possessions or the lives of animals or humans to a deity as an act of propitiation or worship.[1][2] Evidence of ritual animal sacrifice has been seen at least since ancient Hebrews and Greeks, and possibly existed before that. Evidence of ritual human sacrifice can also be found back to at least pre-Columbian civilizations of Mesoamerica as well as in European civilizations. Varieties of ritual non-human sacrifices are practiced by numerous religions today.  

    https://en.wikipedia.org/wiki/Sacrifice

    In mythology and folklore, a vengeful ghost or vengeful spirit is said to be the spirit of a dead person who returns from the afterlife to seek revenge for a cruel, unnatural or unjust death. In certain cultures where funeral and burial or cremation ceremonies are important, such vengeful spirits may also be considered as unhappy ghosts of individuals who have not been given a proper funeral.[1]

    https://en.wikipedia.org/wiki/Vengeful_ghost

    The veneration of the dead, including one's ancestors, is based on love and respect for the deceased. In some cultures, it is related to beliefs that the dead have a continued existence, and may possess the ability to influence the fortune of the living. Some groups venerate their direct, familial ancestors. Certain sects and religions, in particular the Eastern Orthodox Church and Roman Catholic Church, venerate saints as intercessors with God; the latter also believes in prayer for departed souls in Purgatory. Other religious groups, however, consider veneration of the dead to be idolatry and a sin.

    In European, Asian, Oceanian, African and Afro-diasporic cultures, the goal of ancestor veneration is to ensure the ancestors' continued well-being and positive disposition towards the living, and sometimes to ask for special favours or assistance. The social or non-religious function of ancestor veneration is to cultivate kinship values, such as filial piety, family loyalty, and continuity of the family lineage. Ancestor veneration occurs in societies with every degree of social, political, and technological complexity, and it remains an important component of various religious practices in modern times. 

    https://en.wikipedia.org/wiki/Veneration_of_the_dead

    Animism (from Latin: anima meaning 'breath, spirit, life')[1][2] is the belief that objects, places, and creatures all possess a distinct spiritual essence.[3][4][5][6] Animism perceives all things—animals, plants, rocks, rivers, weather systems, human handiwork, and in some cases words—as animated and alive. Animism is used in anthropology of religion as a term for the belief system of many Indigenous peoples,[7] in contrast to the relatively more recent development of organized religions.[8] Animism focuses on the metaphysical universe, with a specific focus on the concept of the immaterial soul.[9]

    Although each culture has its own mythologies and rituals, animism is said to describe the most common, foundational thread of indigenous peoples' "spiritual" or "supernatural" perspectives. The animistic perspective is so widely held and inherent to most indigenous peoples that they often do not even have a word in their languages that corresponds to "animism" (or even "religion").[10] The term "animism" is an anthropological construct.

    Largely due to such ethnolinguistic and cultural discrepancies, opinions differ on whether animism refers to an ancestral mode of experience common to indigenous peoples around the world or to a full-fledged religion in its own right. The currently accepted definition of animism was only developed in the late 19th century (1871) by Edward Tylor. It is "one of anthropology's earliest concepts, if not the first."[11]

    Animism encompasses beliefs that all material phenomena have agency, that there exists no categorical distinction between the spiritual and physical world, and that soul, spirit, or sentience exists not only in humans but also in other animals, plants, rocks, geographic features (such as mountains and rivers), and other entities of the natural environment. Examples include water sprites, vegetation deities, and tree spirits, among others. Animism may further attribute a life force to abstract concepts such as words, true names, or metaphors in mythology. Some members of the non-tribal world also consider themselves animists, such as author Daniel Quinn, sculptor Lawson Oyekan, and many contemporary Pagans.[12] 

    https://en.wikipedia.org/wiki/Animism

    Numinous (/ˈnjmɪnəs/) is a term derived from the Latin numen, meaning "arousing spiritual or religious emotion; mysterious or awe-inspiring."[1] The term was given its present sense by the German theologian and philosopher Rudolf Otto in his influential 1917 German book The Idea of the Holy. He also used the phrase mysterium tremendum as another description for the phenomenon. Otto's concept of the numinous influenced thinkers including Carl Jung, Mircea Eliade, and C. S. Lewis. It has been applied to theology, psychology, religious studies, literary analysis, and descriptions of psychedelic experiences.  

    https://en.wikipedia.org/wiki/Numinous

    In religion, transcendence is the aspect of a deity's nature and power that is completely independent of the material universe, beyond all known physical laws. This is contrasted with immanence, where a god is said to be fully present in the physical world and thus accessible to creatures in various ways. In religious experience, transcendence is a state of being that has overcome the limitations of physical existence, and by some definitions, has also become independent of it. This is typically manifested in prayer, rituals, meditation, psychedelics and paranormal "visions".

    It is affirmed in various religious traditions' concept of the divine, which contrasts with the notion of a god (or, the Absolute) that exists exclusively in the physical order (immanentism), or is indistinguishable from it (pantheism). Transcendence can be attributed to the divine not only in its being, but also in its knowledge. Thus, a god may transcend both the universe and knowledge (is beyond the grasp of the human mind).

    Although transcendence is defined as the opposite of immanence, the two are not necessarily mutually exclusive. Some theologians and metaphysicians of various religious traditions affirm that a god is both within and beyond the universe (panentheism); in it, but not of it; simultaneously pervading it and surpassing it. 

    https://en.wikipedia.org/wiki/Transcendence_(religion)

    In folklore, a revenant is an animated corpse that is believed to have been revived from death to haunt the living.[6][7] The word revenant is derived from the Old French word, revenant, the "returning" (see also the related French verb revenir, meaning "to come back").

    Revenants are part of the legend of various cultures, including Old Irish Celtic and Norse mythology,[8] and stories of supposed revenant visitations were documented by English historians in the Middle Ages.[9]

    Revenant
    Wiedergaenger.jpg
    GroupingLegendary creature
    Sub groupingUndead
    RegionThe Americas, Europe, Asia, West Indies, Africa[1][2][3][4][5]

     

    https://en.wikipedia.org/wiki/Revenant

    The undead are beings in mythology, legend, or fiction that are deceased but behave as if alive. Most commonly the term refers to corporeal forms of formerly alive humans, such as mummies, vampires, and zombies, who have been reanimated by supernatural means, technology, or disease. In some cases (for example in Dungeons & Dragons) the term also includes incorporeal forms of the dead, such as ghosts.

    The undead are featured in the belief systems of most cultures, and appear in many works of fantasy and horror fiction. The term is also occasionally used for real-life attempts to resurrect the dead with science and technology, from early experiments like Robert E. Cornish's to future sciences such as "chemical brain preservation" and "cryonics." 

    https://en.wikipedia.org/wiki/Undead

    A fetch, based in Irish folklore, is a supernatural double or an apparition of a living person. The sighting of a fetch is regarded as an omen, usually for impending death.  

    https://en.wikipedia.org/wiki/Fetch_(folklore)

    https://en.wikipedia.org/wiki/Category:Counterparts

    Terminology

    The English word ghost continues Old English gāst. Stemming from Proto-Germanic *gaistaz, it is cognate with Old Frisian gāst, Old Saxon gēst, Old Dutch gēst, and Old High German geist. Although this form is not attested in North Germanic and East Germanic languages (the equivalent word in Gothic is ahma, Old Norse has andi m., önd f.), it appears to be a dental suffix derivative of pre-Germanic *ghois-d-oz ('fury, anger'), which is comparable to Sanskrit héḍas ('anger') and Avestan zōižda- ('terrible, ugly'). The prior Proto-Indo-European form is reconstructed as *ǵʰéys-d-os, from the root *ǵʰéys-, which is reflected in Old Norse geisa ('to rage') and *geiski ('fear'; cf. geiskafullr 'full of fear'), in Gothic usgaisjan ('to terrify') and usgaisnan ('to be terrified'), as well as in Avestan zōiš- (cf. zōišnu 'shivering, trembling').[15][16][17]

    The Germanic word is recorded as masculine only, but likely continues a neuter s-stem. The original meaning of the Germanic word would thus have been an animating principle of the mind, in particular capable of excitation and fury (compare óðr). In Germanic paganism, "Germanic Mercury", and the later Odin, was at the same time the conductor of the dead and the "lord of fury" leading the Wild Hunt.

    Besides denoting the human spirit or soul, both of the living and the deceased, the Old English word is used as a synonym of Latin spiritus also in the meaning of "breath" or "blast" from the earliest attestations (9th century). It could also denote any good or evil spirit, such as angels and demons; the Anglo-Saxon gospel refers to the demonic possession of Matthew 12:43 as se unclæna gast. Also from the Old English period, the word could denote the spirit of God, viz. the "Holy Ghost".

    The now-prevailing sense of "the soul of a deceased person, spoken of as appearing in a visible form" only emerges in Middle English (14th century). The modern noun does, however, retain a wider field of application, extending on one hand to "soul", "spirit", "vital principle", "mind", or "psyche", the seat of feeling, thought, and moral judgement; on the other hand used figuratively of any shadowy outline, or fuzzy or unsubstantial image; in optics, photography, and cinematography especially, a flare, secondary image, or spurious signal.[18]

    The synonym spook is a Dutch loanword, akin to Low German spôk (of uncertain etymology); it entered the English language via American English in the 19th century.[19][20][21][22] Alternative words in modern usage include spectre (altn. specter; from Latin spectrum), the Scottish wraith (of obscure origin), phantom (via French ultimately from Greek phantasma, compare fantasy) and apparition. The term shade in classical mythology translates Greek σκιά,[23] or Latin umbra,[24] in reference to the notion of spirits in the Greek underworld. The term poltergeist is a German word, literally a "noisy ghost", for a spirit said to manifest itself by invisibly moving and influencing objects.[25]

    Wraith is a Scots word for ghost, spectre, or apparition. It appeared in Scottish Romanticist literature, and acquired the more general or figurative sense of portent or omen. In 18th- to 19th-century Scottish literature, it also applied to aquatic spirits. The word has no commonly accepted etymology; the OED notes "of obscure origin" only.[26] An association with the verb writhe was the etymology favored by J. R. R. Tolkien.[27] Tolkien's use of the word in the naming of the creatures known as the Ringwraiths has influenced later usage in fantasy literature. Bogey[28] or bogy/bogie is a term for a ghost, and appears in Scottish poet John Mayne's Hallowe'en in 1780.[29][30]

    A revenant is a deceased person returning from the dead to haunt the living, either as a disembodied ghost or alternatively as an animated ("undead") corpse. Also related is the concept of a fetch, the visible ghost or spirit of a person yet alive.

    https://en.wikipedia.org/wiki/Ghost#Terminology

    A doppelgänger[a] (/ˈdɒpəlɡɛŋər, -ɡæŋər/), sometimes spelled as doppelgaenger or doppelganger is a biologically unrelated look-alike, or a double, of a living person.

    In fiction and mythology, a doppelgänger is often portrayed as a ghostly or paranormal phenomenon and usually seen as a harbinger of bad luck. Other traditions and stories equate a doppelgänger with an evil twin. In modern times, the term twin stranger is occasionally used.[3] 

    Dante Gabriel Rossetti, How They Met Themselves, watercolor, 1864

    https://en.wikipedia.org/wiki/Doppelg%C3%A4nger

    Spelling

    The word doppelganger is a loanword from the German noun Doppelgänger, literally meaning double-walker.[a] The singular and plural forms are the same in German, but English writers usually prefer the plural "doppelgangers". In German, there is also a female form, "Doppelgängerin" (plural: "Doppelgängerinnen"). The first known use, in the slightly different form Doppeltgänger, occurs in the novel Siebenkäs (1796) by Jean Paul, in which he explains his newly coined word in a footnote; the word Doppelgänger also appears in the novel, but with a different meaning.[4]

    In German, the word is written (as is usual with German nouns) with an initial capital letter: Doppelgänger. In English, the word is generally written with a lower-case letter, and the umlaut on the letter "a" is usually dropped: "doppelganger".[5] 

    https://en.wikipedia.org/wiki/Doppelg%C3%A4nger

    Twin strangers

    With the advent of social media, there have been several reported cases of people finding their "twin stranger" online, a modern term for a doppelgänger.[20][21] There are several websites where users can upload a photo of themselves and facial recognition software attempts to match them with another user of like appearance. Some of these sites report that they have found numerous living doppelgängers.[22][23] 

    https://en.wikipedia.org/wiki/Doppelg%C3%A4nger

    Scientific applications

    Research has found that people who are "true" look-alikes have more similar genes than people who do not look like each other. They share genes affecting not only the face but also some phenotypes of physique and behavior, also indicating that (their) differences in the epigenome and microbiome contribute only modestly to human variability in facial appearance.[31][32]

    Heautoscopy is a term used in psychiatry and neurology for the hallucination of "seeing one's own body at a distance".[33] It can occur as a symptom in schizophrenia[34] and epilepsy, and is considered a possible explanation for doppelgänger phenomena.[35]

    Criminologists find a practical application in the concepts of facial familiarity and similarity due to the instances of wrongful convictions based on eyewitness testimony. In one case, a person spent 17 years behind bars persistently denying any involvement with the crime of which he was accused. He was finally released after someone was found who shared a striking resemblance and the same first name.[36] 

    https://en.wikipedia.org/wiki/Doppelg%C3%A4nger

    https://en.wikipedia.org/wiki/Category:Literary_concepts

    In Finnish folklore, all places and things, and also human beings, have a haltija (a genius, guardian spirit) of their own. One such haltija is called etiäinen—an image, doppelgänger, or just an impression that goes ahead of a person, doing things the person in question later does. For example, people waiting at home might hear the door close or even see a shadow or a silhouette, only to realize that no one has yet arrived. Etiäinen can also refer to some kind of a feeling that something is going to happen. Sometimes it could, for example, warn of a bad year coming.[1]

    In modern Finnish, the term has detached from its shamanistic origins and refers to premonition. Unlike clairvoyance, divination, and similar practices, etiäiset (plural) are spontaneous and can't be induced. Quite the opposite, they may be unwanted and cause anxiety, like ghosts. Etiäiset need not be too dramatic and may concern everyday events, although ones related to e.g. deaths are common. As these phenomena are still reported today, they can be considered a living tradition, as a way to explain the psychological experience of premonition. 

    https://en.wikipedia.org/wiki/Eti%C3%A4inen

    Capgras delusion or Capgras syndrome is a psychiatric disorder in which a person holds a delusion that a friend, spouse, parent, another close family member, or pet has been replaced by an identical impostor. It is named after Joseph Capgras (1873–1950), the French psychiatrist who first described the disorder.

    The Capgras delusion is classified as a delusional misidentification syndrome, a class of delusional beliefs that involves the misidentification of people, places, or objects.[2] It can occur in acute, transient, or chronic forms. Cases in which patients hold the belief that time has been "warped" or "substituted" have also been reported.[3]

    The delusion most commonly occurs in individuals diagnosed with schizophrenia but has also been seen in brain injury,[4] dementia with Lewy bodies,[5] and other dementia.[6] It presents often in individuals with a neurodegenerative disease, particularly at an older age.[7] It has also been reported as occurring in association with diabetes, hypothyroidism, and migraine attacks.[8] In one isolated case, the Capgras delusion was temporarily induced in a healthy subject by the drug ketamine.[9] It occurs more frequently in females, with a female to male ratio of approximately 3∶2.[10] 

    https://en.wikipedia.org/wiki/Capgras_delusion

    The syndrome of subjective doubles is a rare delusional misidentification syndrome in which a person experiences the delusion that they have a double or Doppelgänger with the same appearance, but usually with different character traits, that is leading a life of its own.[1][2] The syndrome is also called the syndrome of doubles of the self,[3] delusion of subjective doubles,[1] or simply subjective doubles.[4] Sometimes, the patient is under the impression that there is more than one double.[1] A double may be projected onto any person, from a stranger to a family member.[4]

    This syndrome is often diagnosed during or after the onset of another mental disorder, such as schizophrenia or other disorders involving psychotic hallucinations.[5] There is no widely accepted method of treatment, as most patients require individualized therapy. The prevalence of this disease is relatively low, as few cases have been reported since the disease was defined in 1978 by George Nikolaos Christodoulou (b.1935), a Greek-American Psychiatrist.[5][6] However, subjective doubles is not clearly defined in literature,[7] and therefore may be under-reported.[5] 

    https://en.wikipedia.org/wiki/Syndrome_of_subjective_doubles

    In mythology, folklore and speculative fiction, shape-shifting is the ability to physically transform oneself through an inherently superhuman ability, divine intervention, demonic manipulation, sorcery, spells or having inherited the ability. The idea of shape-shifting is in the oldest forms of totemism and shamanism, as well as the oldest existent literature and epic poems such as the Epic of Gilgamesh and the Iliad. The concept remains a common literary device in modern fantasy, children's literature and popular culture.  

    https://en.wikipedia.org/wiki/Shapeshifting

    The multiverse is the hypothetical set of all universes.[a] Together, these universes are presumed to comprise everything that exists: the entirety of space, time, matter, energy, information, and the physical laws and constants that describe them. The different universes within the multiverse are called "parallel universes", "other universes", "alternate universes", or "many worlds". One common assumption is that the multiverse is a "patchwork quilt of separate universes all bound by the same laws of physics."[1]

    The concept of multiple universes, or a multiverse, has been discussed throughout history, with origins in ancient Greek philosophy. It has evolved over time and has been debated in various fields, including cosmology, physics, and philosophy. Some physicists argue that the multiverse is a philosophical notion rather than a scientific hypothesis, as it cannot be empirically falsified. In recent years, there have been proponents and skeptics of multiverse theories within the physics community. Although some scientists have analyzed data in search of evidence for other universes, no statistically significant evidence has been found. Critics argue that the multiverse concept lacks testability and falsifiability, which are essential for scientific inquiry, and that it raises unresolved metaphysical issues.

    Max Tegmark and Brian Greene have proposed different classification schemes for multiverses and universes. Tegmark's four-level classification consists of Level I: an extension of our universe, Level II: universes with different physical constants, Level III: many-worlds interpretation of quantum mechanics, and Level IV: ultimate ensemble. Brian Greene's nine types of multiverses include quilted, inflationary, brane, cyclic, landscape, quantum, holographic, simulated, and ultimate. The ideas explore various dimensions of space, physical laws, and mathematical structures to explain the existence and interactions of multiple universes. Some other multiverse concepts include twin-world models, cyclic theories, M-theory, and black-hole cosmology.

    The anthropic principle suggests that the existence of a multitude of universes, each with different physical laws, could explain the fine-tuning of our own universe for conscious life. The weak anthropic principle posits that we exist in one of the few universes that support life. Debates around Occam's razor and the simplicity of the multiverse versus a single universe arise, with proponents like Max Tegmark arguing that the multiverse is simpler and more elegant. The many-worlds interpretation of quantum mechanics and modal realism, the belief that all possible worlds exist and are as real as our world, are also subjects of debate in the context of the anthropic principle

    https://en.wikipedia.org/wiki/Multiverse

    The anthropic principle, also known as the "observation selection effect",[1] is the hypothesis, first proposed in 1957 by Robert Dicke, that the range of possible observations that we could make about the universe is limited by the fact that observations could only happen in a universe capable of developing intelligent life in the first place.[2] Proponents of the anthropic principle argue that it explains why this universe has the age and the fundamental physical constants necessary to accommodate conscious life, since if either had been different, we would not have been around to make observations. Anthropic reasoning is often used to deal with the notion that the universe seems to be finely tuned for the existence of life.[3]

    There are many different formulations of the anthropic principle. Philosopher Nick Bostrom counts them at thirty, but the underlying principles can be divided into "weak" and "strong" forms, depending on the types of cosmological claims they entail. The weak anthropic principle (WAP), as defined by Brandon Carter, states that the universe's ostensible fine tuning is the result of selection bias (specifically survivorship bias). Most such arguments draw upon some notion of the multiverse for there to be a statistical population of universes to select from. However, a single vast universe is sufficient for most forms of the WAP that do not specifically deal with fine tuning. Carter distinguished the WAP from the strong anthropic principle (SAP), which considers the universe in some sense compelled to eventually have conscious and sapient life emerge within it.[4][5] A form of the latter known as the participatory anthropic principle, articulated by John Archibald Wheeler, suggests on the basis of quantum mechanics that the universe, as a condition of its existence, must be observed, so implying one or more observers. Stronger yet is the final anthropic principle (FAP), proposed by John D. Barrow and Frank Tipler, which views the universe's structure as expressible by bits of information in such a way that information processing is inevitable and eternal.[4] 

    https://en.wikipedia.org/wiki/Anthropic_principle

    The Gothic double is a literary motif which refers to the divided personality of a character. Closely linked to the Doppelgänger, which first appeared in the 1796 novel Siebenkäs by Johann Paul Richter, the double figure emerged in Gothic literature in the late 18th century due to a resurgence of interest in mythology and folklore which explored notions of duality, such as the fetch in Irish folklore which is a double figure of a family member, often signifying an impending death.[1]

    A major shift in Gothic literature occurred in the late 18th and early 19th centuries, where evil was no longer within a physical location such as a haunted castle, but expanded to inhabit the mind of characters, often referred to as "the haunted individual."[2] Examples of the Gothic double motif in 19th-century texts include Charlotte Brontë's novel Jane Eyre (1847) and Charlotte Perkins Gilman's short story The Yellow Wallpaper (1892), which use the motif to reflect on gender inequalites in the Victorian era,[3] and famously, Robert Louis Stevenson's novella Strange Case of Dr Jekyll and Mr Hyde (1886).

    In the early 20th century, the Gothic double motif was featured in new mediums such as film to explore the emerging fear of technology replacing humanity.[4] A notable example of this is the evil mechanical double depicted in the German expressionist film Metropolis by Fritz Lang (1927). Texts in this period also appropriate the Gothic double motif present in earlier literature, such as Daphne du Maurier's Gothic romance novel Rebecca (1938), which appropriates the doubling in Jane Eyre.[5] In the 21st century, the Gothic double motif has further been featured in horror and psychological thriller films such as Darren Aronofsky's Black Swan (2010) and Jordan Peele's Us (2019).[1] In addition, the Gothic double motif has been used in 21st century Anthropocene literature, such as Jeff VanderMeer's Annihilation (2014). 

    https://en.wikipedia.org/wiki/Gothic_double

    Cloning is the process of producing individual organisms with identical genomes, either by natural or artificial means. In nature, some organisms produce clones through asexual reproduction. In the field of biotechnology, cloning is the process of creating cloned organisms of cells and of DNA fragments.  

    https://en.wikipedia.org/wiki/Cloning

    A changeling, also historically referred to as an auf or oaf, is a human-like creature found in folklore throughout Europe. A changeling was believed to be a fairy that had been left in place of a human (typically a child) stolen by other fairies. 

    https://en.wikipedia.org/wiki/Changeling

    Anthropomorphism is the attribution of human traits, emotions, or intentions to non-human entities.[1] It is considered to be an innate tendency of human psychology.[2]

    Personification is the related attribution of human form and characteristics to abstract concepts such as nations, emotions, and natural forces, such as seasons and weather.

    Both have ancient roots as storytelling and artistic devices, and most cultures have traditional fables with anthropomorphized animals as characters. People have also routinely attributed human emotions and behavioral traits to wild as well as domesticated animals.[3] 

    https://en.wikipedia.org/wiki/Anthropomorphism

    Personation (rather than impersonation) is a primarily-legal term, meaning 'to assume the identity of another person with intent to deceive'.[1] It is often used for the kind of voter fraud where an individual votes in an election, whilst pretending to be a different elector. It is also used when charging a person who portrays themselves as a police officer.

    Personation appears as a crime in the Canadian Criminal Code with the meaning simply of impersonation.[2]

    In the U.S., the New York State Penal Law defines the crime of false personation as simply the act of pretending to be another, a Class B misdemeanor; those who assume the identity of another in order to further another crime can be charged with second-degree criminal impersonation, a Class A misdemeanor. Posing as a police officer for any reason, or as a physician in order to forge a prescription or otherwise obtain substances so controlled, is first-degree criminal impersonation, a Class E felony.

    Many jurisdictions allow electors to nominate an individual to vote on their behalf, often known as proxy voting. Whilst voting with an invalid proxy form could be considered personation, it is usual for an intent to deceive to be required for such an act to be considered criminal.

    Personation is an offence in law in England and Wales: see English criminal law#Forgery, personation and cheating 

    https://en.wikipedia.org/wiki/Personation

    Factitious disorder imposed on another (FDIA), also known as fabricated or induced illness by carers (FII), and first named as Munchausen syndrome by proxy (MSbP), is a condition in which a caregiver creates the appearance of health problems in another person, typically their child.[7][8] This may include injuring the child or altering test samples.[7] The caregiver then presents the person as being sick or injured.[5] Permanent injury or death of the victim may occur as a result of the disorder.[7] The behaviour occurs without a specific benefit to the caregiver.[5]

    The cause of FDIA is unknown.[2] The primary motive may be to gain attention and manipulate physicians.[4] Risk factors for FDIA include pregnancy related complications and a mother who was abused as a child or has factitious disorder imposed on self.[3] Diagnosis is supported when removing the child from the caregiver results in improvement of symptoms or video surveillance without the knowledge of the caregiver finds concerns.[4] Those affected by the disorder have been subjected to a form of physical abuse and medical neglect.[1]

    Management of FDIA may require putting the child in foster care.[2][4][9] It is not known how effective therapy is for FDIA; it is assumed it may work for those who admit they have a problem.[4] The prevalence of FDIA is unknown,[5] but it appears to be relatively rare.[4] More than 95% of cases involve a person's mother.[3]

    The prognosis for the caregiver is poor.[4] However, there is a burgeoning literature on possible courses of therapy.[3]

    The condition was first named as "Munchausen syndrome by proxy" in 1977 by British pediatrician Roy Meadow.[4] Some aspects of FDIA may represent criminal behavior.[5] 

    https://en.wikipedia.org/wiki/Factitious_disorder_imposed_on_another

    Other uses

    • Proxy or agent (law), a substitute authorized to act for another entity or a document which authorizes the agent so to act
    • Proxy (climate), a measured variable used to infer the value of a variable of interest in climate research
    • Proxy (statistics), a measured variable used to infer the value of a variable of interest
    • Healthcare proxy, a document used to specify an agent to make medical decisions for a patient in case they are incapacitated
    • Proxy bullying (or vicarious bullying), bullying committed on behalf of somebody else
    • Proxy fight, attempting to influence how company shareholders use their proxy votes
    • Proxy marriage, common amongst European monarchs, where one party is not present in person to their marriage to the other
    • Proxy murder, a murder committed on behalf of somebody else
    • Proxy statement, information published related to a U.S. stockholders' meeting
    • Proxy voting, a vote cast on behalf of an absent person
    • Proxy war, a war where two powers use third parties as a substitute for fighting each other directly
    • Torture by proxy, torturing someone on somebody else's behalf

    https://en.wikipedia.org/wiki/Proxy

    The Wild Hunt is a folklore motif occurring across various northern European cultures (motif E501 per Thompson).[1] Wild Hunts typically involve a chase led by a mythological figure escorted by a ghostly or supernatural group of hunters engaged in pursuit.[2] The leader of the hunt is often a named figure associated with Odin in Germanic legends,[3][4] but may variously be a historical or legendary figure like Theodoric the Great, the Danish king Valdemar Atterdag, the dragon slayer Sigurd, the Welsh psychopomp Gwyn ap Nudd, biblical figures such as Herod, Cain, Gabriel, or the Devil, or an unidentified lost soul or spirit either male or female. The hunters are generally the souls of the dead or ghostly dogs, sometimes fairies, valkyries, or elves.[5][6][7]

    Seeing the Wild Hunt was thought to forebode some catastrophe such as war or plague, or at best the death of the one who witnessed it.[8] People encountering the Hunt might also be abducted to the underworld or the fairy kingdom.[a] In some instances, it was also believed that people's spirits could be pulled away during their sleep to join the cavalcade.[10]

    The concept was developed by Jacob Grimm in his Deutsche Mythologie (1835) on the basis of comparative mythology. Grimm believed that a group of stories represented a folkloristic survival of Germanic pagan tradition, but comparable folk myths are found throughout Northern, Western and Central Europe.[3] Grimm popularised the term Wilde Jagd ('Wild Hunt') for the phenomenon. 

    https://en.wikipedia.org/wiki/Wild_Hunt

    Odin (/ˈdɪn/;[1] from Old Norse: Óðinn) is a widely revered god in Germanic paganism. Norse mythology, the source of most surviving information about him, associates him with wisdom, healing, death, royalty, the gallows, knowledge, war, battle, victory, sorcery, poetry, frenzy, and the runic alphabet, and depicts him as the husband of the goddess Frigg. In wider Germanic mythology and paganism, the god was also known in Old English as Wōden, in Old Saxon as Uuôden, in Old Dutch as Wuodan, in Old Frisian as Wêda, and in Old High German as Wuotan, all ultimately stemming from the Proto-Germanic theonym *Wōðanaz, meaning 'lord of frenzy', or 'leader of the possessed'.

    Odin appears as a prominent god throughout the recorded history of Northern Europe, from the Roman occupation of regions of Germania (from c. 2 BCE) through movement of peoples during the Migration Period (4th to 6th centuries CE) and the Viking Age (8th to 11th centuries CE). In the modern period, the rural folklore of Germanic Europe continued to acknowledge Odin. References to him appear in place names throughout regions historically inhabited by the ancient Germanic peoples, and the day of the week Wednesday bears his name in many Germanic languages, including in English.

    In Old English texts, Odin holds a particular place as a euhemerized ancestral figure among royalty, and he is frequently referred to as a founding figure among various other Germanic peoples, such as the Langobards, while some Old Norse sources depict him as an enthroned ruler of the gods. Forms of his name appear frequently throughout the Germanic record, though narratives regarding Odin are mainly found in Old Norse works recorded in Iceland, primarily around the 13th century. These texts make up the bulk of modern understanding of Norse mythology.

    Old Norse texts portray Odin as the son of Bestla and Borr along with two brothers, Vili and Vé, and he fathered many sons, most famously the gods Thor (with Jörð) and Baldr (with Frigg). He is known by hundreds of names. Odin is frequently portrayed as one-eyed and long-bearded, wielding a spear named Gungnir or appearing in disguise wearing a cloak and a broad hat. He is often accompanied by his animal familiars—the wolves Geri and Freki and the ravens Huginn and Muninn, who bring him information from all over Midgard—and he rides the flying, eight-legged steed Sleipnir across the sky and into the underworld. In these texts he frequently seeks greater knowledge, most famously by obtaining the Mead of Poetry, and makes wagers with his wife Frigg over his endeavors. He takes part both in the creation of the world by slaying the primordial being Ymir and in giving life to the first two humans Ask and Embla. He also provides mankind knowledge of runic writing and poetry, showing aspects of a culture hero. He has a particular association with the Yule holiday.

    Odin is also associated with the divine battlefield maidens, the valkyries, and he oversees Valhalla, where he receives half of those who die in battle, the einherjar, sending the other half to the goddess Freyja's Fólkvangr. Odin consults the disembodied, herb-embalmed head of the wise Mímir, who foretells the doom of Ragnarök and urges Odin to lead the einherjar into battle before being consumed by the monstrous wolf Fenrir. In later folklore, Odin sometimes appears as a leader of the Wild Hunt, a ghostly procession of the dead through the winter sky. He is associated with charms and other forms of magic, particularly in Old English and Old Norse texts.

    The figure of Odin is a frequent subject of interest in Germanic studies, and scholars have advanced numerous theories regarding his development. Some of these focus on Odin's particular relation to other figures; for example, Freyja's husband Óðr appears to be something of an etymological doublet of the god, while Odin's wife Frigg is in many ways similar to Freyja, and Odin has a particular relation to Loki. Other approaches focus on Odin's place in the historical record, exploring whether Odin derives from Proto-Indo-European mythology or developed later in Germanic society. In the modern period, Odin has inspired numerous works of poetry, music, and other cultural expressions. He is venerated with other Germanic gods in most forms of the new religious movement Heathenry; some branches focus particularly on him. 

    https://en.wikipedia.org/wiki/Odin

    For the majority of Christian denominations, the Holy Spirit, or Holy Ghost, is believed to be the third person of the Trinity,[1] a Triune God manifested as God the Father, God the Son, and God the Holy Spirit, each person itself being God.[2][3][4] Nontrinitarian Christians, who reject the doctrine of the Trinity, differ significantly from mainstream Christianity in their beliefs about the Holy Spirit. In Christian theology, pneumatology is the study of the Holy Spirit. Due to Christianity's historical relationship with Judaism, theologians often identify the Holy Spirit with the concept of the Ruach Hakodesh in Jewish scripture, on the theory that Jesus was expanding upon these Jewish concepts. Similar names, and ideas, include the Ruach Elohim (Spirit of God), Ruach YHWH (Spirit of Yahweh), and the Ruach Hakodesh (Holy Spirit).[5][6] In the New Testament it is identified with the Spirit of Christ, the Spirit of Truth, the Paraclete and the Holy Spirit.[7][8][9]

    The New Testament details a close relationship between the Holy Spirit and Jesus during his earthly life and ministry.[10] The Gospels of Matthew and Luke and the Nicene Creed state that Jesus was "conceived by the Holy Spirit, born of the Virgin Mary".[11] The Holy Spirit descended on Jesus like a dove during his baptism, and in his Farewell Discourse after the Last Supper Jesus promised to send the Holy Spirit to his disciples after his departure.[12][13]

    The Holy Spirit is referred to as "the Lord, the Giver of Life" in the Nicene Creed, which summarises several key beliefs held by many Christian denominations. The participation of the Holy Spirit in the tripartite nature of conversion is apparent in Jesus' final post-resurrection instruction to his disciples at the end of the Gospel of Matthew,[14] "Make disciples of all the nations, baptizing them into the name of the Father and of the Son and of the Holy Spirit."[15] Since the first century, Christians have also called upon God with the trinitarian formula "Father, Son and Holy Spirit" in prayer, absolution and benediction.[16][17] In the book of the Acts of the Apostles the arrival of the Holy Spirit happens fifty days after the resurrection of the Christ, and is celebrated in Christendom with the feast of Pentecost.[18]

    https://en.wikipedia.org/wiki/Holy_Spirit_in_Christianity

    Spirit possession is an unusual or altered state of consciousness and associated behaviors purportedly caused by the control of a human body by spirits, ghosts, demons, or gods.[1] The concept of spirit possession exists in many cultures and religions, including Buddhism, Christianity,[2] Haitian Vodou, Hinduism, Islam, Wicca, and Southeast Asian, African, and Native American traditions. Depending on the cultural context in which it is found, possession may be considered voluntary or involuntary and may be considered to have beneficial or detrimental effects on the host.[3]

    In a 1969 study funded by the National Institute of Mental Health, spirit possession beliefs were found to exist in 74% of a sample of 488 societies in all parts of the world, with the highest numbers of believing societies in Pacific cultures and the lowest incidence among Native Americans of both North and South America.[1][4] As Pentecostal and Charismatic Christian churches move into both African and Oceanic areas, a merger of belief can take place, with "demons" becoming representative of the "old" indigenous religions, which the Christian ministers attempt to exorcise.[5] 

     

    https://en.wikipedia.org/wiki/Spirit_possession

    https://en.wikipedia.org/wiki/Skepticism


    Vitalism is a belief that starts from the premise that "living organisms are fundamentally different from non-living entities because they contain some non-physical element or are governed by different principles than are inanimate things."[1][a] Where vitalism explicitly invokes a vital principle, that element is often referred to as the "vital spark," "energy," or "élan vital," which some equate with the soul. In the 18th and 19th centuries vitalism was discussed among biologists, between those who felt that the known mechanics of physics would eventually explain the difference between life and non-life and vitalists who argued that the processes of life could not be reduced to a mechanistic process. Vitalist biologists such as Johannes Reinke proposed testable hypotheses meant to show inadequacies with mechanistic explanations, but their experiments failed to provide support for vitalism. Biologists now consider vitalism in this sense to have been refuted by empirical evidence, and hence regard it either as a superseded scientific theory,[4] or, since the mid-20th century, as a pseudoscience.[5][6]

    Vitalism has a long history in medical philosophies: many traditional healing practices posited that disease results from some imbalance in vital forces. 

    https://en.wikipedia.org/wiki/Vitalism

    In poetry and literature, a shade (translating Greek σκιά,[1] Latin umbra[2]) is the spirit or ghost of a dead person, residing in the underworld.

    An underworld where the dead live in shadow was common to beliefs in the ancient Near East. In Biblical Hebrew, it was called tsalmaveth (צַלמָוֶת: lit. "death-shadow", "shadow of death") as an alternate term for Sheol.[3][4] The Witch of Endor in the First Book of Samuel notably conjures the ghost (owb[5]) of Samuel.

    Only select individuals were believed to be exempt from the fate of dwelling in shadow after death. They would instead ascend to the divine sphere, as is reflected in the veneration of heroes. Plutarch relates how Alexander the Great was inconsolable after the death of Hephaistion up to the moment he received an oracle of Ammon confirming that the deceased was a hero, i.e. enjoyed the status of a divinity.[6]

    Shades appear in Book Eleven of Homer's Odyssey, when Odysseus descends into Hades, and in Book Six of Virgil's Aeneid, when Aeneas travels to the underworld. In the Divine Comedy by Dante Alighieri, many of the dead are similarly referred to as shades (Italian ombra), including Dante's guide, Virgil.

    The phrase "peace to thy gentle shade [and endless rest]" is sometimes seen in epitaphs, and was used by Alexander Pope in his epitaph for Nicholas Rowe.

    See also

    References


  • Liddell & Scott entry

  • Lewis & Short

  • Gesenius

  • (edit.) Boustan, Ra'anan S. Reed, Annette Yoshiko. Heavenly Realms and Earthly Realities in Late Antique Religions. Cambridge University Press, 2004.

  • Gesenius

    1. "Alexander's grief for him exceeded all reasonable measure. He ordered the manes of all the horses and mules to be cut off in sign of mourning, he struck off the battlements of all the neighbouring cities, crucified the unhappy physician, and would not permit the flute or any other musical instrument to be played throughout his camp, until a response came from the oracle of Ammon bidding him honour Hephæstion and offer sacrifice to him as to a hero." Parallel Lives, 72.

     https://en.wikipedia.org/wiki/Shade_(mythology)

    The Bogeyman (/ˈbɡimæn/;[1] also spelled or known as boogeyman, bogyman, bogieman, boogie monster, boogieman, bugaboo, bug-a-boo, boogie woogie, bugabear, or bugbear) is a type of mythic creature used by adults to frighten children into good behavior. Bogeymen have no specific appearance and conceptions vary drastically by household and culture, but they are most commonly depicted as masculine or androgynous monsters that punish children for misbehavior.[2] The Bogeyman or conceptually similar monsters can be found in many cultures around the world. Bogeymen may target a specific act or general misbehaviour, depending on what purpose needs serving, often on the basis of a warning from the child's authority figure. The term is sometimes used as a non-specific personification or metonym for terror, and in some cases, the Devil.[3]

    https://en.wikipedia.org/wiki/Bogeyman

    Purgatory (Latin: purgatorium, borrowed into English via Anglo-Norman and Old French)[1] is, according to the belief of some Christian denominations, an intermediate state after physical death for expiatory purification.[2] The process of purgatory is the final purification of the elect, which is entirely different from the punishment of the damned.[3] Tradition, by reference to certain texts of scripture, sees the process as involving a cleansing fire. Some forms of Western Christianity, particularly within Protestantism, deny its existence. Other strands of Western Christianity see purgatory[4] as a place, perhaps filled with fire. Some concepts of Gehenna in Judaism resemble those of purgatory.

    The word "purgatory" has come to refer to a wide range of historical and modern conceptions of postmortem suffering short of everlasting damnation.[5] English-speakers also use the word in a non-specific sense to mean any place or condition of suffering or torment, especially one that is temporary.[6]

    According to Jacques Le Goff, the conception of purgatory as a physical place came into existence in Western Europe toward the end of the twelfth century.[7] Le Goff states that the concept involves the idea of a purgatorial fire, which he suggests "is expiatory and purifying not punitive like hell fire".[8] At the Second Council of Lyon in 1274, when the Catholic Church defined, for the first time, its teaching on purgatory, the Eastern Orthodox Church did not adopt the doctrine. The council made no mention of purgatory as a third place or as containing fire,[9] which are absent also in the declarations by the Councils of Florence (1431-1449) and of Trent (1545-1563).[10] Popes John Paul II and Benedict XVI have written that the term does not indicate a place, but a condition of existence.[11][12]

    The Church of England, mother church of the Anglican Communion, officially denounces what it calls "the Romish Doctrine concerning Purgatory",[13] but the Eastern Orthodox Church, Oriental Orthodox Churches, and elements of the Anglican, Lutheran, and Methodist traditions hold that for some there is cleansing after death and pray for the dead.[14][15][16][17][18] The Reformed Churches teach that the departed are delivered from their sins through the process of glorification.[19] Rabbinical Judaism also believes in the possibility of after-death purification and may even use the word "purgatory" to describe the similar rabbinical concept of Gehenna, though Gehenna is also sometimes described[by whom?] as more similar to hell or Hades.[20] 

    https://en.wikipedia.org/wiki/Purgatory

    Fantasy is a genre of speculative fiction involving magical elements, typically set in a fictional universe and sometimes inspired by mythology and folklore. The term "fantasy" can also be used to describe a "work of this genre",[1] usually literary.

    Its roots are in oral traditions, which then became fantasy literature and drama. From the twentieth century, it has expanded further into various media, including film, television, graphic novels, manga, animations and video games.

    Alternate terms for the genre include phantasy,[2] although this is rarely used, xuanhuan (for Chinese fantasy works with a strong emphasis on magic),[3] and qihuan (for Chinese fantasy novels that take place in fantasy worlds that combine elements from European and Chinese fantasy).[3]

    Fantasy is distinguished from the genres of science fiction and horror by the absence of scientific or macabre themes, although these can occur in fantasy. In popular culture, the fantasy genre predominantly features settings that emulate Earth, but with a sense of otherness.[4] In its broadest sense, however, fantasy consists of works by many writers, artists, filmmakers, and musicians from ancient myths and legends to many recent and popular works. 

    https://en.wikipedia.org/wiki/Fantasy

    Elements of the supernatural and the fantastic were an element of literature from its beginning. The modern genre is distinguished from tales and folklore which contain fantastic elements, first by the acknowledged fictitious nature of the work, and second by the naming of an author. Works in which the marvels were not necessarily believed, or only half-believed, such as the European romances of chivalry and the tales of the Arabian Nights, slowly evolved into works with such traits. Authors like George MacDonald (1824 –1905) created the first explicitly fantastic works.

    Later, in the twentieth century, the publication of The Lord of the Rings by J. R. R. Tolkien enormously influenced fantasy writing, establishing the form of epic fantasy. This also did much to establish the genre of fantasy as commercially distinct and viable. And today fantasy continues as an expansive, multi-layered milieu encompassing many subgenres, including traditional high fantasy, sword and sorcery, magical realism, fairytale fantasy, and horror-tinged dark fantasy

    https://en.wikipedia.org/wiki/History_of_fantasy

    https://en.wikipedia.org/wiki/Falsifiability

    https://en.wikipedia.org/wiki/Empirical_research

    https://en.wikipedia.org/wiki/Verification

    https://en.wikipedia.org/wiki/Real

    https://en.wikipedia.org/wiki/Limit

    https://en.wikipedia.org/wiki/Mind

    https://en.wikipedia.org/w/index.php?search=not+real&title=Special%3ASearch&ns0=1

    https://en.wikipedia.org/wiki/Real_World

    https://en.wikipedia.org/wiki/Real_life

    https://en.wikipedia.org/w/index.php?search=mechanical+physics&title=Special%3ASearch&ns0=1

    https://en.wikipedia.org/wiki/Classical_physics

    https://en.wikipedia.org/wiki/Potentiality_and_actuality

    https://en.wikipedia.org/w/index.php?search=actualization&title=Special%3ASearch&ns0=1

    https://en.wikipedia.org/wiki/Imagination


    Imagination is the production or simulation of novel objects, sensations, and ideas in the mind without any immediate input of the senses. Stefan Szczelkun characterises it as the forming of experiences in one's mind, which can be re-creations of past experiences, such as vivid memories with imagined changes, or completely invented and possibly fantastic scenes.[1] Imagination helps make knowledge applicable in solving problems and is fundamental to integrating experience and the learning process.[2][3][4][5] As an approach to build theory, it is called "disciplined imagination".[6] A basic training for imagination is listening to storytelling (narrative),[2][7] in which the exactness of the chosen words is the fundamental factor to "evoke worlds".[8]

    One view of imagination links it with cognition,[9][10][11] seeing imagination as a cognitive process used in mental functioning. It is increasingly used - in the form of visual imagery - by clinicians in psychological treatment.[12] Imaginative thought may - speculatively - become associated with rational thought on the assumption that both activities may involve cognitive processes that may "underpin thinking about possibilities".[13] The cognate term, "mental imagery" may be used in psychology for denoting the process of reviving in the mind recollections of objects formerly given in sense perception. Since this use of the term conflicts with that of ordinary language, some psychologists have preferred to describe this process as "imaging" or "imagery" or to speak of it as "reproductive" as opposed to "productive" or "constructive" imagination. Constructive imagination is further divided into voluntary imagination driven by the lateral prefrontal cortex (LPFC) and involuntary imagination (LPFC-independent), such as REM-sleep dreaming, daydreaming, hallucinations, and spontaneous insight.[14] The voluntary types of imagination include integration of modifiers, and mental rotation. Imagined images, both novel and recalled, are seen with the "mind's eye".

    Imagination, however, is not considered to be exclusively a cognitive activity because it is also linked to the body and place, particularly that it also involves setting up relationships with materials and people, precluding the sense that imagination is locked away in the head.[15]

    Imagination can also be expressed through stories such as fairy tales or fantasies. Children often use such narratives and pretend play in order to exercise their imaginations. When children develop fantasy they play at two levels: first, they use role playing to act out what they have developed with their imagination, and at the second level they play again with their make-believe situation by acting as if what they have developed is an actual reality.[16]

    https://en.wikipedia.org/wiki/Imagination


    https://en.wikipedia.org/wiki/Problem_solving

    https://en.wikipedia.org/wiki/Critical_thinking


    Critical thinking is the analysis of available facts, evidence, observations, and arguments in order to form a judgement by the application of rational, skeptical, and unbiased analyses and evaluation.[1] The application of critical thinking includes self-directed, self-disciplined, self-monitored, and self-corrective habits of mind,[2] thus, a critical thinker is a person who practices the skills of critical thinking or has been trained and educated in its disciplines.[3] Richard W. Paul said that the mind of a critical thinker engages the person's intellectual abilities and personality traits.[4] Critical thinking presupposes assent to rigorous standards of excellence and mindful command of their use in effective communication and problem-solving, and a commitment to overcome egocentrism and sociocentrism.[5][6] 

    https://en.wikipedia.org/wiki/Critical_thinking

    Egocentrism is the inability to differentiate between self and other. More specifically, it is the inability to accurately assume or understand any perspective other than one's own.[1] Egocentrism is found across the life span: in infancy,[2] early childhood,[3][4] adolescence,[5] and adulthood.[3][6] Although egocentric behaviors are less prominent in adulthood, the existence of some forms of egocentrism in adulthood indicates that overcoming egocentrism may be a lifelong development that never achieves completion.[7] Adults appear to be less egocentric than children because they are faster to correct from an initially egocentric perspective than children, not because they are less likely to initially adopt an egocentric perspective.[3] 

    https://en.wikipedia.org/wiki/Egocentrism

    Egotism is defined as the drive to maintain and enhance favorable views of oneself and generally features an inflated opinion of one's personal features and importance distinguished by a person's amplified vision of one's self and self-importance. It often includes intellectual, physical, social, and other overestimations.[1] The egotist has an overwhelming sense of the centrality of the "me" regarding their personal qualities.[2] 

    https://en.wikipedia.org/wiki/Egotism

    The Culture of Narcissism: American Life in an Age of Diminishing Expectations is a 1979 book by the cultural historian Christopher Lasch, in which the author explores the roots and ramifications of what he perceives as the normalizing of pathological narcissism in 20th-century American culture using psychological, cultural, artistic and historical synthesis.[1] For the mass-market edition published in September of the same year,[1] Lasch won the 1980 US National Book Award in the category Current Interest (paperback).[2][a] 

    https://en.wikipedia.org/wiki/The_Culture_of_Narcissism

    Walter Map

    The chronicler Walter Map, a Welshman writing during the 12th century, tells of a "wicked man" in Hereford who revived from the dead and wandered the streets of his village at night calling out the names of those who would die of sickness within three days. The response by bishop Gilbert Foliot was "Dig up the body and cut off the head with a spade, sprinkle it with holy water and re-inter it".[19] 

    https://en.wikipedia.org/wiki/Revenant

    In ghostlore, a poltergeist (/ˈpltərˌɡst/ or /ˈpɒltərˌɡst/; German for "rumbling ghost" or "noisy spirit") is a type of ghost or spirit that is responsible for physical disturbances, such as loud noises and objects being moved or destroyed. Most claims or fictional descriptions of poltergeists show them as being capable of pinching, biting, hitting, and tripping people. They are also depicted as capable of the movement or levitation of objects such as furniture and cutlery, or noises such as knocking on doors. Foul smells are also associated with poltergeist occurrences, as well as spontaneous fires and different electrical issues such as flickering lights. [1]

    They have traditionally been described as troublesome spirits who haunt a particular person instead of a specific location. Some variation of poltergeist folklore is found in many different cultures. Early claims of spirits that supposedly harass and torment their victims date back to the 1st century, but references to poltergeists became more common in the early 17th century. 

    https://en.wikipedia.org/wiki/Poltergeist

    A skeleton is a type of physically manifested undead often found in fantasy, gothic and horror fiction, and mythical art. Most are human skeletons, but they can also be from any creature or race found on Earth or in the fantasy world.  

    https://en.wikipedia.org/wiki/Skeleton_(undead)

    A vampire burial or anti-vampire burial is a burial performed in a way which was believed to prevent the deceased from revenance in the form of a vampire or to prevent an "actual" vampire from revenance. Traditions, known from the medieval times, varied.[1][2][3]

    By an association, the term "vampire burial" may also refer to burials apparently performed with rituals associated with beliefs that the buried may arise from the dead or evil spirits may come out of the grave, etc., and these rituals were intended to prevent this from happening. An example of this is believed to be the case of mid-5th century "Children's Necropolis" of Lugnano in Teverina, Italy.[4]

    Vampire burials had other byproducts, whether intentional or not, that as well counteracted various outside forces that could be imparted onto a deceased body, such as protection from scavengers, erosion damage, and having the body resurface due to storms. [5]

    Archeologists uncovered a number of burials believed to be of this type.

    • A mid-16th century burial of a woman on the island of Lazzaretto Nuovo in the Venice lagoon, Italy[3]
    • Some interments in a cemetery in Greater Poland, dated 1675–1880.[6]
    • Drawsko cemetery, Poland, dated to the 17th-18th centuries[7] However the theory about "vampire burials" there has been contested later.[8]
    • Gliwice, Poland, undated[9]
    • Medieval cemetery site in Kałdus, Poland[10]
    • A 17th-century burial of a woman in a graveyard in Pień, Poland. The corpse had a padlock around the toe and a scythe positioned in such a way that if the corpse had risen from the grave, the scythe would have severed its throat.[11]
    • Anti-vampire burial from Sanok

    https://en.wikipedia.org/wiki/Vampire_burial

    Maschalismos (Ancient Greek: μασχαλισμός) is the practice of physically rendering the dead incapable of rising or haunting the living in undead form. It comes from the Ancient Greek word and was also the term for procedural rules on such matters in later Greek customary law.

    Such acts considered maschalismos were not limited to folkloric physical risings but also meant to escape the ill will of those wrongfully slain by a murderer after death. Sophocles expresses that such treatment is saved for one's 'enemies' due to the act being a way to elicit indignation as well as dishonor the dead.

    In Aeschylus' tragedy Choephori and Sophocles' tragedy Electra, Clytemnestra performs maschalismos on the body of Agamemnon after his murder, to prevent him from taking vengeance on her. In the Argonautica of Apollonius of Rhodes, Jason performs maschalismos on the body of Medea's brother Apsyrtus after treacherously murdering him; in addition to cutting off the extremities, Jason licks the dead man's blood three times and spits it out three times. "The scholiast says that the blood was spat into the mouth of the deceased", according to a footnote in the Loeb edition. 

    https://en.wikipedia.org/wiki/Maschalismos

    In German folklore, a nachzehrer (also spelt nachtzehrer) is a sort of vampire. The word nachzehrer translates to "after (nach) living off (zehren)" likely alluding to their living after death or living off humans after death in addition to the choice of "nach" for "after" which is similar to "nacht" ("night"). The nachzehrer was prominent in the folklore of the northern regions of Germany, but even in Silesia and Bavaria, and the word was also used to describe a similar creature of the Kashubes of Northern Poland.  

    https://en.wikipedia.org/wiki/Nachzehrer

    A Gjenganger (Norwegian: Gjenganger, Attergangar or Gjenferd; Danish: Genganger or Genfærd; Swedish: Gengångare) in Scandinavian folklore was a term for a revenant, the spirit or ghost of a deceased from the grave.[1]

    https://en.wikipedia.org/wiki/Gjenganger

    In German folklore, a nachzehrer (also spelt nachtzehrer) is a sort of vampire. The word nachzehrer translates to "after (nach) living off (zehren)" likely alluding to their living after death or living off humans after death in addition to the choice of "nach" for "after" which is similar to "nacht" ("night"). The nachzehrer was prominent in the folklore of the northern regions of Germany, but even in Silesia and Bavaria, and the word was also used to describe a similar creature of the Kashubes of Northern Poland.

    Overview

    A Nachzehrer is created most commonly after suicide, and sometimes from an accidental death. According to German lore, a person does not become a nachzehrer from being bitten or scratched; the transformation happens after death and is not communicable. Nachzehrers are also related to sickness and disease. If a large group of people died of the plague, the first person to have died is believed to be a nachzehrer.

    Typically, a nachzehrer devours its family members upon waking. It has also been said that they devour their own bodies, including their funeral shrouds, and the more of themselves they eat, the more of their family they physically drain. It is not unlikely that the idea of the dead eating themselves might have risen from bodies in open graves who had been partly eaten by scavengers like rats.

    The nachzehrer was similar to the Slavic vampire in that it was known to be a recently deceased person who returned from the grave to attack family and village acquaintances.

    Some Kashubes believed that the Nachzehrer would leave its grave, shapeshifting into the form of a pig, and pay a visit to their family members to feast on their blood. In addition, the Nachzehrer was able to ascend to a church belfry to ring the bells, bringing death to anyone who hears them. Another lesser known ability of the Nachzehrer is the power it had to bring death by causing its shadow to fall upon someone. Those hunting the Nachzehrer in the graveyard would listen for grunting sounds that it would make while it munched on its grave clothes.[1]

    It usually originated from an unusual death such as a person who died by suicide or accident. They were also associated with epidemic sickness, such as whenever a group of people died from the same disease, the person who died first was labeled to be the cause of the group's death. Another belief was that if a person's name was not removed from his burial clothing, that person would be a candidate for becoming a nachzehrer.

    Such a belief was found even in the Republic of Venice, where the body of a woman, with a brick in her mouth, was discovered in 2006 in a mass grave of plague-dead people.[2][3] The official killing myth says a nachzehrer can be killed by placing a coin in its mouth, and then chopping off its head. It can be discerned from this that a mere coin in the mouth may result in paralysis as some myths say that a stake through a vampire's heart does.

    Finding nachzehrer in order to kill them is not difficult; it is characteristic of a nachzehrer to lie in its grave with its thumb in its opposite hand, and its left eye open. Additionally, they are easily found while eating their burial shroud due to the noise they produce doing so.[citation needed]

    See also

    • Draugr (Norse mythology and Scandinavian folklore)
    • Revenant (English folklore)

    References


  • Bunson, Matthew (1993). The Vampire Encyclopedia, p. 185,186, Gramercy, ISBN 0-517-16206-7.

  • Nuzzolese, E., & Borrini, M. (2010). Forensic approach to an archaeological casework of “vampire” skeletal remains in Venice: Odontological and anthropological prospectus. Journal of forensic sciences, 55(6), 1634-1637.

    1. "Vampire woman of Venice". Archived from the original on 2013-05-30. Retrieved 2012-07-06.

     https://en.wikipedia.org/wiki/Nachzehrer

    German folklore is the folk tradition which has developed in Germany over a number of centuries. Seeing as Germany was divided into numerous polities for most of its history, this term might both refer to the folklore of Germany proper and of all German-speaking countries, this wider definition including folklore of Austria and Liechtenstein as well as the German-speaking parts of Switzerland, Luxembourg, and Belgium.  

    https://en.wikipedia.org/wiki/German_folklore

    The Nixie, Nixy,[1] Nix,[1] Näcken, Nicor, Nøkk, or Nøkken (German: Nixe; Dutch: nikker, nekker; Danish: nøkke; Norwegian Bokmål: nøkk; Nynorsk: nykk; Swedish: näck; Faroese: nykur; Finnish: näkki; Icelandic: nykur; Estonian: näkk; Old English: nicor; English: neck or nicker) are humanoid, and often shapeshifting water spirits in Germanic mythology and folklore.

    Under a variety of names, they are common to the stories of all Germanic peoples,[2] although they are perhaps best known from Scandinavian folklore. The related English knucker was generally depicted as a worm or dragon, although more recent versions depict the spirits in other forms. Their sex, bynames, and various transformations vary geographically. The German Nix and his Scandinavian counterparts were male. The German Nixe was a female river mermaid.[2] Similar creatures are known from other parts of Europe, such as the Melusine in France, the Xana in Asturias (Spain), and the Slavic water spirits (e.g. the Rusalka) in Slavic countries. 

    https://en.wikipedia.org/wiki/Nixie_(folklore)

    The Feuermann (fire man; pl. Feuermänner), also Brennender, Brünnling,[1] Brünnlinger,[2] Brünnlig[3] (all: burning one), brünnigs Mannli (burning manikin), Züsler (sg., pl.; flickering one or arsonist), and Glühender (glowing one)[1] is a fiery ghost from German folklore different from the will-o'-the-wisp (German Irrlicht), the main difference being its size: Feuermänner are rather big, Irrlichter rather small.[4] An often recurring term for Feuermänner is that of glühende Männer (pl.; glowing men; die glühenden Männer; in dialect gloinige Männer, glöhnege Männer, glöänige Männer or jlönige Männer).[5]

    https://en.wikipedia.org/wiki/Feuermann_(ghost)

    https://en.wikipedia.org/wiki/Category:Headless_Horseman

    https://en.wikipedia.org/wiki/German_folklore

    https://en.wikipedia.org/wiki/Klagmuhme

    https://en.wikipedia.org/wiki/Buschgro%C3%9Fmutter

    https://en.wikipedia.org/wiki/Wei%C3%9Fe_Frauen

    https://en.wikipedia.org/wiki/Alp_(folklore)

    https://en.wikipedia.org/wiki/Wild_man

    https://en.wikipedia.org/wiki/Peterm%C3%A4nnchen

     

    A household deity is a deity or spirit that protects the home, looking after the entire household or certain key members. It has been a common belief in paganism as well as in folklore across many parts of the world.

    Household deities fit into two types; firstly, a specific deity – typically a goddess – often referred to as a hearth goddess or domestic goddess who is associated with the home and hearth, such as the ancient Greek Hestia.[1]

    The second type of household deities are those that are not one singular deity, but a type, or species of animistic deity, who usually have lesser powers than major deities. This type was common in the religions of antiquity, such as the lares of ancient Roman religion, the gashin of Korean shamanism, and cofgodas of Anglo-Saxon paganism. These survived Christianisation as fairy-like creatures existing in folklore, such as the Anglo-Scottish brownie and Slavic domovoy.

    Household deities were usually worshipped not in temples but in the home, where they would be represented by small idols (such as the teraphim of the Bible, often translated as "household gods" in Genesis 31:19 for example), amulets, paintings, or reliefs. They could also be found on domestic objects, such as cosmetic articles in the case of Tawaret. The more prosperous houses might have a small shrine to the household god(s); the lararium served this purpose in the case of the Romans. The gods would be treated as members of the family and invited to join in meals, or be given offerings of food and drink

    https://en.wikipedia.org/wiki/Household_deity

    Totem and Taboo: Resemblances Between the Mental Lives of Savages and Neurotics, or Totem and Taboo: Some Points of Agreement between the Mental Lives of Savages and Neurotics, (German: Totem und Tabu: Einige Übereinstimmungen im Seelenleben der Wilden und der Neurotiker) is a 1913 book by Sigmund Freud, the founder of psychoanalysis, in which the author applies his work to the fields of archaeology, anthropology, and the study of religion. It is a collection of four essays inspired by the work of Wilhelm Wundt and Carl Jung and first published in the journal Imago (1912–13): "The Horror of Incest", "Taboo and Emotional Ambivalence", "Animism, Magic and the Omnipotence of Thoughts", and "The Return of Totemism in Childhood".

    Though Totem and Taboo has been seen as one of the classics of anthropology, comparable to Edward Burnett Tylor's Primitive Culture (1871) and Sir James George Frazer's The Golden Bough (1890), the work is now hotly debated by anthropologists. The cultural anthropologist Alfred L. Kroeber was an early critic of Totem and Taboo, publishing a critique of the work in 1920. Some authors have seen redeeming value in the work.

    Background

    Freud, who had a longstanding interest in social anthropology and was devoted to the study of archaeology and prehistory, wrote that the work of Wilhelm Wundt and Carl Jung provided him with his "first stimulus" to write the essays included in Totem and Taboo. The work was translated twice into English, first by Abraham Brill and later by James Strachey.[1] Freud was influenced by the work of James George Frazer, including The Golden Bough (1890).[2]

    https://en.wikipedia.org/wiki/Totem_and_Taboo

    Lares Familiares are guardian household deities and tutelary deities in ancient Roman religion. The singular form is Lar Familiaris. Lares were thought to influence all that occurred within their sphere of influence or location. In well-regulated, traditional Roman households, the household Lar or Lares were given daily cult and food-offerings, and were celebrated at annual festivals. They were identified with the home to the extent that a homeward-bound Roman could be described as going ad larem ("to the Lar").  

    https://en.wikipedia.org/wiki/Lares_Familiares

    A brownie or broonie (Scots),[1] also known as a brùnaidh or gruagach (Scottish Gaelic), is a household spirit or Hobgoblin from Scottish folklore that is said to come out at night while the owners of the house are asleep and perform various chores and farming tasks. The human owners of the house must leave a bowl of milk or cream or some other offering for the brownie, usually by the hearth. Brownies are described as easily offended and will leave their homes forever if they feel they have been insulted or in any way taken advantage of. Brownies are characteristically mischievous and are often said to punish or pull pranks on lazy servants. If angered, they are sometimes said to turn malicious, like boggarts.

    Brownies originated as domestic tutelary spirits, very similar to the Lares of ancient Roman tradition. Descriptions of brownies vary regionally, but they are usually described as ugly, brown-skinned, and covered in hair. In the oldest stories, they are usually human-sized or larger. In more recent times, they have come to be seen as small and wizened. They are often capable of turning invisible and they sometimes appear in the shapes of animals. They are always either naked or dressed in rags. If a person attempts to present a brownie with clothing or if a person attempts to baptize him, he will leave forever.

    Although the name brownie originated as a dialectal word used only in the UK, it has since become the standard term for all such creatures throughout the UK and Ireland. Regional variants in England and Scotland include hobs, silkies, and ùruisgs. Variants outside England and Scotland are the Welsh Bwbach and the Manx Fenodyree. Brownies have also appeared outside of folklore, including in John Milton's poem L'Allegro. They became popular in works of children's literature in the late nineteenth century and continue to appear in works of modern fantasy. The Brownies in the Girl Guides are named after a short story by Juliana Horatia Ewing based on brownie folklore. 

    Roman Lararium, or household shrine to the Lares, from the House of the Vettii in Pompeii. Brownies bear many similarities to the Roman Lares.[2][3][4]

     

    Origin

    Roman Lararium, or household shrine to the Lares, from the House of the Vettii in Pompeii. Brownies bear many similarities to the Roman Lares.[2][3][4]

    Brownies originated as domestic tutelary spirits, very similar to the Lares of ancient Roman tradition, who were envisioned as the protective spirits of deceased ancestors.[2][3][4][5] Brownies and Lares are both regarded as solitary and devoted to serving the members of the house.[6] Both are said to be hairy and dress in rags[6] and both are said to demand offerings of food or dairy.[6] Like Lares, brownies were associated with the dead[7][6] and a brownie is sometimes described as the ghost of a deceased servant who once worked in the home.[6] The Cauld Lad of Hilton, for instance, was reputed to be the ghost of a stable boy who was murdered by one of the Lords of Hilton Castle in a fit of passion.[8] Those who saw him described him as a naked boy.[9] He was said to clean up anything that was untidy and make messes of things that were tidy.[9] The Menehune of Hawaiian folklore have been compared to brownies as well, seeing they are portrayed as a race of dwarf people who carry out work during night time.[10]

    The family cult of deceased ancestors in ancient times centred around the hearth,[2] which later became the place where offerings would be left for the brownie.[3] The most significant difference between brownies and Lares is that, while Lares were permanently bound to the house in which they lived,[3][6] brownies are seen as more mobile, capable of leaving or moving to another house if they became dissatisfied.[3][6] One story describes a brownie who left the house after the stingy housewife fired all the servants because the brownie was doing all the work and refused to return until all the servants had been re-hired.[3]

    https://en.wikipedia.org/wiki/Brownie_(folklore)

    A tutelary (/ˈtjtəlɛri/) (also tutelar) is a deity or a spirit who is a guardian, patron, or protector of a particular place, geographic feature, person, lineage, nation, culture, or occupation. The etymology of "tutelary" expresses the concept of safety and thus of guardianship.

    In late Greek and Roman religion, one type of tutelary deity, the genius, functions as the personal deity or daimon of an individual from birth to death. Another form of personal tutelary spirit is the familiar spirit of European folklore.[1]

    https://en.wikipedia.org/wiki/Tutelary_deity

     



     


     

     

     

     

     

     

     

     



     

     

     



     

     

     

     

     

     

     

     

     

     

     

     

     

     

     


     


     

     

     

     

     

     



     



     

     

     

     

     

     

     

     

     

     

     

     

     

     

     

     

     

     

     



     

     05-14-2023-1802 - various words/descriptions {concept illustration, dictionary reference note reference text source citation note missing citation etc.} (term, definition, etc. (allusion) ; explanation, variety type, etc.), etc. (draft) [not formal, casual, informal, error omission components, etc.] (DRAFT) internal page link chain sequence array variety term information variety type vague no class browse

     

    No comments:

    Post a Comment