Embodied cognition is the theory that many features of cognition, whether human or otherwise, are shaped by aspects of an organism's entire body. Sensory and motor systems are seen as fundamentally integrated with cognitive processing. The cognitive features include high-level mental constructs (such as concepts and categories) and performance on various cognitive tasks (such as reasoning or judgment). The bodily aspects involve the motor system, the perceptual system, the bodily interactions with the environment (situatedness), and the assumptions about the world built into the organism's functional structure.
The embodied mind thesis challenges other theories, such as cognitivism, computationalism, and Cartesian dualism.[1][2] It is closely related to the extended mind thesis, situated cognition, and enactivism. The modern version depends on insights drawn from up to date research in psychology, linguistics, cognitive science, dynamical systems, artificial intelligence, robotics, animal cognition, plant cognition, and neurobiology.
https://en.wikipedia.org/wiki/Embodied_cognition
Embodied energy is the sum of all the energy required to produce any goods or services, considered as if that energy was incorporated or 'embodied' in the product itself. The concept can be useful in determining the effectiveness of energy-producing or energy saving devices, or the "real" replacement cost of a building, and, because energy-inputs usually entail greenhouse gas emissions, in deciding whether a product contributes to or mitigates global warming. One fundamental purpose for measuring this quantity is to compare the amount of energy produced or saved by the product in question to the amount of energy consumed in producing it.
Embodied energy is an accounting method which aims to find the sum total of the energy necessary for an entire product lifecycle. Determining what constitutes this lifecycle includes assessing the relevance and extent of energy into raw material extraction, transport, manufacture, assembly, installation, disassembly, deconstruction and/or decomposition as well as human and secondary resources.
https://en.wikipedia.org/wiki/Embodied_energy
Embodiment theory speaks to the ways that experiences are enlivened, materialized, and situated in the world through the body. Embodiment is a relatively amorphous and dynamic conceptual framework in anthropological research that emphasizes possibility and process as opposed to definitive typologies.[1] Margaret Lock identifies the late 1970s as the point in the social sciences where we see a new attentiveness to bodily representation and begin a theoretical shift towards developing an ‘Anthropology of the Body.’[2]
Embodiment-based approaches in anthropology were born of dissatisfaction with dualistic interpretations of humanity that created divisions such as mind/body, nature/culture, and object/subject.[1][2] Within these dichotomies, the physical body was historically confined to the realm of the ‘natural’ sciences and was not considered to be a subject of study in cultural and social sciences. When the body was studied or considered in social science contexts employing these dualistic frameworks, it was treated as a categorizable, ‘natural’ object with little recognition of its dynamic or subjective potentialities.[1]
Embodiment theory has been developed and expanded by the work of many scholars, as opposed to being credited to a single thinker.[1] The work of Thomas Csordas and Margaret Lock marks some of the earliest explicit applications of embodiment theory in anthropology.[2][3][4] More recent edited volumes compiled by Margaret Lock, Judith Farquhar, and Frances Mascia-Lees provide a better window into current applications of embodiment theory in anthropology.[5][6] The theoretical background of embodiment is an amalgamation of phenomenology, practice theory, feminist theory, and post-structuralist thought.[7] Mary Douglas, Marcel Mauss, Pierre Bourdieu, Maurice Merleau-Ponty, Judith Butler, and Michel Foucault are often cited as key precursory conceptual contributors to embodiment theory.[7]
https://en.wikipedia.org/wiki/Embodiment_theory_in_anthropology
One way of attributing greenhouse gas (GHG) emissions is to measure the embedded emissions of goods that are being consumed (also referred to as "embodied emissions", "embodied carbon emissions", or "embodied carbon"). This is different from the question of to what extent the policies of one country to reduce emissions affect emissions in other countries (the "spillover effect" and "carbon leakage" of an emissions reduction policy). The UNFCCC measures emissions according to production, rather than consumption.[1] Consequently, embedded emissions on imported goods are attributed to the exporting, rather than the importing, country. The question of whether to measure emissions on production instead of consumption is partly an issue of equity, i.e., who is responsible for emissions.[2]
The 37 Parties listed in Annex B to the Kyoto Protocol have agreed to legally binding emission reduction commitments. Under the UNFCCC accounting of emissions, their emission reduction commitments do not include emissions attributable to their imports.[3] In a briefing note, Wang and Watson (2007) asked the question, "who owns China's carbon emissions?".[4] In their study, they suggested that nearly a quarter of China's CO2 emissions might be a result of its production of goods for export, primarily to the US but also to Europe. Based on this, they suggested that international negotiations based on within country emissions (i.e., emissions measured by production) may be "[missing] the point".
Recent research confirms that, in 2004, 23% of global emissions were embedded in goods traded internationally, mostly flowing from China and other developing countries, such as Russia and South Africa, to the U.S., Europe and Japan. These states are included in a group of ten, as well as the Middle East, that make up 71% of the total difference in regional emissions. In Western Europe the difference in the import and export of emissions is particularly pronounced, with imported emissions making up 20-50% of consumed emissions. The majority of the emissions transferred between these states is contained in the trade of machinery, electronics, chemicals, rubber and plastics.[5]
Research by the Carbon Trust in 2011 revealed that approximately 25% of all CO2 emissions from human activities 'flow' (i.e. are imported or exported) from one country to another. The flow of carbon was found to be roughly 50% emissions associated with trade in commodities such as steel, cement, and chemicals, and 50% in semi-finished/finished products such as motor vehicles, clothing or industrial machinery and equipment.[6]
Embodied carbon in construction
The embodied carbon of buildings is estimated to count for 11% of global carbon emissions and 75% of a building's emissions over its entire lifecycle.[7] The World Green Building Council has set a target for all new buildings to have at least 40% less embodied carbon.[8]
A life-cycle assessment for embodied carbon calculates the carbon used throughout each stage of a building's life: construction, use and maintenance, and demolition or disassembly.[9]
Re-use is a key consideration when addressing embodied carbon in construction. The architect Carl Elefante is known for coining the phrase, "The greenest building is the building that is already built."[10] The reason that existing buildings are usually more sustainable than new buildings is that the quantity of carbon emissions which occurs during construction of a new building is large in comparison to the annual operating emissions of the building, especially as operations become more energy efficient and energy supplies transition to renewable generation.[11][8]
Beyond re-use, there are two principal areas of focus in the reduction of embodied carbon in construction. The first is to reduce the quantity of construction material ('construction mass') while the second is the substitution of lower carbon alternative materials. Typically—where reduction of embodied carbon is a goal—both of these are addressed.
Often, the most significant scope for reduction of construction mass is found in structural design, where measures such as reduced beam or slab span (and an associated increase in column density) can yield large carbon savings.[12]
To assist material substitution (with low carbon alternatives), manufacturers of materials such as steel re-bar, glulam, and precast concrete typically provide Environmental Product Declarations (EPD) which certify the carbon impact as well as general environmental impacts of their products.[13] Minimizing the use of carbon-intensive materials may mean selecting lower carbon versions of glass and steel products, and products manufactured using low-emissions energy sources. Embodied carbon may be reduced in concrete construction through the use of Portland cement alternatives such as Ground granulated blast-furnace slag, recycled aggregates and industry by-products. Carbon-neutral, carbon positive, and carbon-storing materials include bio-based materials such as timber, bamboo, hemp fibre and hempcrete, wool, dense-pack cellulose insulation, and cork.[14][15][16]
A 2021 study focused on "carbon-intensive hotspot materials (e.g., concrete foundations and slab floors, insulated roof and wall panels, and structural framing) in light industrial buildings" estimated that a "sizable reduction (~60%) in embodied carbon is possible in two to three years by bringing readily-available low-carbon materials into wider use".[17]
Embodied carbon policy and legislation
A variety of policies, regulations, and standards exist worldwide with respect to embodied carbon, according to the American Institute of Architects.[18]
Eight states introduced procurement policies related to embodied carbon in 2021: Washington, Oregon, California, Colorado, Minnesota, Connecticut, New York, and New Jersey.[19]
In Colorado, HB21-1303: Global Warming Potential for Public Project Materials (better known as “Buy Clean Colorado”) was signed into law July 6, 2021. The law uses environmental product declarations (EPDs) to help drive the use of low-embodied-carbon materials.[20]
"In Europe, embodied carbon emissions have been limited in the Netherlands since 2018, and this is scheduled to happen in Denmark, Sweden, France and Finland between 2023 and 2027."[21]
"On May 10, 2023, Toronto [began to] to require lower-carbon construction materials, limiting embodied carbon from new [city-owned] municipal building construction. New City-owned buildings must now limit upfront embodied emission intensity — emissions associated with manufacturing, transporting, and constructing major structural and envelope systems — to below 350 kg CO2e/m2."[22]
See also
References
- "Toronto Limits Embodied Carbon in New City Buildings". The Energy Mix. 14 May 2023. Retrieved 15 May 2023.
External links
- Resources: Embodied Carbon in Buildings, Boston Society for Architecture
- Ten Steps Towards Reducing Embodied Carbon, The American Institute of Architects
- Carbon Positive Reset! 1.5° C 2020 Teach-In
- Carbon Smart Materials Palette
https://en.wikipedia.org/wiki/Embedded_emissions
Embodied or embodiment may refer to:
Anthropology
Cognitive science
- Embodied bilingual language, in cognitive science
- Embodied cognition, a theory that many aspects of cognition are shaped by the body
- Embodied cognitive science, seeks to explain the mechanisms underlying intelligent behavior
- Embodied design, that the actions of the body can play a role in the development of thought and ideas
- Embodied imagination, a therapeutic form of working with dreams and memories
- Embodied knowledge, a.k.a. tacit knowledge
Music and arts
- Embodiment 12:14, a Christian Australian metalcore band
- Embodiment: Collapsing Under the Weight of God, a 2008 album by the band Sculptured
- Embodied music cognition, in musicology
- Embodied writing, practices are used by academics and artists to highlight the connection between writing and the body
Religion
- Incarnation
- Anthropopathy in Islam, a religious faith in early Islam similar to the Incarnation
Other
- Personification, an embodiment of an entity, usually in the form of a person.
- Embodied energy, required to produce any goods or services
- Embodied water or virtual water, the water used in the production of a good or service
- Claim (patent), in patent law, embodiment refers to implementation of an invention
- Embodied agent, an agent with a physical presence in the world (Robotics)
See also
- Embodyment, American Christian rock band
https://en.wikipedia.org/wiki/Embodiment
Personification is the representation of a thing or abstraction as a person. In the arts, many things are commonly personified. These include numerous types of places, especially cities, countries, and continents, elements of the natural world such as the months or four seasons, four elements,[1] four cardinal winds, five senses,[2] and abstractions such as virtues, especially the four cardinal virtues and seven deadly sins,[3] the nine Muses,[4] or death.
In many polytheistic early religions, deities had a strong element of personification, suggested by descriptions such as "god of". In ancient Greek religion, and the related ancient Roman religion, this was perhaps especially strong, in particular among the minor deities.[5] Many such deities, such as the tyches or tutelary deities for major cities, survived the arrival of Christianity, now as symbolic personifications stripped of religious significance. An exception was the winged goddess of victory, Victoria/Nike, who developed into the visualisation of the Christian angel.[6]
Generally, personifications lack much in the way of narrative myths, although classical myth at least gave many of them parents among the major Olympian deities.[7] The iconography of several personifications "maintained a remarkable degree of continuity from late antiquity until the 18th century".[8] Female personifications tend to outnumber male ones,[9] at least until modern national personifications, many of which are male.
Personifications are very common elements in allegory, and historians and theorists of personification complain that the two have been too often confused, or discussion of them dominated by allegory. Single images of personifications tend to be titled as an "allegory", arguably incorrectly.[10] By the late 20th century personification seemed largely out of fashion, but the semi-personificatory superhero figures of many comic book series came in the 21st century to dominate popular cinema in a number of superhero film franchises.
According to Ernst Gombrich, "we tend to take it for granted rather than to ask questions about this extraordinary predominantly feminine population which greets us from the porches of cathedrals, crowds around our public monuments, marks our coins and our banknotes, and turns up in our cartoons and our posters; these females variously attired, of course, came to life on the medieval stage, they greeted the Prince on his entry into a city, they were invoked in innumerable speeches, they quarrelled or embraced in endless epics where they struggled for the soul of the hero or set the action going, and when the medieval versifier went out on one fine spring morning and lay down on a grassy bank, one of these ladies rarely failed to appear to him in his sleep and to explain her own nature to him in any number of lines".[11]
History
Classical world
Personification as an artistic device is easier to discuss when belief in the personification as an actual spiritual being has died down;[12] this seems to have happened in the ancient Graeco-Roman world, probably even before Christianisation.[13] In other cultures, especially Hinduism and Buddhism, many personification figures still retain their religious significance, which is why they are not covered here. For example Bharat Mata was devised as a Hindu goddess figure to act as a national personification by intellectuals in the Indian independence movement from the 1870s, but now has some actual Hindu temples.[14]
https://en.wikipedia.org/wiki/Personification
The virtual water trade (also known as embedded or embodied water) is the hidden flow of water in food or other commodities that are traded from one place to another.[1] The virtual water trade is the idea that when goods and services are exchanged, so is virtual water. Virtual water trade allows a new, amplified perspective on water problems: In the framewond balancing different perspectives, basic conditions, and interests. Analytically, the concept enables one to distinguish between global, regional, and local levels and their linkages. However, the use of virtual water estimates may offer no guidance for policymakers seeking to ensure that environmental objectives are being met.
For example, cereal grains have been major carriers of virtual water in countries where water resources are scarce. Therefore, cereal imports can play a crucial role in compensating local water deficit.[2] However, low-income countries may not be able to afford such imports in the future which could lead to food insecurity and starvation.
https://en.wikipedia.org/wiki/Virtual_water
Incarnation literally means embodied in flesh or taking on flesh. It refers to the conception and the embodiment of a deity or spirit in some earthly form[1] or the appearance of a god as a human.[2] If capitalized, it is the union of divinity with humanity in Jesus Christ.[1] In its religious context the word is used to mean a god, deity, or divine being in human or animal form on Earth.
Abrahamic religions
Christianity
The incarnation of Christ is the central Christian doctrine that God became flesh, assumed a human nature, and became a man in the form of Jesus, the Son of God and the second person of the Trinity. This foundational Christian position holds that the divine nature of the Son of God was perfectly united with human nature in one divine Person, Jesus, making him both truly God and truly human. The theological term for this is hypostatic union: the second person of the Trinity, God the Son, became flesh when he was miraculously conceived in the womb of the Virgin Mary.[3] Biblical passages traditionally referenced in connection with the doctrine of the Incarnation include John 3:1-21, Colossians 2:9, and Philippians 2:7-8.
Islam
Islam completely rejects the doctrine of the incarnation (Mu'jassimā[4] / (Tajseem) Tajsīm) of God in any form, as the concept is defined as shirk. In Islam, God is one and "neither begets nor is begotten".[5]
Judaism
Mainstream Judaism totally rejects any doctrine of an incarnation of God and absolutely rejects any concept of an incarnation of God in any form.[6] However, some Hasidim believe in a somewhat similar concept. Menachem Mendel Schneerson, a prominent Hasidic leader, said that the Rebbe is God's essence itself put into the body of a tzadik.[7]
Druze faith
Hamza ibn Ali ibn Ahmad is considered the founder of the Druze faith and the primary author of the Druze manuscripts,[8] he proclaimed that God had become human and taken the form of man, al-Hakim bi-Amr Allah.[9][10][11][12][13][14] al-Hakim bi-Amr Allah is an important figure in the Druze faith whose eponymous founder ad-Darazi proclaimed him as the incarnation of God in 1018.[9][10][15][16]
Baháʼí Faith
In the Baháʼí Faith, God is not seen to be incarnated into this world and is not seen to be part of creation as he cannot be divided and does not descend to the condition of his creatures.[17] The Manifestations of God are also not seen as an incarnation of God, but are instead understood to be like a perfect mirror reflecting the attributes of God onto this material world.[18][19]
Buddhism
Buddhism is a nontheistic religion: it denies the concept of a creator deity or any incarnation of a creator deity. However, Buddhism does teach the rebirth doctrine and asserts that living beings are reborn, endlessly, reincarnating as devas (gods), demi-gods, human beings, animals, hungry ghosts or hellish beings,[20] in a cycle of samsara that stops only for those who reach nirvana (nibbana).[21][22][23]
In Tibetan Buddhism, an enlightened spiritual teacher (lama) is believed to reincarnate, and is called a tulku. According to Tulku Thond,[24] there are three main types of tulkus. They are the emanations of buddhas, the manifestations of highly accomplished adepts, and rebirths of highly virtuous teachers or spiritual friends. There are also authentic secondary types as well which include unrecognized tulkus, blessed tulkus, and tulkus fallen from the path.
Hinduism
In Hinduism, incarnation refers to its rebirth doctrine, and in its theistic traditions to avatar.[25] Avatar literally means "descent, alight, to make one's appearance",[26] and refers to the embodiment of the essence of a superhuman being or a deity in another form.[27] The word also implies "to overcome, to remove, to bring down, to cross something".[26] In Hindu traditions, the "crossing or coming down" is symbolism, states Daniel Bassuk, of the divine descent from "eternity into the temporal realm, from unconditioned to the conditioned, from infinitude to finitude".[28] An avatar, states Justin Edwards Abbott, is a saguna (with form, attributes) embodiment of the nirguna Brahman or Atman (soul).[29]
Neither the Vedas nor the Principal Upanishads ever mentions the word avatar as a noun.[28] The verb roots and form, such as avatarana, do appear in ancient post-Vedic Hindu texts, but as "action of descending", but not as an incarnated person (avatara).[30] The related verb avatarana is, states Paul Hacker, used with double meaning, one as action of the divine descending, another as "laying down the burden of man" suffering from the forces of evil.[30]
The term is most commonly found in the context of the Hindu god Vishnu.[26][31] The earliest mention of Vishnu manifested in a human form to empower the good and fight against evil, uses other terms such as the word sambhavāmi in verse 4.6 and the word tanu in verse 9.11 of the Bhagavad Gita,[32] as well as other words such as akriti and rupa elsewhere.[33] It is in medieval era texts, those composed after the sixth century CE, that the noun version of avatar appears, where it means embodiment of a deity.[34] The incarnation idea proliferates thereafter, in the Puranic stories for many deities, and with ideas such as ansha-avatar or partial embodiments.[32][31]
While avatars of other deities such as Ganesha and Shiva are also mentioned in medieval Hindu texts, this is minor and occasional.[35] The incarnation doctrine is one of the important differences between Vaishnavism and Shaivism traditions of Hinduism.[36][37]
Avatar versus incarnation
The translation of avatar as "incarnation" has been questioned by Christian theologians, who state that an incarnation is in flesh and imperfect, while avatar is mythical and perfect.[38][39] The theological concept of Christ as an Incarnation into the womb of the Virgin Mary and by work of the Holy Spirit God, as found in Christology, presents the Christian concept of incarnation. This, state Oduyoye and Vroom, is different from the Hindu concept of avatar because avatars in Hinduism are unreal and is similar to Docetism.[40] Sheth disagrees and states that this claim is an incorrect understanding of the Hindu concept of avatar.[41][note 1] Avatars are true embodiments of spiritual perfection, one driven by noble goals, in Hindu traditions such as Vaishnavism.[41]
Serer religion
The Serer religion of West Africa rejects any notions of incarnation or manifestation of the supreme deity Roog (also called Koox in the Cangin language). However, the reincarnation (ciiɗ)[43] of the ancient Serer saints and ancestral spirits, called Pangool, is a well-held principle in Serer religion. These Pangool (singular : Fangool) act as intermediaries between the living world and the divine. When the Serers speak of incarnation, it is these Pangool they refer to, who are themselves holy by virtue of their intercession with the divine.[43][44][45]
See also
Notes
- Buddha, a real person, is included as an avatar of Vishnu in many Hindu texts.[42]
References
- Gravrand, Henry, La civilisation sereer, Cosaan: les origines, vol.1, Nouvelles Editions africaines (1983), p 33, ISBN 2-7236-0877-8
Bibliography
- Daniélou, Alain (1991) [1964]. The myths and gods of India. Inner Traditions, Vermont, USA. ISBN 0-89281-354-7. pp. 164–187.
- Coleman, T. (2011). "Avatāra". Oxford Bibliographies Online: Hinduism. doi:10.1093/obo/9780195399318-0009. Short introduction and bibliography of sources about Avatāra (subscription required).
- Matchett, Freda (2001). Krishna, Lord or Avatara?: the relationship between Krishna and Vishnu. Routledge. ISBN 978-0700712816.
- Paul Hacker (1978). Lambert Schmithausen (ed.). Zur Entwicklung der Avataralehre (in German). Otto Harrassowitz. ISBN 978-3447048606.
- Sheth, Noel (2002). "Hindu Avatāra and Christian Incarnation: A Comparison". Philosophy East and West. University of Hawai'i Press. 52 (1 (January)): 98–125. doi:10.1353/pew.2002.0005. JSTOR 1400135. S2CID 170278631.
Further reading
External links
https://en.wikipedia.org/wiki/Incarnation
Effeminacy is the embodiment of traits and/or expressions in those who are not of the female sex (e.g. boys and men) that are often associated with what is generally perceived to be feminine behaviours, mannerisms, styles, or gender roles, rather than with traditionally masculine behaviours, mannerisms, styles or roles. Effeminacy and other gender expressions are independent of a person's sexuality or sexual identity and are displayed by people of all sexualities and none. Effeminacy is seen in some societies as something embodied by some in the homosexual male community. The embodiment of effeminacy by people in some societies has resulted in prejudice, discrimination, antagonism and insults towards those who display it.
https://en.wikipedia.org/wiki/Effeminacy
Stereotype embodiment theory (SET) is a theoretical model first posited by psychologist Becca Levy to explain the process by which age stereotypes influence the health of older adults.[1] There are multiple well-documented effects of age stereotypes on a number of cognitive and physical outcomes (including memory, cardiovascular reactivity, and longevity).[2][3][4][5]
SET explains these findings according to a three-step process:
- Age stereotypes are internalized from the host culture at a young age.
- At some point, these age stereotype become "self stereotypes" about oneself as an aging individual.
- These self-stereotypes are then consciously and unconsciously activated to exert their effects on individual health.
Underlying these three steps are SET's four main theoretical premises. According to Levy (2009): "The theory has four components: The stereotypes (a) become internalized across the lifespan, (b) can operate unconsciously, (c) gain salience from self-relevance, and (d) utilize multiple pathways."[1]
Although this theory was developed to explain the operation of age stereotypes across the lifespan, it may also explain how other types of self-stereotypes operate, such as race stereotypes among African Americans and gender stereotypes among women.
https://en.wikipedia.org/wiki/Stereotype_embodiment_theory
Patent law |
---|
Overviews |
Procedural concepts |
Patentability requirements and related concepts |
Other legal requirements |
By region / country |
By specific subject matter |
See also |
In a patent or patent application, the claims define, in technical terms, the extent, i.e. the scope, of the protection conferred by a patent, or the protection sought in a patent application. In other words, the purpose of the claims is to define which subject-matter is protected by the patent (or sought to be protected by the patent application). This is termed as the "notice function" of a patent claim—to warn others of what they must not do if they are to avoid infringement liability.[1] The claims are of the utmost importance both during prosecution and litigation alike.
For instance, a claim could read:
- "An apparatus for catching mice, said apparatus comprising a base, a spring member coupled to the base, and ..."
- "A chemical composition for cleaning windows, said composition substantially consisting of 10–15% ammonia, ..."
- "Method for computing future life expectancies, said method comprising gathering data including X, Y, Z, analyzing the data, comparing the analyzed data results..."
https://en.wikipedia.org/wiki/Patent_claim
Embodiment
The sense of embodiment is critical to a person's conception of self. Embodiment is the understanding of the physical body and its relation to oneself.[5] The study of human embodiment currently has a large impact on the study of human cognition as a whole. The current study of embodiment suggests that sensory input and experiences impact human's overall perception. This idea somewhat challenges previous ideas of human cognition because it challenges the idea of the human mind being innate.[6]
There are two portions of the brain that have recently been found to have a large importance on a person's perception of self. The temporoparietal junction, located in the cortex is one of these brain regions. The temporoparietal junction is thought to integrate sensory information. The second portion of the brain thought to be involved in perception of embodiment is the extrastriate body area. The extrastriate body area is located in the lateral occipitotemporal cortex. When people are shown images of body parts, the extrastriate body area is activated. The temporoparietal junction is involved in sensory integration processes while the extrastriate body area deals mainly with thoughts of and exposure to human body parts. It has been found that the brain responds to stimuli that involve embodiment differently from stimuli that involve localization. During task performance tests, a person's body position (whether he or she is sitting or laying face up) affects how the extrastriate body area is activated. The temporoparietal junction, however, is not affected by a person's particular body position. The temporoparietal junction deals with disembodied rather than embodied self-location, explaining why a person's physical position does not affect its activation. Self-location as related to a person's sense of embodiment is related to his or her actual location in space.[5]
Autobiographical memories
The information people remember as autobiographical memory is essential to their perception of self. These memories form the way people feel about themselves. The left dorsolateral prefrontal cortex and posterior cingulate cortex are involved in the memory of autobiographical information.[7]
Morality
Morality is an extremely important defining factor for humans. It often defines or contributes to people's choices or actions, defining who a person is. Making moral decisions, much like other neural processes has a clear biological basis. The anterior and medial prefrontal cortex and the superior temporal sulcus are activated when people feel guilt, compassion, or embarrassment. Guilt and passion activate the mesolimbic pathway, and indignation and disgust are activated by the amygdala. There is clearly a network involved with the ideas of morality.[8]
https://en.wikipedia.org/wiki/Neural_basis_of_self#Embodiment
Embodiment of possessiveness
The Tolkien scholar Verlyn Flieger, discussing the splintering of the original created light of Middle-earth, likens Melkor/Morgoth's response to the Silmarils to that of Fëanor, who had created those jewels. She states that the central temptation is the desire to possess, and that possessiveness itself is the "great transgression" in Tolkien's created world. She observes that the commandment "Love not too well the work of thy hands and the devices of thy heart" is stated explicitly in The Silmarillion. Flieger compares Tolkien's descriptions of the two characters: "the heart of Fëanor was fast bound to these things that he himself had made", followed at once by "Melkor lusted for the Silmarils, and the very memory of their radiance was a gnawing fire in his heart". She writes that it is appropriately ironic that Melkor and Fëanor, one the greatest of the Ainur, the other the most subtle and skilful of the creative Noldor among the Elves – should "usher in the darkness".[8]
https://en.wikipedia.org/wiki/Morgoth#Embodiment_of_possessiveness
Re-embodiment of amoral aristocratic values
For Rüdiger Safranski, the Übermensch represents a higher biological type reached through artificial selection and at the same time is also an ideal for anyone who is creative and strong enough to master the whole spectrum of human potential, good and "evil", to become an "artist-tyrant". In Ecce Homo, Nietzsche vehemently denied any idealistic, democratic or humanitarian interpretation of the Übermensch: "The word Übermensch [designates] a type of supreme achievement, as opposed to 'modern' men, 'good' men, Christians, and other nihilists ... When I whispered into the ears of some people that they were better off looking for a Cesare Borgia than a Parsifal, they did not believe their ears."[11] Safranski argues that the combination of ruthless warrior pride and artistic brilliance that defined the Italian Renaissance embodied the sense of the Übermensch for Nietzsche. According to Safranski, Nietzsche intended the ultra-aristocratic figure of the Übermensch to serve as a Machiavellian bogeyman of the modern Western middle class and its pseudo-Christian egalitarian value system.[12]
https://en.wikipedia.org/wiki/%C3%9Cbermensch#Re-embodiment_of_amoral_aristocratic_values
Posthuman or post-human is a concept originating in the fields of science fiction, futurology, contemporary art, and philosophy that means a person or entity that exists in a state beyond being human.[1] The concept aims at addressing a variety of questions, including ethics and justice, language and trans-species communication, social systems, and the intellectual aspirations of interdisciplinarity.
Posthumanism is not to be confused with transhumanism (the biotechnological enhancement of human beings) and narrow definitions of the posthuman as the hoped-for transcendence of materiality.[2] The notion of the posthuman comes up both in posthumanism as well as transhumanism, but it has a special meaning in each tradition. In 2017, Penn State University Press in cooperation with Stefan Lorenz Sorgner and James Hughes established the Journal of Posthuman Studies,[3] in which all aspects of the concept "posthuman" can be analysed.[4]
https://en.wikipedia.org/wiki/Posthuman
Notes from Underground (pre-reform Russian: Записки изъ подполья; post-reform Russian: Записки из подполья, Zapíski iz podpólʹya; also translated as Notes from the Underground or Letters from the Underworld) is a novella by Fyodor Dostoevsky, first published in the journal Epoch in 1864. It is a first-person narrative in the form of a "confession": the work was originally announced by Dostoevsky in Epoch under the title "A Confession".[2]
The novella presents itself as an excerpt from the memoirs of a bitter, isolated, unnamed narrator (generally referred to by critics as the Underground Man), who is a retired civil servant living in St. Petersburg. Although the first part of the novella has the form of a monologue, the narrator's form of address to his reader is acutely dialogized. According to Mikhail Bakhtin, in the Underground Man's confession "there is literally not a single monologically firm, undissociated word". The Underground Man's every word anticipates the words of an other, with whom he enters into an obsessive internal polemic.[3]
The Underground Man attacks contemporary Russian philosophy, especially Nikolay Chernyshevsky's What Is to Be Done?[4] More generally, the work can be viewed as an attack on and rebellion against determinism: the idea that everything, including the human personality and will, can be reduced to the laws of nature, science and mathematics.[5]
https://en.wikipedia.org/wiki/Notes_from_Underground
The last man (German: Letzter Mensch) is a term used by the philosopher Friedrich Nietzsche in Thus Spoke Zarathustra to describe the antithesis of his theorized superior being, the Übermensch, whose imminent appearance is heralded by Zarathustra. The last man is the archetypal passive nihilist. He is tired of life, takes no risks, and seeks only comfort and security. Therefore, The Last Man is unable to build and act upon a self-actualized ethos.
https://en.wikipedia.org/wiki/Last_man
"God is dead" (German: Gott ist tot (help·info); also known as the death of God) is a statement made by the German philosopher Friedrich Nietzsche. The first instance of this statement in Nietzsche's writings is in his 1882 The Gay Science, where it appears three times.[note 1] The phrase also appears in Nietzsche's Thus Spoke Zarathustra.
The meaning of this statement is that since, as Nietzsche says, "the belief in the Christian God has become unbelievable", everything that was "built upon this faith, propped up by it, grown into it", including "the whole [...] European morality", is bound to "collapse".[1]
Other philosophers had previously discussed the concept, including Philipp Mainländer and Georg Wilhelm Friedrich Hegel. The phrase is also discussed in the Death of God theology.
https://en.wikipedia.org/wiki/God_is_dead
The will to power (German: der Wille zur Macht) is a concept in the philosophy of Friedrich Nietzsche. The will to power describes what Nietzsche may have believed to be the main driving force in humans. However, the concept was never systematically defined in Nietzsche's work, leaving its interpretation open to debate.[1] Usage of the term by Nietzsche can be summarized as self-determination, the concept of actualizing one's will onto one's self or one's surroundings, and coincides heavily with egoism.[2]
Alfred Adler incorporated the will to power into his individual psychology. This can be contrasted to the other Viennese schools of psychotherapy: Sigmund Freud's pleasure principle (will to pleasure) and Viktor Frankl's logotherapy (will to meaning). Each of these schools advocates and teaches a very different essential driving force in human beings.
https://en.wikipedia.org/wiki/Will_to_power
https://en.wikipedia.org/wiki/Notes_from_Underground
The last man (German: Letzter Mensch) is a term used by the philosopher Friedrich Nietzsche in Thus Spoke Zarathustra to describe the antithesis of his theorized superior being, the Übermensch, whose imminent appearance is heralded by Zarathustra. The last man is the archetypal passive nihilist. He is tired of life, takes no risks, and seeks only comfort and security. Therefore, The Last Man is unable to build and act upon a self-actualized ethos.
https://en.wikipedia.org/wiki/Last_man
"God is dead" (German: Gott ist tot (help·info); also known as the death of God) is a statement made by the German philosopher Friedrich Nietzsche. The first instance of this statement in Nietzsche's writings is in his 1882 The Gay Science, where it appears three times.[note 1] The phrase also appears in Nietzsche's Thus Spoke Zarathustra.
The meaning of this statement is that since, as Nietzsche says, "the belief in the Christian God has become unbelievable", everything that was "built upon this faith, propped up by it, grown into it", including "the whole [...] European morality", is bound to "collapse".[1]
Other philosophers had previously discussed the concept, including Philipp Mainländer and Georg Wilhelm Friedrich Hegel. The phrase is also discussed in the Death of God theology.
https://en.wikipedia.org/wiki/God_is_dead
The will to power (German: der Wille zur Macht) is a concept in the philosophy of Friedrich Nietzsche. The will to power describes what Nietzsche may have believed to be the main driving force in humans. However, the concept was never systematically defined in Nietzsche's work, leaving its interpretation open to debate.[1] Usage of the term by Nietzsche can be summarized as self-determination, the concept of actualizing one's will onto one's self or one's surroundings, and coincides heavily with egoism.[2]
Alfred Adler incorporated the will to power into his individual psychology. This can be contrasted to the other Viennese schools of psychotherapy: Sigmund Freud's pleasure principle (will to pleasure) and Viktor Frankl's logotherapy (will to meaning). Each of these schools advocates and teaches a very different essential driving force in human beings.
https://en.wikipedia.org/wiki/Will_to_power
Eternal return (or eternal recurrence) is a philosophical concept which states that time repeats itself in an infinite loop, and that exactly the same events will continue to occur in exactly the same way, over and over again, for eternity.
In ancient Greece, the concept of eternal return was most prominently associated with Stoicism, the school of philosophy founded by Zeno of Citium. The Stoics believed that the universe is periodically destroyed and reborn, and that each universe is exactly the same as the one before. This doctrine was fiercely refuted by Christian authors such as Augustine, who saw in it a fundamental denial of free will and of the possibility of salvation. The global spread of Christianity therefore brought an end to classical theories of eternal return.
The concept was revived in the 19th century by German philosopher Friedrich Nietzsche. Having briefly presented the idea as a thought experiment in The Gay Science, he explored it more thoroughly in his novel Thus Spoke Zarathustra, in which the protagonist learns to overcome his horror of the thought of eternal return. It is not known whether Nietzsche believed in the literal truth of eternal return, or, if he did not, what he intended to demonstrate by it.
Nietzsche's ideas were subsequently taken up and re-interpreted by other writers, such as Russian esotericist P. D. Ouspensky, who argued that it was possible to break the cycle of return.
https://en.wikipedia.org/wiki/Eternal_return
Ergodic theory (Greek: ἔργον ergon "work", ὁδός hodos "way") is a branch of mathematics that studies statistical properties of deterministic dynamical systems; it is the study of ergodicity. In this context, statistical properties means properties which are expressed through the behavior of time averages of various functions along trajectories of dynamical systems. The notion of deterministic dynamical systems assumes that the equations determining the dynamics do not contain any random perturbations, noise, etc. Thus, the statistics with which we are concerned are properties of the dynamics.
Ergodic theory, like probability theory, is based on general notions of measure theory. Its initial development was motivated by problems of statistical physics.
A central concern of ergodic theory is the behavior of a dynamical system when it is allowed to run for a long time. The first result in this direction is the Poincaré recurrence theorem, which claims that almost all points in any subset of the phase space eventually revisit the set. Systems for which the Poincaré recurrence theorem holds are conservative systems; thus all ergodic systems are conservative.
More precise information is provided by various ergodic theorems which assert that, under certain conditions, the time average of a function along the trajectories exists almost everywhere and is related to the space average. Two of the most important theorems are those of Birkhoff (1931) and von Neumann which assert the existence of a time average along each trajectory. For the special class of ergodic systems, this time average is the same for almost all initial points: statistically speaking, the system that evolves for a long time "forgets" its initial state. Stronger properties, such as mixing and equidistribution, have also been extensively studied.
The problem of metric classification of systems is another important part of the abstract ergodic theory. An outstanding role in ergodic theory and its applications to stochastic processes is played by the various notions of entropy for dynamical systems.
The concepts of ergodicity and the ergodic hypothesis are central to applications of ergodic theory. The underlying idea is that for certain systems the time average of their properties is equal to the average over the entire space. Applications of ergodic theory to other parts of mathematics usually involve establishing ergodicity properties for systems of special kind. In geometry, methods of ergodic theory have been used to study the geodesic flow on Riemannian manifolds, starting with the results of Eberhard Hopf for Riemann surfaces of negative curvature. Markov chains form a common context for applications in probability theory. Ergodic theory has fruitful connections with harmonic analysis, Lie theory (representation theory, lattices in algebraic groups), and number theory (the theory of diophantine approximations, L-functions).
https://en.wikipedia.org/wiki/Ergodic_theory
The wheel of time or wheel of history (also known as Kalachakra) is a concept found in several religious traditions and philosophies, notably religions of Indian origin such as Hinduism, Jainism, Sikhism, and Buddhism, which regard time as cyclical and consisting of repeating ages. Many other cultures contain belief in a similar concept: notably, the Q'ero Natives of Peru, as well as the Hopi Natives of Arizona.
https://en.wikipedia.org/wiki/Wheel_of_time
https://en.wikipedia.org/wiki/concrete
https://en.wikipedia.org/wiki/concretization
https://en.wikipedia.org/wiki/cretian
Embodiment of prestige or power
When the Milgram experimenters were interviewing potential volunteers, the participant selection process itself revealed several factors that affected obedience, outside of the actual experiment.
Interviews for eligibility were conducted in an abandoned complex in Bridgeport, Connecticut.[2][21] Despite the dilapidated state of the building, the researchers found that the presence of a Yale professor as stipulated in the advertisement affected the number of people who obeyed. This was not further researched to test obedience without a Yale professor because Milgram had not intentionally staged the interviews to discover factors that affected obedience.[2] A similar conclusion was reached in the Stanford prison experiment.[21]
In the actual experiment, prestige or the appearance of power was a direct factor in obedience—particularly the presence of men dressed in gray laboratory coats, which gave the impression of scholarship and achievement and was thought to be the main reason why people complied with administering what they thought was a painful or dangerous shock.[2] A similar conclusion was reached in the Stanford prison experiment.
Raj Persaud, in an article in the BMJ,[22] comments on Milgram's attention to detail in his experiment:
The research was also conducted with amazing verve and subtlety—for example, Milgram ensured that the "experimenter" wear a grey lab coat rather than a white one, precisely because he did not want subjects to think that the "experimenter" was a medical doctor and thereby limit the implications of his findings to the power of physician authority.
Despite the fact that prestige is often thought of as a separate factor, it is, in fact, merely a subset of power as a factor. Thus, the prestige conveyed by a Yale professor in a laboratory coat is only a manifestation of the experience and status associated with it and/or the social status afforded by such an image.
Agentic state and other factors
According to Milgram, "the essence of obedience consists in the fact that a person comes to view himself as the instrument for carrying out another person's wishes, and he therefore no longer sees himself as responsible for his actions. Once this critical shift of viewpoint has occurred in the person, all of the essential features of obedience follow." Thus, "the major problem for the subject is to recapture control of his own regnant processes once he has committed them to the purposes of the experimenter."[23] Besides this hypothetical agentic state, Milgram proposed the existence of other factors accounting for the subject's obedience: politeness, awkwardness of withdrawal, absorption in the technical aspects of the task, the tendency to attribute impersonal quality to forces that are essentially human, a belief that the experiment served a desirable end, the sequential nature of the action, and anxiety.
Belief perseverance
Another explanation of Milgram's results invokes belief perseverance as the underlying cause. What "people cannot be counted on is to realize that a seemingly benevolent authority is in fact malevolent, even when they are faced with overwhelming evidence which suggests that this authority is indeed malevolent. Hence, the underlying cause for the subjects' striking conduct could well be conceptual, and not the alleged 'capacity of man to abandon his humanity ... as he merges his unique personality into larger institutional structures."'[24]
See also
In humans:
In animals:
https://en.wikipedia.org/wiki/Obedience_(human_behavior)#Embodiment_of_prestige_or_power
The Prince of Darkness is a term used in John Milton's poem Paradise Lost referring to Satan as the embodiment of evil. It is an English translation of the Latin phrase princeps tenebrarum, which occurs in the Acts of Pilate, written in the 4th century, in the Historia Francorum by Gregory of Tours (6th century),[1] in the 11th century hymn Rhythmus de die mortis by Pietro Damiani,[2] and in a sermon by Bernard of Clairvaux[3] from the 12th century.
https://en.wikipedia.org/wiki/Prince_of_Darkness_(Satan)
A victory column, or monumental column or triumphal column, is a monument in the form of a column, erected in memory of a victorious battle, war, or revolution. The column typically stands on a base and is crowned with a victory symbol, such as a statue. The statue may represent the goddess Victoria; in Germany, the female embodiment of the nation, Germania; in the United States either female embodiment of the nation Liberty or Columbia; in the United Kingdom, the female embodiment Britannia, an eagle, or a war hero.
https://en.wikipedia.org/wiki/Victory_column
A role-playing game (sometimes spelled roleplaying game,[1][2] RPG) is a game in which players assume the roles of characters in a fictional setting. Players take responsibility for acting out these roles within a narrative, either through literal acting or through a process of structured decision-making regarding character development.[3] Actions taken within many games succeed or fail according to a formal system of rules and guidelines.[4]
https://en.wikipedia.org/wiki/Role-playing_game
GNS theory is an informal field of study developed by Ron Edwards which attempts to create a unified theory of how role-playing games work. Focused on player behavior, in GNS theory participants in role-playing games organize their interactions around three categories of engagement: Gamism, Narrativism and Simulation.
The theory focuses on player interaction rather than statistics, encompassing game design beyond role-playing games. Analysis centers on how player behavior fits the above parameters of engagement and how these preferences shape the content and direction of a game. GNS theory is used by game designers to dissect the elements which attract players to certain types of games.
https://en.wikipedia.org/wiki/GNS_theory
Suspension of disbelief is the avoidance—often described as willing—of critical thinking and logic in understanding something that is unreal or impossible in reality, such as something in a work of speculative fiction, in order to believe it for the sake of enjoying its narrative.[1] Historically, the concept originates in the Greco-Roman principles of theater, wherein the audience ignores the unreality of fiction to experience catharsis from the actions and experiences of characters.[2]
https://en.wikipedia.org/wiki/Suspension_of_disbelief
Aesthetics (also esthetics in American English) is a branch of philosophy that deals with the nature of beauty and taste, as well as the philosophy of art (its own area of philosophy that comes out of aesthetics).[1] It examines aesthetic values, often expressed through judgments of taste.[2]
https://en.wikipedia.org/wiki/Aesthetics
Make believe, also known as pretend play, is a loosely structured form of play that generally includes role-play, object substitution and nonliteral behavior.[1] What separates play from other daily activities is its fun and creative aspect rather than being an action performed for the sake of survival or necessity.[2] Children engage in make believe for a number of reasons. It provides the child with a safe setting to express fears and desires.[3] When children participate in pretend play, they are integrating and strengthening previously acquired knowledge.[1] Children who have better pretense and fantasy abilities also show better social competence, cognitive capabilities, and an ability to take the perspective of others.[2] In order for the activity to be referred to as pretend play, the individual must be intentionally diverting from reality. The individual must be aware of the contrast between the real situation and the make believe situation.[2] If the child believes that the make believe situation is reality, then they are misinterpreting the situation rather than pretending. Pretend may or may not include action, depending on whether the child chooses to project their imagination onto reality or not.[4]
https://en.wikipedia.org/wiki/Make_believe
Aspects
Gamism
A gamist makes decisions to satisfy predefined goals in the face of adversity: to win. Edwards wrote,
I might as well get this over with now: the phrase "Role-playing games are not about winning" is the most widespread example of synecdoche in the hobby. Potential Gamist responses, and I think appropriately, include:
"Eat me,"
(upon winning) "I win," and
"C'mon, let's play without these morons."[6]
These decisions are most common in games pitting characters against successively-tougher challenges and opponents, and may not consider why the characters are facing them in the first place. Gamist RPG design emphasizes parity; all player characters should be equally strong and capable of dealing with adversity.
Combat and diversified options for short-term problem solving (for example, lists of specific spells or combat techniques) are frequently emphasized. Randomization provides a gamble, allowing players to risk more for higher stakes rather than modelling probability. Examples include Magic: The Gathering, chess and most computer games.
Narrativism
Narrativism relies on outlining (or developing) character motives, placing characters into situations where those motives conflict and making their decisions the driving force. For example, a samurai sworn to honor and obey his lord might be tested when directed to fight his rebellious son; a compassionate doctor might have his charity tested by an enemy soldier under his care; or a student might have to decide whether to help her best friend cheat on an exam.
This has two major effects. Characters usually change and develop over time, and attempts to impose a fixed storyline are impossible or counterproductive. Moments of drama (the characters' inner conflict) make player responses difficult to predict, and the consequences of such choices cannot be minimized. Revisiting character motives or underlying emotional themes often leads to escalation: asking variations of the same "question" at higher intensity levels.
Simulationism
Simulationism is a playing style recreating, or inspired by, a genre or source. Its major concerns are internal consistency, analysis of cause and effect and informed speculation. Characterized by physical interaction and details of setting, simulationism shares with narrativism a concern for character backgrounds, personality traits and motives to model cause and effect in the intellectual and physical realms.
Simulationist players consider their characters independent entities, and behave accordingly; they may be reluctant to have their character act on the basis of out-of-character information. Similar to the distinction between actor and character in a film or play, character generation and the modeling of skill growth and proficiency can be complex and detailed.
Many simulationist RPGs encourage illusionism (manipulation of in-game probability and environmental data to point to predefined conclusions) to create a story. Call of Cthulhu recreates the horror and humanity's cosmic insignificance in the Cthulhu Mythos, using illusionism to craft grisly fates for the players' characters and maintain consistency with the source material.
Simulationism maintains a self-contained universe operating independent of player will; events unfold according to internal rules. Combat may be broken down into discrete, semi-randomised steps for modeling attack skill, weapon weight, defense checks, armor, body parts and damage potential. Some simulationist RPGs explore different aspects of their source material, and may have no concern for realism; Toon, for example, emulates cartoon hijinks. Role-playing game systems such as GURPS and Fudge use a somewhat-realistic core system which can be modified with sourcebooks or special rules.
https://en.wikipedia.org/wiki/GNS_theory
Character creation (also character generation / character design) is the process of defining a game character or other character. Typically, characters possess individual strengths and weaknesses represented by a set of statistics.[citation needed] Games with a fictional setting may include traits such as race, class, or species. Games with a more contemporary or narrower setting may limit customization to physical and personality traits. This is usually used in role-playing games.
https://en.wikipedia.org/wiki/Character_creation
A role-playing game system is a set of game mechanics used in a tabletop role-playing game (TTRPG) to determine the outcome of a character's in-game actions.
History
By the late 1970s, the Chaosium staff realized that Steve Perrin's RuneQuest system had the potential to become a "house system", where one set of game mechanics could be used for multiple games; Greg Stafford and Lynn Willis proved that theory by boiling down the RuneQuest rules into the thin 16-page Basic Role-Playing (1980).[1]: 85 Hero Games used their Champions rules as the basis for their Hero System.[1]: 146 The Pacesetter house system centered on a universal "action table" that used one chart to resolve all game actions.[1]: 197 Steve Jackson became interested in publishing a new roleplaying system, designed by himself, with three goals: that it be detailed and realistic; logical and well-organized; and adaptable to any setting and any level of play; this system was eventually released as GURPS (1986).[1]: 104–107 The D&D-derived Palladium house system ultimately encompassed all of the Palladium Books titles.[1]: 60 Mekton II (1987) by R. Talsorian Games revealed for the first time the full-fledged Interlock System.[1]: 208
In 1990, Game Designers' Workshop released the Twilight: 2000 second edition game system, and decided to turn it into their house system, an umbrella under which all future games would be designed.[1]: 60 TSR's Amazing Engine was a universal game system, a simple beginner's system.[1]: 27 In 1996, Hero Games partnered with R. Talsorian and decided to create a new, simpler rules system to attract new players, merging it with the Interlock game system and calling it Fuzion.[1]: 150 Dragonlance: Fifth Age (1996) was built on TSR's new SAGA storytelling game system, which centered on resource management (through cards) rather than die rolls.[1]: 29 TSR published Alternity (1997), another universal system, this one directed only toward science-fiction games.[1]: 29 West End Games' MasterBook system had failed to catch on as a house system, so they decided to publish another, the D6 System, based on their most well-known and well-tested game system, Star Wars RPG.[1]: 194
Development
While early role-playing games relied heavily on either group consensus or the judgement of a single player (the "Dungeon Master" or Game Master) or on randomizers such as dice, later generations of narrativist games allow role-playing to influence the creative input and output of the players, so both acting out roles and employing rules take part in shaping the outcome of the game.
An RPG system also affects the game environment, which can take any of several forms. Generic role-playing game systems, such as Basic Role-Playing, GURPS, and Fate, are not tied to a specific storytelling genre or campaign setting and can be used as a framework to play many different types of RPG. Others, such as Dungeons & Dragons, are designed to depict a specific genre or style of play, and still others, such as Paranoia, are not only genre-specific but come bundled with a specific campaign setting to which the game mechanics are inseparably tied. In fact, in more psychological games such as Call of Cthulhu, King Arthur Pendragon, Unknown Armies, and Don't Rest Your Head, aspects of the game system are designed to reinforce psychological or emotional dynamics that evoke a game world's specific atmosphere.
Many role-playing game systems involve the generation of random numbers by which success or failure of an action is determined. This can be done using dice (probably the most common method) or cards (as in Castle Falkenstein), but other methods may be used depending on the system. The random result is added to an attribute which is then compared to a difficulty rating, although many variations on this game mechanic exist among systems. Some (such as the Storyteller/Storytelling System and the One-Roll Engine) use dice pools instead of individual dice to generate a series of random numbers, some of which may be discarded or used to determine the magnitude of the result. However, some games (such as the Amber Diceless Roleplaying Game and Nobilis) use no random factor at all. These instead use direct comparison of character ability scores to difficulty values, often supplemented with points from a finite but renewable pool. These "resource points" represent a character's additional effort or luck, and can be used strategically by the player to influence the success of an action.
References
- Shannon Appelcline (2011). Designers & Dragons. Mongoose Publishing. ISBN 978-1-907702-58-7.
https://en.wikipedia.org/wiki/Role-playing_game_system
Varieties
Role-playing games are played in a wide variety of formats, ranging from discussing character interaction in tabletop form, physically acting out characters in LARP to playing characters virtually in digital media.[17] There is also a great variety of systems of rules and game settings. Games that emphasize plot and character interaction over game mechanics and combat sometimes prefer the name storytelling game. These types of games tend to reduce or eliminate the use of dice and other randomizing elements. Some games are played with characters created before the game by the GM, rather than those created by the players. This type of game is typically played at gaming conventions, or in standalone games that do not form part of a campaign.
https://en.wikipedia.org/wiki/Role-playing_game
Improvisational theatre, often called improvisation or improv, is the form of theatre, often comedy, in which most or all of what is performed is unplanned or unscripted, created spontaneously by the performers. In its purest form, the dialogue, action, story, and characters are created collaboratively by the players as the improvisation unfolds in present time, without use of an already prepared, written script.
Improvisational theatre exists in performance as a range of styles of improvisational comedy as well as some non-comedic theatrical performances. It is sometimes used in film and television, both to develop characters and scripts and occasionally as part of the final product.
Improvisational techniques are often used extensively in drama programs to train actors for stage, film, and television and can be an important part of the rehearsal process. However, the skills and processes of improvisation are also used outside the context of performing arts. This practice, known as applied improvisation, is used in classrooms as an educational tool and in businesses as a way to develop communication skills, creative problem solving, and supportive team-work abilities that are used by improvisational, ensemble players.[1] It is sometimes used in psychotherapy as a tool to gain insight into a person's thoughts, feelings, and relationships.
https://en.wikipedia.org/wiki/Improvisational_theatre
A wargame is a strategy game in which two or more players command opposing armed forces in a realistic simulation of an armed conflict.[1] Wargaming may be played for recreation, to train military officers in the art of strategic thinking, or to study the nature of potential conflicts. Many wargames recreate specific historic battles, and can cover either whole wars, or any campaigns, battles, or lower-level engagements within them. Many simulate land combat, but there are wargames for naval and air combat as well.
Generally, activities where the participants actually perform mock combat actions (e.g. friendly warships firing dummy rounds at each other) are not considered wargames. Some writers may refer to a military's field training exercises as "live wargames", but certain institutions such as the US Navy do not accept this.[2] Likewise, activities like paintball are sports rather than wargames. Wargames are a mental activity.
Modern wargaming was invented in Prussia in the early 19th-century, and eventually the Prussian military adopted wargaming as a tool for training their officers and developing doctrine. After Prussia defeated France in the Franco-Prussian War, wargaming was widely adopted by military officers in other countries. Civilian enthusiasts also played wargames for fun, but this was a niche hobby until the development of consumer electronic wargames in the 1990s.
https://en.wikipedia.org/wiki/Wargame
A live action role-playing game (LARP) is a form of role-playing game where the participants physically portray their characters.[1] The players pursue goals within a fictional setting represented by real-world environments while interacting with each other in character. The outcome of player actions may be mediated by game rules or determined by consensus among players. Event arrangers called gamemasters decide the setting and rules to be used and facilitate play.
The first LARPs were run in the late 1970s, inspired by tabletop role-playing games and genre fiction. The activity spread internationally during the 1980s and has diversified into a wide variety of styles. Play may be very game-like or may be more concerned with dramatic or artistic expression. Events can also be designed to achieve educational or political goals. The fictional genres used vary greatly, from realistic modern or historical settings to fantastic or futuristic eras. Production values are sometimes minimal, but can involve elaborate venues and costumes. LARPs range in size from small private events lasting a few hours, to large public events with thousands of players lasting for days.
https://en.wikipedia.org/wiki/Live_action_role-playing_game
A stagehand is a person who works backstage or behind the scenes in theatres, film, television, or location performance. Their work include setting up the scenery, lights, sound, props, rigging, and special effects for a production.
https://en.wikipedia.org/wiki/Stagehand
A prop, formally known as (theatrical) property,[1] is an object used on stage or screen by actors during a performance or screen production.[2] In practical terms, a prop is considered to be anything movable or portable on a stage or a set, distinct from the actors, scenery, costumes, and electrical equipment.[3][4][5]
https://en.wikipedia.org/wiki/Prop
Filmmaking or film production is the process by which a motion picture is produced. Filmmaking involves a number of complex and discrete stages, starting with an initial story, idea, or commission. It then continues through screenwriting, casting, pre-production, shooting, sound recording, post-production, and screening the finished product before an audience that may result in a film release and an exhibition. Filmmaking occurs in a variety of economic, social, and political contexts around the world. It uses a variety of technologies and cinematic techniques.
Although filmmaking originally involved the use of film, most film productions are now digital.[1] Today, filmmaking refers to the process of crafting an audio-visual story commercially for distribution or broadcast.
Production stages
Film production consists of five major stages:[2]
- Development: Ideas for the film are created, rights to existing intellectual properties are purchased, etc., and the screenplay is written. Financing for the project is sought and obtained.
- Pre-production: Arrangements and preparations are made for the shoot, such as hiring cast and film crew, selecting locations and constructing sets.
- Production: The raw footage and other elements of the film are recorded during the film shoot, including principal photography.
- Post-production: The images, sound, and visual effects of the recorded film are edited and combined into a finished product.
- Distribution: The completed film is distributed, marketed, and screened in cinemas or released to home video to be viewed.
Development
The development stage contains both general and specific components. Each film studio has a yearly retreat where their top creative executives meet and interact on a variety of areas and topics they wish to explore through collaborations with producers and screenwriters, and then ultimately, directors, actors, and actresses. They choose trending topics from the media and real life, as well as many other sources, to determine their yearly agenda. For example, in a year when action is popular, they may wish to explore that topic in one or more movies. Sometimes, they purchase the rights to articles, bestselling novels, plays, the remaking of older films, stories with some basis in real life through a person or event, a video game, fairy tale, comic book, graphic novel. Likewise, research through surveys may inform their decisions. They may have had blockbusters from their previous year and wish to explore a sequel. They will additionally acquire a completed and independently financed and produced film. Such notable examples are Little Miss Sunshine and The English Patient as well as Roma.
Studios hold general meetings with producers and screenwriters about original story ideas. "In my decade working as a writer, I knew of only a few that were sold and fewer that made it to the screen," relays writer Wayne Powers. Alan Watt, writer-director and Founder of The LA Writer's Lab, confirmed that completed original screenplays, referred to as "specs", make big news when they sell, but these make up a very small portion of movies that are ultimately given the green light to be produced by the president of a studio.
The executives return from the retreat with fairly well-established instructions. They spread these concepts through the industry community, especially to producers they have deals with (traditional studios will have those producers in offices on their lots). Also, agents for screenwriters are made aware. This results in a pairing of producers with writers, where they develop a "take", a basic story idea that utilizes the concept given by studio executives. Often it is a competition with several pairings meeting with studio executives and "pitching" their "take". Very few writing jobs are from original ideas brought to studios by producers or writers. Perhaps one movie a year will be a "spec" script that was purchased.
Once the producer and writer have sold their approach to the desired subject matter, they begin to work. However, many writers and producers usually pass before a particular concept is realized in a way that is awarded a green light to production. Production of Unforgiven, which earned Oscars for its Director/Star Clint Eastwood, as well as its screenwriter, David Webb Peoples, required fifteen years. Powers related that The Italian Job took approximately eight years from concept to screen, which, as Powers added, "is average." And most concepts turned into paid screenplays wind up gathering dust on some executive's shelf, never to see production.
Writers have different styles and creative processes; some have stronger track records than others. Because of this, how the development process proceeds from there and how much detail a writer returns to the studio to divulge before beginning writing can vary greatly. Screenwriters are often protected by the union the Writers Guild of America, or WGA. The WGA allows a screenwriter to contract for One Draft, One Revision and One Polish. Bob Eisle, Writer and Member of the Guild Board states, "Additional writing requires extension of contracts and payment for additional work". They are paid 80% of their fee after the First Draft. Preliminary discussions are minimal with studio executives but might be quite detailed with the producer.
Next, a screenwriter writes a screenplay over a period of several months, or however long it takes. Deadlines are in their contracts but there is no pressure to adhere to them. Again, every writer's process and speed varies. The screenwriter may rewrite the script several times to improve dramatization, clarity, structure, characters, dialogue, and overall style.
Script Coverage, a freelance job held by recent university graduates, does not feed scripts into the system that are ready for production nor already produced. "Coverage" is a way for young screenwriters to be read and their ideas might make their way up to an executive or famous producer and result in "meet and greets" where relations with up and comers can be formed. But it has not historically yielded ideas studios pursue into production.
The studio is the film distributor who at an early stage attempts to choose a slate of concepts that are likely to have market appeal and find potential financial success. Hollywood distributors consider factors such as the film genre, the target audience and assumed audience, the historical success of similar films, the actors who might appear in the film, and potential directors. All these factors imply a certain appeal of the film to a possible audience. Not all films make a profit from the theatrical release alone, however, the studio mainly targets the opening weekend and the second weekend to make most domestic profits. Occasionally, a film called a "word of mouth film" does not market strongly but its success spreads by word of mouth. It slowly gains its audience. These are special circumstances and these films may remain in theaters for 5 months while a typical film run is closer to 5 weekends. Further earnings result from pay television purchases, foreign market purchases and DVD sales to establish worldwide distribution Gross of a Film.
Once a screenplay is "green-lit", directors and actors are attached and the film proceeds into the pre-production stage, although sometimes development and pre-production stages will overlap. Projects which fail to obtain a green light may have protracted difficulties in making the transition to pre-production and enter a phase referred to as developmental hell for extended period of time or until developmental turnaround.
Analogous to almost any business venture, financing of a film project deals with the study of filmmaking as the management and procurement of investments. It includes the dynamics of assets that are required to fund the filmmaking and liabilities incurred during the filmmaking over the time period from early development through the management of profits and losses after distribution under conditions of different degrees of uncertainty and risk. The practical aspects of filmmaking finance can also be defined as the science of the money management of all phases involved in filmmaking. Film finance aims to price assets based on their risk level and their expected rate of return based upon anticipated profits and protection against losses.
Pre-production
In pre-production, every step of actually creating the film is carefully designed and planned. This is the phase where one would narrow down all the options of the production. It is where all the planning takes place before the camera rolls and sets the overall vision of the project. The production company is created and a production office established. The film is pre-visualized by the director and may be storyboarded with the help of illustrators and concept artists. A production budget is drawn up to plan expenditures for the film. For major productions, insurance is procured to protect against accidents. Pre-production also includes working out the shoot location and casting process. The Producer hires a Line Manager or a Production Manager to create the schedule and budget for the film.
The nature of the film, and the budget, determine the size and type of crew used during filmmaking. Many Hollywood blockbusters employ a cast and crew of hundreds, while a low-budget, independent film may be made by a "skeleton crew" of eight or nine (or fewer). These are typical crew positions:
- Storyboard artist: creates visual images to help the director and production designer communicate their ideas to the production team.
- Director: is primarily responsible for the storytelling, creative decisions and acting of the film.
- Assistant director (AD): manages the shooting schedule and logistics of the production, among other tasks. There are several types of AD, each with different responsibilities.
- Film producer: hires the film's crew.
- Unit production manager: manages the production budget and production schedule. They also report, on behalf of the production office, to the studio executives or financiers of the film.
- Location manager: finds and manages film locations. Nearly all pictures feature segments that are shot in the controllable environment of a studio sound stage, while outdoor sequences call for filming on location.
- Unit production manager: manages the production budget and production schedule. They also report, on behalf of the production office, to the studio executives or financiers of the film.
- Production designer: the one who creates the visual conception of the film, working with the art director, who manages the art department which makes production sets.[3]
- Costume designer: creates the clothing for the characters in the film working closely with the actors, as well as other departments.
- Makeup and hair designer: works closely with the costume designer in order to create a certain look for a character.
- Casting director: finds actors to fill the parts in the script. This normally requires that actors partake in an audition, either live in front of the casting director or in front of one or more cameras.
- Choreographer: creates and coordinates the movement and dance – typically for musicals. Some films also credit a fight choreographer.
- Director of photography (DOP): the head of the photography of the entire film, supervises all cinematographers and camera operators.
- Production sound mixer: the head of the sound department during the production stage of filmmaking. They record and mix the audio on set – dialogue, presence and sound effects in monaural and ambience in stereo.[4][5] They work with the boom operator, Director, DA, DP, and First AD.
- Sound designer: creates the aural conception of the film,[3] working with the supervising sound editor. On Bollywood-style Indian productions the sound designer plays the role of a director of audiography.[6]
- Composer: creates new music for the film. (usually not until post-production)
Production
In production, the film is created and shot. In this phase, it is key to keep planning ahead of the daily shoot. The primary aim is to stick to the budget and schedule, which requires constant vigilance. More crew will be recruited at this stage, such as the property master, script supervisor, assistant directors, stills photographer, picture editor, and sound editors. These are the most common roles in filmmaking; the production office will be free to create any unique blend of roles to suit the various responsibilities needed during the production of a film. Communication is key between the location, set, office, production company, distributors and all other parties involved.
A typical day shooting begins with the crew arriving on the set/location by their call time. Actors usually have their own separate call times. Since set construction, dressing and lighting can take many hours or even days, they are often set up in advance.
The grip, electric and production design crews are typically a step ahead of the camera and sound departments: for efficiency's sake, while a scene is being filmed, they are already preparing the next one.
While the crew prepares their equipment, the actors do their costumes and attend the hair and make-up departments. The actors rehearse the script and blocking with the director, and the camera and sound crews rehearse with them and make final tweaks. Finally, the action is shot in as many takes as the director wishes. Most American productions follow a specific procedure:
The assistant director (AD) calls "picture is up!" to inform everyone that a take is about to be recorded, and then "quiet, everyone!" Once everyone is ready to shoot, the AD calls "roll sound" (if the take involves sound), and the production sound mixer will start their equipment, record a verbal slate of the take's information, and announce "sound speed", or just "speed", when they are ready. The AD follows with "roll camera", answered by "speed!" by the camera operator once the camera is recording. The clapper loader, who is already in front of the camera with the clapperboard, calls "marker!" and slaps it shut. If the take involves extras or background action, the AD will cue them ("action background!"), and last is the director, telling the actors "action!". The AD may echo "action" louder on large sets.
A take is over when the director calls "Cut!" and the camera and sound stop recording. The script supervisor will note any continuity issues, and the sound and camera teams log technical notes for the take on their respective report sheets. If the director decides additional takes are required, the whole process repeats. Once satisfied, the crew moves on to the next camera angle or "setup," until the whole scene is "covered." When shooting is finished for the scene, the assistant director declares a "wrap" or "moving on," and the crew will "strike," or dismantle, the set for that scene.
At the end of the day, the director approves the next day's shooting schedule and a daily progress report is sent to the production office. This includes the report sheets from continuity, sound, and camera teams. Call sheets are distributed to the cast and crew to tell them when and where to turn up the next shooting day. Later on, the director, producer, other department heads, and, sometimes, the cast, may gather to watch that day or yesterday's footage, called dailies, and review their work.
With workdays often lasting fourteen or eighteen hours in remote locations, film production tends to create a team spirit. When the entire film is "in the can", or in the completion of the production phase, it is customary for the production office to arrange a wrap party, to thank all the cast and crew for their efforts.
For the production phase on live-action films, synchronizing work schedules of key cast and crew members is very important, since for many scenes, several cast members and most of the crew must be physically present at the same place at the same time (and bankable stars may need to rush from one project to another). Animated films have different workflow at the production phase, in that voice actors can record their takes in the recording studio at different times and may not see one another until the film's premiere.[7] Animated films also have different crew, since most physical live-action tasks are either unnecessary or are simulated by various types of animators.
Post-production
This stage is usually thought of as starting when principal photography ends, but they may overlap. The bulk of post-production consists of the film editor reviewing the footage with the director and assembling the film out of selected takes. The production sound (dialogue) is also edited; music tracks and songs are composed and recorded if a film is intended to have a score; sound effects are designed and recorded. Any computer-generated visual effects are digitally added by an artist. Finally, all sound elements are mixed down into "stems", which are synchronized to the images on the screen, and the film is fully completed ("locked").
Distribution
Distribution is the last stage, where the film is released to cinemas or, occasionally, directly to consumer media (VHS, VCD, DVD, Blu-ray) or direct download from a digital media provider. The film is duplicated as required (either onto film or hard disk drives) and distributed to cinemas for exhibition (screening). Press kits, posters, and other advertising materials are published, and the film is advertised and promoted. A B-roll clip may be released to the press based on raw footage shot for a "making of" documentary, which may include making-of clips as well as on-set interviews separate from those of the production company or distributor. For major films, key personnel are often contractually required to participate in promotional tours in which they appear at premieres and festivals and sit for interviews with many TV, print, and online journalists. The largest productions may require more than one promotional tour, in order to rejuvenate audience demand at each release window.
Since the advent of home video in the late 1970s, most major films have followed a pattern of having several distinct release windows. A film may first be released to a few select cinemas, or if it tests well enough, may go directly into wide release. Next, it is released, normally at different times several weeks (or months) apart, into different market segments like rental, retail, pay-per-view, in-flight entertainment, cable television, satellite television, or free-to-air broadcast television. The distribution rights for the film are also usually sold for worldwide distribution. The distributor and the production company share profits and manage losses.
Independent filmmaking
Filmmaking also takes place outside of the mainstream and is commonly called independent filmmaking. Since the introduction of DV technology, the means of production have become more democratized and economically viable. Filmmakers can conceivably shoot and edit a film, create and edit the sound and music, and mix the final cut on a home computer. However, while the means of production may be democratized, financing, traditional distribution, and marketing remain difficult to accomplish outside the traditional system. In the past, most independent filmmakers have relied on film festivals (such as Sundance Film Festival, Venice Film Festival, Cannes Film Festival, and Toronto International Film Festivals) to get their films noticed and sold for distribution and production. However, the internet has allowed for the relatively inexpensive distribution of independent films on websites such as YouTube. As a result, several companies have emerged to assist filmmakers in getting independent movies seen and sold via mainstream internet marketplaces, often adjacent to popular Hollywood titles. With internet movie distribution, independent filmmakers who choose to forego a traditional distribution deal now have the ability to reach global audiences.
See also
- 35 mm film
- 3D film
- Audiography
- Cinematic techniques
- Digital cinema
- Experimental filmmaking
- Film colorization
- Film industry
- Filmmaking technique in Kurosawa
- Filmmaking technique of Luis Buñuel
- Film poster
- Film school
- Film studies
- Film title design
- Film trailer
- First-look deal
- Glossary of motion picture terms
- Housekeeping deal
- List of film topics
- Motion Picture Association of America
- Motion picture content rating system
- Movie production incentives in the United States
- Movie theater
- Outline of film
- Television
- Video production
- Film portal
References
- Hayes, Derek; Webster, Chris (2013). Acting and Performance for Animation. New York and London: Focal Press. p. 176. ISBN 9781136135989.
External links
Library resources about Filmmaking |
- Filmmaking at Curlie
https://en.wikipedia.org/wiki/Filmmaking#Production
A non-player character (NPC), or non-playable character, is any character in a game that is not controlled by a player.[1] The term originated in traditional tabletop role-playing games where it applies to characters controlled by the gamemaster or referee rather than by another player. In video games, this usually means a character controlled by the computer (instead of a player) that has a predetermined set of behaviors that potentially will impact gameplay, but will not necessarily be the product of true artificial intelligence.
https://en.wikipedia.org/wiki/Non-player_character
A backstory, background story, back-story, or background is a set of events invented for a plot, presented as preceding and leading up to that plot. It is a literary device of a narrative history all chronologically earlier than the narrative of primary interest.
In acting, it is the history of the character before the drama begins, and is created during the actor's preparation.
It is the history of characters and other elements that underlie the situation existing at the main narrative's start. Even a purely historical work selectively reveals backstory to the audience.[1][2]
https://en.wikipedia.org/wiki/Backstory
A fictional universe, or fictional world, is a self-consistent setting with events, and often other elements, that differ from the real world. It may also be called an imagined, constructed, or fictional realm or world. Fictional universes may appear in novels, comics, films, television shows, video games, and other creative works.
The subject is most commonly addressed in reference to fictional universes that differ markedly from the real world, such as those that introduce entire fictional cities, countries, or even planets; those that contradict commonly known facts about the world and its history; or those that feature fantasy or science fiction concepts such as magic or faster than light travel—and those in which the deliberate development of the setting is a substantial focus of the work. When a large franchise of related works has two or more somewhat different fictional universes that are each internally consistent but not consistent with each other, such as a distinct plotline and set of characters in a comics version versus a television adaptation, each universe is often referred to as a continuity, though the term continuity as a mass noun usually has a broader meaning in fiction.
https://en.wikipedia.org/wiki/Fictional_universe
Usage
As a literary device, backstory is often employed to lend depth or believability to the main story. The usefulness of having a dramatic revelation was recognized by Aristotle, in Poetics.
Backstories are usually revealed, partially or in full, chronologically or otherwise, as the main narrative unfolds. However, a story creator may also create portions of a backstory or even an entire backstory that is solely for their own use.[3]
Backstory may be revealed by various means, including flashbacks, dialogue, direct narration, summary, recollection, and exposition. The original Star Wars film and its first two sequels are examples of a work with a preconceived backstory, which was later released as the "prequel" second set of three films.
Recollection
Recollection is the fiction-writing mode whereby a character calls something to mind, or remembers it. A character's memory plays a role for conveying backstory, as it allows a fiction-writer to bring forth information from earlier in the story or from before the beginning of the story. Although recollection is not widely recognized as a distinct fiction-writing mode, recollection is commonly used by authors of fiction.
For example, Orson Scott Card observes that "If it's a memory the character could have called to mind at any point, having her think about it just in time to make a key decision may seem like an implausible coincidence . . . ." Furthermore, "If the memory is going to prompt a present decision, then the memory in turn must have been prompted by a recent event."[4]
In a shared universe more than one author may shape the same backstory. The later creation of a backstory that conflicts with a previously written main story may require the adjustment device known as retroactive continuity, informally known as "retcon".
Acting
Actors may create their own backstories for characters, going beyond the sometimes meager information in a script. Filling in details helps an actor interpret the script and create fully imagined characters.[5]
See also
References
- Homan, Sidney; Rhinehart, Brian (2018). "3". Comedy Acting for Theatre: The Art and Craft of Performing in Comedies. Bloomsbury Publishing. ISBN 9781350012783. Retrieved 26 November 2018.
https://en.wikipedia.org/wiki/Backstory
Retroactive continuity, or retcon for short, is a literary device in which facts in the world of a fictional work which have been established through the narrative itself are adjusted, ignored, supplemented, or contradicted by a subsequently published work which recontextualizes or breaks continuity with the former.[2]
There are various motivations for applying retroactive continuity, including:
- To accommodate desired aspects of sequels or derivative works which would otherwise be ruled out.
- To respond to negative fan reception of previous stories.
- To correct and overcome errors or problems identified in the prior work since its publication.
- To change or clarify how the prior work should be interpreted.
- To match reality, when assumptions or projections of the future are later proven wrong.[Note 1]
Retcons are used by authors to increase their creative freedom, on the assumption that the changes are unimportant to the audience compared to the new story which can be told. Retcons can be diegetic or nondiegetic. For instance, by using time travel or parallel universes, an author may diegetically reintroduce a popular character they had previously killed off. More subtle and nondiegetic methods would be ignoring or expunging minor plot points to remove narrative elements the author doesn't have interest in writing.
Retcons are common in pulp fiction, and especially in comic books published by long-established publishers such as DC and Marvel.[4] The long history of popular titles and the number of writers who contribute stories can often create situations that demand clarification or revision. Retcons also often appear in manga, soap operas, serial dramas, movie sequels, cartoons, professional wrestling angles, video games, radio series, and other forms of serial fiction.
https://en.wikipedia.org/wiki/Retroactive_continuity
A shared universe or shared world is a fictional universe from a set of creative works where more than one writer (or other artist) independently contributes a work that can stand alone but fits into the joint development of the storyline, characters, or world of the overall project. It is common in genres like science fiction.[1] It differs from collaborative writing in which multiple artists are working together on the same work and from crossovers where the works and characters are independent except for a single meeting.
The term shared universe is also used within comics to reflect the overall milieu created by the comic book publisher in which characters, events, and premises from one product line appear in other product lines in a media franchise. A specific kind of shared universe that is published across a variety of media (such as novels and films), each of them contributing to the growth, history, and status of the setting is called an "imaginary entertainment environment."[2]
The term has also been used in a wider, non-literary sense to convey interdisciplinary[3] or social commonality,[4] often in the context of a "shared universe of discourse".[5]
https://en.wikipedia.org/wiki/Shared_universe
A crossover is the placement of two or more otherwise discrete fictional characters, settings, or universes into the context of a single story. They can arise from legal agreements between the relevant copyright holders, unofficial efforts by fans, or common corporate ownership.
https://en.wikipedia.org/wiki/Crossover_(fiction)
The Banishing is a 2020 British horror film directed by Christopher Smith, starring Jessica Brown Findlay, John Heffernan, John Lynch and Sean Harris. It premiered at the Sitges Film Festival and London FrightFest Film Festival in October 2020, before being released digitally in the United Kingdom on 26 March 2021. The movie revolves around a haunted-house horror set in the lead up to World War II.[3]
https://en.wikipedia.org/wiki/The_Banishing
The public domain (PD) consists of all the creative work to which no exclusive intellectual property rights apply. Those rights may have expired,[1] been forfeited,[2] expressly waived, or may be inapplicable.[3] Because no one holds the exclusive rights, anyone can legally use or reference those works without permission.
https://en.wikipedia.org/wiki/Public_domain
Alternate history (also alternative history, allohistory,[1] althist, AH) is a genre of speculative fiction of stories in which one or more historical events occur and are resolved differently than in real life.[2][3][4][5] As conjecture based upon historical fact, alternate history stories propose What if? scenarios about crucial events in human history, and present outcomes very different from the historical record. Alternate history also is a subgenre of literary fiction, science fiction, and historical fiction; as literature, alternate history uses the tropes of the genre to answer the What if? speculations of the story.
Since the 1950s, as a subgenre of science fiction, alternative history stories feature the tropes of time travel between histories, the psychic awareness of the existence of an alternative universe, by the inhabitants of a given universe; and time travel that divides history into various timestreams.[1]
In the Spanish, French, German, and Portuguese, Italian, Catalan, and Galician languages, the terms uchronie, ucronia, and ucronía identify the alternate history genre, from which derives the English term uchronia, composed of the Greek prefix οὐ- ("not", "not any", and "no") and the Greek word χρόνος (chronos) "time", to describe a story that occurs "[in] no time"; analogous to a story that occurs in utopia, "[in] no place". The term uchronia also is the name of the list of alternate-history books, Uchronia: The Alternate History List.[6]
https://en.wikipedia.org/wiki/Alternate_history
Counterfactual history (also virtual history) is a form of historiography that attempts to answer the What if? questions that arise from counterfactual conditions.[1] As a method of intellectual enquiry, counterfactual history explores history and historical incidents by extrapolating a timeline in which key historical events either did not occur or had an outcome different from the actual historical outcome. Counterfactual history seeks by "conjecturing on what did not happen, or what might have happened, in order to understand what did happen."[2] It has produced a literary genre which is variously called alternate history, speculative history, allohistory, and hypothetical history.[3][4]
https://en.wikipedia.org/wiki/Counterfactual_history
Speculative fiction is a category of fiction that, in its broadest sense, encompasses the genres that depart from reality,[1] such as in the context of supernatural, futuristic, and other imaginative realms.[2] This umbrella category includes, but is not limited to, science fiction, fantasy, horror, superhero fiction, alternate history, utopian and dystopian fiction, and supernatural fiction, as well as combinations thereof (for example, science fantasy).[3] The term has been used with a variety of meanings for works of literature.[1]
https://en.wikipedia.org/wiki/Speculative_fiction
Pigeonholing is a process that attempts to classify disparate entities into a limited number of categories (usually, mutually exclusive ones).
The term usually carries connotations of criticism, implying that the classification scheme referred to inadequately reflects the entities being sorted, or that it is based on stereotypes.[1]
When considering various classification schemes, one must be aware of the following pitfalls:
- Using categories that are poorly defined (e.g., because they are subjective).
- Entities may be suited to more than one category. Example: rhubarb is both "poisonous" and "edible".
- Entities may not fit into any available category. Example: asking somebody from Washington, D.C. which state they live in.
- Entities may change over time, so they no longer fit the category in which they have been placed. Example: certain species of fish may change from male to female during their life.
- Attempting to discretize properties that would be better viewed as a continuum must be taken with caution. Example: while sorting people into "introverted" and "extroverted" one must keep in mind that most people exhibit both traits to some degree.[citation needed]
An example of pigeonholing in everyday conversation occurs when a person making an apolitical or barely political comment is assumed to have a certain political belief, without ascertaining their political stance. Such an erroneous designation is especially erroneous when assigning it to people who live in places where the left–right dichotomy is not present.[2]
https://en.wikipedia.org/wiki/Pigeonholing
The timestream or time stream is a metaphorical conception of time as a stream, a flowing body of water. In Brave New Words: The Oxford Dictionary of Science Fiction, the term is more narrowly defined as: "the series of all events from past to future, especially when conceived of as one of many such series".[1] Timestream is the normal passage or flow of time and its historical developments, within a given dimension of reality. The concept of the time stream, and the ability to travel within and around it, are the fundamentals of a genre of science fiction.
This conception has been widely used in mythology and in fiction.
This analogy is useful in several ways:
- Streams flow only one way. Time moves only forward.
- Streams flow constantly. Time never stops.
- People can stand in a stream, but will be pulled along by it. People exist within time, but move with it.[2]
- Some physicists and science fiction writers have speculated that time is branching—it branches into alternate universes (see many-worlds interpretation). Streams can converge and also diverge.
Science fiction scholar Andrew Sawyer writes, "The paradoxes of time—do we move in time, or does it move by us? Does it exist or is it merely an illusion of our limited perception?—are puzzles that exercise both physicists and philosophers..."[3]
https://en.wikipedia.org/wiki/Timestream
Thomas Sawyer (/ˈsɔːjər/) is the title character of the Mark Twain novel The Adventures of Tom Sawyer (1876). He appears in three other novels by Twain: Adventures of Huckleberry Finn (1884), Tom Sawyer Abroad (1894), and Tom Sawyer, Detective (1896).
Sawyer also appears in at least three unfinished Twain works, Huck and Tom Among the Indians, Schoolhouse Hill, and Tom Sawyer's Conspiracy. While all three uncompleted works were posthumously published, only Tom Sawyer's Conspiracy has a complete plot, as Twain abandoned the other two works after finishing only a few chapters. It is set in the 1840s in the Mississippi.
https://en.wikipedia.org/wiki/Tom_Sawyer
Whitewashing is the act of glossing over or covering up vices, crimes or scandals or exonerating by means of a perfunctory investigation or biased presentation of data with the intention to improve one's reputation.[1]
Etymology
The first known use of the term is from 1591 in England.[1][2] Whitewash is a cheap white paint or coating of chalked lime that was used to quickly give a uniform clean appearance to a wide variety of surfaces, such as the interior of a barn.[citation needed]
https://en.wikipedia.org/wiki/Whitewashing_(censorship)
Censorship is the suppression of speech, public communication, or other information. This may be done on the basis that such material is considered objectionable, harmful, sensitive, or "inconvenient".[2][3][4] Censorship can be conducted by governments,[5] private institutions and other controlling bodies.
Governments[5] and private organizations may engage in censorship. Other groups or institutions may propose and petition for censorship.[6] When an individual such as an author or other creator engages in censorship of their own works or speech, it is referred to as self-censorship. General censorship occurs in a variety of different media, including speech, books, music, films, and other arts, the press, radio, television, and the Internet for a variety of claimed reasons including national security, to control obscenity, pornography, and hate speech, to protect children or other vulnerable groups, to promote or restrict political or religious views, and to prevent slander and libel.
Direct censorship may or may not be legal, depending on the type, location, and content. Many countries provide strong protections against censorship by law, but none of these protections are absolute and frequently a claim of necessity to balance conflicting rights is made, in order to determine what could and could not be censored. There are no laws against self-censorship.
https://en.wikipedia.org/wiki/Censorship
Self-censorship is the act of censoring or classifying one's own discourse. This is done out of fear of, or deference to, the sensibilities or preferences (actual or perceived) of others and without overt pressure from any specific party or institution of authority. Self-censorship is often practiced by film producers, film directors, publishers, news anchors, journalists, musicians, and other kinds of authors including individuals who use social media.
Article 19 of the Universal Declaration of Human Rights guarantees freedom of speech from all forms of censorship. Article 19 explicitly states that "everyone has the right to freedom of opinion and expression; this right includes freedom to hold opinions without interference and to seek, receive and impart information and ideas through any media and regardless of frontiers."[1]
The practice of self-censorship, like that of censorship itself, has a long history.[2][3][4]
https://en.wikipedia.org/wiki/Self-censorship
Uchronia: The Alternate History List is an online general-interest book database containing a bibliography of alternate history novels, stories, essays and other printed material. It is owned and operated by Robert B. Schmunk. Uchronia was twice selected as the Sci Fi Channel's "Sci Fi Site of the Week."[1][2]
https://en.wikipedia.org/wiki/Uchronia:_The_Alternate_History_List
Historiography is the study of the methods of historians in developing history as an academic discipline, and by extension is any body of historical work on a particular subject. The historiography of a specific topic covers how historians have studied that topic by using particular sources, techniques, and theoretical approaches. Scholars discuss historiography by topic—such as the historiography of the United Kingdom, that of WWII, the pre-Columbian Americas, early Islam, and China—and different approaches and genres, such as political history and social history. Beginning in the nineteenth century, with the development of academic history, there developed a body of historiographic literature. The extent to which historians are influenced by their own groups and loyalties—such as to their nation state—remains a debated question.[1][2]
https://en.wikipedia.org/wiki/Historiography
Counterfactual conditionals (also subjunctive or X-marked) are conditional sentences which discuss what would have been true under different circumstances, e.g. "If Peter believed in ghosts, he would be afraid to be here." Counterfactuals are contrasted with indicatives, which are generally restricted to discussing open possibilities. Counterfactuals are characterized grammatically by their use of fake tense morphology, which some languages use in combination with other kinds of morphology including aspect and mood.
Counterfactuals are one of the most studied phenomena in philosophical logic, formal semantics, and philosophy of language. They were first discussed as a problem for the material conditional analysis of conditionals, which treats them all as trivially true. Starting in the 1960s, philosophers and linguists developed the now-classic possible world approach, in which a counterfactual's truth hinges on its consequent holding at certain possible worlds where its antecedent holds. More recent formal analyses have treated them using tools such as causal models and dynamic semantics. Other research has addressed their metaphysical, psychological, and grammatical underpinnings, while applying some of the resultant insights to fields including history, marketing, and epidemiology.
https://en.wikipedia.org/wiki/Counterfactual_conditional
Counterfactual history (also virtual history) is a form of historiography that attempts to answer the What if? questions that arise from counterfactual conditions.[1] As a method of intellectual enquiry, counterfactual history explores history and historical incidents by extrapolating a timeline in which key historical events either did not occur or had an outcome different from the actual historical outcome. Counterfactual history seeks by "conjecturing on what did not happen, or what might have happened, in order to understand what did happen."[2] It has produced a literary genre which is variously called alternate history, speculative history, allohistory, and hypothetical history.[3][4]
https://en.wikipedia.org/wiki/Counterfactual_history
In entertainment, an origin story is an account or backstory revealing how a character or group of people become a protagonist or antagonist, and it adds to the overall interest and complexity of a narrative, often giving reasons for their intentions.
In American comic books, it also refers to how characters gained their superpowers and/or the circumstances under which they became superheroes or supervillains. In order to keep their characters current, comic book companies, as well as cartoon companies, game companies, children's show companies, and toy companies, frequently rewrite the origins of their oldest characters. This goes from adding details that do not contradict earlier facts to a totally new origin which makes it seem that it is an altogether different character.
A pourquoi story, also dubbed an "origin story", is also used in mythology, referring to narratives of how a world began, how creatures and plants came into existence, and why certain things in the cosmos have certain yet distinct qualities.
https://en.wikipedia.org/wiki/Origin_story
A prequel is a literary, dramatic or cinematic work whose story precedes that of a previous work, by focusing on events that occur before the original narrative.[1] A prequel is a work that forms part of a backstory to the preceding work.
The term "prequel" is a 20th-century neologism from the prefix "pre-" (from Latin prae, "before") and "sequel".[2][3]
Like sequels, prequels may or may not concern the same plot as the work from which they are derived. More often they explain the background that led to the events in the original, but sometimes the connections are not completely explicit. Sometimes prequels play on the audience's knowledge of what will happen next, using deliberate references to create dramatic irony.
https://en.wikipedia.org/wiki/Prequel
Category:Plot (narrative)
Articles relating to plot.
Subcategories
This category has the following 3 subcategories, out of 3 total.
- Elements of fiction (21 C)
F
- Frame stories (3 C, 47 P)
I
- Interactive narrative (3 C, 11 P)
Pages in category "Plot (narrative)"
The following 58 pages are in this category, out of 58 total. This list may not reflect recent changes.
D
Q
S
https://en.wikipedia.org/wiki/Category:Plot_(narrative)
No comments:
Post a Comment