The Beasts of battle is a poetic trope in Old English and Old Norse literature. The trope has the wolf, the raven, and the eagle follow warriors into battle to feast on the bodies of the slain.[1] It occurs in eight Old English poems and in the Old Norse Poetic Edda.
History of the term
The term originates with Francis Peabody Magoun, who first used it in 1955, although the combination of the three animals was first considered a theme by Maurice Bowra, in 1952.[2]
History, content
The beasts of battle presumably date from an earlier, Germanic tradition; the animals are well known for eating carrion. A mythological connection may be presumed as well, though it is clear that at the time that the Old English manuscripts were produced, in a Christianized England, there was no connection between for instance the raven and Huginn and Muninn or the wolf and Geri and Freki. This mythological and/or religious connection survived for much longer in Scandinavia.[3] Their literary pedigree is unknown. John D. Niles points out that they possibly originate in the wolf and the raven as animals sacred to Wōden; their role as eaters of the fallen victims certainly, he says, accords with the fondness of Old English poets for litotes, or deliberate understatement, giving "ironic expression to the horror of warfare as seen from the side of the losers."[4]
While the beasts have no connection to pagan mythology and theology in the Old English poems they inhabit, such a connection returns, oddly enough, in Christian hagiography: in Ælfric of Eynsham's Passio Saneti Edmundi Regis (11th century) a wolf guards the head of Saint Edmund the Martyr, and in John Lydgate's The Life of Saint Alban and Saint Amphibal (15th century), "the wolf and also the eagle, upon the explicit command of Christ, protect the bodies of the martyrs from all the other carrion beasts."[5]
Occurrences in Old English poetry
- Battle of Brunanburh (61-65)
- The Battle of Maldon (106-107)
- Beowulf (3024-27)
- Elene (52-53; 110-113)
- Exodus (162-167)
- The Fight at Finnsburgh (5-7)
- Genesis A (1983–1985)
- Judith (204-212; 292-296)
- The Wanderer (80-83)
References
Notes
- Honegger 290-91.
Bibliography
- Herring, Scott (2008). "A Hawk from a Handsaw: A Note on the Beasts of 'The Battle of Brunanburh'". American Notes and Queries. 21 (1): 9–11. doi:10.3200/anqq.21.1.9-11. S2CID 162360209.
- Honegger, Thomas (1998). "Form and function: The beasts of battle revisited". English Studies. 79 (4): 289–98. doi:10.1080/00138389808599134.
- Magoun, Francis Peabody (1955). "The Theme of the Beasts of Battle in Anglo- Saxon Poetry". Neuphilologische Mitteilungen. 56: 81–90.
- Niles, John D. (2007). "Pagan survivals and popular belief". In Malcolm Godden (ed.). The Cambridge Companion to Old English Literature. Michael Lapidge. Cambridge UP. pp. 126–41. ISBN 978-0-521-37794-2.
https://en.wikipedia.org/wiki/Beasts_of_battle
https://en.wikipedia.org/wiki/Category:Tropes
https://en.wikipedia.org/wiki/Archetype
https://en.wikipedia.org/wiki/Miser
https://en.wikipedia.org/wiki/Category:Economic_crises
The drugstore beetle (Stegobium paniceum), also known as the bread beetle, biscuit beetle, and misnamed as the biscuit weevil (despite not being a true weevil), is a tiny, brown beetle that can be found infesting a wide variety of dried plant products, where it is among the most common non-weevils to be found. It is the only living member of the genus Stegobium. It belongs to the family Ptinidae, which also includes the deathwatch beetle and furniture beetle.
The drugstore beetle has a worldwide distribution though it is more common in warmer climates. It is similar in appearance to the cigarette beetle (Lasioderma serricorne), but is slightly larger (adults can be up to 3.5 mm in length). Additionally, drugstore beetles have antennae ending in 3-segmented clubs, while cigarette beetles have serrated antennae (notched like teeth of a saw). The drugstore beetle also has grooves running longitudinally along the elytra, whereas the cigarette beetle is smooth.
Drugstore beetle | |
---|---|
Scientific classification | |
Kingdom: | Animalia |
Phylum: | Arthropoda |
Class: | Insecta |
Order: | Coleoptera |
Family: | Ptinidae |
Subfamily: | Anobiinae |
Tribe: | Stegobiini |
Genus: | Stegobium Motschulsky, 1860 |
Species: | S. paniceum
|
Binomial name | |
Stegobium paniceum | |
Synonyms | |
|
As its name suggests, the drugstore beetle has a tendency to feed on pharmacological products. This is from its preference of dried herbs and plant material sometimes used as drugs; e.g. drugstore beetles have been known to feed on strychnine, a highly toxic herbal extract. It can also feed on a diverse range of dried foods and spices, as well as hair, leather, books, and museum specimens. The drugstore beetle is also known as the biscuit or bread beetle since it can live on biscuit or bread crumbs.
The oldest known member of the genus is Stegobium raritanensis from the Late Cretaceous (Turonian ~94-90 million years ago) aged New Jersey amber.[5] Another fossil species, Stegobium defunctus is known from the Eocene aged Green River Formation of Wyoming. The oldest records of the beetle as a pest are known from the Bronze Age of Akrotiri, Santorini, Greece around 1500 BC where it was found associated with stored pulses.[6]
https://en.wikipedia.org/wiki/Drugstore_beetle
https://www.wikidata.org/wiki/Q11962
https://en.wikipedia.org/wiki/Category:Beetles_of_South_America
https://en.wikipedia.org/wiki/Help:Taxon_identifiers
Evil, in a general sense, is defined as the opposite or absence of good. It can be an extremely broad concept, although in everyday usage it is often more narrowly used to talk about profound wickedness and against common good. It is generally seen as taking multiple possible forms, such as the form of personal moral evil commonly associated with the word, or impersonal natural evil (as in the case of natural disasters or illnesses), and in religious thought, the form of the demonic or supernatural/eternal.[1] While some religions, world views, and philosophies focus on "good versus evil", others deny evil's existence and usefulness in describing people.
Evil can denote profound immorality,[2] but typically not without some basis in the understanding of the human condition, where strife and suffering (cf. Hinduism) are the true roots of evil. In certain religious contexts, evil has been described as a supernatural force.[2] Definitions of evil vary, as does the analysis of its motives.[3] Elements that are commonly associated with personal forms of evil involve unbalanced behavior including anger, revenge, hatred, psychological trauma, expediency, selfishness, ignorance, destruction and neglect.[4]
In some forms of thought, evil is also sometimes perceived as the dualistic antagonistic binary opposite to good,[5] in which good should prevail and evil should be defeated.[6] In cultures with Buddhist spiritual influence, both good and evil are perceived as part of an antagonistic duality that itself must be overcome through achieving Nirvana.[6] The ethical questions regarding good and evil are subsumed into three major areas of study:[7] meta-ethics concerning the nature of good and evil, normative ethics concerning how we ought to behave, and applied ethics concerning particular moral issues. While the term is applied to events and conditions without agency, the forms of evil addressed in this article presume one or more evildoers.
Etymology
The modern English word evil (Old English yfel) and its cognates such as the German Übel and Dutch euvel are widely considered to come from a Proto-Germanic reconstructed form of *ubilaz, comparable to the Hittite huwapp- ultimately from the Proto-Indo-European form *wap- and suffixed zero-grade form *up-elo-. Other later Germanic forms include Middle English evel, ifel, ufel, Old Frisian evel (adjective and noun), Old Saxon ubil, Old High German ubil, and Gothic ubils.[8]
The root meaning of the word is of obscure origin though shown to be akin to modern German übel (noun: Übel, although the noun evil is normally translated as "das Böse") with the basic idea of social or religious transgression.[citation needed]
https://en.wikipedia.org/wiki/Evil
This article needs additional citations for verification. (May 2023) |
Black-and-white (B&W or B/W) images combine black and white in a continuous spectrum, producing a range of shades of grey.
Media
The history of various visual media began with black and white, and as technology improved, altered to color. However, there are exceptions to this rule, including black-and-white fine art photography, as well as many film motion pictures and art film(s).
Photography
Contemporary use
Since the late 1960s, few mainstream films have been shot in black-and-white. The reasons are frequently commercial, as it is difficult to sell a film for television broadcasting if the film is not in color. 1961 was the last year in which the majority of Hollywood films were released in black and white.[1]
Computing
In computing terminology, black-and-white is sometimes used to refer to a binary image consisting solely of pure black pixels and pure white ones; what would normally be called a black-and-white image, that is, an image containing shades of gray, is referred to in this context as grayscale.[2]
See also
- dr5 chrome
- List of black-and-white films produced since 1966
- Monochromatic color
- Panchromatic film
- Selective color
References
- Renner, Honey (2011). Fifty Shades of Greyscale: A History of Greyscale Cinema, p. 13. Knob Publishers, Nice.
https://en.wikipedia.org/wiki/Black-and-white
Monochrome photography is photography where each position on an image can record and show a different amount of light, but not a different hue. It includes all forms of black-and-white photography, which produce images containing shades of neutral grey ranging from black to white.[1] Other hues besides grey, such as sepia, cyan, blue, or brown can also be used in monochrome photography.[2] In the contemporary world, monochrome photography is mostly used for artistic purposes and certain technical imaging applications, rather than for visually accurate reproduction of scenes.
Description
Although methods for photographing in color emerged slowly starting in the 1850s, monochrome imagery dominated photography until the mid–twentieth century. From the start, photographic recording processes such as the daguerreotype, the paper negative and the glass collodion negative did not render the color of light (although they were sensitive to some colors more than others). The result was a monochrome image.
Until the 1880s, photographic processes used to print negatives — such as calotype, ambrotype, tintype, salt print and the albumen print — generally produced images with a variety of brown or sepia tones. Later processes moved toward a black-and-white image, although photographers have used toning solutions to convert silver in the image to silver sulphide, imparting a brown or sepia tone. Similarly, selenium toner produces a blue-black or purple image by converting silver into more stable silver selenide.[3] Cyanotypes use iron salts rather than silver salts, producing blue images.[2]
Most modern black-and-white films, called panchromatic films, record the entire visible spectrum.[1]: 157 Some films are orthochromatic, recording visible light wavelengths shorter than 590 nanometers,[1]: 158 in the blue to green range of the spectrum and are less sensitive to the longer wavelength range (i.e. orange-red) of the visible spectrum.[4]
Modern techniques and uses
Black-and-white photography is considered by some to be more subtle and interpretive, and less realistic than color photography.[1]: 5 Monochrome images are not direct renditions of their subjects, but are abstractions from reality, representing colors in shades of grey. In computer terms, this is often called greyscale.[5] Black-and-white photography is considered by some to add a more emotional touch to the subject, compared with the original colored photography.[6]
Monochrome images may be produced in a number of ways. Finding and capturing a scene having only variants of a certain hue, while difficult and uncommon in practice, will result in an image that technically qualifies as a monochrome photo.[7] One can also artificially limit the range of color in a photo to those within a certain hue by using black-and-white film or paper, or by manipulating color images using computer software.
Color images can be converted to black and white on the computer using several methods, including desaturating the existing color RGB image so that no color remains visible (which still allows color channels to be manipulated to alter tones such as darkening a blue sky), or by converting the image to a greyscale version (which eliminates the colors permanently), using software programs like Photoshop.[8] After software conversion to a monochrome image, one or more hues can replace the grey tones to emulate duotones, sepia, selenium or gold toned images or cyanotype, calotype or albumen prints.[2][9]
Digital black-and-white cameras
Leica M Monochrom is a digital camera in Leica Camera AG's rangefinder M series, and features a monochrome sensor. The camera was announced in May 2012.
Fujifilm X-Pro1-M is a cheaper option compared to the Leica M Model. It is a digital camera with a removed color sensor to capture monochromatic photographs. The camera was released in March 2012.
Phase One IQ3 100MP Achromatic is a digital medium format camera with an ISO rating exceeding up to 51,200. The camera was released in 2017.[10]
Monochromatic modifiers
The use of the following modifiers can add a different aesthetic to your images without software manipulation, each used for their own unique purposes:[11][12]
- Color Filters
- Neutral Density Filters (Gradual or Standard ND Filters)
- Polarizing Filters
- Infrared Filter
Astrophotography applications
Monochrome imaging for astrophotography is a popular technique among amateur astrophotographers. Modern monochrome cameras dispose of the color bayer matrix that sits in front of the sensor. This allows for specialized narrowband filters to be used, allowing the entire sensor area to be utilized for specific wavelengths of light emitted by many deep space objects. Hydrogen-alpha, a common wavelength used, is red in color. and only the red pixels, approximately 25% of the sensor, will detect this light. In a monochrome camera, the whole sensor can be used to detect this signal. Monochrome photography is also useful in areas of high light pollution.[13]
Image gallery
See also
- List of photographs considered the most important
- Ruh khitch
- Black and white
- Cyanotype
- Ambrotype
- Calotype
References
- Morison, Ian (2017). "The use of narrow band filters such as S II, H-alpha and O III to eliminate light pollution and produce images using the Hubble Palette". Cambridge University Press: 191–198.
https://en.wikipedia.org/wiki/Monochrome_photography
Splitting (also called black-and-white thinking, thinking in extremes or all-or-nothing thinking) is the failure in a person's thinking to bring together the dichotomy of both perceived positive and negative qualities of something into a cohesive, realistic whole. It is a common defense mechanism[1] wherein the individual tends to think in extremes (e.g., an individual's actions and motivations are all good or all bad with no middle ground). This kind of dichotomous interpretation is contrasted by an acknowledgement of certain nuances known as "shades of gray".[2]
Splitting was first described by Ronald Fairbairn in his formulation of object relations theory;[3] it begins as the inability of the infant to combine the fulfilling aspects of the parents (the good object) and their unresponsive aspects (the unsatisfying object) into the same individuals, instead seeing the good and bad as separate. In psychoanalytic theory this functions as a defense mechanism.[4]
https://en.wikipedia.org/wiki/Splitting_(psychology)
The contrast of white and black (light and darkness, day and night) has a long tradition of metaphorical usage, traceable to the Ancient Near East, and explicitly in the Pythagorean Table of Opposites. In Western culture as well as in Confucianism, the contrast symbolizes the moral dichotomy of good and evil.
https://en.wikipedia.org/wiki/Black-and-white_dualism
https://en.wikipedia.org/wiki/List_of_psychological_effects
A false dilemma, also referred to as false dichotomy or false binary, is an informal fallacy based on a premise that erroneously limits what options are available. The source of the fallacy lies not in an invalid form of inference but in a false premise. This premise has the form of a disjunctive claim: it asserts that one among a number of alternatives must be true. This disjunction is problematic because it oversimplifies the choice by excluding viable alternatives, presenting the viewer with only two absolute choices when in fact, there could be many.
For example, a false dilemma is committed when it is claimed that "Stacey spoke out against capitalism; therefore, she must be a communist". One of the options excluded is that Stacey may be neither communist nor capitalist.
False dilemmas often have the form of treating two contraries, which may both be false, as contradictories, of which one is necessarily true. Various inferential schemes are associated with false dilemmas, for example, the constructive dilemma, the destructive dilemma or the disjunctive syllogism. False dilemmas are usually discussed in terms of deductive arguments, but they can also occur as defeasible arguments.
Our liability to commit false dilemmas may be due to the tendency to simplify reality by ordering it through either-or-statements, which is to some extent already built into our language. This may also be connected to the tendency to insist on clear distinction while denying the vagueness of many common expressions.
https://en.wikipedia.org/wiki/False_dilemma
Compartmentalization is a form of psychological defense mechanism in which thoughts and feelings that seem to conflict are kept separated or isolated from each other in the mind.[1] Those with post traumatic stress disorder may use compartmentalization to separate positive and negative self aspects.[2] It may be a form of mild dissociation; example scenarios that suggest compartmentalization include acting in an isolated moment in a way that logically defies one's own moral code, or dividing one's unpleasant work duties from one's desires to relax.[3] Its purpose is to avoid cognitive dissonance, or the mental discomfort and anxiety caused by a person having conflicting values, cognitions, emotions, beliefs, etc. within themselves.
Compartmentalization allows these conflicting ideas to co-exist by inhibiting direct or explicit acknowledgement and interaction between separate compartmentalized self-states.[4]
https://en.wikipedia.org/wiki/Compartmentalization_(psychology)
https://en.wikipedia.org/wiki/Isolation_(psychology)
Dehumanization is the denial of full humanness in others along with the cruelty and suffering that accompany it.[1][2][3] A practical definition refers to it as the viewing and the treatment of other people as though they lack the mental capacities that are commonly attributed to human beings.[4] In this definition, every act or thought that regards a person as "less than" human is dehumanization.[5]
Dehumanization is one form of incitement to genocide.[6] It has also been used to justify war, judicial and extrajudicial killing, slavery, the confiscation of property, denial of suffrage and other rights, and to attack enemies or political opponents.
https://en.wikipedia.org/wiki/Dehumanization
Ambivalence[1] is a state of having simultaneous conflicting reactions, beliefs, or feelings towards some object.[2][3][4][5] Stated another way, ambivalence is the experience of having an attitude towards someone or something that contains both positively and negatively valenced components.[6] The term also refers to situations where "mixed feelings" of a more general sort are experienced, or where a person experiences uncertainty or indecisiveness.
Although attitudes tend to guide attitude-relevant behavior, those held with ambivalence tend to do so to a lesser extent. The less certain an individual is in their attitude, the more impressionable it becomes, hence making future actions less predictable and/or less decisive.[7] Ambivalent attitudes are also more susceptible to transient information (e.g., mood), which can result in a more malleable evaluation.[7][8] However, since ambivalent people think more about attitude-relevant information, they also tend to be more persuaded by (compelling) attitude-relevant information than less-ambivalent people.[9]
Explicit ambivalence may or may not be experienced as psychologically unpleasant when the positive and negative aspects of a subject are both present in a person's mind at the same time.[10][11] Psychologically uncomfortable ambivalence, also known as cognitive dissonance, can lead to avoidance, procrastination, or to deliberate attempts to resolve the ambivalence.[12] People experience the greatest discomfort from their ambivalence at the time when the situation requires a decision to be made.[13] People are aware of their ambivalence to varying degrees, so the effects of an ambivalent state vary across individuals and situations. For this reason, researchers have considered two forms of ambivalence, only one of which is subjectively experienced as a state of conflict.[4]
https://en.wikipedia.org/wiki/Ambivalence
https://en.wikipedia.org/wiki/Cognitive_distortion
https://en.wikipedia.org/wiki/Cognitive_restructuring
In social psychology, collective narcissism (or group narcissism) is the tendency to exaggerate the positive image and importance of a group to which one belongs.[1][2] The group may be defined by ideology, race, political beliefs/stance, religion, sexual orientation, social class, language, nationality, employment status, education level, cultural values, or any other ingroup.[1][2] While the classic definition of narcissism focuses on the individual, collective narcissism extends this concept to similar excessively high opinions of a person's social group, and suggests that a group can function as a narcissistic entity.[1]
Collective narcissism is related to ethnocentrism. While ethnocentrism is an assertion of the ingroup's supremacy, collective narcissism is a self-defensive tendency to invest unfulfilled self-entitlement into a belief in an ingroup's uniqueness and greatness. Thus, the ingroup is expected to become a vehicle of actualisation of frustrated self-entitlement.[2] In addition, ethnocentrism primarily focuses on self-centeredness at an ethnic or cultural level, while collective narcissism is extended to any type of ingroup.[1][3]
Collective narcissism is associated with intergroup hostility.[2]
https://en.wikipedia.org/wiki/Collective_narcissism
https://en.wikipedia.org/wiki/In-group_favoritism
https://en.wikipedia.org/wiki/Idealization_and_devaluation
Psychoanalytic theory posits that an individual unable to integrate difficult feelings mobilizes specific defenses to overcome these feelings, which the individual perceives to be unbearable. The defense that effects (brings about) this process is called splitting. Splitting is the tendency to view events or people as either all bad or all good.[1] When viewing people as all good, the individual is said to be using the defense mechanism idealization: a mental mechanism in which the person attributes exaggeratedly positive qualities to the self or others. When viewing people as all bad, the individual employs devaluation: attributing exaggeratedly negative qualities to the self or others[citation needed].
In child development, idealization and devaluation are quite normal. During the childhood development stage, individuals become capable of perceiving others as complex structures, containing both good and bad components. If the development stage is interrupted (by early childhood trauma, for example), these defense mechanisms may persist into adulthood.
https://en.wikipedia.org/wiki/Idealization_and_devaluation
Secondary narcissism
According to Freud, secondary narcissism occurs when the libido withdraws from objects outside the self, above all the mother, producing a relationship to social reality that includes the potential for megalomania. 'This megalomania has no doubt come into being at the expense of object-libido....This leads us to look upon the narcissism which arises through the drawing on of object-cathexes as a secondary one, superimposed upon a primary narcissism'.[20] For Freud, while both primary and secondary narcissism emerge in normal human development, problems in the transition from one to the other can lead to pathological narcissistic disorders in adulthood.
"This state of secondary narcissism constituted object relations of the narcissistic type", according to Freud, something he went on to explore further in Mourning and Melancholia—considered one of Freud's most profound contributions to object relations theory, and constituting the overall principles of object relations and narcissism as concepts.[21]
https://en.wikipedia.org/wiki/History_of_narcissism#Secondary_narcissism
https://en.wikipedia.org/wiki/Defence_mechanism
Karen Horney saw narcissism quite differently from Freud, Kohut and other mainstream psychoanalytic theorists in that she did not posit a primary narcissism but saw the narcissistic personality as the product of a certain kind of early environment acting on a certain kind of temperament. For her, narcissistic needs and tendencies are not inherent in human nature.
Narcissism is different from Horney's other major defensive strategies or solutions in that it is not compensatory. Self-idealization is compensatory in her theory, but it differs from narcissism. All the defensive strategies involve self-idealization, but in the narcissistic solution it tends to be the product of indulgence rather than of deprivation. The narcissist's self-esteem is not strong, however, because it is not based on genuine accomplishments.[22]
https://en.wikipedia.org/wiki/History_of_narcissism#Secondary_narcissism
Objectivism is a philosophical system developed by Russian-American writer and philosopher Ayn Rand. She described it as "the concept of man as a heroic being, with his own happiness as the moral purpose of his life, with productive achievement as his noblest activity, and reason as his only absolute".[1]
https://en.wikipedia.org/wiki/Objectivism
Malignant narcissism is a psychological syndrome comprising an extreme mix of narcissism, antisocial behavior, aggression, and sadism.[1] Grandiose, and always ready to raise hostility levels, the malignant narcissist undermines families and organizations in which they are involved, and dehumanizes the people with whom they associate.[2]
Malignant narcissism is not a diagnostic category, but a subcategory of narcissism. Narcissistic personality disorder (NPD) is found in the Diagnostic and Statistical Manual of Mental Disorders (DSM-IV-TR), while malignant narcissism is not. Malignant narcissism could include aspects of narcissistic personality disorder (NPD) alongside a mix of antisocial, paranoid and sadistic personality disorder traits. The importance of malignant narcissism and of projection as a defense mechanism has been confirmed in paranoia, as well as "the patient's vulnerability to malignant narcissistic regression".[3] A person with malignant narcissism exhibits paranoia in addition to the symptoms of a Narcissistic Personality Disorder. Because a malignant narcissist's personality cannot tolerate any criticism, being mocked typically causes paranoia.[4]
https://en.wikipedia.org/wiki/Malignant_narcissism
Narcissistic mortification is "the primitive terror of self dissolution, triggered by the sudden exposure of one's sense of a defective self ... it is death by embarrassment".[1] Narcissistic mortification is a term first used by Sigmund Freud in his last book, Moses and Monotheism,[2] with respect to early injuries to the ego/self. The concept has been widely employed in ego psychology and also contributed to the roots of self psychology.
When narcissistic mortification is experienced for the first time, it may be defined as a sudden loss of control over external or internal reality, or both. This produces strong emotions of terror while at the same time narcissistic libido (also known as ego-libido) or destrudo is built up.[3] Narcissistic libido or ego-libido is the concentration of libido on the self. Destrudo is the opposite of libido and is the impulse to destroy oneself and everything associated with oneself.
https://en.wikipedia.org/wiki/Narcissistic_mortification
In law, an entitlement is a provision made in accordance with a legal framework of a society. Typically, entitlements are based on concepts of principle ("rights") which are themselves based in concepts of social equality or enfranchisement.
In psychology, entitlement mentality is defined as a sense of deservingness or being owed a favor when little or nothing has been done to deserve special treatment.[1]
https://en.wikipedia.org/wiki/Entitlement
Hubris (/ˈhjuːbrɪs/; from Ancient Greek ὕβρις (húbris) 'pride, insolence, outrage'), or less frequently hybris (/ˈhaɪbrɪs/),[1] describes a personality quality of extreme or excessive pride[2] or dangerous overconfidence,[3] often in combination with (or synonymous with) arrogance.[4] The term arrogance comes from the Latin adrogare, meaning "to feel that one has a right to demand certain attitudes and behaviors from other people". To arrogate means "to claim or seize without justification... To make undue claims to having",[5] or "to claim or seize without right... to ascribe or attribute without reason".[6] The term pretension is also associated with the term hubris, but is not synonymous with it.[7][need quotation to verify]
According to studies, hubris, arrogance, and pretension are related to the need for victory (even if it does not always mean winning) instead of reconciliation, which "friendly" groups might promote.[8] Hubris is usually perceived as a characteristic of an individual rather than a group, although the group the offender belongs to may suffer collateral consequences from wrongful acts. Hubris often indicates a loss of contact with reality and an overestimation of one's own competence, accomplishments, or capabilities. The adjectival form of the noun hubris/hybris is hubristic/hybristic.[1]
The term hubris originated in Ancient Greek,[9] where it had several different meanings depending on the context. In legal usage, it meant assault or sexual crimes and theft of public property,[10] and in religious usage it meant transgression against a god.[11]
https://en.wikipedia.org/wiki/Hubris
https://en.wikipedia.org/wiki/Arrogance_(disambiguation)
The Oxford English Dictionary defines "arrogance" in terms of "high or inflated opinion of one's own abilities, importance, etc., that gives rise to presumption or excessive self-confidence, or to a feeling or attitude of being superior to others [...]."[24] Adrian Davies sees arrogance as more generic and less severe than hubris.[25]
https://en.wikipedia.org/wiki/Hubris
Elitism is the belief or notion that individuals who form an elite—a select group of people perceived as having an intrinsic quality, high intellect, wealth, power, notability, special skills, or experience—are more likely to be constructive to society as a whole, and therefore deserve influence or authority greater than that of others.[1] The term elitism may be used to describe a situation in which power is concentrated in the hands of a limited number of people. Beliefs that are in opposition to elitism include egalitarianism, anti-intellectualism, populism, and the political theory of pluralism.
Elite theory is the sociological or political science analysis of elite influence in society: elite theorists regard pluralism as a utopian ideal.
Elitism, closely related to social class and what sociologists term "social stratification". In modern Western societies, social stratification is typically defined in terms of three distinct social classes: the upper class, the middle class, and the lower class.[2]
Some synonyms for "elite" might be "upper-class" or "aristocratic", indicating that the individual in question has a relatively large degree of control over a society's means of production. This includes those who gain this position due to socioeconomic means and not personal achievement. However, these terms are misleading when discussing elitism as a political theory, because they are often associated with negative "class" connotations and fail to appreciate a more unbiased exploration of the philosophy.[3]
https://en.wikipedia.org/wiki/Elitism
Social stratification refers to a society's categorization of its people into groups based on socioeconomic factors like wealth, income, race, education, ethnicity, gender, occupation, social status, or derived power (social and political). As such, stratification is the relative social position of persons within a social group, category, geographic region, or social unit.[1][2][3]
https://en.wikipedia.org/wiki/Social_stratification
A caste is a fixed social group into which an individual is born within a particular system of social stratification: a caste system. Within such a system, individuals are expected to: marry exclusively within the same caste (endogamy), follow lifestyles often linked to a particular occupation, hold a ritual status observed within a hierarchy, and interact with others based on cultural notions of exclusion, with certain castes considered as either more pure or more polluted than others.[1][2][3] Its paradigmatic ethnographic example is the division of India's Hindu society into rigid social groups, with roots in south Asia's ancient history and persisting to the present time.[1][4] However, the economic significance of the caste system in India has been declining as a result of urbanisation and affirmative action programs. A subject of much scholarship by sociologists and anthropologists, the Hindu caste system is sometimes used as an analogical basis for the study of caste-like social divisions existing outside Hinduism and India. The term "caste" is also applied to morphological groupings in eusocial insects such as ants, bees, and termites.[5]
The Basor weaving bamboo baskets in a 1916 book. The Basor are a Scheduled Caste found in the state of Uttar Pradesh in India.https://en.wikipedia.org/wiki/Caste
Legal anthropology, also known as the anthropology of laws, is a sub-discipline of anthropology follows inter disciplinary approach which specializes in "the cross-cultural study of social ordering".[1] The questions that Legal Anthropologists seek to answer concern how is law present in cultures? How does it manifest? How may anthropologists contribute to understandings of law?
https://en.wikipedia.org/wiki/Legal_anthropology
Author | Henry Maine |
---|---|
Country | United Kingdom |
Language | English |
Genre | Law, History |
Publisher | John Murray |
Publication date | 1861 |
Media type | |
Pages | 260 |
ISBN | 978-1596052260 |
Ancient Law is a book by Henry James Sumner Maine. It was first published in octavo in 1861.[1] The book went through twelve editions during the lifetime of the author.[2] The twelfth edition was published in 1888.[3] A new edition, with notes by Frederick Pollock, was published in octavo in 1906.[4][5][6]
Lectures delivered by Maine for the Inns of Court were the groundwork for Ancient Law. Its object, as stated in the preface, was "to indicate some of the earliest ideas of mankind, as they are reflected in ancient law, and to point out the relation of those ideas to modern thought.[7]
https://en.wikipedia.org/wiki/Ancient_Law
Legal pluralism is the existence of multiple legal systems within one society and/or geographical area. Plural legal systems are particularly prevalent in former colonies, where the law of a former colonial authority may exist alongside more traditional legal systems (customary law). In postcolonial societies a recognition of pluralism may be viewed as a roadblock to nation-building and development. Anthropologists view legal pluralism in the light of historical struggles over sovereignty, nationhood and legitimacy.[1]
https://en.wikipedia.org/wiki/Legal_pluralism
Scholars of the sociology of knowledge note that social and power relations can both be created by the definition of knowledge, and influence how knowledge is created.
Scholars have argued that law provides a set of categories and relations through which to see the social world.[9]: 54 [14]: 8 Individuals themselves (rather than legal professionals) will try to frame their problems in legalistic terms to resolve them.[14]: 130 Boaventura de Sousa Santos argues that these legal categories can distort reality, Yngvesson argues that the definitions themselves can create power imbalances.[9]: 64
https://en.wikipedia.org/wiki/Legal_anthropology
Regarding law, in Anthropology's characteristically self-conscious manner, the comparative analysis inherent to Legal Anthropology has been speculated upon and most famously debated by Paul Bohannan and Max Gluckman. The discourse highlights one of the primary differences between British and American Anthropology regarding fieldwork approaches and concerns the imposition of Western terminology as ethnological categories of differing societies.[15]
Each author's uses the Case Study Approach, however, the data's presentation in terms of achieving comparativeness is a point of contention between them.
Paul Bohannan promotes the use of native terminology presented with ethnographic meaning as opposed to any Universal categories, which act as barriers to understanding the true nature of a culture's legal system.
Advocating that it is better to appreciate native terms in their own medium, Bohannan critiques Gluckman's work for its inherent bias.
Gluckman has argued that Bohannan's excessive use of native terminology creates barriers when attempting to achieve comparative analysis. He in turn has suggested that in order to further the cross-cultural comparative study of law, we should use English terms and concepts of law which will aid in the refinement of dispute facts and interrelations [16] Thus, all native terms should be described and translated into an Anglo-American conceptual equivalent for the purpose of comparison.
https://en.wikipedia.org/wiki/Legal_anthropology
Forensic anthropology is the application of the anatomical science of anthropology and its various subfields, including forensic archaeology and forensic taphonomy,[1] in a legal setting. A forensic anthropologist can assist in the identification of deceased individuals whose remains are decomposed, burned, mutilated or otherwise unrecognizable, as might happen in a plane crash. Forensic anthropologists are also instrumental in the investigation and documentation of genocide and mass graves. Along with forensic pathologists, forensic dentists, and homicide investigators, forensic anthropologists commonly testify in court as expert witnesses. Using physical markers present on a skeleton, a forensic anthropologist can potentially determine a person's age, sex, stature, and race. In addition to identifying physical characteristics of the individual, forensic anthropologists can use skeletal abnormalities to potentially determine cause of death, past trauma such as broken bones or medical procedures, as well as diseases such as bone cancer.
The methods used to identify a person from a skeleton relies on the past contributions of various anthropologists and the study of human skeletal differences. Through the collection of thousands of specimens and the analysis of differences within a population, estimations can be made based on physical characteristics. Through these, a set of remains can potentially be identified. The field of forensic anthropology grew during the twentieth century into a fully recognized forensic specialty involving trained anthropologists as well as numerous research institutions gathering data on decomposition and the effects it can have on the skeleton.
https://en.wikipedia.org/wiki/Forensic_anthropology
A mass grave is a grave containing multiple human corpses, which may or may not be identified prior to burial. The United Nations has defined a criminal mass grave as a burial site containing three or more victims of execution,[1] although an exact definition is not unanimously agreed upon.[2][3][4] Mass graves are usually created after many people die or are killed, and there is a desire to bury the corpses quickly for sanitation concerns. Although mass graves can be used during major conflicts such as war and crime, in modern times they may be used after a famine, epidemic, or natural disaster. In disasters, mass graves are used for infection and disease control. In such cases, there is often a breakdown of the social infrastructure that would enable proper identification and disposal of individual bodies.[5]
https://en.wikipedia.org/wiki/Mass_grave
Part of a series on |
Genocide |
---|
Issues |
18th / 19th / early 20th century genocides |
|
Late Ottoman genocides |
|
World War II (1939–1945) |
Cold War |
|
Contemporary ethno-religious genocides |
|
Related topics |
Category |
Genocide is the intentional destruction of a people[a] in whole or in part. Raphael Lemkin coined the term in 1944,[1][2] combining the Greek word γένος (genos, "race, people") with the Latin suffix -caedo ("act of killing").[3]
In 1948, the United Nations Genocide Convention defined genocide as any of five "acts committed with intent to destroy, in whole or in part, a national, ethnical, racial or religious group." These five acts were: killing members of the group, causing them serious bodily or mental harm, imposing living conditions intended to destroy the group, preventing births, and forcibly transferring children out of the group. Victims are targeted because of their real or perceived membership of a group, not randomly.[4][5]
The Political Instability Task Force estimated that 43 genocides occurred between 1956 and 2016, resulting in about 50 million deaths.[6] The UNHCR estimated that a further 50 million had been displaced by such episodes of violence up to 2008.[6] Genocide, especially large-scale genocide, is widely considered to signify the epitome of human evil.[7] As a label, it is contentious because it is moralizing,[8] and has been used as a type of moral category since the late 1990s.[9]
https://en.wikipedia.org/wiki/Genocide
Part of a series on |
Paleontology |
---|
|
|
|
|
|
|
|
Paleontology Portal Category |
Taphonomy is the study of how organisms decay and become fossilized or preserved in the paleontological record. The term taphonomy (from Greek táphos, τάφος 'burial' and nomos, νόμος 'law') was introduced to paleontology in 1940[1] by Soviet scientist Ivan Efremov to describe the study of the transition of remains, parts, or products of organisms from the biosphere to the lithosphere.[2][3]
The term taphomorph is used to describe fossil structures that represent poorly-preserved, deteriorated remains of a mixture of taxonomic groups, rather than of a single one.
https://en.wikipedia.org/wiki/Taphonomy
Archaeology or archeology[a] is the study of human activity through the recovery and analysis of material culture. The archaeological record consists of artifacts, architecture, biofacts or ecofacts, sites, and cultural landscapes. Archaeology can be considered both a social science and a branch of the humanities.[1][2] It is usually considered an independent academic discipline, but may also be classified as part of anthropology (in North America – the four-field approach), history or geography.[3]
https://en.wikipedia.org/wiki/Archaeology
The determination of an individual's age by anthropologists depends on whether or not the individual was an adult or a child. The determination of the age of children, under the age of 21, is usually performed by examining the teeth.[37] When teeth are not available, children can be aged based on which growth plates are sealed. The tibia plate seals around age 16 or 17 in girls and around 18 or 19 in boys. The clavicle is the last bone to complete growth and the plate is sealed around age 25.[38] In addition, if a complete skeleton is available anthropologists can count the number of bones. While adults have 206 bones, the bones of a child have not yet fused resulting in a much higher number.
The aging of adult skeletons is not as straightforward as aging a child's skeleton as the skeleton changes little once adulthood is reached.[39] One possible way to estimate the age of an adult skeleton is to look at bone osteons under a microscope. New osteons are constantly formed by bone marrow even after the bones stop growing. Younger adults have fewer and larger osteons while older adults have smaller and more osteon fragments.[38] Another potential method for determining the age of an adult skeleton is to look for arthritis indicators on the bones. Arthritis will cause noticeable rounding of the bones.[40] The degree of rounding from arthritis coupled with the size and number of osteons can help an anthropologist narrow down a potential age range for the individual.
https://en.wikipedia.org/wiki/Forensic_anthropology
The estimation of stature by anthropologists is based on a series of formulas that have been developed over time by the examination of multiple different skeletons from a multitude of different regions and backgrounds. Stature is given as a range of possible values, in centimeters, and typically computed by measuring the bones of the leg. The three bones that are used are the femur, the tibia, and the fibula.[33] In addition to the leg bones, the bones of the arm, humerus, ulna, and radius can be used.[34] The formulas that are used to determine stature rely on various information regarding the individual. Sex, ancestry, and age should be determined before attempting to ascertain height, if possible. This is due to the differences that occur between populations, sexes, and age groups.[35] By knowing all the variables associated with height, a more accurate estimate can be made. For example, a male formula for stature estimation using the femur is 2.32 × femur length + 65.53 ± 3.94 cm. A female of the same ancestry would use the formula, 2.47 × femur length + 54.10 ± 3.72 cm.[36] It is also important to note an individual's approximate age when determining stature. This is due to the shrinkage of the skeleton that naturally occurs as a person ages. After age 30, a person loses approximately one centimeter of their height every decade.[33]
https://en.wikipedia.org/wiki/Forensic_anthropology
The history of anthropometry includes its use as an early tool of anthropology, use for identification, use for the purposes of understanding human physical variation in paleoanthropology and in various attempts to correlate physical with racial and psychological traits. At various points in history, certain anthropometrics have been cited by advocates of discrimination and eugenics often as part of novel social movements or based upon pseudoscience.
https://en.wikipedia.org/wiki/History_of_anthropometry
No comments:
Post a Comment