Surviving World War III
The explosion of the atom bomb over Nagasaki in 1945 brought a dramatic end to World War II and in so doing, highlighted the role of science and technology in the victory. Following this, the age-old ritual of beating swords into plowshares turned scientific and technical advances to peaceful ends. As the scope of applied and basic entomology grew under the stimulus of post-World War II goals, the two primary entomological societies in the United States, the American Association of Economic Entomologists (AAEE) and the ESA, recognized two common needs. They needed professional In medical and veterinary entomology, the post-World War II experience with the miracle insecticides paralleled the experience with agricultural pests. First, there was euphoria following the miraculous effectiveness of the insecticides. So promising were the prospects that in 1955 the World Health Organization (WHO) proposed global eradication of malaria. However, the development of resistant strains of vectors...
The arrangement involved the agricultural constituency, lending political support to the agricultural colleges in return for their services. The colleges then aided the chemical industry by testing their products and giving their stamp of approval, which enhanced their marketability. A grateful chemical industry provided grants to the entomology departments, which were always short of operational funds. The deans at the agricultural colleges had the difficult task of being broker between the college faculty, with its leaning toward basic research, and the farm constituency seeking low-risk pest control programs. The arrangement was an American innovation that seemed to please everyone. Furthermore, the chemical industry was greatly stimulated by the economic and political activities of World War I. Food and fiber production was given high priority and new discoveries advanced the pesticide industry. The period from 1880 to 1940 witnessed the maturing of the Agricultural Experiment...
The American Board of Professional Psychology (ABPP) is an independent credentialing organization that certifies psychologists as having met advanced standards for specialty practice in professional psychology. The ABPP (originally called the American Board of Examiners in Professional Psychology) was developed in 1947 because psychological services were needed for veterans following World War II, and it was thought that some means to indicate to the public which psychologists were qualified as practitioners should be established. Psychological licensure by state psychology boards did not become common until the 1950s and 1960s. Although the ABPP Diploma could have been the credential upon which to base standards for state licensure, it became a credential used to certify psychologists at a high level of practice (Pryzwansky, 1998). The ABPP grants diplomas in 13 areas of practice Behavioral, Business and Consulting, Clinical, Clinical Child, Clinical Health, Clinical Neuropsychology,...
Aerobic fermentation in surface culture. Continuous demands for increased productivity of aerobic industrial fermentations for baker's yeast and organic acid production led to the use of deep tank vessels sparged with large quantities of sterile air. Development of the penicillin process during the Second World War reaffirmed the development of industrial fermentation. However, the postwar period resulted in a temporary setback due to the growth of the petrochemical industry, and the availability of inexpensive chemicals for cheaper production of organic acids and solvents by chemical synthesis. Toward the end of the twentieth century, a better understanding of molecular biology led to the development of efficient fermentation systems for production of biochemicals. Presently, fermentation technology has gained increased credence due to the power of microbe design biotechnology, perceived hazards to people and the environment of chemical synthesis, and better economics from use of...
Traditional microbial biotechnology began during the first World War when the development of acetone, butanol, and glycerol fermentations took place (149). Microbial primary metabolites used in the food and feed industries include alcohols (ethanol), amino acids (monosodium glutamate, lysine, threonine, phenylalanine, and tryptophan), flavor nucleo-tides (5'-guanylic acid, 5'-inosinic acid), organic acids (acetic, propionic, succinic, fumaric, and lactic), polyols (glycerol, mannitol, erythritol, and xylitol), polysaccharides (xanthan and gelan), sugars (fructose, ribose, and sorbose), and vitamins (riboflavin, cya-nocobalamin, and biotin). The group of microbially produced secondary metabolites important for health and nutrition includes antibiotics, other medicinals, toxins, biopesticides, and animal and plant growth factors (150). The targets uses of antibiotics (the best known group of secondary fermentation metabolites) include DNA replication (actinomy-cin, bleomycin, and...
Normally, the process of separating the cells from the environmental or clinical matrix is conducted in a laboratory. This step is important, both because major enzymatic inhibitors can be located in the matrix (14) and because of the loss of sensitivity and specificity if the DNA is isolated directly from the matrix. The sensitivity issue is of particular importance in monitoring or diagnosis of harmful or pathogenic bacteria. Microorganisms may form biofilms that are tightly attached to a surface. Critical steps are the separation of the organisms from the matrix. For soil samples, the separation of the microorganisms from the matrix can be a particular problem. The microbial cells may be tightly associated with the soil matrix, as is the case for clay particles, where the microorganisms may be bound to the particles through ionic interaction (15). Most of the methods for sample preparation from soil are thus based on direct lysis approaches (16). Recently, there has been an...
The new definition of randomness has its heritage in information theory, the science, developed mainly since World War II, that studies the transmission of messages. Suppose you have a friend who is visiting a planet in another galaxy, and sending him telegrams is very expensive. He forgot to take along his tables of trigonometric functions and he has asked you to supply them. You could simply translate the numbers into an appropriate code (such as the binary numbers) and transmit them directly, but even the most modest tables of the six functions have a few thousand digits, so the cost would be high. A much cheaper way to convey the
Broca's and Wernicke's seminal studies led to the golden age of the study of brain-behavior relationships called neuropsychology. This golden age lasted until the First World War, and then there was a shift in position to the mass action or nonlocalization hypothesis. The reason for the decline of the localizationist approach is not fully known, but there were probably two major factors. The first was a change in the political-philosophical Zeitgeist. Most of the early localizationist work was done on the European continent, primarily in France and Germany. After the First World War these continental European powers lost much of their power and their influence on Western thought, but the English-speaking countries such as the United States and the United Kingdom flourished. The Anglo-American social and political systems were strongly influenced by the philosophic writings of John Locke, who proposed that the brain was like a tabula rasa or a blank wax tablet. Unlike the modularity...
Dengue fever is a very old disease the earliest record of a dengue-like illness found to date is in a Chinese encyclopedia of disease symptoms and remedies, first published during the Chin Dynasty (265-420 AD) and formally edited in 610 AD (Tang Dynasty) and again in 992 during the Northern Sung Dynasty. There are reports of epidemics of dengue-like illnesses in the French West Indies in 1635 and in Panama in 1699. By the late 1700s, the disease had a worldwide distribution in the tropics, with epidemics of a clinically compatible disease occurring in 1779 in Batavia (Jakarta), Indonesia and Cairo, Egypt, and in 1780 in Philadelphia, Pennsylvania, USA. From the late 1700s to World War II, repeated epidemics of denguelike illness occurred in most tropical and subtropical regions of the world at 10- to 30-year intervals. There is no documentation, however, that dengue viruses were responsible for all of these epidemics because diagnosis was based only on clinical reports. Clinical...
International organization is the key to many of these elements. There would be value in designing a single overarching global organization to deal with the emerging and re-emerging viral disease problems of humans and the similar problems of interest in animal agriculture (the many viral diseases of livestock, poultry, fish and shellfish), crop agriculture (the many viral diseases of commercially grown food and fiber plants) and national and international security and law enforcement agencies (the viral diseases involved in biological warfare threats from rogue governments and bioterrorism from the same sources as well as amateur groups).
Human rights principles into the implementation of the regime. The classical regime originally developed before the post-World War II human rights revolution in international law and international relations. Revisions of the classical regime in the form of the IHR in 1951 and 1969 did not, however, link international control of infectious disease with the growing body of international human rights law. The IHR 2005 achieve this linkage. The IHR 2005 state that '' t he implementation of these Regulations shall be with full respect for the dignity, human rights and fundamental freedoms of persons'' (Article 3.1), and the regulations contain provisions that apply human rights principles to actions states parties might take.
Extensive research on World War II and Korean War casualties (32) has enabled investigators to develop methods of estimating stature based on measurements of long bones. The length of the femur is the most reliable basis for calculating stature (28). The tibia is also useful, but there has been some controversy over the accuracy of tibial measurements, particularly the most appropriate location of the more distal measuring point. Apparently, the plafond of the tibia is the preferred site of measurement rather than the tip of the medial malleolus (33). The tables and equations furnished to estimate stature from long-bone measurements are based on direct measurements of defleshed or skeletonized specimens (Table 2). However, measurements from radiographs can be
Health insurance developed in the pre- and post-World War II period well before medical research began generating a continuing stream of new medical interventions. In recent years, insurers' response to new treatments has become a continuing challenge. Medical necessity provisions not only have anchored that response but also have revealed major problems with that reliance. Bergthold (1995) described medical necessity as rarely defined, largely unexamined, generally misunderstood, and idiosyncratically applied in medical and insurance practice (p. 181).
The pale, cold skin characteristic of hypovolemic shock reflects a rise in cutaneous vascular resistance that appears to help support arterial blood pressure. During World War I, it was noticed that men rescued quickly and warmed in blankets (producing cutaneous vasodilation)
Expansion rates for saprotrophic basidiomycetes with non-unit-restricted growth is in the order of 0.3-1.5 m-1 year-1 (Hansen and Hamelin, 1999). To get a reliable measure of the expansion rate an independent measure of age is needed. This is hard to achieve from markers within the mycelium and has to be sought from known historical events. Normally the maximum age can be calculated but there are also a number of cases where minimum age has been inferred from forestry operations or construction of roads etc. For example, genets of Armillaria were estimated to be older than a road that crossed through its distribution (Kile, 1983 Lygis et al., 2005), and the age of H. annosum was dated from the onset of thinning operations in previously untouched stands (Bendz-Hellgren et al., 1999). Another possibility may be to use the spike in 14C from nuclear weapon trials date biological material to certain years (Levin and Kromer, 2004). This requires that mycelial structures are built during...
In Germany, on the eve of World War II, Ruska and his colleagues had produced electron microscopic photographs of the particles of TMV, bacteriophages and poxviruses, noting the extraordinary 'sperm-shaped' particles of bacteriophage. Technical improvements in instrumentation after the war, and the introduction of negative staining for studying the structure of TMV and other viruses by Cambridge scientists Huxley in 1957 and Brenner and Home in 1959 resulted in photographs of the particles of virtually every known kind of virus. These demonstrated the variety of their size, shape and structure, and the presence of common features such as the helical particles of many plant viruses and the icosahedral symmetry of many other viruses.
At the same time that they provide beneficial genetic counseling to patients and their families, professionals providing such a service must have a full understanding of the dangers of eugenics. The abuse of genetic information has led to many atrocities in the past. In Germany, the Nazis murdered nearly 7 million genetically defective people during World War II and forcibly sterilized nearly half a million others, all in the name of eugenics a policy that calls for the systematic elimination of unfit members of the population. The United States also has a checkered past with respect to eugenics. In the early twentieth century, the United States passed laws allowing sterilization of the mentally handicapped and limiting the number of genetically inferior ethnic groups that were allowed to immigrate.
It is essential, nevertheless, to recognize that deep ambivalence exists within medicine, including oncology, toward randomized clinical trials. Harry Marks (1997) provided a historical account of the struggle to establish the importance of randomized clinical trials in medicine after World War II. Barron Lerner (2001)
The Nuremberg trials and subsequent Nuremberg Code on Medical Intervention and Experimentation focused professional and public attention on the issue of informed consent. The trials revealed the horrific and inhumane practices of many health care professionals during World War II under the guise of treatment and research
This virus was discovered in Europe when corn hybrids were introduced from the US after World War II. In 1949, in Italy, a severe outbreak threatened the maize cultivation, lowering the yield by 40 . Later, MRDV was reported in several European countries, where it had the potential to be economically damaging. In China, a similar disease was found to be caused by rice black-streaked dwarf virus (RBSDV), not by MRDV. In young field-grown corn, symptoms caused by MRDV are dark green color of leaves, stunting and irregular swellings of veins (enations) along the lower surfaces of leaves and sometimes also of leaf sheaths, ligules, and husks. The enations are rough to the touch, hence the disease name. Plants are stunted with increased girth giving the plant a 'leek' aspect (Figure 1(b)). Short chlorotic streaks develop on mature leaves and coalesce into yellowish green stripes parallel to the veins. Later the leaves can turn reddish. Tassels are sterile. The root system is reduced, roots...
One of the greatest successes of vaccination has been with smallpox. This once devastating disease killed millions in the Old World and scarred countless others. It was then brought to the New World, where it killed millions of Native Americans, in some cases wiping out entire cultures. In fact, the infection was spread intentionally to some tribes by European invaders who distributed infected blankets (an early example of biological warfare). Vaccination with material from the pustules (pox) of victims was practiced in Asia for at least several centuries, but sometimes lead to serious infection. In 1798, Edward Jenner reported on his experiments and observations in England involving cowpox, a related but mild disease of cows and milkmaids. He developed a vaccine based on this virus that provided immunity to smallpox. Vaccination was so successful that by 1966, the World Health Organization undertook a program to eradicate smallpox worldwide. In part because humans are the only known...
At the heart of his theory of cognitive development was the notion that the child passed through a set of ordered, qualitatively different stages. Intellectually, the child was not seen as a young adult, but rather, as one employing very different cognitive structures and processes. The stage theory was first expressed in a series of lectures presented to French scholars during the Second World War (Brainerd, 2003, p. 257) and subsequently in a series of publications, most notably, The Psychology of Intelligence, published in 1950.
An indefatigable animal advocate and campaigner whose activism dominated the British scene during the first half of the 20th century, Emilia Augusta Louise Lind-af-Hageby (1878-1963) stood at the center of one of the most contentious episodes in the history of antivivisectionism,* the Brown Dog Incident. In 1901, Lind-af-Hageby and her friend Leisa Schartau enrolled at the London School of Medicine for Women to seek medical degrees in order to fight vivisection. The two recorded their experiences in diaries and later exposed the fact that a brown terrier dog had, in contravention of the Cruelty to Animals Act, been vivisected, revived, and used in another procedure. Lind-af-Hageby and her codefendent Stephen Coleridge lost the court case stemming from the publication of her work The Shambles of Science, but their efforts galvanized a coalition of antivivisectionists, trade unionists, and suffragettes who confronted medical students in the streets of Battersea, where a statue...
Crick was born in Northampton, England, in 1916. He studied physics at University College in London until the outbreak of the Second World War. He then joined the British Admiralty Research Laboratory, where he contributed to the development of radar for tracking enemy planes, and magnetic mines used in naval warfare.
The original impetus for the Human Genome Project came almost a decade earlier, however, from the U.S. Department of Energy (DOE) shortly after World War II. The atomic bombs that were dropped on Hiroshima and Nagasaki, Japan, left many survivors who had been exposed to high levels of radiation. The survivors of the bomb were stigmatized in Japan. They were considered poor marriage prospects, because of the potential for carrying mutations, and the rest of Japanese society often ostracized them. In 1946 the famous geneticist and Nobel laureate Hermann J. Muller wrote in the New York Times that if they could foresee the results mutations among their descendants 1,000 years from now . . . , they might consider themselves more fortunate if the bomb had killed them.
One technology that is only 'emerging' because of its limited application in the market is that of irradiation. Treatment of foods with ionizing radiation has been researched for decades. A report by the World Health Organization concluded that food irradiated to any dose to achieve the intended technological objective is safe to consume and nutritionally adequate (WHO, 1999). However, consumers associate the process with the negative effects of radiation on humans resulting from atomic bombs and the fear of nuclear war and accidents at nuclear power facilities such as those at Chernobyl and Three Mile Island. Activists have viewed the process as a way to mask contamination, and they claim it destroys nutrients and creates harmful chemicals. Consumer misconceptions about the
Hantavirus infections are not new to humankind. The first description of a hemorrhagic fever with renal syndrome (HFRS)-like disease can be found in a Chinese medical account written in about AD 960 and the earliest definite description of HFRS comes from Far East Russian clinical records dating back to 1913. During World Wars I and II, HFRS became an important military problem for example, 'field nephritis' in Flanders during World War I may well have been caused by a hantavirus. In Manchuria in the mid-1930s, 12 000 Japanese soldiers caught the disease and military researchers were investigating the cause of the disease, sometimes using prisoners of war in infection experiments. Finnish and German soldiers encountered an HFRS-like epidemic in Finnish Lapland in 1943-44. Duringthe Korean conflict in 1950-53, the disease again gained much attention when about 3000 United Nations troops contracted it -since then known as Korean hemorrhagic fever - with a 5-10 case-fatality rate. The...
20th century, job-focused vocational education became a popular target for federal dollars (Smith-Hughes Act, 1917 George-Barden Act, 1947), providing courses in agriculture, industry, and home economics to high school students across the nation. Federal spending expanded further following World War II when approximately eight million veterans attended college under the GI Bill of Rights. Officially known as the Servicemen's Readjustment Act of 1944, this legislation provided, among other benefits, books, tuition, educational supplies, and counseling to veterans wishing to continue their education after returning from military service (Schugurensky U.S. Department of Education).
The beginning student often erroneously believes that the heart is the sole determinant of cardiac output. In Chapter 14, we learned that the peripheral circulation also plays an important governing role in the circulation. The peripheral circulation controls the filling pressure of the heart as well as its afterload, two major determinants of the stroke volume. This interaction between the heart and the periphery is vividly illustrated by the shock syndrome. In its simplest form, circulatory shock can occur whenever the cardiac output is inadequate to meet the needs of the periphery. If this continues, the peripheral tissues will incur ischemic injury that will trigger a vicious cycle of events leading to circulatory collapse this sequence of events is outlined in Fig. 4. Although war has been the scourge of mankind over his entire recorded existence, one benefit of war has been to provide physicians with large numbers of patients in hemorrhagic and traumatic shock to study. As a...
After World War II, Carl Wiggers simulated hemor-rhagic shock in dogs by connecting a large bottle to the femoral artery of an anesthetized dog with a length of flexible tubing. If the bottle was suspended 136 cm above the heart, the dog's blood pressure (100 mm Hg) would be just enough to push blood up the tubing to the reservoir but little would enter. If the reservoir was lowered to 54 cm above the heart, blood quickly flowed out of the dog and into the reservoir. This continued until enough blood had been shed to lower the mean arterial pressure to 40 mm Hg. It was noted that after several hours of this hypotension the blood would begin to flow out of the bottle and back into the dog. When all the blood had returned, the arterial pressure then began to decline and the dog quickly died. This progression of injury in which the animal required more and more volume just to maintain a pressure of 40 mm Hg was referred to as decompensation. The decompensation phase is due to injury to...
The development of a microbial means of producing acetone was vital to the allied effort in the First World War. Acetone was a crucial precursor in explosives manufacture and the demands of war soon outstripped supply by traditional methods. The problem was solved when Chaim Weismann isolated a strain of Clostridium acetobutylicum that could ferment molasses to acetone and butanol (another industrially useful product). Nowadays, acetone is made more cheaply from petrochemicals.
The development in the 1830s of the Wardian case, a simple to extravagantly elaborate terrarium-like structure, offered plants protection from the suffocating pollution of Victorian England and gave ferns a new and elevated status. Thus began the collecting epidemic know as the Victorian Fern Craze. In short, ferns were trendy and in high demand. Varieties with slight or major irregularities were especially prized, collected, and either proudly displayed or sold for a tidy sum. Thus Polystichum setiferum, the common native, yielded at one time some 366 of these varieties. I will not go into them all here Some have persisted through time (although, regrettably, a private collection of prizes went unrecognized and was torch flamed at the end of World War II to make way for a vegetable garden). Cultivars have an ardent group of fans, especially in Britain and as this goes to press the British Pteridological Society has just published an excellent and helpful compilation, Polystichum...
Seaweeds have been commercially exploited for hydrocolloids since 1658, when the gelling properties of agar, extracted with hot water from red seaweed, were first discovered in Japan. Various red and brown seaweeds are used to produce three hydrocolloids, namely agar, alginate, and carrageenan (Table 19.10). A hydrocolloid is a noncrystalline substance with very large molecules which dissolves in water to give a thickened (viscous) solution. Industrial uses of seaweed extracts expanded rapidly after the Second World War, which were subsequently limited by the availability of raw materials. Research into seaweed life cycles has led to the development of cultivation industries that now supply a high proportion of the raw material for some hydrocolloids. Today, approximately 1 million tons of wet seaweed are harvested and extracted to produce these three hydrocolloids. Total hydrocol-loid production is about 55,000 tons, with a value of 585 million (181).
In 1991, the archives of the Soviet Union and its satellites began to open and to inform us in the West about events that we had perceived only dimly. This history was meticulously stored in endless files, and as these came under scrutiny, we came to realize that truth does not just spring forth. It requires sifting, close reading, and interpretation by linguists and historians. Was so and so a spy What actually happened during the Cuban Missile Crisis How extensive was the Soviet biological warfare effort Slowly, new perceptions formed and we came to understand events differently or in greater detail. You may think that this is a strange way to introduce a chapter on an ameba, but we will see how far this metaphor, the opening of an archive, carries us. The opening of an archive and the sequencing of a genome are similar in the sense that at one moment you do not know something and then, within a very short time, you do. They are parallel in that dramas of the past, some completely...
Worldwide, involuntary starvation is the commonest cause of reduced reproductive ability, resulting in delayed pubertal growth and menarche in adolescents 51 and infertility in adults. Acute malnutrition, as seen in famine conditions and during and after the Second World War has profound effects on fertility and fecundity 48 . Ovulatory function usually returns quickly on restoration of adequate nutrition. Chronic malnutrition, common in developing countries has less profound effects on fertility, but is associated with small and premature babies.
OD is a relatively young field, with some of the earliest efforts not emerging until after World War II. In his seminal article, Toward a General Theory for the Behavioral Sciences, Miller (1955) proposed a systems theory as a way to understand the inter-connectedness of all living things. In the years since Miller first advanced the idea, systems theory has come to serve as a cornerstone for the field of organization development. From this perspective, an organization is viewed as being composed of parts that are organized in a purposeful way in order to achieve its goals. There is thought to be reciprocal influence among the parts on each other, as well as on the organization as a whole, and vice versa (i.e., the organization
Prior to the 1940s, acetabular fractures were relatively uncommon injuries. World War II, however, brought an increase in the number of these injuries. Young servicemen traveling at relatively high speeds in military jeeps and other vehicles accounted for relatively large numbers of these types of injuries. Two reports are particularly notable from this era. Armstrong et al. reported on the experience of the Royal Air Force (12). The classification used in his report consisted of four types of injuries
As noted in Table 18.1, the benefits of ionizing radiation have been known since 1905. In addition to its potential to irradiation can be used to eliminate pests such as the screw worm fly, which preys on cattle, the Mediterranean fruit fly, and the tsetse fly, by the release of sterile insects. Worries about nuclear weapons, combinedwithanantiprogressideology, began to hinder food irradiation research afterthe war. Althoughthere wasatthat time an adequate supply of gamma rays, the high-energy, short-wavelength rays given off by radionuclides, the antitechnology factionconvincedtheCongressto control the development of nuclear technology for treating foods.
World War IIa Korean Warb Vietnam War (1960s)c Northern Ireland (1970s)d Balkan War (1990s)e Gulf War (1990s)f Israel (terrorist-related)g Blast injury causes injuries to the torso in 38 only one-third of them are isolated, whereas the others are abdominal injuries combined with head, chest, or extremity injuries (Peleg et al. 2003). Gas-containing organs are the most vulnerable to primary blast effect, though injuries to solid organs such as the kidneys are also encountered as a result of acceleration and deceleration forces. At exploration, this injury usually takes the form of hemorrhage beneath the visceral peritoneum that extends into the mesentery, possibly associated with perforation of the bowel or rupture, infarction, ischemia, or hemorrhage of solid organs, including the genitourinary system (Centers for Disease Control 2006 DePalma et al. 2005 Stein and Hirshberg 1999). During warfare, the proportion of the abdominal injury with involvement of the kidneys and ureters is...
Of its activity in wounds, would await the work of Carrel and Dakin. The conditions of trench warfare during the First World War resulted in large numbers of casualties with wounds contaminated by soil and human and animal excrement. These conditions led to a high incidence of wound infection and gangrene 4 . Existing antimicrobial compounds such as phenol, mercuric chloride and tincture of iodine proved to be unsuitable for antiseptic treatment of large traumatic wounds. These compounds could not be used in the volume necessary to debride and disinfect the wounds without producing toxic or highly irritating effects 5 . To combat the high mortality that resulted from the wound infections of war, Nobel Laureate Dr. Alexis Carrel enlisted the aid of a noted chemist, Henry Dakin, to formulate a non-irritating solution that had significant antiseptic effect 2 . Dakin examined over 200 substances in his search for a solution that met Carrel's requirements 6 . Among the substances examined...
Before 1947, few synthetic pesticides were used in crops. Most available materials were stomach poisons based on heavy metals such as lead and arsenic, which kill only if eaten. Some botanical extracts, such as rotenone and pyrethrum, both of which quickly degrade in the environment, were also used. After World War II, a business revolution occurred when it became recognized that a variety of compounds that could be artificially synthesized in laboratories were highly effective in killing insects by mere physical contact. Beginning with DDT in 1947, many types of chemicals were marketed to kill insects. One of the undesirable consequences of this change in farming practice was the mass destruction of beneficial insects in crops, resulting in a substantial decrease in natural control. Indeed, insecticides often killed natural enemies more efficiently than they killed the target pest. This unintended consequence was due to the smaller body size, greater relative surface area, and lower...
How did the nutrient recommendations originate Concerned with the need to provide proper nutrition for newly drafted World War II soldiers, many of whom were undernourished, the Department of Defense commissioned the first set of nutrient recommendations (called the Recommended Dietary Allowances) in 1941. Since then, nutrient recommendations
In conclusion, MHC developed after World War II in Israel as a pragmatic response to a difficult set of mental health care circumstances. It is a conceptually rich model of consultation that seems particularly relevant when consultee and organizational setting issues are relevant to understanding problems within the context of consultation.
This brief history traces the interactions of humans and insects dating from the adoption of agriculture and its inherent ecological disruptions. Humankind's early preoccupation with survival focused on insects as relentless pests, competitors for food and fiber, threats to health and comfort. The high hopes following World War II for relief from the bondage of insects through the use of chemical insecticides such as DDT proved unrealistic. The reassessment that followed
After World War II, the chemical industry began the rapid development and marketing of chemicals to control pest insects by poisoning them. Pesticides became very popular and were used on a large scale in the second half of the twentieth century, such that the frequent application of insect-killing poisons to crops became routine. Widespread pesticide use led to a substantial reduction in the level of natural control provided by predators and parasitoids of pest insects, necessitating the further use of pesticides to suppress pest insect populations. However, many pests became resistant to one or more pesticides. This resistance sparked an interest in restoring natural control by reducing the use of insecticides in crops and making their use less damaging to natural enemies by manipulating their timing, placement, or formulation. The effort to restore natural control while making judicious use of pesticides formed the basis of the integrated pest management (IPM) movement in the late...
Indeed, within the past millennia, microbial disease has proven to be a formidable adversary, one that has the potential to decimate the human population if left unchecked. During the Middle Ages and extending into the nineteenth century, diseases such as bubonic plague, cholera, and typhoid swept through Europe, causing massive mortality. The influenza pandemic at the end of World War I, for example, killed more people than the war itself.
To the Americas by the Spaniards and cultivated in Mexico, the West Indies, some Central American countries, and the Florida Keys. The lime became popular in the West as a preventive and treatment for scurvy among British sailors. For the same reason, its popularity rose further in the United States during the California Gold Rush of 1849 and the construction of the transcontinental railroad. Four decades later, lime production ceased after a damaging freeze in the 1890s but underwent a resurgence after World War I.
A human encephalitis syndrome was recognized in the far eastern provinces of the former Soviet Union at least as far back as the late nineteenth century, and has been known under a variety of names including Russian spring-summer encephalitis and Far Eastern tick-borne encephalitis. The causative agent, now named tick-borne encephalitis virus-Far Eastern subtype (TBEV-FE), was isolated from human patients in 1937 and later from its Ixodes persulcatus tick vectors. The virus occurs in the Primorsky, Khabarovsk, Krasnoyarsk, Altai, Tomsk, Omsk, Kemerovo, Western Siberia, Ural and Priural regions of the Russian Federation, China, and eastern Europe. After World War II the existence of a similar disease was recognized in several countries in Central Europe and adjacent parts of the Soviet Union, and a second virus, tick-borne encephalitis virus-European subtype (TBEV-Eu), was isolated from humans and Ix. ricinus ticks. Slight antigenic differences were shown to occur between the two...
Nuclear Weapons Fallout Above-ground nuclear weapons testing in Nevada between 1951 and 1963 released radioactive particles into the atmosphere that exposed thousands of individuals across the continental United States to radioiodine fallout. Exposure was highest for children who drank milk from a backyard cow or goat that had ingested grass contaminated with radioiodine. A study from the National Cancer Institute (34) concluded that nuclear weapons fallout had resulted in an average cumulative thyroid dose of 0.02 Gy (2 rad) in Americans, but for those under age 20 it was 0.1 Gy (10 rad), within the range known to cause thyroid cancer in children (30). The National Academy of Sciences (35) reported that about 50,000 excess cases of thyroid cancer would result from these exposures, but these are highly uncertain estimates. Another study established an association between thyroid cancer and radioiodine fallout among children exposed at less than 1 yr of age and those born between 1950...
Enteritis necroticans (also known as necrotizing enteritis, Darmbrand, or Pigbel) caused by C. perfringens type C isolates is a rare but potentially lethal enteric disease of humans. This illness was first recognized after World War II in Darmbrand, Germany and later in Papua New Guinea. 20 The p -toxin produced by type C isolates is considered as the primary virulence factor based on the relative efficacy of a p-toxoid vaccine. 20 Although the incidence of enteritis necroticans is low, risk factors include reduced intestinal motility and or low intestinal trypsin levels (the p-toxin is trypsin-sensitive) because of preexisting conditions created by a protein-poor diet, helminthic coinfection, and or pancreatic disease. 20 The offending type C isolates can be introduced exo-genously by ingestion of undercooked meat products (commonly pork).
One argument would suggest that a very early stage is involved. I am thinking of the delay of a generation or so between the increase in smoking in men around the First World War, and the rise in lung cancer mortality rates which was so marked 20 or 30 years later and similarly the increase in cigarette smoking among women about the time of the Second World War, and the rise in lung cancer rates for females which has become so noticeable in the last few years. This long delay is what one would expect if a very early part of the process were involved rather than a very recent one.
Previously, most publications put so little emphasis on science that they did not need a specialized reporter. More and more, though, news organizations regard scientific results as necessary information, part of the everyday reporting of news. After all, science and technology have radically altered the way we live. For example, consider the development of antibiotics and vaccines, nuclear weapons, computer technologies, lasers, and fiber optics. Advances in science and technology will undoubtedly continue to occur in ways that we can not fully predict. The Human Genome Project, with all its promise and ethical unknowns, illustrates this perfectly.
Originally, cancer chemotherapy started with nitrogen mustard, a derivative of poisonous gas yperite, a by-product in World War II. The pharmacological action of nitrogen mustard consists in cytotoxicities (e.g., leukopenia, diarrhea, and stomatitis) to the organism, and attempts were made to utilize these toxicities to obtain anticancer activity. Namely, the modality consisted in cancer therapy using toxicities to the organism that were inherent to nitrogen mustard. From the standpoint of establishing cancer chemotherapy that is ideally based on the premise that only the tumor should be attacked with the least damage to the organism, therefore, we cannot but consider that the approach was the tail wagging the dog (misoriented rescuing). A concept of high-dose chemotherapy, i.e., an anticancer agent fails to be effective unless provoking considerable adverse reactions, still remains at present when half a century has elapsed since the introduction of nitrogen mustard.
Other risks exist in the uses of biotechnology. From the late nineteenth century until World War II, a school of thought called eugenics suggested that the methods of genetics should be turned to improving the human gene pool. This idea led to forced sterilization first, of various criminal populations, and eventually, of alcoholics and epileptics. The policies were used to restrict immigration of certain Asian and European populations that were termed genetically inferior. Eugenics had its ultimate expression when it provided the scientific basis for the racial policies of the Nazis before and during World War II. Where the capability exists, so will the temptation. Will parents seek to amplify the gene for human growth hormone in their offspring so that their children could become heftier football linemen or taller basketball players The ability to select the gender of one's offspring by amniocentesis and abortion is already causing problems in some cultures.
Even after Harlow's dramatic description of the effects of frontal lobe injury, not much research was performed on the functions of the frontal lobes until 1934, when Kleist had the opportunity to examine many of the soldiers who injured their frontal lobes during the First World War. He noted that these veterans also were apathetic and abu-lic, with a loss of drive and initiative. We still do not fully understand why the frontal lobes are so important for goal-oriented behavior however, Nauta (1971) a Dutch neuroanatomist who worked at MIT, provided us with one of the best explanations. He noted that information from the outside world is first transmitted to the primary sensory areas. As I mentioned, the auditory system projects to the superior portion of the temporal lobes, touch projects to the anterior portions of the parietal lobes, and vision projects to the occipital lobes (see Figure 3.6). These primary sensory areas perform elementary sensory analyses. Each of these primary...
The diuretic effect of acidifying salts was described at the end of World War I. The two principal agents used were calcium chloride and ammonium chloride. Calcium chloride was the weaker of the two when it was shown that its intravenous use resulted in calcification of the heart and soft tissues it quickly fell
Because of their safety and effectiveness as oral agents in the management of edema and hypertension, as well as their usefulness in the exploration of renal function in the laboratory, the sulfonamyl diuretics can easily be ranked among the most important pharmacological discoveries and, alongside the antibiotics, one of the noteworthy drugs of post-World War II medicine.
Eugenics is commonly associated with the Nazi racial hygiene program that began in 1933 and ended in May 1945, with Germany's defeat near the end of World War II. Although the German eugenics movement existed long before the Nazis came to power, scholars have shown that Nazi eugenicists were inspired by American eugenic studies and sterilization, as well as their antimiscegenation and immigration restriction laws.
For environmental analyses, handheld equipment that can be brought into the field is currently being developed (47). Because of the fear of biological warfare, the US army is a driving force in these developments (17). Advances also have been made in the field of pathogen control in animals used for food production (48). Future developments will be an integration of all steps into a single apparatus as in the concept of lab-on-a-chip. The current focus for lab-on-a-chip has changed from expensive silica-based to cheap plastic chips (49). These chips are gaining acceptability, mainly because they are affordable and because the liquid volumes that can be processed are in a practical range for most applications.
Surgeons dominated the treatment of cancer until recent decades. Radiologists came to play a supporting role after World War II. Hematologists developed a similar role for leukemias and lymphomas. Only in the 1970s did medical oncology emerge as the primary cancer-treating specialty. Each group had its own treatment technology (surgery, radiotherapy, and chemotherapy) that depended partly on the prevailing concept of cancer and partly on the empirical outcomes of treatment. As oncology developed as a specialty, it confronted the challenge of differentiating itself from both surgery and radiation therapy. Within internal medicine, it faced hostility from chairs of major departments and from the subspecialty of hematology. Chemotherapy was viewed by many with great skepticism as little more than the administration of toxic chemicals to patients with a barely understood disease.
L. ( Emmarel ) Freshel (1867-1948) was the founder of the Millennium Guild, the first American animal rights* organization. Founded in 1912, the guild published Freshel's Golden Rule Cook Book (first published in 1907) and Selections from Three Essays by Richard Wagner with Comment on a Subject of Such Importance to the Moral Progress of Humanity That It Constitutes an Issue in Ethics and Religion (1933), an impassioned attack on vivisection. An associate of Mary Baker Eddy, founder of the Christian Science Church, Freshel resigned from the Christian Science Church after it expressed support for the entry of the United States into World War I. Through the Millennium Guild, she promoted alternative fur fabrics and vegetarianism* and spoke out against all forms of animal exploitation. After her death, control of the Millennium Guild fell to her husband Curtis. After his death, the organization was directed by New York radio personality Pegeen Fitzgerald.
Although generally perceived by the public to be a recurring, endemic, non-life-threatening problem, viral influenza ranks high on the list of diseases with epidemic potential. During years with high levels of influenza incidence, tens of thousands of fatalities may be experienced in the United States. During the World War I pandemic of 1918, it has been estimated that 40 to 50 million victims died worldwide, more than died directly from the war itself
However, because the causative agent, Bacillus anthracis, is a bacterial endospore-former and easily grown in culture in the laboratory, anthrax has been of increased concern as a biological warfare agent. If dispersed in the air and inhaled in high amounts, virulent strains cause a respiratory disease that is rapidly fatal in virtually 100 of unvac-cinated victims.
Abernethy, Virginia D., Population Politics The Choices That Shape Our Future (New York Insight Books, 1993) Tobias, M., The Dynamics of Environmental Despair and Optimism, Population and Environment, September 1996 Tobias, M., World War III Population and the Biosphere at the End of the Millennium (Santa Fe, NM Bear and Co., 1994) Wilson, E. O., The Diversity of Life (Cambridge, MA Belknap Press of Harvard University Press, 1992) World Resources A Report by the World Resources Institute, with the United Nations Environment Program and the United Nations Development Program (New York Oxford University Press, 1993), 119.
Especially in virology, the application of both qualitative and quantitative nucleic acid detection techniques has had a major impact on diagnostics.1-26-1 Point-of-care testing for biological warfare agents by ''nontrained'' personnel using devices such as the RAPID (Idaho Technology, Salt Lake City, USA), and handheld instruments such as the RAZOR (Idaho Technology), or the BioSeeg (Smiths Detection, Edgewood, USA) is already the reality. 27 It seems to be just a matter of time when this kind of technology will become available for primary-care providers for routine microbiological testing in the point-of-care setting.
A major benefit lies in the rapidity with which results can be obtained. This is of major importance when detecting bacterial pathogens, as it allows a specific and timely application of antibiotics. Real-time assays are ideal for distinguishing between different serotypes of a single bacterial species, 25 for detecting and monitoring drug resistance among clinical isolates, 26 for detecting pathogens in food, 27 and not least for identifying microbes used as agents of biological warfare. 28
(Juniperus virginiana) Sometimes known as Pencil Juniper or Pencil Cedar - no other wood has been found that has just the right physical properties for the casing of lead pencils (Harper). But by the end of World War II, it had become extremely scarce, so it had to be replaced for pencil wood by Red Cedar (Calocedrus decurrens). (Lewington). Clothes chests are made of it, too, for the smell of the wood repels moths. Smoking crushed juniper berries is an American domestic medicine for catarrh (H M Hyatt), and earlier, Indian peoples had used it for a variety of ailments. Both leaves and berries boiled together were taken for coughs. Twigs were burned and the smoke inhaled for a cold in the head (Gilmore). The Kiowa chewed the berries as a remedy for canker sores in the mouth (Vestal & Schultes), while the Natchez used it in some way for mumps (Weiner).
Pesticides can be classified by function and divided into subclasses by structure (Table 21.3). However, some of the structural types are used for several functions. For example, carbamates are used as both insecticides and herbicides, as is the organochlorine hexachlorobenzene. Before the development of synthetic pesticides during World War II, other compounds were used, such as arsenicals and nicotine. What they all have in common is that they are intended to control or eliminate undesirable organisms. The emphasis here will be on the insecticides and herbicides because of their wider distribution in the environment. The structures of some of these compounds are shown in Figures 21.1 and 21.2.
As a follow-up to the conference Dr White Franklin invited some of us whom he knew to be sympathetic to his overall approach to form a committee. The committee members were Dr Macdonald Critchley, Professor Oliver Zangwill, Professor Patrick Meredith, Maisie Holt and myself. We were later joined by Dr Mia Kellmer Pringle, a much respected figure in the world of education. Dr Critchley was to become President of the World Federation of Neurology Professor Zangwill, who was Professor of Psychology at Cambridge University, was sympathetic to the venture primarily, I think, because of his experiences with brain-damaged patients at the end of the second world war the idea that there could be developmental anomalies not unlike those found in brain-damaged patients was an idea which obviously made sense to him. Professor Patrick Meredith was Professor of Psychology at the University of Leeds. Although considered by some to be rather eccentric, he was in his own way a highly imaginative and...
GARLIC has always been used as an antiseptic, though its original use relied on the Doctrine of Signatures, its signature being the shape of its leaf. The word garlic is OE garleac, where 'gar' means spear, a recognition of the taper-leaved or spear shaped outline. So it soon became used to combat wounds inflicted by spears (Storms). This use as a wound herb, for which there are sound medical reasons, continued into the 20th century. It has always been applied externally as an antiseptic, and during World War 1 the raw juice was put on sterilized swabs to apply to wounds to prevent their turning septic. LEEKS too, surprisingly, enjoyed early reputation as a wound herb. A Middle English medical treatise claimed that, with salt, they helpe a wounde to close some (I B Jones), and the Physicians of Myddfai included a prescription to restrain bleeding from recent wounds . TUTSAN owes its inclusion here to the doctrine of signatures. The dark red juice that exudes from the bruised capsules...
Another case was first described by Gonthier et al. (2004). Using a phyloge-netic approach, H. annosum of the North American clade was detected west of Rome, Italy, centred round a military camp established during the Second World War. The fungus subsequently spread to some of the surrounding Pinus pinea forests probably by spore infections in thinning operations.
After World War II, women did not return to their traditional role as homemaker, instead they continued to enter the workforce in droves. Between the years of 1940 and 1976, maternal employment increased fivefold. The 1970s saw a trend of separation and divorce that has continued into the 21st century. This trend has forced a growing number of mothers to become the main provider for their families. Among married couples, dual-earner households are now the norm. The shifting gender roles in our society have substantially increased the number of latchkey children (Lamorey & colleagues, 1999). This in turn has created serious concerns for parents, educators, politicians, and communities regarding the well-being of unattended children during after-school hours.
Anthrax is a historically important infection, thought to be the fifth and sixth plagues of ancient Egypt, brought by Moses. It was the cause of several disastrous animal plagues in Europe in the eighteenth and nineteenth centuries. In 1877, Robert Koch cultured Bacillus anthracis, the first proof of a microbial agent causing human disease (1). This discovery supported germ theory and gave birth to the science of modern microbiology. Subsequently, Pasteur and Greenfield successfully developed the first vaccine, composed of attenuated B. anthracis (2). Anthrax has been explored as an agent of biological warfare because of its exceptional virulence and capability to create an aerosol of odorless, invisible spores. Its spores could potentially be dispersed over densely populated areas, and generate disease in a multitude of people with high mor
Dengue viruses are believed to have originated in tropical forested habitats, moved from there to rural environments, and finally invaded urban centers. The word dengue most likely originated from Swahili, and following a series of modifications in pronunciation and spelling the word evolved to its present form. The earliest recorded epidemics of a dengue-like illness were in China during the Chin dynasty (265-420 AD). During the 18th and 19th centuries sporadic epidemics were reported in Asia and the Americas. Following World War II, the pattern changed from one of periodic outbreaks to one of continuous transmission of multiple virus serotypes in Asian cities. It was from that situation that DHF DSS surfaced in 1954 in the Philippines.
Naturally occurring or synthetic pesticides are a diverse group of chemical agents used to control undesirable pests. The introduction of synthetic pesticides had an enormous impact on agriculture and human health. Dramatic increases in agricultural production, the so-called green revolution, in the U.S. in the past decades were achieved by the use of synthetic pesticides, herbicides, and fungicides. Insecticides and fungicides have been used and are being used to reduce postharvest losses of crops. Synthetic compounds that have been used for farming and food production have contributed to agriculture's success and affected our perception of the environment. For example, although current thinking has played down the usefulness of DDT (dichloro diphenyl trichloroethane), many lives were saved in Europe and Asia during and after World War II by controlling the mosquito vector of malaria transmission.
Numerous reports have suggested that nutritional deficiencies in general would cause adverse birth outcomes. As an example, a Dutch midwife found an increase in NTD in 1722 and 1732, 2 yr that were linked with poor crops. She also noted that the children with NTD came from the poorest homes in urban areas (1). A similar observation was made in the children who were exposed in utero to severe food shortage during the Second World War in Holland. In addition to a significant decrease in birth weight, there was also a significant increase in the rate of NTD (2).
Twin registries have also been assembled from among special populations. Examples in the United States are registries assembled from military records (the World War II Veteran Twins Registry and the Vietnam Era Twin Registry) and from Medicare files (the U.S. Registry of Elderly African-American Twins). In these registries, likely adult twins were iden-rifled by searching records to identify individuals with identical dates of birth, birthplaces, and surnames. These individuals were then contacted to
The notion of triage is somewhat problematic and debatable, as well as impregnated with difficult ethical and moral questions. During World War II, a battlefield nurse was given the responsibility of triage, i.e., dividing patients into three groups patients with minor (non-life- or non-limb-threatening injuries) who do not need immediate attention, patients in critical condition who can mostly benefit from immediate care, and patients beyond hope who will not be treated (Frykberg 2002). Similar principles are applied in modern medicine for disaster triage, with emphasis on the fact that the essence of triage is to identify the few critically injured who can be saved by immediate intervention among the many others with non-life-threatening injuries, for whom treatment can be delayed.
Making whereof I commit to the cunning cookes, and teeth to eat them in the rich man's mouth (Gerard). The hips contain large amounts of Vitamin C, and they were systematically gathered during World War 2 so that the vitamin content could be exploited. The hips have always been used as a medicine in one way or another. The conserve is of some efficacy against coughs (Hill), or a tea made from them was taken for fatigue, and dropsy among other complaints (Fluck), including the common cold (Thomson. 1978). To prevent a wound going bad hips of Dog Rose chewed, then let it drop on the wound (Cockayne). The leaves, too, were used to put on a cut in Essex (V G Hatfield), and a charm from Ireland to cure a stye required the stye to be touched nine times with a rose thorn (Buckley).
(Aphanes arvensis) Parsley (Piert) refers to the form of the leaves, not any relationship to parsley. The common name is from French perce-pierre, meaning breakstone (Prior) and it is actually called Parsley Breakstone (Grigson. 1955) (cf SAXIFRAGE). By sympathy, it was much used against stone in the bladder. Gypsies use an infusion of the dried herb for gravel and other bladder troubles (Vesey-Fitzgerald). It was well-known as a powerful diuretic in Camden's time, and it was in great demand during World War 11, being used for bladder and kidney troubles, and it is also valuable for jaundice (Brownlow). A decoction with sanicle was used for stomach complaints, but it was especially recommended, powdered and with a little cochineal, for bowel complaints, especially bowel-hive, an inflammation of the bowel, occurring in children. It was even called Bowel-hive, or Bowel-hive Grass (Britten & Holland), once. Colicwort is another relevant name, from Herefordshire (Grigson. 1955).
A root infusion of the African tree CATCHTHORN (Zizyphus abyssinica) is taken for dysentery. There is a lot of tannin in the bark, so that is probably the reason for this treatment (Palgrave & Palgrave). Maoris set great store in HEBE, particularly Hebe salicifolia, for curing diarrhoea and dysentery, so much so that the young leaf tips, the astringent part used, were collected and sent out to Maori troops in the Middle East during World War II (C Macdonald). GREAT BURNET root is used in Chinese medicine for the complaint (Geng Junying), as well as for haemorrhages and other conditions.
(Cichorium intybus) It was cultivated on the Continent up to World War II, for the root, which is used as a substitute for coffee, and is sometimes mixed with real coffee as an adulterant. It is very bitter, and contains no caffeine or tannin (Sanford). It was introduced as a coffee substitute in the 18th century, but it was actually banned by a law of 1832, repealed in 1840, though it has now virtually died out as a coffee substitute (Brouk). Another use for the foliage, which is edible once the bitter principle is removed by twice boiling, was for a blue dye (Hemphill).
Says that colocynth was the base for general issue purgative pills in the British army in the first World War. Pomet mentions the practice of confectioners who cover these Seeds with Sugar, and sell them to catch or delude Children with, and People of Quality upon extraordinary Occasions .
The first chemical to be recognized as a mutagen was mustard gas, which had been developed during World War I, but not tested until World War II by Auerbach and Robson, at the University of Edinburgh. Since then a wide variety of chemicals have been discovered that are also muta-genic. Some induce mutations at any point in the cell cycle, by disrupting DNA structure. Others only act during DNA replication. Called base
Alkylating agents have been widely used ever since their anticancer properties were recognized just after World War I, when they were used as chemical weapons. They are still being used alone or in combination with other agents in myelogenic leukemia, Hodgkin disease, lung, testicle, ovarian and breast cancer, as well as in several lymphomas 1 . Unfortunately, their remarkable effectiveness is moderated by the serious side effects that accompany the therapy. The biochemical target of alkylating agents is DNA however they cannot distinguish between tumor-cells and normal cells. Aiming to minimize their toxicity via a more specialized targeting, many researchers have chemically combined different alkylating agents with a diversity of carrier-molecules such as aminoacids , chlorokine and quinacrine , hydrocarbons, purines and pyrimidines, steroids 2-4 and others, with diverse effectiveness.
It was another 20 years before another significant antimicrobial drug was developed, when the German chemist Gerhard Domagk showed a synthetic dye, Prontosil, to be active against a range of Gram-positive bacteria. The active component of prontosil was shown soon afterwards to be sulphanilamide. In the following decade, numerous derivatives of sulphanilamide were synthesised, many of which were more potent antimicrobial agents than the parent molecule. This class of compounds is known collectively as the sulphonamides, or sulfa drugs (Box 14.1). In the years leading up to the Second World War, sulphonamides dramatically improved the mortality rates due to pneumonia and puerperal fever.
Alive after the Fall Review Official Download Link
The legit version of Alive after the Fall Review is not distributed through other stores. An email with the special link to download the ebook will be sent to you if you ordered this version.