curid
stringlengths 2
7
| title
stringlengths 1
182
| text
stringlengths 2
230k
|
---|---|---|
1966 | Apollo 10 | Apollo 10 (May 18–26, 1969) was the fourth human spaceflight in the United States' Apollo program and the second to orbit the Moon. NASA, the mission's operator, described it as a "dress rehearsal" for the first Moon landing (Apollo 11, two months later). It was designated an "F"mission, intended to test all spacecraft components and procedures short of actual descent and landing.
After the spacecraft reached lunar orbit, astronaut John Young remained in the Command and Service Module (CSM) while astronauts Thomas Stafford and Gene Cernan flew the Apollo Lunar Module (LM) to within of the lunar surface, the point at which powered descent for landing would begin on a landing mission. Then they rejoined Young in the CSM and, after the CSM completed its 31st orbit of the Moon, they returned safely to Earth.
While NASA had considered attempting the first crewed lunar landing on Apollo 10, mission planners ultimately decided that it would be prudent to have a practice flight to hone the procedures and techniques. The crew encountered some problems during the flight: pogo oscillations during the launch phase and a brief, uncontrolled tumble of the LM ascent stage in lunar orbit during its solo flight. However, the mission accomplished its major objectives. Stafford and Cernan observed and photographed Apollo 11's planned landing site in the Sea of Tranquility. Apollo 10 spent 61 hours and 37 minutes orbiting the Moon, for about eight hours of which Stafford and Cernan flew the LM apart from Young in the CSM, and about eight days total in space. Additionally, Apollo 10 set the record for the highest speed attained by a crewed vehicle: 39,897 km/h (11.08 km/s or 24,791 mph) on May 26, 1969, during the return from the Moon.
The mission's call signs were the names of the "Peanuts" characters Charlie Brown for the CSM and Snoopy for the LM, who became Apollo 10's semi-official mascots. "Peanuts" creator Charles Schulz also drew mission-related artwork for NASA.
Framework.
Background.
By 1967, NASA had devised a list of mission types, designated by letters, that needed to be flown before a landing attempt, which would be the "G" mission. The early uncrewed flights were considered "A" or "B" missions, while Apollo 7, the crewed-flight test of the Command and Service Module (CSM), was the "C" mission. The first crewed orbital test of the Lunar Module (LM) was accomplished on Apollo 9, the "D" mission. Apollo 8, flown to the Moon's orbit without an LM, was considered a "C-prime" mission, but its success gave NASA the confidence to skip the "E" mission, which would have tested the full Apollo spacecraft in medium or high Earth orbit. Apollo 10, the dress rehearsal for the lunar landing, was to be the "F" mission.
NASA considered skipping the "F" mission as well and attempting the first lunar landing on Apollo 10. Some with the agency advocated this, feeling it senseless to bring astronauts so close to the lunar surface, only to turn away. Although the lunar module intended for Apollo 10 was too heavy to perform the lunar mission, the one intended for Apollo 11 could be substituted by delaying Apollo 10 a month from its May 1969 planned launch. NASA official George Mueller favored a landing attempt on ; he was known for his aggressive approach to moving the Apollo program forward. However, Director of Flight Operations Christopher C. Kraft and others opposed this, feeling that new procedures would have to be developed for a rendezvous in lunar orbit and that NASA had incomplete information regarding the Moon's mass concentrations, which might throw off the spacecraft's trajectory. Lieutenant General Sam Phillips, the Apollo Program Manager, listened to the arguments on both sides and decided that having a dress rehearsal was crucial.
Crew and key Mission Control personnel.
On November 13, 1968, NASA announced the crew members of Apollo 10. Thomas P. Stafford, the commander, was 38 years old at the time of the mission. A 1952 graduate of the Naval Academy, he was commissioned in the Air Force. Selected for the second group of astronauts in 1962, he flew as pilot of Gemini 6A (1965) and command pilot of Gemini 9A (1966). John Young, the command module pilot, was 38 years old and a commander in the Navy at the time of Apollo 10. A 1952 graduate of Georgia Tech who entered the Navy after graduation and became a test pilot in 1959, he was selected as a Group 2 astronaut alongside Stafford. He flew in Gemini 3 with Gus Grissom in 1965, becoming the first American not of the Mercury Seven to fly in space. Young thereafter commanded Gemini 10 (1966), flying with Michael Collins. Eugene Cernan, the lunar module pilot, was a commander in the Navy at the time of Apollo 10. A 1952 graduate of Purdue University, he entered the Navy after graduation. Selected for the third group of astronauts in 1963, Cernan flew with Stafford on Gemini 9A before his assignment to Apollo 10. With five prior flights among them, the Apollo 10 crew was the most experienced to reach space until the Space Shuttle era, and the first American space mission whose crew were all spaceflight veterans.
The backup crew for Apollo 10 was L. Gordon Cooper Jr as commander, Donn F. Eisele as command module pilot, and Edgar D. Mitchell as lunar module pilot. By the normal crew rotation in place during Apollo, Cooper, Eisele, and Mitchell would have flown on Apollo 13, but Cooper and Eisele never flew again. Deke Slayton, Director of Flight Crew Operations, felt that Cooper did not train as hard as he could have. Eisele was blackballed because of incidents during Apollo 7, which he had flown as CMP and which had seen conflict between the crew and ground controllers; he had also been involved in a messy divorce. Slayton only assigned the two as backups because he had few veteran astronauts available. Cooper and Eisele were replaced by Alan Shepard and Stuart Roosa respectively. Feeling they needed additional training time, George Mueller rejected the Apollo 13 crew. The crew was switched to Apollo 14, which saw Shepard and Mitchell walk on the Moon.
For projects Mercury and Gemini, a prime and a backup crew had been designated, but for Apollo, a third group of astronauts, known as the support crew, was also designated. Slayton created the support crews early in the Apollo program on the advice of McDivitt, who would lead Apollo 9. McDivitt believed that, with preparation going on in facilities across the U.S., meetings that needed a member of the flight crew would be missed. Support crew members were to assist as directed by the mission commander. Usually low in seniority, they assembled the mission's rules, flight plan, and checklists, and kept them updated. For Apollo 10, they were Joe Engle, James Irwin, and Charles Duke.
Flight directors were Gerry Griffin, Glynn Lunney, Milt Windler, and Pete Frank. Flight directors during Apollo had a one-sentence job description: "The flight director may take any actions necessary for crew safety and mission success." CAPCOMs were Duke, Engle, Jack Lousma, and Bruce McCandless II.
Call signs and mission insignia.
The command module was given the call sign "Charlie Brown" and the lunar module the call sign "Snoopy". These were taken from the characters in the comic strip, "Peanuts", Charlie Brown and Snoopy. These names were chosen by the astronauts with the approval of Charles Schulz, the strip's creator, who was uncertain it was a good idea, since Charlie Brown was always a failure. The choice of names was deemed undignified by some at NASA, as were the choices for Apollo 9's CM and LM ("Gumdrop" and "Spider"). Public relations chief Julian Scheer urged a change for the lunar landing mission. But for Apollo 10, according to Cernan, "The P.R.-types lost this one big-time, for everybody on the planet knew the klutzy kid and his adventuresome beagle, and the names were embraced in a public relations bonanza." Apollo 11's call signs were "Columbia" for the command module and "Eagle" for the lunar module.
Snoopy, Charlie Brown's dog, was chosen for the call sign of the lunar module since it was to "snoop" around the landing site, with Charlie Brown given to the command module as Snoopy's companion. Snoopy had been associated for some time with the space program, with workers who performed in an outstanding manner awarded silver "Snoopy pins", and Snoopy posters were seen at NASA facilities, with the cartoon dog having traded in his World War I aviator's headgear for a space helmet. Stafford stated that, given the pins, "the choice of Snoopy [as call sign] was a way of acknowledging the contributions of the hundreds of thousands of people who got us there". The use of the dog was also appropriate since, in the comic strip, Snoopy had journeyed to the Moon the year before, thus defeating, according to Schulz, "the Americans, the Russians, and that stupid cat next door".
The shield-shaped mission insignia shows a large, three-dimensional Roman numeral X sitting on the Moon's surface, in Stafford's words, "to show that we had left our mark". Although it did not land on the Moon, the prominence of the number represents the contributions the mission made to the Apollo program. A CSM circles the Moon as an LM ascent stage flies up from its low pass over the lunar surface with its engine firing. The Earth is visible in the background. On the mission patch, a wide, light blue border carries the word APOLLO at the top and the crew names around the bottom. The patch is trimmed in gold. The insignia was designed by Allen Stevens of Rockwell International.
Training and preparation.
Apollo 10, the "F" mission or dress rehearsal for the lunar landing, had as its primary objectives to demonstrate crew, space vehicle and mission support facilities performance during a crewed mission to lunar orbit, and to evaluate the performance of the lunar module there. In addition, it was to attempt photography of Apollo Landing Site 2 (ALS-2) in the Sea of Tranquillity, the contemplated landing site for Apollo 11. According to Stafford, Our flight was to take the first lunar module to the moon. We would take the lunar module, go down to within about ten miles above the moon, nine miles above the mountains, radar map, photo map, pick out the first landing site, do the first rendezvous around the moon, pick out some future landing sites, and come home.
Apollo 10 was to adhere as closely as possible to the plans for Apollo 11, including its trajectory to and from lunar orbit, the timeline of mission events, and even the angle of the Sun at ALS-2. However, no landing was to be attempted. ALS-1, given that number because it was the furthest to the east of the candidate sites, and also located in the Sea of Tranquility, had been extensively photographed by Apollo 8 astronauts; at the suggestion of scientist-astronaut Harrison Schmitt, the launch of Apollo 10 had been postponed a day so ALS-2 could be photographed under proper conditions. ALS-2 was chosen as the lunar landing site since it was relatively smooth, of scientific interest, and ALS-1 was deemed too far to the east. Thus, when Apollo 10's launch date was announced on January 10, 1969, it was shifted from its placeholder date of May 1 to May 17, rather than to May 16. On March 17, 1969, the launch was slipped one day to May 18, to allow for a better view of ALS-3, to the west of ALS-2. Another deviation from the plans for Apollo 11 was that Apollo 10 was to spend an additional day in lunar orbit once the CSM and LM rendezvoused; this was to allow time for additional testing of the LM's systems, as well as for photography of possible future Apollo landing sites.
The Apollo 10 astronauts undertook five hours of formal training for each hour of the mission's eight-day duration. This was in addition to the normal mission preparations such as technical briefings, pilot meetings and study. They took part in the testing of the CSM at the Downey, California, facility of its manufacturer, North American Rockwell, and of the LM at Grumman in Bethpage, New York. They visited Cambridge, Massachusetts, for briefings on the Apollo Guidance Computer at the Massachusetts Institute of Technology Instrumentation Laboratory. They each spent more than 300 hours in simulators of the CM or LM at the Manned Spacecraft Center (MSC) in Houston and at Kennedy Space Center (KSC) in Florida. To train for the high-acceleration conditions they would experience in returning to Earth's atmosphere, they endured MSC's centrifuge.
Lunar landing capability.
While Apollo 10 was meant to follow the procedures of a lunar landing mission to the point of powered descent, Apollo 10's LM was not capable of landing and returning to lunar orbit. The ascent stage was loaded with the amount of fuel and oxidizer it would have had remaining if it had lifted off from the surface and reached the altitude at which the Apollo 10 ascent stage fired; this was only about half the total amount required for lift off and rendezvous with the CSM. The mission-loaded LM weighed , compared to for the Apollo 11 LM which made the first landing. Additionally, the software necessary to guide the LM to a landing was not available at the time of Apollo 10.
Craig Nelson wrote in his book "Rocket Men" that NASA took special precaution to ensure Stafford and Cernan would not attempt to make the first landing. Nelson quoted Cernan as saying "A lot of people thought about the kind of people we were: 'Don't give those guys an opportunity to land, 'cause they might!' So the ascent module, the part we lifted off the lunar surface with, was short-fueled. The fuel tanks weren't full. So had we literally tried to land on the Moon, we couldn't have gotten off." Mueller, NASA's Associate Administrator for Manned Space Flight, stated, There had been some speculation about whether or not the crew might have landed, having gotten so close. They might have wanted to, but it was impossible for that lunar module to land. It was an early design that was too heavy for a lunar landing, or, to be more precise, too heavy to be able to complete the ascent back to the command module. It was a test module, for the dress rehearsal only, and that was the way it was used.
Equipment.
The descent stage of the LM was delivered to KSC on October 11, 1968, and the ascent stage arrived five days later. They were mated on November 2. The Service Module (SM) and Command Module (CM) arrived on November 24 and were mated two days later. Portions of the Saturn V launch vehicle arrived during November and December 1968, and the complete launch vehicle was erected in the Vehicle Assembly Building (VAB) on December 30. After being tested in an altitude chamber, the CSM was placed atop the launch vehicle on February 6, 1969. The completed space vehicle was rolled out to Launch Complex 39B on March 11, 1969—the fact that it had been assembled in the VAB's High Bay 2 (the first time it had been used) required the crawler to exit the rear of the VAB before looping around the building and joining the main crawlerway, proceeding to the launch pad. This rollout, using Mobile Launch Platform-3 (MLP-3), happened eight days after the launch of Apollo 9, while that mission was still in orbit.
The launch vehicle for Apollo 10 was a Saturn V, designated AS-505, the fifth flight-ready Saturn V to be launched and the third to take astronauts to orbit. The Saturn V differed from that used on Apollo 9 in having a lower dry weight (without propellant) in its first two stages, with a significant reduction to the interstage joining them. Although the S-IVB third stage was slightly heavier, all three stages could carry a greater weight of propellant, and the S-II second stage generated more thrust than that of Apollo 9.
The Apollo spacecraft for the Apollo 10 mission was composed of Command Module 106 (CM-106), Service Module 106 (SM-106, together with the CM known as CSM-106), Lunar Module 4 (LM-4), a spacecraft-lunar module adapter (SLA), numbered as SLA-13A, and a launch escape system. The SLA was a mating structure joining the Instrument Unit on the S-IVB stage of the Saturn V launch vehicle and the CSM, and acted as a housing for the LM, while the Launch Escape System (LES) contained rockets to propel the CM to safety if there was an aborted launch. At about 76.99 metric tons, Apollo 10 would be the heaviest spacecraft to reach orbit to that point.
Mission highlights.
Launch and outbound trip.
Apollo 10 launched from KSC on May 18, 1969, at 12:49:00 EDT (16:49:00 UT), at the start of a 4.5-hour launch window. The launch window was timed to secure optimal lighting conditions at Apollo Landing Site 2 at the time of the LM's closest approach to the site days later. The launch followed a countdown that had begun at 21:00:00 EDT on May 16 (01:00:00 UT on May 17). Because preparations for Apollo 11 had already begun at Pad 39A, Apollo 10 launched from Pad 39B, becoming the only Apollo flight to launch from that pad and the only one to be controlled from its Firing Room 3.
Problems that arose during the countdown were dealt with during the built-in holds, and did not delay the mission. On the day before launch, Cernan had been stopped for speeding while returning from a final visit with his wife and child. Lacking identification and under orders to tell no one who he was, Cernan later attested in his autobiography that he had feared being arrested. Launch pad leader Gunther Wendt, who had pulled over nearby after recognizing Cernan, explained the situation to the police officer, who then released Cernan despite the officer's skepticism that Cernan was an astronaut.
The crew experienced a somewhat rough ride on the way to orbit due to pogo oscillations. About 12 minutes after liftoff, the spacecraft entered a low Earth orbit with a high point of and a low point of . All appeared to be normal during the systems review period in Earth orbit, and the crew restarted the S-IVB third stage to achieve trans-lunar injection (TLI) and send them towards the Moon. The vehicle shook again while executing the TLI burn, causing Cernan to be concerned that they might have to abort. However, the TLI burn was completed without incident. Young then performed the transposition, docking, and extraction maneuver, separating the CSM from the S-IVB stage, turning around, and docking its nose to the top of the lunar module (LM), before separating from the S-IVB. Apollo 10 was the first mission to carry a color television camera inside the spacecraft, and mission controllers in Houston watched as Young performed the maneuver. Soon thereafter, the large television audience was treated to color views of the Earth. One problem that was encountered was that the mylar cover of the CM's hatch had pulled loose, spilling quantities of fiberglass insulation into the tunnel, and then into both the CM and LM. The S-IVB was fired by ground command and sent into solar orbit with a period of 344.88 days.
The crew settled in for the voyage to the Moon. They had a light workload, and spent much of their time studying the flight plan or sleeping. They made five more television broadcasts back to Earth, and were informed that more than a billion people had watched some part of their activities. In June 1969, the crew would accept a special Emmy Award on behalf of the first four Apollo crews for their television broadcasts from space. One slight course correction was necessary; this occurred at 26:32:56.8 into the mission and lasted 7.1 seconds. This aligned Apollo 10 with the trajectory Apollo 11 was expected to take. One issue the crew encountered was bad-tasting food, as Stafford apparently used a double dose of chlorine in their drinking water, which had to be placed in their dehydrated food to reconstitute it.
Lunar orbit.
Arrival and initial operations.
At 75:55:54 into the mission, above the far side of the Moon, the CSM's service propulsion system (SPS) engine was fired for 356.1 seconds to slow the spacecraft into a lunar orbit of . This was followed, after two orbits of the Moon, with a 13.9-second firing of the SPS to circularize the orbit to at 80:25:08.1. Within the first couple of hours after the initial lunar orbit insertion burn and following the circularization burn, the crew turned to tracking planned landmarks on the surface below to record observations and take photographs. In addition to ALS-1, ALS-2, and ALS-3, the crew of Apollo 10 observed and photographed features on the near and far sides of the Moon, including the craters Coriolis, King, and Papaleksi. Shortly after the circularization burn, the crew partook in a scheduled half-hour color-television broadcast with descriptions and video transmissions of views of the lunar surface below.
About an hour after the second burn, the LM crew of Stafford and Cernan entered the LM to check out its systems. They were met with a blizzard of fiberglass particles from the earlier problem, which they cleaned up with a vacuum cleaner as best they could. Stafford had to help Cernan remove smaller bits from his hair and eyebrows. Stafford later commented that Cernan looked like he just came out of a chicken coop, and that the particles made them itch and got into the air conditioning system, and they were scraping it off the filter screens for the rest of the mission. This was merely an annoyance, but the particles may have gotten into the docking ring joining the two craft and caused it to misalign slightly. Mission Control determined that this was still within safe limits.
The flight of "Snoopy".
After Stafford and Cernan checked out "Snoopy", they returned to "Charlie Brown" for a rest. Then they re-entered "Snoopy" and undocked it from the CSM at 98:29:20. Young, who remained in the CSM, became the first person to fly solo in lunar orbit. After undocking, Stafford and Cernan deployed the LM's landing gear and inspected the LM's systems. The CSM performed an 8.3-second burn with its RCS thrusters to separate itself from the LM by about 30 feet, after which Young visually inspected the LM from the CSM. The CSM performed another separation burn, this time separating the two spacecraft by about . The LM crew then performed the descent orbit insertion maneuver by firing their descent engine for 27.4 seconds at 99:46:01.6, and tested their craft's landing radar as they approached the altitude where the subsequent Apollo 11 mission would begin powered descent to land on the Moon. Previously, the LM's landing radar had only been tested under terrestrial conditions. While the LM executed these maneuvers, Young monitored the location and status of the LM from the CSM, standing by to rescue the LM crew if necessary. Cernan and Stafford surveyed ALS-2, coming within of the surface at a point 15 degrees to its east, then performed a phasing burn at 100:58:25.93, thrusting for just under 40 seconds to allow a second pass at ALS-2, when the craft came within of the Moon, its closest approach. Reporting on his observations of the site from the LM's low passes, Stafford indicated that ALS-2 seemed smoother than he had expected and described its appearance as similar to the desert surrounding Blythe, California; but he observed that Apollo 11 could face rougher terrain downrange if it approached off-target. Based upon Apollo 10's observations from relatively low altitude, NASA mission planners became comfortable enough with ALS-2 to confirm it as the target site for Apollo 11.
The next action was to prepare to separate the LM ascent stage from the descent stage, to jettison the descent stage, and fire the Ascent Propulsion System to return the ascent stage towards the CSM. As Stafford and Cernan prepared to do so, the LM began to gyrate out of control. Alarmed, Cernan exclaimed, "Son of a bitch!" into a hot mic being broadcast live, which, combined with other language used by the crew during the mission, generated some complaints back on Earth. Stafford discarded the descent stage about five seconds after the tumbling began and fought to regain control manually, suspecting that there might have been an "open thruster", or a thruster stuck firing. He did so in time to orient the spacecraft to rejoin "Charlie Brown". The problem was traced to a switch controlling the mode of the abort guidance system; it was to be moved as part of the procedure, but both of the crew members switched it, thus returning it to the original position. Had they fired "Snoopy" in the wrong direction, they might have missed the rendezvous with "Charlie Brown" or crashed into the Moon. Once Stafford had regained control of the LM ascent stage, which took about eight seconds, the pair fired the ascent engine at the lowest point of the LM's orbit, mimicking the orbital insertion maneuver after launch from the lunar surface in a later landing mission. "Snoopy" coasted on that trajectory for about an hour before firing the engine once more to further fine-tune its approach to "Charlie Brown".
"Snoopy" rendezvoused with and re-docked with "Charlie Brown" at 106:22:02, just under eight hours after undocking. The docking was telecast live in color from the CSM. Once Cernan and Stafford had re-entered "Charlie Brown", "Snoopy" was sealed off and separated from "Charlie Brown." The rest of the LM's ascent-stage engine fuel was burned to send it on a trajectory past the Moon and into a heliocentric orbit.
It was the only Apollo LM to meet this fate; the Apollo 11 ascent stage would be left in lunar orbit to crash, while post-Apollo 11 ascent stages were steered into the Moon to obtain readings from seismometers placed on the surface, except for Apollo 13's ascent stage, which the crew used as a "life boat" to get safely back to Earth before releasing it to burn up in Earth's atmosphere, and Apollo 16's, which NASA lost control of after jettison.
Return to Earth.
After ejecting the LM ascent stage, the crew slept and performed photography and observation of the lunar surface from orbit. Though the crew located 18 landmarks on the surface and took photographs of various surface features, crew fatigue necessitated the cancellation of two scheduled television broadcasts. Thereafter, the main Service Propulsion System engine of the CSM re-ignited for about 2.5 minutes to set Apollo 10 on a trajectory towards Earth, achieving such a trajectory at 137:39:13.7. As it departed lunar orbit, Apollo 10 had orbited the Moon 31 times over the span of about 61 hours and 37 minutes.
During their journey back to Earth, the crew performed some observational activities which included star-Earth horizon sightings for navigation. The crew also performed a scheduled test to gauge the reflectivity of the CSM's high-gain antenna and broadcast six television transmissions of varying durations to show views inside the spacecraft and of the Earth and Moon from the crew's vantage point. Cernan reported later that he and his crewmates became the first to "successfully shave in space" during the return trip, using a safety razor and thick shaving gel, as such items had been deemed a safety hazard and prohibited on earlier flights. The crew fired the engine of the CSM for the only mid-course-correction burn required during the return trip at 188:49:58, a few hours before separation of the CM from the SM. The burn lasted about 6.7 seconds.
As the spacecraft rapidly approached Earth on the final day of the mission, the Apollo 10 crew traveled faster than any humans before or since, relative to Earth: 39,897 km/h (11.08 km/s or 24,791 mph). This is because the return trajectory was designed to take only 42 hours rather than the normal 56. The Apollo 10 crew also traveled farther than any humans before or since from their (Houston) homes: (though the Apollo 13 crew was 200 km farther away from Earth as a whole). While most Apollo missions orbited the Moon at from the lunar surface, the distance between the Earth and Moon varies by about , between perigee and apogee, throughout each lunar month, and the Earth's rotation makes the distance to Houston vary by at most another each day. The Apollo 10 crew reached the farthest point in their orbit around the far side of the Moon at about the same time Earth's rotation put Houston nearly a full Earth diameter farther away.
At 191:33:26, the CM (which contained the crew) separated from the SM in preparation for reentry, which occurred about 15 minutes later at 191:48:54.5. Splashdown of the CM occurred about 15 minutes after reentry in the Pacific Ocean about east of American Samoa on May 26, 1969, at 16:52:23 UTC and mission elapsed time 192:03:23. The astronauts were recovered by . They spent about four hours aboard, during which they took a congratulatory phone call from President Richard Nixon. As they had not made contact with the lunar surface, Apollo 10's crew were not required to quarantine like the first landing crews would be. They were flown to Pago Pago International Airport in Tafuna for a greeting reception, before boarding a C-141 cargo plane to Ellington Air Force Base near Houston.
Aftermath.
Orbital operations and the solo maneuvering of the LM in partial descent to the lunar surface paved the way for the successful Apollo 11 lunar landing by demonstrating the capabilities of the mission hardware and systems. The crew demonstrated that the checkout procedures of the LM and initial descent and rendezvous could be accomplished within the allotted time, that the communication systems of the LM were sufficient, that the rendezvous and landing radars of the LM were operational in lunar orbit, and that the two spacecraft could be adequately monitored by personnel on Earth. Additionally, the precision of lunar orbital navigation improved with Apollo 10 and, combined with data from Apollo 8, NASA expected that it had achieved a level of precision sufficient to execute the first crewed lunar landing. After about two weeks of Apollo 10 data analysis, a NASA flight readiness team cleared Apollo 11 to proceed with its scheduled July 1969 flight. On July 16, 1969, the next Saturn V to launch carried the astronauts of Apollo 11: Neil Armstrong, Buzz Aldrin, and Michael Collins. On July 20, Armstrong and Aldrin landed on the Moon, and four days later the three astronauts returned to Earth, fulfilling John F. Kennedy's challenge to Americans to land astronauts on the Moon and return them safely to Earth by the end of the 1960s.
In July 1969, Stafford replaced Alan Shepard as Chief Astronaut, and then became deputy director of Flight Crew Operations under Deke Slayton. In his memoirs, Stafford wrote that he could have put his name back in the flight rotation, but wanted managerial experience. In 1972, Stafford was promoted to brigadier general and assigned to command the American portion of the Apollo–Soyuz Test Project, which flew in July 1975. He commanded the Air Force Flight Test Center at Edwards Air Force Base in California, and retired in November 1979 as a lieutenant general. Young commanded the Apollo 16 lunar landing mission flown in April 1972. From 1974 to 1987, Young served as Chief Astronaut, commanding the STS-1 (1981) and STS-9 (1983) Space Shuttle missions in April 1981 and November 1983, respectively, and retired from NASA's Astronaut Corps in 2004. Gene Cernan commanded the final Apollo lunar mission, Apollo 17, flown in December 1972. Cernan retired from NASA and the Navy as a captain in 1976.
Hardware disposition.
The Smithsonian has been accountable for the command module "Charlie Brown" since 1970. The spacecraft was on display in several countries until it was placed on loan to the London Science Museum in 1978. "Charlie Brown"'s SM was jettisoned just before re-entry and burned up in the Earth's atmosphere, its remnants scattering in the Pacific Ocean.
After translunar injection, the Saturn V's S-IVB third stage was accelerated past Earth escape velocity to become space debris; , it remains in a heliocentric orbit.
The ascent stage of the Lunar Module "Snoopy" was jettisoned into a heliocentric orbit. "Snoopy"'s ascent stage orbit was not tracked after 1969, and its whereabouts were unknown. In 2011, a group of amateur astronomers in the UK started a project to search for it. In June 2019, the Royal Astronomical Society announced a possible rediscovery of "Snoopy", determining that small Earth-crossing asteroid 2018 AV2 is likely to be the spacecraft with "98%" certainty. It is the only once-crewed spacecraft known to still be in outer space without a crew.
"Snoopy's" descent stage was jettisoned in lunar orbit; its current location is unknown, though it may have eventually crashed into the Moon as a result of orbital decay. Phil Stooke, a planetary scientist who studied the lunar crash sites of the LM's ascent stages, wrote that the descent stage "crashed at an unknown location", and another source stated that the descent stage "eventually impact(ed) within a few degrees of the equator on the near side". Richard Orloff and David M. Harland, in their sourcebook on Apollo, stated that "the descent stage was left in the low orbit, but perturbations by 'mascons' would have caused this to decay, sending the stage to crash onto the lunar surface".
External links.
NASA reports
Multimedia
|
1967 | Apollo 12 | Apollo 12 (November 14–24, 1969) was the sixth crewed flight in the United States Apollo program and the second to land on the Moon. It was launched on November 14, 1969, by NASA from the Kennedy Space Center, Florida. Commander Charles "Pete" Conrad and Lunar Module Pilot Alan L. Bean performed just over one day and seven hours of lunar surface activity while Command Module Pilot Richard F. Gordon remained in lunar orbit.
Apollo 12 would have attempted the first lunar landing had Apollo 11 failed, but after the success of Neil Armstrong's mission, Apollo 12 was postponed by two months, and other Apollo missions also put on a more relaxed schedule. More time was allotted for geologic training in preparation for Apollo 12 than for Apollo 11, Conrad and Bean making several geology field trips in preparation for their mission. Apollo 12's spacecraft and launch vehicle were almost identical to Apollo 11's. One addition was hammocks to allow Conrad and Bean to rest more comfortably on the Moon.
Shortly after being launched on a rainy day at Kennedy Space Center, Apollo 12 was twice struck by lightning, causing instrumentation problems but little damage. Switching to the auxiliary power supply resolved the data relay problem, saving the mission. The outward journey to the Moon otherwise saw few problems. On November 19, Conrad and Bean achieved a precise landing at their expected location within walking distance of the Surveyor 3 robotic probe, which had landed on April 20, 1967. In making a pinpoint landing, they showed that NASA could plan future missions in the expectation that astronauts could land close to sites of scientific interest. Conrad and Bean carried the Apollo Lunar Surface Experiments Package, a group of nuclear-powered scientific instruments, as well as the first color television camera taken by an Apollo mission to the lunar surface, but transmission was lost after Bean accidentally pointed the camera at the Sun and its sensor was destroyed. On the second of two moonwalks, they visited Surveyor 3 and removed parts for return to Earth.
Lunar Module "Intrepid" lifted off from the Moon on November 20 and docked with the command module, which subsequently traveled back to Earth. The Apollo 12 mission ended on November 24 with a successful splashdown.
Crew and key Mission Control personnel.
The commander of the all-Navy Apollo 12 crew was Charles "Pete" Conrad, who was 39 years old at the time of the mission. After receiving a bachelor's degree in aeronautical engineering from Princeton University in 1953, he became a naval aviator, and completed United States Naval Test Pilot School at Patuxent River Naval Air Station. He was selected in the second group of astronauts in 1962, and flew on Gemini 5 in 1965, and as command pilot of Gemini 11 in 1966. Command Module Pilot Richard "Dick" Gordon, 40 years old at the time of Apollo 12, also became a naval aviator in 1953, following graduation from the University of Washington with a degree in chemistry, and completed test pilot school at Patuxent River. Selected as a Group 3 astronaut in 1963, he flew with Conrad on Gemini 11.
The original Lunar Module pilot assigned to work with Conrad was Clifton C. Williams Jr., who was killed in October 1967 when the T-38 he was flying crashed near Tallahassee. When forming his crew, Conrad had wanted Alan L. Bean, a former student of his at the test pilot school, but had been told by Director of Flight Crew Operations Deke Slayton that Bean was unavailable due to an assignment to the Apollo Applications Program. After Williams's death, Conrad asked for Bean again, and this time Slayton yielded. Bean, 37 years old when the mission flew, had graduated from the University of Texas in 1955 with a degree in aeronautical engineering. Also a naval aviator, he was selected alongside Gordon in 1963, and first flew in space on Apollo 12. The three Apollo 12 crew members had backed up Apollo 9 earlier in 1969.
The Apollo 12 backup crew was David R. Scott as commander, Alfred M. Worden as Command Module pilot, and James B. Irwin as Lunar Module pilot. They became the crew of Apollo 15. For Apollo, a third crew of astronauts, known as the support crew, was designated in addition to the prime and backup crews used on projects Mercury and Gemini. Slayton created the support crews because James McDivitt, who would command Apollo 9, believed that, with preparation going on in facilities across the US, meetings that needed a member of the flight crew would be missed. Support crew members were to assist as directed by the mission commander. Usually low in seniority, they assembled the mission's rules, flight plan, and checklists, and kept them updated; For Apollo 12, they were Gerald P. Carr, Edward G. Gibson and Paul J. Weitz. Flight directors were Gerry Griffin, first shift, Pete Frank, second shift, Clifford E. Charlesworth, third shift, and Milton Windler, fourth shift. Flight directors during Apollo had a one-sentence job description, "The flight director may take any actions necessary for crew safety and mission success." Capsule communicators (CAPCOMs) were Scott, Worden, Irwin, Carr, Gibson, Weitz and Don Lind.
Preparation.
Site selection.
The landing site selection process for Apollo 12 was greatly informed by the site selection for Apollo 11. There were rigid standards for the possible Apollo 11 landing sites, in which scientific interest was not a major factor: they had to be close to the lunar equator and not on the periphery of the portion of the lunar surface visible from Earth; they had to be relatively flat and without major obstructions along the path the Lunar Module (LM) would fly to reach them, their suitability confirmed by photographs from Lunar Orbiter probes. Also desirable was the presence of another suitable site further west in case the mission was delayed and the sun would have risen too high in the sky at the original site for desired lighting conditions. The need for three days to recycle if a launch had to be scrubbed meant that only three of the five suitable sites found were designated as potential landing sites for Apollo 11, of which the Apollo 11 landing site in the Sea of Tranquillity was the easternmost. Since Apollo 12 was to attempt the first lunar landing if Apollo 11 failed, both sets of astronauts trained for the same sites.
With the success of Apollo 11, it was initially contemplated that Apollo 12 would land at the site next further west from the Sea of Tranquility, in Sinus Medii. However, NASA planning coordinator Jack Sevier and engineers at the Manned Spaceflight Center at Houston argued for a landing close enough to the crater in which the Surveyor 3 probe had landed in 1967 to allow the astronauts to cut parts from it for return to Earth. The site was otherwise suitable, and had scientific interest. Given that Apollo 11 had landed several miles off-target, though, some NASA administrators feared Apollo 12 would land far enough away that the astronauts could not reach the probe, and the agency would be embarrassed. Nevertheless, the ability to perform pinpoint landings was essential if Apollo's exploration program was to be carried out, and on July 25, 1969, Apollo Program Manager Samuel Phillips designated what became known as Surveyor crater as the landing site, despite the unanimous opposition of members of two site selection boards.
Training and preparation.
The Apollo 12 astronauts spent five hours in mission-specific training for every hour they expected to spend in flight on the mission, a total exceeding 1,000 hours per crew member. Conrad and Bean received more mission-specific training than Apollo 11's Neil Armstrong and Buzz Aldrin had. This was in addition to the 1,500 hours of training they received as backup crew members for Apollo 9. The Apollo 12 training included over 400 hours per crew member in simulators of the Command Module (CM) and of the LM. Some of the simulations were linked in real time to flight controllers in Mission Control. To practice landing on the Moon, Conrad flew the Lunar Landing Training Vehicle (LLTV), training in which continued to be authorized even though Armstrong had been forced to bail out of a similar vehicle in 1968, just before it crashed.
Soon after being assigned as Apollo 12 crew commander, Conrad met with NASA geologists and told them that the training for lunar surface activities would be conducted much as Apollo 11's, but there was to be no publicity or involvement by the media. Conrad felt he had been abused by the press during Gemini, and the sole Apollo 11 geology field trip had turned into a near-fiasco, with a large media contingent present, some getting in the way—the astronauts had trouble hearing each other due to a hovering press helicopter. After the successful return of Apollo 11 in July 1969, more time was allotted for geology, but the astronauts' focus was in getting time in the simulators without being pre-empted by the Apollo 11 crew. On the six Apollo 12 geology field trips, the astronauts would practice as if on the Moon, collecting samples and documenting them with photographs, while communicating with a CAPCOM and geologists who were out of sight in a nearby tent. Afterwards, the astronauts' performance in choosing samples and taking photographs would be critiqued. To the frustration of the astronauts, the scientists kept changing the photo documentation procedures; after the fourth or fifth such change, Conrad required that there be no more. After the return of Apollo 11, the Apollo 12 crew was able to view the lunar samples, and be briefed on them by scientists.
As Apollo 11 was targeted for an ellipse-shaped landing zone, rather than at a specific point, there was no planning for geology traverses, the designated tasks to be done at sites of the crew's choosing. For Apollo 12, before the mission, some of NASA's geology team met with the crew and Conrad suggested they lay out possible routes for him and Bean. The result was four traverses, based on four potential landing points for the LM. This was the start of geology traverse planning that on later missions became a considerable effort involving several organizations.
The stages of the lunar module, LM–6, were delivered to Kennedy Space Center (KSC) on March 24, 1969, and were mated to each other on April 28. Command module CM–108 and service module SM–108 were delivered to KSC on March 28, and were mated to each other on April 21. Following installation of gear and testing, the launch vehicle, with the spacecraft atop it, was rolled out to Launch Complex 39A on September 8, 1969. The training schedule was complete, as planned, by November 1, 1969; activities after that date were intended as refreshers. The crew members felt that the training, for the most part, was adequate preparation for the Moon mission.
Hardware.
Launch vehicle.
There were no significant changes to the Saturn V launch vehicle used on Apollo 12, SA–507, from that used on Apollo 11. There were another 17 instrumentation measurements in the Apollo 12 launch vehicle, bringing the number to 1,365. The entire vehicle, including the spacecraft, weighed at launch, an increase from Apollo 11's . Of this figure, the spacecraft weighed , up from on Apollo 11.
Third stage trajectory.
After LM separation, the third stage of the Saturn V, the S-IVB, was intended to fly into solar orbit. The S-IVB auxiliary propulsion system was fired, with the intent that the Moon's gravity slingshot the stage into solar orbit. Due to an error, the S-IVB flew past the Moon at too high an altitude to achieve Earth escape velocity. It remained in a semi-stable Earth orbit until it finally escaped Earth orbit in 1971, but briefly returned to Earth orbit 31 years later. It was discovered by amateur astronomer Bill Yeung who gave it the temporary designation J002E3 before it was determined to be an artificial object. Again in solar orbit as of 2021, it may again be captured by Earth's gravity, but not at least until the 2040s. The S-IVBs used on later lunar missions were deliberately crashed into the Moon to create seismic events that would register on the seismometers left on the Moon and provide data about the Moon's structure.
Spacecraft.
The Apollo 12 spacecraft consisted of Command Module 108 and Service Module 108 (together Command and Service Modules 108, or CSM–108), Lunar Module 6 (LM–6), a Launch Escape System (LES), and Spacecraft-Lunar Module Adapter 15 (SLA–15). The LES contained three rocket motors to propel the CM to safety in the event of an abort shortly after launch, while the SLA housed the LM and provided a structural connection between the Saturn V and the LM. The SLA was identical to Apollo 11's, while the LES differed only in the installation of a more reliable motor igniter.
The CSM was given the call sign "Yankee Clipper", while the LM had the call sign "Intrepid". These sea-related names were selected by the all-Navy crew from several thousand proposed names submitted by employees of the prime contractors of the respective modules. George Glacken, a flight test engineer at North American Aviation, builder of the CSM, proposed "Yankee Clipper" as such ships had "majestically sailed the high seas with pride and prestige for a new America". "Intrepid" was from a suggestion by Robert Lambert, a planner at Grumman, builder of the LM, as evocative of "this nation's resolute determination for continued exploration of space, stressing our astronauts' fortitude and endurance of hardship".
The differences between the CSM and LM of Apollo 11, and those of Apollo 12, were few and minor. A hydrogen separator was added to the CSM to stop the gas from entering the potable water tank—Apollo 11 had had one, though mounted on the water dispenser in the CM's cabin. Gaseous hydrogen in the water had given the Apollo 11 crew severe flatulence. Other changes included the strengthening of the recovery loop attached following splashdown, meaning that the swimmers recovering the CM would not have to attach an auxiliary loop. LM changes included a structural modification so that scientific experiment packages could be carried for deployment on the lunar surface. Two hammocks were added for greater comfort of the astronauts while resting on the Moon, and a color television camera substituted for the black and white one used on the lunar surface during Apollo 11.
ALSEP.
The Apollo Lunar Surface Experiments Package, or ALSEP, was a suite of scientific instruments designed to be emplaced on the lunar surface by the Apollo astronauts, and thereafter operate autonomously, sending data to Earth. Development of the ALSEP was part of NASA's response to some scientists who opposed the crewed lunar landing program (they felt that robotic craft could explore the Moon more cheaply) by demonstrating that some tasks, such as deployment of the ALSEP, required humans. In 1966, a contract to design and build the ALSEPs was awarded to the Bendix Corporation Due to the limited time the Apollo 11 crew would have on the lunar surface, a smaller suite of experiments was flown, known as the Early Apollo Surface Experiment Package (EASEP). Apollo 12 was the first mission to carry an ALSEP; one would be flown on each of the subsequent lunar landing missions, though the components that were included would vary. Apollo 12's ALSEP was to be deployed at least away from the LM to protect the instruments from the debris that would be generated when the ascent stage of the LM took off to return the astronauts to lunar orbit.
Apollo 12's ALSEP included a Lunar Surface Magnetometer (LSM), to measure the magnetic field at the Moon's surface, a Lunar Atmosphere Detector (LAD, also known as the Cold Cathode Ion Gauge Experiment), intended to measure the density and temperature of the thin lunar atmosphere and how it varies, a Lunar Ionosphere Detector (LID, also known as the Charged Particle Lunar Environment Experiment, or CPLEE), intended to study the charged particles in the lunar atmosphere, and the Solar Wind Spectrometer, to measure the strength and direction of the solar wind at the Moon's surface—the free-standing Solar Wind Composition Experiment, to measure what makes up the solar wind, would be deployed and then brought back to Earth by the astronauts. A Dust Detector was used to measure the accumulation of lunar dust on the equipment. Apollo 12's Passive Seismic Experiment (PSE), a seismometer, would measure moonquakes and other movements in the Moon's crust, and would be calibrated by the nearby planned impact of the ascent stage of Apollo 12's LM, an object of known mass and velocity hitting the Moon at a known location, and projected to be equivalent to the explosive force of one ton of TNT.
The ALSEP experiments left on the Moon by Apollo 12 were connected to a Central Station, which contained a transmitter, receiver, timer, data processor, and equipment for power distribution and control of the experiments. The equipment was powered by SNAP-27, a radioisotope thermoelectric generator (RTG) developed by the Atomic Energy Commission. Containing plutonium, the RTG flown on Apollo 12 was the first use of atomic energy on a crewed NASA spacecraft—some NASA and military satellites had previously used similar systems. The plutonium core was brought from Earth in a cask attached to an LM landing leg, a container designed to survive re-entry in the event of an aborted mission, something NASA considered unlikely. The cask would survive re-entry on Apollo 13, sinking in the Tonga Trench of the Pacific Ocean, apparently without radioactive leakage.
The Apollo 12 ALSEP experiments were activated from Earth on November 19, 1969. The LAD returned only a small amount of useful data due to the failure of its power supply soon after activation. The LSM was deactivated on June 14, 1974, as was the other LSM deployed on the Moon, from Apollo 15. All powered ALSEP experiments that remained active were deactivated on September 30, 1977, principally because of budgetary constraints.
Mission highlights.
Launch.
With President Richard Nixon in attendance, the first time a current U.S. president had witnessed a crewed space launch, as well as Vice President Spiro Agnew, Apollo 12 launched as planned at 11:22:00 on November 14, 1969 (16:22:00 UT) from Kennedy Space Center. This was at the start of a launch window of three hours and four minutes to reach the Moon with optimal lighting conditions at the planned landing point. There were completely overcast rainy skies, and the vehicle encountered winds of during ascent, the strongest of any Apollo mission. There was a NASA rule against launching into a cumulonimbus cloud; this had been waived and it was later determined that the launch vehicle never entered such a cloud. Had the mission been postponed, it could have been launched on November 16 with landing at a backup site where there would be no Surveyor, but since time pressure to achieve a lunar landing had been removed by Apollo 11's success, NASA might have waited until December for the next opportunity to go to the Surveyor crater.
Lightning struck the Saturn V 36.5 seconds after lift-off, triggered by the vehicle itself. The static discharge caused a voltage transient that knocked all three fuel cells offline, meaning the spacecraft was being powered entirely from its batteries, which could not supply enough current to meet demand. A second strike at 52 seconds knocked out the "8-ball" attitude indicator. The telemetry stream at Mission Control was garbled, but the Saturn V continued to fly normally; the strikes had not affected the Saturn V instrument unit guidance system, which functioned independently from the CSM. The astronauts unexpectedly had a board red with caution and warning lights, but could not tell exactly what was wrong.
The Electrical, Environmental and Consumables Manager (EECOM) in Mission Control, John Aaron, remembered the telemetry failure pattern from an earlier test when a power loss caused a malfunction in the CSM signal conditioning electronics (SCE), which converted raw signals from instrumentation to data that could be displayed on Mission Control's consoles, and knew how to fix it. Aaron made a call, "Flight, EECOM. Try SCE to Aux", to switch the SCE to a backup power supply. The switch was fairly obscure, and neither Flight Director Gerald Griffin, CAPCOM Gerald P. Carr, nor Conrad knew what it was; Bean, who as LMP was the spacecraft's engineer, knew where to find it and threw the switch, after which the telemetry came back online, revealing no significant malfunctions. Bean put the fuel cells back online, and the mission continued. Once in Earth parking orbit, the crew carefully checked out their spacecraft before re-igniting the S-IVB third stage for trans-lunar injection. The lightning strikes caused no serious permanent damage.
Initially, it was feared that the lightning strike could have damaged the explosive bolts that opened the Command Module's parachute compartment. The decision was made not to share this with the astronauts and to continue with the flight plan, since they would die if the parachutes failed to deploy, whether following an Earth-orbit abort or upon a return from the Moon, so nothing was to be gained by aborting. The parachutes deployed and functioned normally at the end of the mission.
Outward journey.
After systems checks in Earth orbit, performed with great care because of the lightning strikes, the trans-lunar injection burn, made with the S-IVB, took place at 02:47:22.80 into the mission, setting Apollo 12 on course for the Moon. An hour and twenty minutes later, the CSM separated from the S-IVB, after which Gordon performed the transposition, docking and extracting maneuver to dock with the LM and separate the combined craft from the S-IVB, which was then sent on an attempt to reach solar orbit. The stage fired its engines to leave the vicinity of the spacecraft, a change from Apollo 11, where the SM's Service Propulsion System (SPS) engine was used to distance it from the S-IVB.
As there were concerns the LM might have been damaged by the lightning strikes, Conrad and Bean entered it on the first day of flight to check its status, earlier than planned. They found no issues. At 30:52.44.36, the only necessary midcourse correction during the translunar coast was made, placing the craft on a hybrid, non-free-return trajectory. Previous crewed missions to lunar orbit had taken a free-return trajectory, allowing an easy return to Earth if the craft's engines did not fire to enter lunar orbit. Apollo 12 was the first crewed spacecraft to take a hybrid free-return trajectory, that would require another burn to return to Earth, but one that could be executed by the LM's Descent Propulsion System (DPS) if the SPS failed. The use of a hybrid trajectory allowed more flexibility in mission planning. It for example allowed Apollo 12 to launch in daylight and reach the planned landing spot on schedule. Use of a hybrid trajectory meant that Apollo 12 took 8 hours longer to go from trans-lunar injection to lunar orbit.
Lunar orbit and Moon landing.
Apollo 12 entered a lunar orbit of with an SPS burn of 352.25 seconds at mission time 83:25:26.36. On the first lunar orbit, there was a television transmission that resulted in good-quality video of the lunar surface. On the third lunar orbit, there was another burn to circularize the craft's orbit to , and on the next revolution, preparations began for the lunar landing. The CSM and LM undocked at 107:54:02.3; a half hour later there was a burn by the CSM to separate them. The 14.4 second burn by some of the CSM's thrusters meant that the two craft would be apart when the LM began the burn to move to a lower orbit in preparation for landing on the Moon.
The LM's Descent Propulsion System began a 29-second burn at 109:23:39.9 to move the craft to the lower orbit, from which the 717-second powered descent to the lunar surface began at 110:20:38.1. Conrad had trained to expect a pattern of craters known as "the Snowman" to be visible when the craft underwent "pitchover", with the Surveyor crater in its center, but had feared he would see nothing recognizable. He was astonished to see the Snowman right where it should be, meaning they were directly on course. He took over manual control, planning to land the LM, as he had in simulations, in an area near the Surveyor crater that had been dubbed "Pete's Parking Lot", but found it rougher than expected. He had to maneuver, and landed the LM
at 110:32:36.2 (06:54:36 UT on November 19, 1969), just from the Surveyor probe. This achieved one objective of the mission, to perform a precision landing near the Surveyor craft.
The lunar coordinates of the landing site were 3.01239° S latitude, 23.42157° W longitude. The landing caused high velocity sandblasting of the Surveyor probe. It was later determined that the sandblasting removed more dust than it delivered onto the Surveyor, because the probe was covered by a thin layer that gave it a tan hue as observed by the astronauts, and every portion of the surface exposed to the direct sandblasting was lightened back toward the original white color through the removal of lunar dust.
Lunar surface activities.
When Conrad, the shortest man of the initial groups of astronauts, stepped onto the lunar surface his first words were "Whoopie! Man, that may have been a small one for Neil, but that's a long one for me." This was not an off-the-cuff remark: Conrad had made a bet with reporter Oriana Fallaci he would say these words, after she had queried whether NASA had instructed Neil Armstrong what to say as he stepped onto the Moon. Conrad later said he was never able to collect the money.
To improve the quality of television pictures from the Moon, a color camera was carried on Apollo 12 (unlike the monochrome camera on Apollo 11). When Bean carried the camera to the place near the LM where it was to be set up, he inadvertently pointed it directly into the Sun, destroying the Secondary Electron Conduction (SEC) tube. Television coverage of this mission was thus terminated almost immediately.
After raising a U.S. flag on the Moon, Conrad and Bean devoted much of the remainder of the first EVA to deploying the Apollo Lunar Surface Experiments Package (ALSEP). There were minor difficulties with the deployment. Bean had trouble extracting the RTG's plutonium fuel element from its protective cask, and the astronauts had to resort to the use of a hammer to hit the cask and dislodge the fuel element. Some of the ALSEP packages proved hard to deploy, though the astronauts were successful in all cases. With the PSE able to detect their footprints as they headed back to the LM, the astronauts secured a core tube full of lunar material, and collected other samples. The first EVA lasted 3 hours, 56 minutes and 3 seconds.
Four possible geologic traverses had been planned, the variable being where the LM might set down. Conrad had landed it between two of these potential landing points, and during the first EVA and the rest break that followed, scientists in Houston combined two of the traverses into one that Conrad and Bean could follow from their landing point. The resultant traverse resembled a rough circle, and when the astronauts emerged from the LM some 13 hours after ending the first EVA, the first stop was Head crater, some from the LM. There, Bean noticed that Conrad's footprints showed lighter material underneath, indicating the presence of ejecta from Copernicus crater, to the north, something that scientists examining overhead photographs of the site had hoped to find. After the mission, samples from Head allowed geologists to date the impact that formed Copernicus—according to initial dating, some 810,000,000 years ago.
The astronauts proceeded to Bench crater and Sharp crater and past Halo crater before arriving at Surveyor crater, where the Surveyor 3 probe had landed. Fearing treacherous footing or that the probe might topple on them, they approached Surveyor cautiously, descending into the shallow crater some distance away and then following a contour to reach the craft, but found the footing solid and the probe stable. They collected several pieces of Surveyor, including the television camera, as well as taking rocks that had been studied by television. Conrad and Bean had procured an automatic timer for their Hasselblad cameras, and had brought it with them without telling Mission Control, hoping to take a selfie of the two of them with the probe, but when the time came to use it, could not locate it among the lunar samples they had already placed in their Hand Tool Carrier. Before returning to the LM's vicinity, Conrad and Bean went to Block crater, within Surveyor crater. The second EVA lasted 3 hours, 49 minutes, 15 seconds, during which they traveled . During the EVAs, Conrad and Bean went as far as from the LM, and collected of samples.
Lunar orbit solo activities.
After the LM's departure, Gordon had little to say as Mission Control focused on the lunar landing. Once that was accomplished, Gordon sent his congratulations and, on the next orbit, was able to spot both the LM and the Surveyor on the ground and convey their locations to Houston. During the first EVA, Gordon prepared for a plane change maneuver, a burn to alter the CSM's orbit to compensate for the rotation of the Moon, though at times he had difficulty communicating with Houston since Conrad and Bean were using the same communications circuit. Once the two moonwalkers had returned to the LM, Gordon executed the burn, which ensured he would be in the proper position to rendezvous with the LM when it launched from the Moon.
While alone in orbit, Gordon performed the Lunar Multispectral Photography Experiment, using four Hasselblad cameras arranged in a ring and aimed through one of the CM's windows. With each camera having a different color filter, simultaneous photos would be taken by each, showing the appearance of lunar features at different points on the spectrum. Analysis of the images might reveal colors not visible to the naked eye or detectable with ordinary color film, and information could be obtained about the composition of sites that would not soon be visited by humans. Among the sites studied were contemplated landing points for future Apollo missions.
Return.
LM "Intrepid" lifted off from the Moon at mission time 143:03:47.78, or 14:25:47 UT on November 20, 1969; after several maneuvers, CSM and LM docked three and a half hours later. At 147:59:31.6, the LM ascent stage was jettisoned, and shortly thereafter the CSM maneuvered away. Under control from Earth, the LM's remaining propellent was depleted in a burn that caused it to impact the Moon from the Apollo 12 landing point. The seismometer the astronauts had left on the lunar surface registered the resulting vibrations for more than an hour.
The crew stayed another day in lunar orbit taking photographs of the surface, including of candidate sites for future Apollo landings. A second plane change maneuver was made at 159:04:45.47, lasting 19.25 seconds.
The trans-Earth injection burn, to send the CSM "Yankee Clipper" towards home, was conducted at 172:27:16.81 and lasted 130.32 seconds. Two short midcourse correction burns were made en route. A final television broadcast was made, the astronauts answering questions submitted by the media. There was ample time for rest on the way back to Earth, One event was the photography of a solar eclipse that occurred when the Earth came between the spacecraft and the Sun; Bean described it as the most spectacular sight of the mission.
Splashdown.
"Yankee Clipper" returned to Earth on November 24, 1969, at 20:58 UT (3:58pm Eastern Time, 10:58am HST), in the Pacific Ocean. The landing was hard, resulting in a camera becoming dislodged and striking Bean in the forehead. After recovery by , they entered the Mobile Quarantine Facility (MQF), while lunar samples and Surveyor parts were sent ahead by air to the Lunar Receiving Laboratory (LRL) in Houston. Once the "Hornet" docked in Hawaii, the MQF was offloaded and flown to Ellington Air Force Base near Houston on November 29, from where it was taken to the LRL, where the astronauts remained until released from quarantine on December 10.
Mission insignia.
The Apollo 12 mission patch shows the crew's naval background; all three astronauts at the time of the mission were U.S. Navy commanders. It features a clipper ship arriving at the Moon, representing the CM "Yankee Clipper". The ship trails fire, and flies the flag of the United States. The mission name APOLLO XII and the crew names are on a wide gold border, with a small blue trim. Blue and gold are traditional U.S. Navy colors. The patch has four stars on it – one each for the three astronauts who flew the mission and one for Clifton Williams, the original LMP on Conrad's crew who was killed in 1967 and would have flown the mission. The star was placed there at the suggestion of his replacement, Bean.
The insignia was designed by the crew with the aid of several employees of NASA contractors. The Apollo 12 landing area on the Moon is within the portion of the lunar surface shown on the insignia, based on a photograph of a globe of the Moon, taken by engineers. The clipper ship was based on photographs of such a ship obtained by Bean.
Aftermath and spacecraft location.
After the mission, Conrad urged his crewmates to join him in the Skylab program, seeing in it the best chance of flying in space again. Bean did so—Conrad commanded Skylab 2, the first crewed mission to the space station, while Bean commanded Skylab 3. Gordon, though, still hoped to walk on the Moon and remained with the Apollo program, serving as backup commander of Apollo 15. He was the likely commander of Apollo 18, but that mission was canceled and he did not fly in space again.
The Apollo 12 command module "Yankee Clipper", was displayed at the Paris Air Show and was then placed at NASA's Langley Research Center in Hampton, Virginia; ownership was transferred to the Smithsonian in July 1971. It is on display at the Virginia Air and Space Center in Hampton.
Mission Control had remotely fired the service module's thrusters after jettison, hoping to have it skip off the atmosphere and enter a high-apogee orbit, but the lack of tracking data confirming this caused it to conclude it most likely burned up in the atmosphere at the time of CM re-entry. The S-IVB is in a solar orbit that is sometimes affected by the Earth.
The ascent stage of LM "Intrepid" impacted the Moon November 20, 1969, at 22:17:17.7 UT (5:17pm EST). In 2009, the Lunar Reconnaissance Orbiter (LRO) photographed the Apollo 12 landing site, where the descent stage, ALSEP, Surveyor3 spacecraft, and astronaut footpaths remain. In 2011, the LRO returned to the landing site at a lower altitude to take higher resolution photographs.
|
1968 | Apollo 14 | Apollo 14 (January 31February 9, 1971) was the eighth crewed mission in the United States Apollo program, the third to land on the Moon, and the first to land in the lunar highlands. It was the last of the "H missions", landings at specific sites of scientific interest on the Moon for two-day stays with two lunar extravehicular activities (EVAs or moonwalks).
The mission was originally scheduled for 1970, but was postponed because of the investigation following the failure of Apollo 13 to reach the Moon's surface, and the need for modifications to the spacecraft as a result. Commander Alan Shepard, Command Module Pilot Stuart Roosa, and Lunar Module Pilot Edgar Mitchell launched on their nine-day mission on Sunday, January 31, 1971, at 4:03:02 p.m. EST. En route to the lunar landing, the crew overcame malfunctions that might have resulted in a second consecutive aborted mission, and possibly, the premature end of the Apollo program.
Shepard and Mitchell made their lunar landing on February 5 in the Fra Mauro formation – originally the target of Apollo 13. During the two walks on the surface, they collected of Moon rocks and deployed several scientific experiments. To the dismay of some geologists, Shepard and Mitchell did not reach the rim of Cone crater as had been planned, though they came close. In Apollo 14's most famous event, Shepard hit two golf balls he had brought with him with a makeshift club.
While Shepard and Mitchell were on the surface, Roosa remained in lunar orbit aboard the Command and Service Module, performing scientific experiments and photographing the Moon, including the landing site of the future Apollo 16 mission. He took several hundred seeds on the mission, many of which were germinated on return, resulting in the so-called Moon trees, that were widely distributed in the following years. After liftoff from the lunar surface and a successful docking, the spacecraft was flown back to Earth where the three astronauts splashed down safely in the Pacific Ocean on February 9.
Astronauts and key Mission Control personnel.
The mission commander of Apollo 14, Alan Shepard, one of the original Mercury Seven astronauts, became the first American to enter space with a suborbital flight on May 5, 1961. Thereafter, he was grounded by Ménière's disease, a disorder of the ear, and served as Chief Astronaut, the administrative head of the Astronaut Office. He had experimental surgery in 1968 which was successful and allowed his return to flight status. Shepard, at age 47, was the oldest U.S. astronaut to fly when he made his trip aboard Apollo 14, and he is the oldest person to walk on the Moon.
Apollo 14's Command Module Pilot (CMP), Stuart Roosa, aged 37 when the mission flew, had been a smoke jumper before joining the Air Force in 1953. He became a fighter pilot and then in 1965 successfully completed Aerospace Research Pilot School (ARPS) at Edwards Air Force Base in California prior to his selection as a Group 5 astronaut the following year. He served as a capsule communicator (CAPCOM) for Apollo 9. The Lunar Module Pilot (LMP), Edgar Mitchell, aged 40 at the time of Apollo 14, joined the Navy in 1952 and served as a fighter pilot, beginning in 1954. He was assigned to squadrons aboard aircraft carriers before returning to the United States to further his education while in the Navy, also completing the ARPS prior to his selection as a Group 5 astronaut. He served on the support crew for Apollo 9 and was the LMP of the backup crew for Apollo 10.
Shepard and his crew had originally been designated by Deke Slayton, Director of Flight Crew Operations and one of the Mercury Seven, as the crew for Apollo 13. NASA's management felt that Shepard needed more time for training given he had not flown in space since 1961, and chose him and his crew for Apollo 14 instead. The crew originally designated for Apollo 14, Jim Lovell as the commander, Ken Mattingly as CMP and Fred Haise as LMP, all of whom had backed up Apollo 11, was made the prime crew for Apollo 13 instead.
Mitchell's commander on the Apollo 10 backup crew had been another of the original seven, Gordon Cooper, who had tentatively been scheduled to command Apollo 13, but according to author Andrew Chaikin, his casual attitude toward training resulted in him being not selected. Also on that crew, but excluded from further flights, was Donn Eisele, likely because of problems aboard Apollo 7, which he had flown, and because he had been involved in a messy divorce.
Apollo 14's backup crew was Eugene A. Cernan as commander, Ronald E. Evans Jr. as CMP and Joe H. Engle as LMP. The backup crew, with Harrison Schmitt replacing Engle, would become the prime crew of Apollo 17. Schmitt flew instead of Engle because there was intense pressure on NASA to fly a scientist to the Moon (Schmitt was a geologist) and Apollo 17 was the last lunar flight. Engle, who had flown the X-15 to the edge of outer space, flew into space for NASA in 1981 on STS-2, the second Space Shuttle flight.
During projects Mercury and Gemini, each mission had a prime and a backup crew. Apollo 9 commander James McDivitt believed meetings that required a member of the flight crew were being missed, so for Apollo a third crew of astronauts was added, known as the support crew. Usually low in seniority, support crew members assembled the mission's rules, flight plan, and checklists, and kept them updated; for Apollo 14, they were Philip K. Chapman, Bruce McCandless II, William R. Pogue and C. Gordon Fullerton. CAPCOMs, the individuals in Mission Control responsible for communications with the astronauts were Evans, McCandless, Fullerton and Haise. A veteran of Apollo 13, which had aborted before reaching the Moon, Haise put his training for that mission to use, especially during the EVAs, since both missions were targeted at the same place on the Moon. Had Haise walked on the Moon, he would have been the first Group 5 astronaut to do so, an honor that went to Mitchell.
The flight directors during Apollo had a one-sentence job description, "The flight director may take any actions necessary for crew safety and mission success." For Apollo 14, they were: Pete Frank, Orange team; Glynn Lunney, Black team; Milt Windler, Maroon team and Gerry Griffin, Gold team.
Preparation and training.
Prime and backup crews for both Apollo 13 and 14 were announced on August 6, 1969. Apollo 14 was scheduled for July 1970, but in January of that year, due to budget cuts that saw the cancellation of Apollo 20, NASA decided there would be two Apollo missions per year with 1970 to see Apollo 13 in April and Apollo 14 likely in October or November.
The investigation into the accident which caused an abort of Apollo 13 delayed Apollo 14. On May 7, 1970, NASA Administrator Thomas O. Paine announced that Apollo 14 would launch no earlier than December 3, and the landing would be close to the site targeted by Apollo 13. The Apollo 14 astronauts continued their training. On June 30, 1970, following the release of the accident report and a NASA review of what changes to the spacecraft would be necessary, NASA announced that the launch would slip to no earlier than January 31, 1971.
The crew of Apollo 14 trained together for 19 months after assignment to the mission, longer than any other Apollo crew to that point. In addition to the normal training workload, they had to supervise the changes to the command and service module (CSM) made as a result of the Apollo 13 investigation, much of which was delegated by Shepard to Roosa. Mitchell later stated, "We realized that if our mission failed—if we had to turn back—that was probably the end of the Apollo program. There was no way NASA could stand two failures in a row. We figured there was a heavy mantle on our shoulders to make sure we got it right."
Before the abort of the Apollo 13 mission, the plan was to have Apollo 14 land near Littrow crater, in Mare Serenitatis, where there are features that were thought to be volcanic. After Apollo 13 returned, it was decided that its landing site, near Cone crater in the Fra Mauro formation, was scientifically more important than Littrow. The Fra Mauro formation is composed of ejecta from the impact event that formed Mare Imbrium, and scientists hoped for samples that originated deep under the Moon's surface. Cone crater was the result of a young, deep impact, and large enough to have torn through whatever debris was deposited since the Imbrium Event, which geologists hoped to be able to date. Landing at Fra Mauro would also allow orbital photography of another candidate landing site, the Descartes Highlands, which became the landing site for Apollo 16. Although Littrow went unvisited, a nearby area, Taurus-Littrow, was the landing site for Apollo 17. Apollo 14's landing site was located slightly closer to Cone crater than the point designated for Apollo 13.
The change in landing site from Littrow to Fra Mauro affected the geological training for Apollo 14. Before the switch, the astronauts had been taken to volcanic sites on Earth; afterwards, they visited crater sites, such as the Ries Crater in West Germany and an artificial crater field created for astronaut training in Arizona's Verde Valley. The effectiveness of the training was limited by a lack of enthusiasm shown by Shepard, which set the tone for Mitchell. Harrison Schmitt suggested that the commander had other things on his mind, such as overcoming a ten-year absence from spaceflight and ensuring a successful mission after the near-disaster of Apollo 13.
Roosa undertook training for his period alone in lunar orbit, when he would make observations of the Moon and take photographs. He had been impressed by the training given to Apollo 13 prime crew CMP Mattingly by geologist Farouk El-Baz and got El-Baz to agree to undertake his training. The two men pored over lunar maps depicting the areas the CSM would pass over. When Shepard and Mitchell were on their geology field trips, Roosa would be overhead in an airplane taking photographs of the site and making observations. El-Baz had Roosa make observations while flying his T-38 jet at a speed and altitude simulating the speed at which the lunar surface would pass below the CSM.
Another issue that had marked Apollo 13 was the last-minute change of crew due to exposure to communicable disease. To prevent another such occurrence, for Apollo 14 NASA instituted what was called the Flight Crew Health Stabilization Program. Beginning 21 days before launch, the crew lived in quarters at the launch site, Florida's Kennedy Space Center (KSC), with their contacts limited to their spouses, the backup crew, mission technicians, and others directly involved in training. Those individuals were given physical examinations and immunizations, and crew movements were limited as much as possible at KSC and nearby areas.
The Command and Service Modules were delivered to KSC on November 19, 1969; the ascent stage of the LM arrived on November 21 with the descent stage three days later. Thereafter, checkout, testing and equipment installation proceeded. The launch vehicle stack, with the spacecraft on top, was rolled out from the Vehicle Assembly Building to Pad 39A on November 9, 1970.
Hardware.
Spacecraft.
The Apollo 14 spacecraft consisted of Command Module (CM) 110 and Service Module (SM) 110 (together CSM-110), called "Kitty Hawk", and Lunar Module 8 (LM-8), called "Antares". Roosa had chosen the CSM's call sign after the town in North Carolina where, in 1903, the Wright Brothers first flew their "Wright Flyer" airplane (also known as "Kitty Hawk"). Antares was named by Mitchell after the star in the constellation Scorpius that the astronauts in the LM would use to orient the craft for its lunar landing. Also considered part of the spacecraft were a Launch Escape System and a Spacecraft/Launch Vehicle Adapter, numbered SLA-17.
The changes to the Apollo spacecraft between Apollo 13 and 14 were more numerous than with earlier missions, not only because of the problems with Apollo 13, but because of the more extensive lunar activities planned for Apollo 14. The Apollo 13 accident had been caused by the explosive failure of an oxygen tank, after the insulation of the internal wiring had been damaged by heating of the tank contents pre-launch—that the oxygen had gotten hot enough to damage the insulation had not been realized, since the protective thermostatic switches had failed because they were, through an error, not designed to handle the voltage applied during ground testing. The explosion damaged the other tank or its tubing, causing its contents to leak away.
The changes in response included a redesign of the oxygen tanks, with the thermostats being upgraded to handle the proper voltage. A third tank was also added, placed in Bay1 of the SM, on the side opposite the other two, and was given a valve that could isolate it in an emergency, and allow it to feed the CM's environmental system only. The quantity probe in each tank was upgraded from aluminum to stainless steel.
Also in response to the Apollo 13 accident, the electrical wiring in Bay4 (where the explosion had happened) was sheathed in stainless steel. The fuel cell oxygen supply valves were redesigned to isolate the Teflon-coated wiring from the oxygen. The spacecraft and Mission Control monitoring systems were modified to give more immediate and visible warnings of anomalies. The Apollo 13 astronauts had suffered shortages of water and of power after the accident. Accordingly, an emergency supply of of water was stored in Apollo 14's CM, and an emergency battery, identical to those that powered the LM's descent stage, was placed in the SM. The LM was modified to make the transfer of power from LM to CM easier.
Other changes included the installation of anti-slosh baffles in the LM descent stage's propellant tanks. This would prevent the low fuel light from coming on prematurely, as had happened on Apollo 11 and 12. Structural changes were made to accommodate the equipment to be used on the lunar surface, including the Modular Equipment Transporter.
Launch vehicle.
The Saturn V used for Apollo 14 was designated SA-509, and was similar to those used on Apollo 8 through 13. At , it was the heaviest vehicle yet flown by NASA, heavier than the launch vehicle for Apollo 13.
A number of changes were made to avoid pogo oscillations, that had caused an early shutdown of the center J-2 engine on Apollo 13's S-II second stage. These included a helium gas accumulator installed in the liquid oxygen (LOX) line of the center engine, a backup cutoff device for that engine, and a simplified 2-position propellant utilization valve on each of the five J-2 engines.
ALSEP and other lunar surface equipment.
The Apollo Lunar Surface Experiments Package (ALSEP) array of scientific instruments carried by Apollo 14 consisted of the Passive Seismic Experiment (PSE), Active Seismic Experiment (ASE), Suprathermal Ion Detector (SIDE), Cold Cathode Ion Gauge (CCIG), and Charged Particle Lunar Environmental Experiment (CPLEE). Two additional lunar surface experiments not part of the ALSEP were also flown, the Laser Ranging Retro-Reflector (LRRR or LR3), to be deployed in the ALSEP's vicinity, and the Lunar Portable Magnetometer (LPM), to be used by the astronauts during their second EVA. The PSE had been flown on Apollo 12 and 13, the ASE on Apollo 13, the SIDE on Apollo 12, the CCIG on Apollo 12 and 13, and the LRRR on Apollo 11. The LPM was new, but resembled equipment flown on Apollo 12. The ALSEP components flown on Apollo 13 were destroyed when its LM burned up in Earth's atmosphere.
Deployment of the ALSEP, and of the other instruments, each formed one of Apollo 14's mission objectives.
The PSE was a seismometer, similar to one left on the Moon by Apollo 12, and was to measure seismic activity in the Moon. The Apollo 14 instrument would be calibrated by the impact, after being jettisoned, of the LM's ascent stage, since an object of known mass and velocity would be impacting at a known location on the Moon. The Apollo 12 instrument would also be activated by the spent Apollo 14 S-IVB booster, which would impact the Moon after the mission entered lunar orbit. The two seismometers would, in combination with those left by later Apollo missions, constitute a network of such instruments at different locations on the Moon.
The ASE would also measure seismic waves. It consisted of two parts. In the first, one of the crew members would deploy three geophones at distances up to from the ALSEP's Central Station, and on his way back from the furthest, fire thumpers every . The second consisted of four mortars (with their launch tubes), of different properties and set to impact at different distances from the experiment. It was hoped that the waves generated from the impacts would provide data about seismic wave transmission in the Moon's regolith. The mortar shells were not to be fired until the astronauts had returned to Earth, and in the event were never fired for fear they would damage other experiments. A similar experiment was successfully deployed, and the mortars launched, on Apollo 16.
The LPM was to be carried during the second EVA and used to measure the Moon's magnetic field at various points.
The SIDE measured ions on the lunar surface, including from the solar wind. It was combined with the CCIG, which was to measure the lunar atmosphere and detect if it varied over time. The CPLEE measured the particle energies of protons and electrons generated by the Sun that reached the lunar surface. The LRRR acts as a passive target for laser beams, allowing the measurement of the Earth/Moon distance and how it changes over time. The LRRRs from Apollo 11, 14 and 15 are the only experiments left on the Moon by the Apollo astronauts that are still returning data.
Flown for the first time on Apollo 14 was the Buddy Secondary Life Support System (BSLSS), a set of flexible hoses that would enable Shepard and Mitchell to share cooling water should one of their Primary Life Support System (PLSS) backpacks fail. In such an emergency, the astronaut with the failed equipment would get oxygen from his Oxygen Purge System (OPS) backup cylinder, but the BSLSS would ensure he did not have to use oxygen for cooling, extending the life of the OPS. The OPSs used on Apollo 14 were modified from those used on previous missions in that the internal heaters were removed as unnecessary.
Water bags were also taken to the lunar surface, dubbed "Gunga Dins", for insertion in the astronauts' helmets, allowing them sips of water during the EVAs. These had been flown on Apollo 13, but Shepard and Mitchell were the first to use them on the Moon. Similarly, Shepard was the first on the lunar surface to wear a spacesuit with commander's stripes: red stripes on arms, legs, and on the helmet, though one had been worn by Lovell on Apollo 13. These were instituted because of the difficulty in telling one spacesuited astronaut from the other in photographs.
Modular Equipment Transporter.
The Modular Equipment Transporter (MET) was a two-wheeled handcart, used only on Apollo 14, intended to allow the astronauts to take tools and equipment with them, and store lunar samples, without needing to carry them. On later Apollo program missions, the self-propelled Lunar Roving Vehicle (LRV) was flown instead.
The MET, when deployed for use on the lunar surface, was about long, wide and high. It had pressurized rubber tires wide and in diameter, containing nitrogen and inflated to about . The first use of tires on the Moon, these were developed by Goodyear and were dubbed their XLT (Experimental Lunar Tire) model. Fully loaded, the MET weighed about . Two legs combined with the wheels to provide four-point stability when at rest.
Mission highlights.
Launch and flight to lunar orbit.
Apollo 14 launched from Launch Complex 39-A at KSC at 4:03:02 pm (21:03:02 UTC), January 31, 1971. This followed a launch delay due to weather of 40 minutes and 2 seconds; the first such delay in the Apollo program. The original planned time, 3:23 pm, was at the very start of the launch window of just under four hours; had Apollo 14 not launched during it, it could not have departed until March. Apollo 12 had launched during poor weather and twice been struck by lightning, as a result of which the rules had been tightened. Among those present to watch the launch were U.S. Vice President Spiro T. Agnew and the Prince of Spain, the future King Juan Carlos I. The mission would take a faster trajectory to the Moon than planned, and thus make up the time in flight. Because it had, just over two days after launch, the mission timers would be put ahead by 40 minutes and 3 seconds so that later events would take place at the times scheduled in the flight plan.
After the vehicle reached orbit, the S-IVB third stage shut down, and the astronauts performed checks of the spacecraft before restarting the stage for translunar injection (TLI), the burn that placed the vehicle on course for the Moon. After TLI, the CSM separated from the S-IVB, and Roosa performed the transposition maneuver, turning it around in order to dock with the LM before the entire spacecraft separated from the stage. Roosa, who had practiced the maneuver many times, hoped to break the record for the least amount of propellant used in docking. But when he gently brought the modules together, the docking mechanism would not activate. He made several attempts over the next two hours, as mission controllers huddled and sent advice. If the LM could not be extracted from its place on the S-IVB, no lunar landing could take place, and with consecutive failures, the Apollo program might end. Mission Control proposed that they try it again with the docking probe retracted, hoping the contact would trigger the latches. This worked, and within an hour the joined spacecraft had separated from the S-IVB. The stage was set on a course to impact the Moon, which it did just over three days later, causing the Apollo 12 seismometer to register vibrations for over three hours.
The crew settled in for its voyage to Fra Mauro. At 60:30 Ground Elapsed Time, Shepard and Mitchell entered the LM to check its systems; while there they photographed a wastewater dump from the CSM, part of a particle contamination study in preparation for Skylab. Two midcourse corrections were performed on the translunar coast, with one burn lasting 10.19 seconds and one lasting 0.65 seconds.
Lunar orbit and descent.
At 81:56:40.70 into the mission (February 4 at 1:59:43 am EST; 06:59:43 UTC), the Service Propulsion System engine in the SM was fired for 370.84 seconds to send the craft into a lunar orbit with apocynthion of and pericynthion of . A second burn, at 86:10:52 mission time, sent the spacecraft into an orbit of by . This was done in preparation for the release of the LM "Antares". Apollo 14 was the first mission on which the CSM propelled the LM to the lower orbit—though Apollo 13 would have done so had the abort not already occurred. This was done to increase the amount of hover time available to the astronauts, a safety factor since Apollo 14 was to land in rough terrain.
After separating from the command module in lunar orbit, the LM "Antares" had two serious problems. First, the LM computer began getting an ABORT signal from a faulty switch. NASA believed the computer might be getting erroneous readings like this if a tiny ball of solder had shaken loose and was floating between the switch and the contact, closing the circuit. The immediate solution – tapping on the panel next to the switch – did work briefly, but the circuit soon closed again. If the problem recurred after the descent engine fired, the computer would think the signal was real and would initiate an auto-abort, causing the ascent stage to separate from the descent stage and climb back into orbit. NASA and the software teams at the Massachusetts Institute of Technology scrambled to find a solution. The software was hard-wired, preventing it from being updated from the ground. The fix made it appear to the system that an abort had already happened, and it would ignore incoming automated signals to abort. This would not prevent the astronauts from piloting the ship, though if an abort became necessary, they might have to initiate it manually. Mitchell entered the changes with minutes to go until planned ignition.
A second problem occurred during the powered descent, when the LM landing radar failed to lock automatically onto the Moon's surface, depriving the navigation computer of vital information on the vehicle's altitude and vertical descent speed. After the astronauts cycled the landing radar breaker, the unit successfully acquired a signal near . Mission rules required an abort if the landing radar was out at , though Shepard might have tried to land without it. With the landing radar, Shepard steered the LM to a landing which was the closest to the intended target of the six missions that landed on the Moon.
Lunar surface operations.
Shepard stated, after stepping onto the lunar surface, "And it's been a long way, but we're here." The first EVA began at 9:42 am EST (14:42 UTC) on February 5, 1971, having been delayed by a problem with the communications system which set back the start of the first EVA to five hours after landing. The astronauts devoted much of the first EVA to equipment offloading, deployment of the ALSEP and the US flag, as well as setting up and loading the MET. These activities were televised back to Earth, though the picture tended to degenerate during the latter portion of the EVA. Mitchell deployed the ASE's geophone lines, unreeling and emplacing the two lines leading out from the ALSEP's Central Station. He then fired the thumper explosives, vibrations from which would give scientists back on Earth information about the depth and composition of the lunar regolith. Of the 21 thumpers, five failed to fire. On the way back to the LM, the astronauts collected and documented lunar samples, and took photographs of the area. The first EVA lasted 4 hours, 47 minutes, 50 seconds.
The astronauts had been surprised by the undulating ground, expecting flatter terrain in the area of the landing, and this became an issue on the second EVA, as they set out, MET in tow, for the rim of Cone crater. The craters that Shepard and Mitchell planned to use for navigational landmarks looked very different on the ground than on the maps they had, based on overhead shots taken from lunar orbit. Additionally, they consistently overestimated the distance they travelled. Mission Control and the CAPCOM, Fred Haise, could see nothing of this, as the television camera remained near the LM, but they worried as the clock ticked on the EVA, and monitored the heavy breathing and rapid heartbeats of the astronauts. They topped one ridge that they expected was the crater rim, only to view more such terrain beyond. Although Mitchell strongly suspected the rim was nearby, they had become physically exhausted from the effort. They were then instructed by Haise to sample where they were and then start moving back towards the LM. Later analysis using the pictures they took determined that they had come within about of the crater's rim. Images from the Lunar Reconnaissance Orbiter (LRO) show the tracks of the astronauts and the MET come to within 30 m of the rim. The difficulties faced by Shepard and Mitchell would emphasize the need for a means of transportation on the lunar surface with a navigation system, which was met by the Lunar Roving Vehicle, already planned to fly on Apollo 15.
Once the astronauts returned to the vicinity of the LM and were again within view of the television camera, Shepard performed a stunt he had been planning for years in the event he reached the Moon, and which is probably what Apollo 14 is best remembered for. Shepard brought along a Wilson six iron golf club head, which he had modified to attach to the handle of the contingency sample tool, and two golf balls. Shepard took several one-handed swings (due to the limited flexibility of the EVA suit) and exuberantly exclaimed that the second ball went "miles and miles and miles" in the low lunar gravity. Mitchell then threw a lunar scoop handle as if it were a javelin. The "javelin" and one of the golf balls wound up in a crater together, with Mitchell's projectile a bit further. In an interview with Ottawa Golf, Shepard stated the other landed near the ALSEP. The second EVA lasted 4 hours, 34 minutes, 41 seconds. Shepard brought back the club, gave it to the USGA Museum in New Jersey, and had a replica made which he gave to the National Air and Space Museum. In February 2021, to commemorate Apollo 14's 50th anniversary, imaging specialist Andy Saunders, who had previously worked to produce the clearest image of Neil Armstrong on the Moon, produced new, digitally enhanced images that were used to estimate the final resting places of the two balls that Shepard hit - the first landed approximately 24 yards from the "tee", while the second managed 40 yards.
Some geologists were pleased enough with the close approach to Cone crater to send a case of scotch to the astronauts while they were in post-mission quarantine, though their enthusiasm was tempered by the fact that Shepard and Mitchell had documented few of the samples they brought back, making it hard and sometimes impossible to discern where they came from. Others were less happy; Don Wilhelms wrote in his book on the geological aspects of Apollo, "the golf game did not set well with most geologists in light of the results at Cone crater. The total haul from the rim-flank of Cone ... was 16 Hasselblad photographs (out of a mission total of 417), six rock-size samples heavier than 50 g, and a grand total of 10 kg of samples, 9 kg of which are in one rock (sample 14321 [i.e., Big Bertha]). That is to say, apart from 14321 we have less than 1 kg of rock—962 g to be exact—from what in my opinion is the most important single point reached by astronauts on the Moon." Geologist Lee Silver stated, "The Apollo 14 crews did not have the right attitude, did not learn enough about their mission, had the burden of not having the best possible preflight photography, and they weren't ready." In their sourcebook on Apollo, Richard W. Orloff and David M. Harland doubted that if Apollo 13 had reached the Moon, Lovell, and Haise, given a more distant landing point, could have got as close to Cone crater as Shepard and Mitchell did.
Lunar samples.
A total of of Moon rocks, or lunar samples, were brought back from Apollo 14. Most are breccias, which are rocks composed of fragments of other, older rocks. Breccias form when the heat and pressure of meteorite impacts fuse small rock fragments together. There were a few basalts that were collected in this mission in the form of clasts (fragments) in breccia. The Apollo 14 basalts are generally richer in aluminum and sometimes richer in potassium than other lunar basalts. Most lunar mare basalts collected during the Apollo program were formed from 3.0 to 3.8 billion years ago. The Apollo 14 basalts were formed 4.0 to 4.3 billion years ago, older than the volcanism known to have occurred at any of the mare locations reached during the Apollo program.
In January 2019 research showed that Big Bertha, which weighs , has characteristics that make it likely to be a terrestrial (Earth) meteorite. Granite and quartz, which are commonly found on Earth but very rarely found on the Moon, were confirmed to exist on Big Bertha. To find the sample's age, the research team from Curtin University looked at bits of the mineral zircon embedded in its structure. "By determining the age of zircon found in the sample, we were able to pinpoint the age of the host rock at about four billion years old, making it similar to the oldest rocks on Earth," researcher Alexander Nemchin said, adding that "the chemistry of the zircon in this sample is very different from that of every other zircon grain ever analyzed in lunar samples, and remarkably similar to that of zircons found on Earth." This would mean Big Bertha is both the first discovered terrestrial meteorite and the oldest known Earth rock.
Lunar orbit operations.
Roosa spent almost two days alone aboard "Kitty Hawk", performing the first intensive program of scientific observation from lunar orbit, much of which was intended to have been done by Apollo 13. After "Antares" separated and its crew began preparations to land, Roosa in "Kitty Hawk" performed an SPS burn to send the CSM to an orbit of approximately , and later a plane change maneuver to compensate for the rotation of the Moon.
Roosa took pictures from lunar orbit. The Lunar Topographic Camera, also known as the Hycon camera, was supposed to be used to image the surface, including the Descartes Highlands site being considered for Apollo 16, but it quickly developed a fault with the shutter that Roosa could not fix despite considerable help from Houston. Although about half of the photographic targets had to be scrubbed, Roosa was able to obtain photographs of Descartes with a Hasselblad camera and confirm that it was a suitable landing point. Roosa also used the Hasselblad to take photographs of the impact point of Apollo 13's S-IVB near Lansburg B crater. After the mission, troubleshooting found a tiny piece of aluminum contaminating the shutter control circuit, which caused the shutter to operate continuously.
Roosa was able to see sunlight glinting off "Antares" and view its lengthy shadow on the lunar surface on Orbit 17; on Orbit 29 he could see the sun reflecting off the ALSEP. He also took astronomical photographs, of the Gegenschein, and of the Lagrangian point of the Sun-Earth system that lies beyond the Earth (L), testing the theory that the Gegenschein is generated by reflections off particles at L. Performing the bistatic radar experiment, he also focused "Kitty Hawk" VHF and S-band transmitters at the Moon so that they would bounce off and be detected on Earth in an effort to learn more about the depth of the lunar regolith.
Return, splashdown and quarantine.
"Antares" lifted off from the Moon at 1:48:42 pm EST (18:48:42 UTC) on February 6, 1971. Following the first direct (first orbit) rendezvous on a lunar landing mission, docking took place an hour and 47 minutes later. Despite concerns based on the docking problems early in the mission, the docking was successful on the first attempt, though the LM's Abort Guidance System, used for navigation, failed just before the two craft docked. After crew, equipment, and lunar samples were transferred to "Kitty Hawk", the ascent stage was jettisoned, and impacted the Moon, setting off waves registered by the seismometers from Apollo 12 and 14.
A trans-earth injection burn took place on February 6 at 8:39:04 pm (February 7 at 01:39:04 UTC) taking 350.8 seconds, during "Kitty Hawk" 34th lunar revolution. During the trans-earth coast, two tests of the oxygen system were performed, one to ensure the system would operate properly with low densities of oxygen in the tanks, the second to operate the system at a high flow rate, as would be necessary for the in-flight EVAs scheduled for Apollo 15 and later. Additionally, a navigation exercise was done to simulate a return to Earth following a loss of communications. All were successful. During his rest periods on the voyage, Mitchell conducted ESP experiments without NASA's knowledge or sanction, attempting by prearrangement to send images of cards he had brought with him to four people on Earth. He stated after the mission that two of the four had gotten 51 out of 200 correct (the others were less successful), whereas random chance would have dictated 40.
On the final evening in space, the crew conducted a press conference, with the questions submitted to NASA in advance and read to the astronauts by the CAPCOM.
The command module "Kitty Hawk" splashed down in the South Pacific Ocean on February 9, 1971, at 21:05 [UTC], approximately south of American Samoa. After recovery by the ship USS "New Orleans", the crew was flown to Pago Pago International Airport in Tafuna, then to Honolulu, then to Ellington Air Force Base near Houston in a plane containing a Mobile Quarantine Facility trailer before they continued their quarantine in the Lunar Receiving Laboratory. They remained there until their release from quarantine on February 27, 1971. The Apollo 14 astronauts were the last lunar explorers to be quarantined on their return from the Moon. They were the only Apollo crew to be quarantined both before and after the flight.
Roosa, who worked in forestry in his youth, took several hundred tree seeds on the flight. These were germinated after the return to Earth, and were widely distributed around the world as commemorative Moon trees. Some seedlings were given to state forestry associations in 1975 and 1976 to mark the United States Bicentennial.
Mission insignia.
The mission insignia is an oval depicting the Earth and the Moon, and an astronaut pin drawn with a comet trail. The pin is leaving Earth and is approaching the Moon. A gold band around the edge includes the mission and astronaut names. The designer was Jean Beaulieu, who based it on a sketch by Shepard, who had been head of the Astronaut Office and meant the pin to symbolize that through him, the entire corps was in spirit flying to the Moon.
The backup crew spoofed the patch with its own version, with revised artwork showing a Wile E. Coyote cartoon character depicted as gray-bearded (for Shepard, who was 47 at the time of the mission and the oldest man on the Moon), pot-bellied (for Mitchell, who had a pudgy appearance) and red-furred (for Roosa's red hair), still on the way to the Moon, while Road Runner (for the backup crew) is already on the Moon, holding a U.S. flag and a flag labelled "1st Team". The flight name is replaced by "BEEP BEEP" and the backup crew's names are given. Several of these patches were hidden by the backup crew and found during the flight by the crew in notebooks and storage lockers in both the CSM "Kitty Hawk" and the LM "Antares", and one patch was stored in the MET lunar handcart. One patch, attached to Shepard's PLSS, was worn on the lunar surface, and, mounted on a plaque, was presented by him to Cernan after the mission.
Spacecraft locations.
The Apollo 14 command module "Kitty Hawk" is on display at the Apollo/Saturn V Center at the Kennedy Space Center Visitor Complex after being on display at the United States Astronaut Hall of Fame near Titusville, Florida, for several years. At the time of its transfer of ownership from NASA to the Smithsonian in July 1977, it was on display at the facilities of North American Rockwell (the company that had constructed it) in Downey, California. The SM reentered Earth's atmosphere and was destroyed, though there was no tracking or sightings of it.
The S-IVB booster impacted the Moon on February4 at . The ascent stage of lunar module "Antares" impacted the Moon on February7, 1971, at 00:45:25.7 UT (February 6, 7:45 pm EST), at . "Antares"' descent stage and the mission's other equipment remain at Fra Mauro at .
Photographs taken in 2009 by the Lunar Reconnaissance Orbiter were released on July 17, and the Fra Mauro equipment was the most visible Apollo hardware at that time, owing to particularly good lighting conditions. In 2011, the LRO returned to the landing site at a lower altitude to take higher resolution photographs.
External links.
NASA reports
Multimedia
|
1969 | Apollo 15 | Apollo 15 (July 26August 7, 1971) was the ninth crewed mission in the United States' Apollo program and the fourth to land on the Moon. It was the first J mission, with a longer stay on the Moon and a greater focus on science than earlier landings. Apollo 15 saw the first use of the Lunar Roving Vehicle.
The mission began on July 26 and ended on August 7, with the lunar surface exploration taking place between July 30 and August 2. Commander David Scott and Lunar Module Pilot James Irwin landed near Hadley Rille and explored the local area using the rover, allowing them to travel further from the lunar module than had been possible on previous missions. They spent 18 hours on the Moon's surface on four extravehicular activities (EVA), and collected of surface material.
At the same time, Command Module Pilot Alfred Worden orbited the Moon, operating the sensors in the scientific instrument module (SIM) bay of the service module. This suite of instruments collected data on the Moon and its environment using a panoramic camera, a gamma-ray spectrometer, a mapping camera, a laser altimeter, a mass spectrometer, and a lunar subsatellite deployed at the end of the moonwalks. The lunar module returned safely to the command module and, at the end of Apollo 15's 74th lunar orbit, the engine was fired for the journey home. During the return trip, Worden performed the first spacewalk in deep space. The Apollo 15 mission splashed down safely on August7 despite the loss of one of its three parachutes.
The mission accomplished its goals but was marred by negative publicity the following year when it emerged that the crew had carried unauthorized postal covers to the lunar surface, some of which were sold by a West German stamp dealer. The members of the crew were reprimanded for poor judgment, and did not fly in space again. The mission also saw the collection of the Genesis Rock, thought to be part of the Moon's early crust, and Scott's use of a hammer and a feather to validate Galileo's theory that when there is no air resistance, objects fall at the same rate due to gravity regardless of their mass.
Background.
In 1962, NASA contracted for the construction of fifteen Saturn V rockets to achieve the Apollo program's goal of a crewed landing on the Moon by 1970; at the time no one knew how many missions this would require. Since success was obtained in 1969 with the sixth SaturnV on Apollo 11, nine rockets remained available for a hoped-for total of ten landings. These plans included a heavier, extended version of the Apollo spacecraft to be used in the last five missions (Apollo 16 through 20). The revamped lunar module would be capable of up to a 75-hour stay, and would carry a Lunar Roving Vehicle to the Moon's surface. The service module would house a package of orbital experiments to gather data on the Moon. In the original plan Apollo 15 was to be the last of the non-extended missions to land in Censorinus crater. But in anticipation of budget cuts, NASA cancelled three landing missions by September 1970. Apollo 15 became the first of three extended missions, known as J missions, and the landing site was moved to Hadley Rille, originally planned for Apollo 19.
Crew and key Mission Control personnel.
Crew.
Scott was born in 1932 in San Antonio, Texas, and, after spending his freshman year at the University of Michigan on a swimming scholarship, transferred to the United States Military Academy, from which he graduated in 1954. Serving in the Air Force, Scott had received two advanced degrees from MIT in 1962 before being selected as one of the third group of astronauts the following year. He flew in Gemini 8 in 1966 alongside Neil Armstrong and as command module pilot of Apollo 9 in 1969. Worden was born in 1932 in Jackson, Michigan, and like his commander, had attended West Point (class of 1955) and served in the Air Force. Worden earned two master's degrees in engineering from Michigan in 1963. Irwin had been born in 1930 in Pittsburgh, and had attended the United States Naval Academy, graduating in 1951 and serving in the Air Force, receiving a master's degree from Michigan in 1957. Both Worden and Irwin were selected in the fifth group of astronauts (1966), and Apollo 15 would be their only spaceflight. All three future astronauts had attended Michigan, and two had taken degrees from there; it had been the first university to offer an aeronautical engineering program.
The backup crew was Richard F. Gordon Jr. as commander, Vance D. Brand as command module pilot and Harrison H. Schmitt as lunar module pilot. By the usual rotation of crews, the three would most likely have flown Apollo 18, which was canceled. Brand flew later on the Apollo–Soyuz Test Project and on STS-5, the first operational Space Shuttle mission. With NASA under intense pressure to send a professional scientist to the Moon, Schmitt, a geologist, was selected as LMP of Apollo 17 instead of Joe Engle.
Apollo 15's support crew consisted of astronauts Joseph P. Allen, Robert A. Parker and Karl G. Henize. All three were scientist-astronauts, selected in 1967, as the prime crew felt they needed more assistance with the science than with the piloting. None of the support crew would fly during the Apollo program, waiting until the Space Shuttle program to go into space.
Mission Control.
The flight directors for Apollo 15 were as follows:
During a mission the capsule communicators (CAPCOMs), always fellow astronauts, were the only people who normally would speak to the crew. For Apollo 15, the CAPCOMs were Allen, Brand, C. Gordon Fullerton, Gordon, Henize, Edgar D. Mitchell, Parker, Schmitt and Alan B. Shepard.
Planning and training.
Schmitt and other scientist-astronauts advocated for a greater place for science on the early Apollo missions. They were often met with disinterest from other astronauts, or found science displaced by higher priorities. Schmitt realized that what was needed was an expert teacher who could fire the astronauts' enthusiasm, and contacted Caltech geologist Lee Silver, whom Schmitt introduced to Apollo 13's commander, Jim Lovell, and to its lunar module pilot, Fred Haise, then in training for their mission. Lovell and Haise were willing to go on a field expedition with Silver, and geology became a significant part of their training. Geologist Farouk El-Baz trained the prime crew's command module pilot, Ken Mattingly to inform his planned observations from lunar orbit. The crew's newly acquired skills mostly went unused, due to the explosion that damaged the Apollo 13 spacecraft, and caused an abort of the mission. Apollo 14's CMP, Stuart Roosa, was enthusiastic about geology, but the mission commander, Shepard, less so.
Already familiar with the spacecraft as the backup crew for Apollo 12, Scott, Worden and Irwin could devote more of their training time as prime crew for Apollo 15 to geology and sampling techniques. Scott was determined that his crew bring back the maximum amount of scientific data possible, and met with Silver in April 1970 to begin planning the geological training. Schmitt's assignment as Apollo 15's backup LMP made him an insider, and allowed him to spark competition between the prime and backup crews. The cancellation of two Apollo missions in September 1970 transformed Apollo 15 into a J mission, with a longer stay on the lunar surface, and the first Lunar Roving Vehicle (LRV). This change was welcomed by Scott, who according to David West Reynolds in his account of the Apollo program, was "something more than a hotshot pilot. Scott had the spirit of a true explorer", one determined to get the most from the J mission. The additional need for communications, including from planned experiments and the rover, required the near-rebuilding of the Honeysuckle Creek Tracking Station in Australia.
Geology field trips took place about once a month throughout the crew's 20 months of training. At first, Silver would take the commanders and LMPs from the prime and backup crews to geological sites in Arizona and New Mexico as if for a normal field geology lesson, but closer to launch, these trips became more realistic. Crews began to wear mock-ups of the backpacks they would carry while hiking near the Rio Grande Gorge, and communicate using walkie-talkies to a CAPCOM in a tent. The CAPCOM was accompanied by a geologist unfamiliar with the area who would rely on the astronauts' descriptions to interpret the findings, and familiarized the crew members with describing landscapes to people who could not see them. Considering himself a serious amateur, Scott came to enjoy field geology.
The decision to land at Hadley came in September 1970. The Site Selection Committee had narrowed the field down to two sites—Hadley Rille, a deep channel on the edge of Mare Imbrium close to the Apennine mountains or the crater Marius, near which were a group of low, possibly volcanic, domes. Although not ultimately his decision, the commander of a mission always held great sway. To David Scott the choice was clear, as Hadley "had more variety. There is a certain intangible quality which drives the spirit of exploration and I felt that Hadley had it. Besides it looked beautiful and usually when things look good they are good." The selection of Hadley was made although NASA lacked high resolution images of the landing site; none had been made as the site was considered too rough to risk one of the earlier Apollo missions. The proximity of the Apennine mountains to the Hadley site required a landing approach trajectory of 26 degrees, far steeper than the 15 degrees in earlier Apollo landings.
The expanded mission meant that Worden spent much of his time at North American Rockwell's facilities at Downey, California, where the command and service module (CSM) was being built. He undertook a different kind of geology training. Working with El-Baz, he studied maps and photographs of the craters he would pass over while orbiting alone in the CSM. As El-Baz listened and gave feedback, Worden learned how to describe lunar features in a way that would be useful to the scientists who would listen to his transmissions back on Earth. Worden found El-Baz to be an enjoyable and inspiring teacher. Worden usually accompanied his crewmates on their geology field trips, though he was often in an airplane overhead, describing features of the landscape as the plane simulated the speed at which the lunar landscape would pass below the CSM.
The demands of the training strained both Worden's and Irwin's marriages; each sought Scott's advice, fearing a divorce might endanger their places on the mission as not projecting the image NASA wanted for the astronauts. Scott consulted Director of Flight Crew Operations Deke Slayton, their boss, who stated what was important was that the astronauts do their jobs. Although the Irwins overcame their marital difficulties, the Wordens divorced before the mission.
Hardware.
Spacecraft.
Apollo 15 used command and service module CSM-112, which was given the call sign "Endeavour", named after HMS "Endeavour", and lunar module LM-10, call sign "Falcon", named after the United States Air Force Academy mascot. Scott explained the choice of the name "Endeavour" on the grounds that its captain, James Cook had commanded the first purely scientific sea voyage, and Apollo 15 was the first lunar landing mission on which there was a heavy emphasis on science. Apollo 15 took with it a small piece of wood from Cook's ship, while "Falcon" carried two falcon feathers to the Moon in recognition of the crew's service in the Air Force. Also part of the spacecraft were a Launch Escape System and a Spacecraft-Lunar Module Adapter, numbered SLA-19.
Technicians at the Kennedy Space Center had some problems with the instruments in the service module's scientific instrument module (SIM) bay. Some instruments were late in arriving, and principal investigators or representatives of NASA contractors sought further testing or to make small changes. Mechanical problems came from the fact the instruments were designed to operate in space, but had to be tested on the surface of the Earth. As such, things like the 7.5 m (24 ft) booms for the mass and gamma ray spectrometers could be tested only using equipment that tried to mimic the space environment, and, in space, the mass spectrometer boom several times did not fully retract.
On the lunar module, the fuel and oxidizer tanks were enlarged on both the descent and ascent stages, and the engine bell on the descent stage was extended. Batteries and solar cells were added for increased electrical power. In all this increased the weight of the lunar module to , heavier than previous models.
If Apollo 15 had flown as an H mission, it would have been with CSM-111 and LM-9. That CSM was used by the Apollo–Soyuz Test Project in 1975, but the lunar module went unused and is now at the Kennedy Space Center Visitor Complex. "Endeavour" is on display at the National Museum of the United States Air Force at Wright-Patterson Air Force Base in Dayton, Ohio, following its transfer of ownership from NASA to the Smithsonian in December 1974.
Launch vehicle.
The Saturn V that launched Apollo 15 was designated SA-510, the tenth flight-ready model of the rocket. As the payload of the rocket was greater, changes were made to the rocket and to its launch trajectory. It was launched in a more southerly direction (80–100 degrees azimuth) than previous missions, and the Earth parking orbit was lowered to . These two changes meant more could be launched. The propellant reserves were reduced and the number of retrorockets on the S-IC first stage (used to separate the spent first stage from the S-II second stage) reduced from eight to four. The four outboard engines of the S-IC would be burned longer and the center engine would also burn longer. Changes were also made to the S-II to dampen pogo oscillations.
Once all major systems were installed in the SaturnV, it was moved from the Vehicle Assembly Building to the launch site, Launch Complex 39A. During late June and early July 1971, the rocket and Launch Umbilical Tower (LUT) were struck by lightning at least four times. There was no damage to the vehicle, and only minor damage to ground support equipment.
Space suits.
The Apollo 15 astronauts wore redesigned space suits. On all previous Apollo flights, including the non-lunar flights, the commander and lunar module pilot had worn suits with the life support, liquid cooling, and communications connections in two parallel rows of three. On Apollo 15, the new suits, dubbed the "A7LB", had the connectors situated in triangular pairs. This new arrangement, along with the relocation of the entry zipper (which went in an up-down motion on the old suits), to run diagonally from the right shoulder to the left hip, aided in suiting and unsuiting in the cramped confines of the spacecraft. It also allowed for a new waist joint, letting the astronauts bend completely over, and sit on the rover. Upgraded backpacks allowed for longer-duration moonwalks. As in all missions from and after Apollo 13, the commander's suit bore a red stripe on the helmet, arms and legs.
Worden wore a suit similar to those worn by the Apollo 14 astronauts, but modified to interface with Apollo 15's equipment. Gear needed only for lunar surface EVAs, such as the liquid cooling garment, was not included with Worden's suit, as the only EVA he was expected to do was one to retrieve film cartridges from the SIM bay on the flight home.
Lunar Roving Vehicle.
A vehicle that could operate on the surface of the Moon had been considered by NASA since the early 1960s. An early version was called MOLAB, which had a closed cabin and would have massed about ; some scaled-down prototypes were tested in Arizona. As it became clear NASA would not soon establish a lunar base, such a large vehicle seemed unnecessary. Still, a rover would enhance the J missions, which were to concentrate on science, though its mass was limited to about and it was not then clear that so light a vehicle could be useful. NASA did not decide to proceed with a rover until May 1969, as Apollo 10, the dress rehearsal for the Moon landing, made its way home from lunar orbit. Boeing received the contract for three rovers on a cost-plus basis; overruns (especially in the navigation system) meant the three vehicles eventually cost a total of $40 million. These cost overruns gained considerable media attention at a time of greater public weariness with the space program, when NASA's budget was being cut.
The Lunar Roving Vehicle could be folded into a space 5 ft by 20 in (1.5 m by 0.5 m). Unloaded, it weighed 460 lb (209 kg) and when carrying two astronauts and their equipment, 1500 lb (700 kg). Each wheel was independently driven by a horsepower (200 W) electric motor. Although it could be driven by either astronaut, the commander always drove. Travelling at speeds up to 6to 8mph (10to 12km/h), it meant that for the first time the astronauts could travel far afield from their lander and still have enough time to do some scientific experiments. The Apollo 15 rover bore a plaque, reading: "Man's First Wheels on the Moon, Delivered by Falcon, July 30, 1971". During pre-launch testing, the LRV was given additional bracing, lest it collapse if someone sat on it under Earth conditions.
Particles and Fields Subsatellite.
The Apollo 15 Particles and Fields Subsatellite (PFS-1) was a small satellite released into lunar orbit from the SIM bay just before the mission left orbit to return to Earth. Its main objectives were to study the plasma, particle, and magnetic field environment of the Moon and map the lunar gravity field. Specifically, it measured plasma and energetic particle intensities and vector magnetic fields, and facilitated tracking of the satellite velocity to high precision. A basic requirement was that the satellite acquire fields and particle data everywhere on the orbit around the Moon. As well as measuring magnetic fields, the satellite contained sensors to study the Moon's mass concentrations, or mascons. The satellite orbited the Moon and returned data from August 4, 1971, until January 1973, when, following multiple failures of the subsatellite's electronics, ground support was terminated. It is believed to have crashed into the Moon sometime thereafter.
Mission highlights.
Launch and outbound trip.
Apollo 15 was launched on July 26, 1971, at 9:34am EDT from the Kennedy Space Center at Merritt Island, Florida. The time of launch was at the very start of the two-hour, 37-minute launch window, which would allow Apollo 15 to arrive at the Moon with the proper lighting conditions at Hadley Rille; had the mission been postponed beyond another window on July 27, it could not have been rescheduled until late August. The astronauts had been awakened five and a quarter hours before launch by Slayton, and after breakfast and suiting up, had been taken to Pad 39A, launch site of all seven attempts at crewed lunar landing, and entered the spacecraft about three hours before launch. There were no unplanned delays in the countdown.
At 000:11:36 into the mission, the S-IVB engine shut down, leaving Apollo 15 in its planned parking orbit in low Earth orbit. The mission remained there for 2hours and 40 minutes, allowing the crew (and Houston, via telemetry) to check the spacecraft's systems. At 002:50.02.6 into the mission, the S-IVB was restarted for trans-lunar injection (TLI), placing the craft on a path to the Moon. Before TLI, the craft had completed 1.5 orbits around the Earth.
The command and service module (CSM) and the lunar module remained attached to the nearly-exhausted S-IVB booster. Once trans-lunar injection had been achieved, placing the spacecraft on a trajectory towards the Moon, explosive cords separated the CSM from the booster as Worden operated the CSM's thrusters to push it away. Worden then maneuvered the CSM to dock with the LM (mounted on the end of the S-IVB), and the combined craft was then separated from the S-IVB by explosives. After Apollo 15 separated from the booster, the S-IVB maneuvered away, and, as planned, impacted the Moon about an hour after the crewed spacecraft entered lunar orbit, though due to an error the impact was away from the intended target. The booster's impact was detected by the seismometers left on the Moon by Apollo 12 and Apollo 14, providing useful scientific data.
There was a malfunctioning light on the craft's service propulsion system (SPS); after considerable troubleshooting, the astronauts did a test burn of the system that also served as a midcourse correction. This occurred about 028:40:00 into the mission. Fearing that the light meant the SPS might unexpectedly fire, the astronauts avoided using the control bank with the faulty light, bringing it online only for major burns, and controlling it manually. After the mission returned, the malfunction proved to be caused by a tiny bit of wire trapped within the switch.
After purging and renewing the LM's atmosphere to eliminate any contamination, the astronauts entered the LM about 34 hours into the mission, needing to check the condition of its equipment and move in items that would be required on the Moon. Much of this work was televised back to Earth, the camera operated by Worden. The crew discovered a broken outer cover on the Range/Range Rate tapemeter. This was a concern not only because an important piece of equipment, providing information on distance and rate of approach, might not work properly, but because bits of the glass cover were floating around "Falcon"'s interior. The tapemeter was supposed to be in a helium atmosphere, but due to the breakage, it was in the LM's oxygen atmosphere. Testing on the ground verified the tapemeter would still work properly, and the crew removed most of the glass using a vacuum cleaner and adhesive tape.
As yet, there had been only minor problems, but at about 61:15:00 mission time (the evening of July 28 in Houston), Scott discovered a leak in the water system while preparing to chlorinate the water supply. The crew could not tell where it was coming from, and the issue had the potential to become serious. The experts in Houston found a solution, which was successfully implemented by the crew. The water was mopped up with towels, which were then put out to dry in the tunnel between the command module (CM) and lunar module—Scott stated it looked like someone's laundry.
At 073:31:14 into the mission, a second midcourse correction, with less than a second of burn, was made. Although there were four opportunities to make midcourse corrections following TLI, only two were needed. Apollo 15 approached the Moon on July 29, and the lunar orbit insertion (LOI) burn had to be made using the SPS, on the far side of the Moon, out of radio contact with Earth. If no burn occurred, Apollo 15 would emerge from the lunar shadow and come back in radio contact faster than expected; the continued lack of communication allowed Mission Control to conclude that the burn had taken place. When contact resumed, Scott did not immediately give the particulars of the burn, but spoke admiringly of the beauty of the Moon, causing Alan Shepard, the Apollo 14 commander, who was awaiting a television interview, to grumble, "To hell with that shit, give us details of the burn." The 398.36-second burn took place at 078:31:46.7 into the mission at an altitude of above the Moon, and placed Apollo 15 in an elliptical lunar orbit of .
Lunar orbit and landing.
On Apollo 11 and 12, the lunar module decoupled from the CSM and descended to a much lower orbit from which the lunar landing attempt commenced; to save fuel in an increasingly heavy lander, beginning with Apollo 14, the SPS in the service module made that burn, known as descent orbit insertion (DOI), with the lunar module still attached to the CSM. The initial orbit Apollo 15 was in had its apocynthion, or high point, over the landing site at Hadley; a burn at the opposite point in the orbit was performed, with the result that Hadley would now be under the craft's pericynthion, or low point. The DOI burn was performed at 082:39:49.09 and took 24.53 seconds; the result was an orbit with apocynthion of and pericynthion of . Overnight between July 29 and 30, as the crew rested, it became apparent to Mission Control that mass concentrations in the Moon were making Apollo 15's orbit increasingly elliptical—pericynthion was by the time the crew was awakened on July 30. This, and uncertainty as to the exact altitude of the landing site, made it desirable that the orbit be modified, or trimmed. Using the craft's RCS thrusters, this took place at 095:56:44.70, lasting 30.40 seconds, and raised the pericynthion to and the apocynthion to .
As well as preparing the lunar module for its descent, the crew continued observations of the Moon (including of the landing site at Hadley) and provided television footage of the surface. Then, Scott and Irwin entered the lunar module in preparation for the landing attempt. Undocking was planned for 100:13:56, over the far side of the Moon, but nothing happened when separation was attempted. After analyzing the problem, the crew and Houston decided the probe instrumentation umbilical was likely loose or disconnected; Worden went into the tunnel connecting the command and lunar modules and determined this was so, seating it more firmly. With the problem resolved, "Falcon" separated from "Endeavour" at 100:39:16.2, about 25 minutes late, at an altitude of . Worden in "Endeavour" executed a SPS burn at 101:38:58.98 to send "Endeavour" to an orbit of by in preparation for his scientific work.
Aboard "Falcon", Scott and Irwin prepared for powered descent initiation (PDI), the burn that was to place them on the lunar surface, and, after Mission Control gave them permission, they initiated PDI at 104:30:09.4 at an altitude of , slightly higher than planned. During the first part of the descent, "Falcon" was aligned so the astronauts were on their backs and thus could not see the lunar surface below them, but after the craft made a pitchover maneuver, they were upright and could see the surface in front of them. Scott, who as commander performed the landing, was confronted with a landscape that did not at first seem to resemble what he had seen during simulations. Part of this was due to an error in the landing path of some , of which CAPCOM Ed Mitchell informed the crew prior to pitchover; part because the craters Scott had relied on in the simulator were difficult to make out under lunar conditions, and he initially could not see Hadley Rille. He concluded that they were likely to overshoot the planned landing site, and, once he could see the rille, started maneuvering the vehicle to move the computer's landing target back towards the planned spot, and looked for a relatively smooth place to land.
Below about , Scott could see nothing of the surface because of the quantities of lunar dust being displaced by "Falcon"'s exhaust. "Falcon" had a larger engine bell than previous LMs, in part to accommodate a heavier load, and the importance of shutting down the engine at initial contact rather than risk "blowback", the exhaust reflecting off the lunar surface and going back into the engine (possibly causing an explosion) had been impressed on the astronauts by mission planners. Thus, when Irwin called "Contact", indicating that one of the probes on the landing leg extensions had touched the surface, Scott immediately shut off the engine, letting the lander fall the remaining distance to the surface. Already moving downward at about per second, "Falcon" dropped from a height of . Scott's speed resulted in what was likely the hardest lunar landing of any of the crewed missions, at about per second, causing a startled Irwin to yell "Bam!" Scott had landed "Falcon" on the rim of a small crater he could not see, and the lander settled back at an angle of 6.9 degrees and to the left of 8.6 degrees. Irwin described it in his autobiography as the hardest landing he had ever been in, and he feared that the craft would keep tipping over, forcing an immediate abort.
"Falcon" landed at 104:42:29.3 (22:16:29 GMT on July 30), with approximately 103 seconds of fuel remaining, about from the planned landing site. After Irwin's exclamation, Scott reported, "Okay, Houston. The "Falcon" is on the Plain at Hadley." Once within the planned landing zone, the increased mobility provided by the Lunar Roving Vehicle made unnecessary any further maneuvering.
Lunar surface.
Stand-up EVA and first EVA.
With "Falcon" due to remain on the lunar surface for almost three days, Scott deemed it important to maintain the circadian rhythm they were used to, and as they had landed in the late afternoon, Houston time, the two astronauts were to sleep before going onto the surface. But the time schedule allowed Scott to open the lander's top hatch (usually used for docking) and spend a half hour looking at their surroundings, describing them, and taking photographs. Lee Silver had taught him the importance of going to a high place to survey a new field site, and the top hatch served that purpose. Deke Slayton and other managers were initially opposed due to the oxygen that would be lost, but Scott got his way. During the only stand-up extravehicular activity (EVA) ever performed through the LM's top hatch on the lunar surface, Scott was able to make plans for the following day's EVA. He offered Irwin a chance to look out as well, but this would have required rearranging the umbilicals connecting Irwin to "Falcon"'s life support system, and he declined. After repressurizing the spacecraft, Scott and Irwin removed their space suits for sleep, becoming the first astronauts to doff their suits while on the Moon.
Throughout the sleep period Mission Control in Houston monitored a slow but steady oxygen loss. Scott and Irwin eventually were awakened an hour early, and the source of the problem was found to be an open valve on the urine transfer device. In post-mission debriefing, Scott recommended that future crews be woken at once under similar circumstances. After the problem was solved, the crew began preparation for the first Moon walk.
After donning their suits and depressurizing the cabin, Scott and Irwin began their first full EVA, becoming the seventh and eighth humans, respectively, to walk on the Moon. They began deploying the lunar rover, stored folded up in a compartment of "Falcon"'s descent stage, but this proved troublesome due to the slant of the lander. The experts in Houston suggested lifting the front end of the rover as the astronauts pulled it out, and this worked. Scott began a system checkout. One of the batteries gave a zero voltage reading, but this was only an instrumentation problem. A greater concern was that the front wheel steering would not work. However, the rear wheel steering was sufficient to maneuver the vehicle. Completing his checkout, Scott said "Okay. Out of detent; we're moving", maneuvering the rover away from "Falcon" in mid-sentence. These were the first words uttered by a human while driving a vehicle on the Moon. The rover carried a television camera, controlled remotely from Houston by NASA's Ed Fendell. The resolution was not high compared to the still photographs that would be taken, but the camera allowed the geologists on Earth to indirectly participate in Scott and Irwin's activities.
The rille was not visible from the landing site, but as Scott and Irwin drove over the rolling terrain, it came into view. They were able to see Elbow crater, and they began to drive in that direction. Reaching Elbow, a known location, allowed Mission Control to backtrack and get closer to pinpointing the location of the lander. The astronauts took samples there, and then drove to another crater on the flank of Mons Hadley Delta, where they took more. After concluding this stop, they returned to the lander to drop off their samples and prepare to set up the Apollo Lunar Surface Experiments Package (ALSEP), the scientific instruments that would remain when they left. Scott had difficulty drilling the holes required for the heat flow experiment, and the work was not completed when they had to return to the lander. The first EVA lasted 6hours and 32 minutes.
Second and third EVAs.
The rover's front steering, inoperative during the first EVA, worked during the second and third ones. The target of the second EVA, on August 1, was the slope of Mons Hadley Delta, where the pair sampled boulders and craters along the Apennine Front. They spent an hour at Spur crater, during which the astronauts collected a sample dubbed the Genesis Rock. This rock, an anorthosite, is believed to be part of the early lunar crust—the hope of finding such a specimen had been one reason the Hadley area had been chosen. Once back at the landing site, Scott continued to try to drill holes for experiments at the ALSEP site, with which he had struggled the day before. After conducting soil-mechanics experiments and raising the U.S. flag, Scott and Irwin returned to the LM. EVA2 lasted 7hours and 12 minutes.
Although Scott had eventually been successful at drilling the holes, he and Irwin had been unable to retrieve a core sample, and this was an early order of business during EVA 3, their third and final moonwalk. Time that could have been devoted to geology ticked away as Scott and Irwin attempted to pull it out. Once it had been retrieved, more time passed as they attempted to break the core into pieces for transport to Earth. Hampered by an incorrectly mounted vise on the rover, they eventually gave up on this—the core would be transported home with one segment longer than planned. Scott wondered if the core was worth the amount of time and effort invested, and the CAPCOM, Joe Allen, assured him it was. The core proved one of the most important items brought back from the Moon, revealing much about its history, but the expended time meant the planned visit to a group of hills known as the North Complex had to be scrubbed. Instead, the crew again ventured to the edge of Hadley Rille, this time to the northwest of the immediate landing site.
Once the astronauts were beside the LM, Scott used a kit provided by the Postal Service to cancel a first day cover of two stamps being issued on August 2, the current date. Scott then performed an experiment in view of the television camera, using a falcon feather and hammer to demonstrate Galileo's theory that all objects in a given gravity field fall at the same rate, regardless of mass, in the absence of aerodynamic drag. He dropped the hammer and feather at the same time; because of the negligible lunar atmosphere, there was no drag on the feather, which hit the ground at the same time as the hammer. This was Joe Allen's idea (he also served as CAPCOM during it) and was part of an effort to find a memorable popular science experiment to do on the Moon along the lines of Shepard's hitting of golf balls. The feather was most likely from a female gyrfalcon (a type of falcon), a mascot at the United States Air Force Academy.
Scott then drove the rover to a position away from the LM, where the television camera could be used to observe the lunar liftoff. Near the rover, he left a small aluminum statuette called "Fallen Astronaut", along with a plaque bearing the names of 14 known American astronauts and Soviet cosmonauts who had died in the furtherance of space exploration. The memorial was left while the television camera was turned away; he told Mission Control he was doing some cleanup activities around the rover. Scott disclosed the memorial in a post-flight news conference. He also placed a Bible on the control panel of the rover before leaving it for the last time to enter the LM.
The EVA lasted 4 hours, 49 minutes and 50 seconds. In total, the two astronauts spent 18 hours outside the LM and collected approximately of lunar samples.
Command module activities.
After the departure of "Falcon", Worden in "Endeavour" executed a burn to take the CSM to a higher orbit. While "Falcon" was on the Moon, the mission effectively split, Worden and the CSM being assigned their own CAPCOM and flight support team.
Worden got busy with the tasks that were to occupy him for much of the time he spent in space alone: photography and operating the instruments in the SIM bay. The door to the SIM bay had been explosively jettisoned during the translunar coast. Filling previously-unused space in the service module, the SIM bay contained a gamma-ray spectrometer, mounted on the end of a boom, an X-ray spectrometer and a laser altimeter, which failed part way through the mission. Two cameras, a stellar camera and a metric camera, together comprised the mapping camera, which was complemented by a panoramic camera, derived from spy technology. The altimeter and cameras permitted the exact time and location from which pictures were taken to be determined. Also present were an alpha particle spectrometer, which could be used to detect evidence of lunar volcanism, and a mass spectrometer, also on a boom in the hope it would be unaffected by contamination from the ship. The boom would prove troublesome, as Worden would not always be able to get it to retract.
"Endeavour" was slated to pass over the landing site at the moment of planned landing, but Worden could not see "Falcon" and did not spot it until a subsequent orbit. He also exercised to avoid muscle atrophy, and Houston kept him up to date on Scott and Irwin's activities on the lunar surface. The panoramic camera did not operate perfectly, but provided enough images that no special adjustment was made. Worden took many photographs through the command module's windows, often with shots taken at regular intervals. His task was complicated by the lack of a working mission timer in the Lower Equipment Bay of the command module, as its circuit breaker had popped en route to the Moon. Worden's observations and photographs would inform the decision to send Apollo 17 to Taurus-Littrow to search for evidence of volcanic activity. There was a communications blackout when the CSM passed over the far side of the Moon from Earth; Worden greeted each resumption of contact with the words, "Hello, Earth. Greetings from "Endeavour"", expressed in different languages. Worden and El-Baz had come up with the idea, and the geology instructor had aided the astronaut in accumulating translations.
Results from the SIM bay experiments would include the conclusion, from data gathered by the X-ray spectrometer, that there was greater fluorescent X-ray flux than anticipated, and that the lunar highlands were richer in aluminum than were the mares. "Endeavour" was in a more inclined orbit than previous crewed missions, and Worden saw features that were not known previously, supplementing photographs with thorough descriptions.
By the time Scott and Irwin were ready to take off from the lunar surface and return to "Endeavour", the CSM's orbit had drifted due to the rotation of the Moon, and a plane change burn was required to ensure that the CSM's orbit would be in the same plane as that of the LM once it took off from the Moon. Worden accomplished the 18-second burn with the SPS.
Return to Earth.
"Falcon" lifted off the Moon at 17:11:22 GMT on August2 after 66 hours and 55 minutes on the lunar surface. Docking with the CSM took place just under two hours later. After the astronauts transferred samples and other items from the LM to the CSM, the LM was sealed off, jettisoned, and intentionally crashed into the lunar surface, an impact registered by the seismometers left by Apollo 12, 14 and 15. The jettison proved difficult because of problems getting airtight seals, requiring a delay in discarding the LM. After the jettison, Slayton came on the loop to recommend the astronauts take sleeping pills, or at least that Scott and Irwin do so. Scott as mission commander refused to allow it, feeling there was no need. During the EVAs, the doctors had noticed irregularities in both Scott's and Irwin's heartbeats, but the crew were not informed during the flight. Irwin had heart problems after retiring as an astronaut and died in 1991 of a heart attack; Scott felt that he as commander should have been informed of the biomedical readings. NASA doctors at the time theorized the heart readings were due to potassium deficiency, due to their hard work on the surface and inadequate resupply through liquids.
The crew spent the next two days working on orbital science experiments, including more observations of the Moon from orbit and releasing the subsatellite. "Endeavour" departed lunar orbit with another burn of the SPS engine of 2minutes 21 seconds at 21:22:45 GMT on August4. The next day, during the return to Earth, Worden performed a 39-minute EVA to retrieve film cassettes from the service module's scientific instrument module (SIM) bay, with assistance from Irwin who remained at the command module's hatch. At approximately 171,000 nautical miles (197,000 mi; 317,000 km) from Earth, it was the first "deep space" EVA in history, performed at great distance from any planetary body. As of , it remains one of only three such EVAs, all performed during Apollo's J missions under similar circumstances. Later that day, the crew set a record for the longest Apollo flight to that point.
On approach to Earth on August7, the service module was jettisoned, and the command module reentered the Earth's atmosphere. Although one of the three parachutes on the CM failed after deploying, likely due to damage as the spacecraft vented fuel, only two were required for a safe landing (one extra for redundancy). Upon landing in the North Pacific Ocean, the CM and crew were recovered and taken aboard the recovery ship, , after a mission lasting 12 days, 7hours, 11 minutes and 53 seconds.
Assessment.
The mission objectives for Apollo 15 were to "perform selenological inspection, survey, and sampling of materials and surface features in a pre-selected area of the Hadley–Apennine region. Emplace and activate surface experiments. Evaluate the capability of the Apollo equipment to provide extended lunar surface stay time, increased extravehicular operations, and surface mobility. [and] Conduct inflight experiments and photographic tasks from lunar orbit." It achieved all those objectives. The mission also completed a long list of other tasks, including experiments. One of the photographic objectives, to obtain images of the gegenschein from lunar orbit, was not completed, as the camera was not pointed at the proper spot in the sky. According to the conclusions in the "Apollo 15 Mission Report", the journey "was the fourth lunar landing and resulted in the collection of a wealth of scientific information. The Apollo system, in addition to providing a means of transportation, excelled as an operational scientific facility."
Apollo 15 saw an increase in public interest in the Apollo program, in part due to fascination with the LRV, as well as the attractiveness of the Hadley Rille site and the increased television coverage.
According to David Woods in the "Apollo Lunar Flight Journal",
Controversies.
Despite the successful mission, the careers of the crew were tarnished by a deal they had made before the flight to carry postal covers to the Moon in exchange for about $7,000 each, which they planned to set aside for their children. Walter Eiermann, who had many professional and social contacts with NASA employees and the astronaut corps, served as intermediary between the astronauts and a West German stamp dealer, Hermann Sieger, and Scott carried about 400 covers onto the spacecraft; they were subsequently transferred into "Falcon" and remained inside the lander during the astronauts' activities on the surface of the Moon. After the return to Earth, 100 of the covers were given to Eiermann, who passed them on to Sieger, receiving a commission. No permission had been received from Slayton to carry the covers, as required.
The 100 covers were put on sale to Sieger's customers in late 1971 at a price of about $1,500 each. After receiving the agreed payments, the astronauts returned them, and accepted no compensation. In April 1972, Slayton learned that unauthorized covers had been carried, and removed the three as the backup crew for Apollo 17. The matter became public in June 1972 and the three astronauts were reprimanded for poor judgment; none ever flew in space again. During the investigation, the astronauts had surrendered those covers still in their possession; after Worden filed suit, they were returned in 1983, something "Slate" magazine deemed an exoneration.
Another controversy surrounding the "Fallen Astronaut" statuette that Scott had left on the Moon, arose later. Before the mission, Scott had made a verbal agreement with Belgian artist Paul Van Hoeydonck to sculpt the statuette. Scott's intent, in keeping with NASA's strict policy against commercial exploitation of the US government's space program, was for a simple memorial with a minimum of publicity, keeping the artist anonymous, no commercial replicas being made except for a single copy for public exhibit at the National Air and Space Museum commissioned after the sculpture's public disclosure during the post-flight press conference. Van Hoeydonck claims to have had a different understanding of the agreement, by which he would have received recognition as the creator of a tribute to human space exploration, with rights to sell replicas to the public. Under pressure from NASA, Van Hoeydonck canceled a plan to publicly sell 950 signed copies.
During the congressional hearings into the postal covers and Fallen Astronaut matters, two Bulova timepieces taken on the mission by Scott were also matters of controversy. Before the mission, Scott had been introduced to Bulova's representative, General James McCormack by Apollo 8 commander Frank Borman. Bulova had been seeking to have its timepieces taken on Apollo missions, but after evaluation, NASA had selected Omega watches instead. Scott brought the Bulova timepieces on the mission, without disclosing them to Slayton. During Scott's second EVA, the crystal on his NASA standard issue Omega Speedmaster watch popped off, and, during the third EVA, he used a Bulova watch. The Bulova Chronograph Model #88510/01 that Scott wore on the lunar surface was a prototype, given to him by the Bulova Company, and it is the only privately owned watch to have been worn while walking on the lunar surface. There are images of him wearing this watch, when he saluted the American flag on the Moon, with the Hadley Delta expanse in the background. In 2015, the watch sold for $1.625 million, which makes it the one of the most expensive astronaut-owned artifact ever sold at auction and one of the most expensive watches sold at auction.
Mission insignia.
The Apollo 15 mission patch carries Air Force motifs, a nod to the crew's service there, just as the Apollo 12 all-Navy crew's patch had featured a sailing ship. The circular patch features stylized red, white and blue birds flying over Hadley Rille. Immediately behind the birds, a line of craters forms the Roman numeral XV. The Roman numerals were hidden in emphasized outlines of some craters after NASA insisted that the mission number be displayed in Arabic numerals. The artwork is circled in red, with a white band giving the mission and crew names and a blue border. Scott contacted fashion designer Emilio Pucci to design the patch, who came up with the basic idea of the three-bird motif on a square patch.
The crew changed the shape to round and the colors from blues and greens to a patriotic red, white and blue. Worden stated that each bird also represented an astronaut, white being his own color (and as Command Module Pilot, uppermost), Scott being the blue bird and Irwin the red. The colors matched Chevrolet Corvettes leased by the astronauts at KSC; a Florida car dealer had, since the time of Project Mercury, been leasing Chevrolets to astronauts for $1 and later selling them to the public. The astronauts were photographed with the cars and the training LRV for the June 11, 1971, edition of "Life" magazine.
Visibility from space.
The halo area of the Apollo 15 landing site, created by the LM's exhaust plume, was observed by a camera aboard the Japanese lunar orbiter SELENE and confirmed by comparative analysis of photographs in May 2008. This corresponds well to photographs taken from the Apollo 15 command module showing a change in surface reflectivity due to the plume, and was the first visible trace of crewed landings on the Moon seen from space since the close of the Apollo program.
|
1970 | Apollo 16 | Apollo 16 (April 1627, 1972) was the tenth crewed mission in the United States Apollo space program, administered by NASA, and the fifth and penultimate to land on the Moon. It was the second of Apollo's "J missions", with an extended stay on the lunar surface, a focus on science, and the use of the Lunar Roving Vehicle (LRV). The landing and exploration were in the Descartes Highlands, a site chosen because some scientists expected it to be an area formed by volcanic action, though this proved not to be the case.
The mission was crewed by Commander John Young, Lunar Module Pilot Charles Duke and Command Module Pilot Ken Mattingly. Launched from the Kennedy Space Center in Florida on April 16, 1972, Apollo 16 experienced a number of minor glitches en route to the Moon. These culminated with a problem with the spaceship's main engine that resulted in a six-hour delay in the Moon landing as NASA managers contemplated having the astronauts abort the mission and return to Earth, before deciding the problem could be overcome. Although they permitted the lunar landing, NASA had the astronauts return from the mission one day earlier than planned.
After flying the lunar module to the Moon's surface on April 21, Young and Duke spent 71 hours—just under three days—on the lunar surface, during which they conducted three extravehicular activities or moonwalks, totaling 20 hours and 14 minutes. The pair drove the lunar rover, the second used on the Moon, for . On the surface, Young and Duke collected of lunar samples for return to Earth, including Big Muley, the largest Moon rock collected during the Apollo missions. During this time Mattingly orbited the Moon in the command and service module (CSM), taking photos and operating scientific instruments. Mattingly, in the command module, spent 126 hours and 64 revolutions in lunar orbit. After Young and Duke rejoined Mattingly in lunar orbit, the crew released a subsatellite from the service module (SM). During the return trip to Earth, Mattingly performed a one-hour spacewalk to retrieve several film cassettes from the exterior of the service module. Apollo 16 returned safely to Earth on April 27, 1972.
Crew and key Mission Control personnel.
John Young, the mission commander, was 41 years old and a captain in the Navy at the time of Apollo 16. Becoming an astronaut in 1962 as part of the second group to be selected by NASA, he flew in Gemini 3 with Gus Grissom in 1965, becoming the first American not of the Mercury Seven to fly in space. He thereafter flew in Gemini 10 (1966) with Michael Collins and as command module pilot of Apollo 10 (1969). With Apollo 16, he became the second American, after Jim Lovell, to fly in space four times.
Thomas Kenneth "Ken" Mattingly, the command module pilot, was 36 years old and a lieutenant commander in the Navy at the time of Apollo 16. Mattingly had been selected in NASA's fifth group of astronauts in 1966. He was a member of the support crew for Apollo 8 and Apollo 9. Mattingly then undertook parallel training with Apollo 11's backup CMP, William Anders, who had announced his resignation from NASA effective at the end of July 1969 and would thus be unavailable if the first lunar landing mission was postponed. Had Anders left NASA before Apollo 11 flew, Mattingly would have taken his place on the backup crew.
Mattingly had originally been assigned to the prime crew of Apollo 13, but was exposed to rubella through Charles Duke, at that time with Young on Apollo 13's backup crew; Duke had caught it from one of his children. Mattingly never contracted the illness, but three days before launch was removed from the crew and replaced by his backup, Jack Swigert. Duke, also a Group 5 astronaut and a space rookie, had served on the support crew of Apollo 10 and was a capsule communicator (CAPCOM) for Apollo 11. A lieutenant colonel in the Air Force, Duke was 36 years old at the time of Apollo 16, which made him the youngest of the twelve astronauts who walked on the Moon during Apollo as of the time of the mission. All three men were announced as the prime crew of Apollo 16 on March 3, 1971.
Apollo 16's backup crew consisted of Fred W. Haise Jr. (commander, who had flown on Apollo 13), Stuart A. Roosa (CMP, who had flown on Apollo 14) and Edgar D. Mitchell (LMP, also Apollo 14). Although not officially announced, Director of Flight Crew Operations Deke Slayton, the astronauts' supervisor, had originally planned to have a backup crew of Haise as commander, William R. Pogue (CMP) and Gerald P. Carr (LMP), who were targeted for the prime crew assignment on Apollo 19. However, after the cancellations of Apollos 18 and 19 were announced in September 1970, it made more sense to use astronauts who had already flown lunar missions as backups, rather than training others on what would likely be a dead-end assignment. Subsequently, Roosa and Mitchell were assigned to the backup crew, while Pogue and Carr were reassigned to the Skylab program where they flew on Skylab 4.
For projects Mercury and Gemini, a prime and a backup crew had been designated, but for Apollo, a third group of astronauts, known as the support crew, was also designated. Slayton created the support crews early in the Apollo Program on the advice of Apollo crew commander James McDivitt, who would lead Apollo 9. McDivitt believed that, with preparation going on in facilities across the U.S., meetings that needed a member of the flight crew would be missed. Support crew members were to assist as directed by the mission commander. Usually low in seniority, they assembled the mission's rules, flight plan, and checklists, and kept them updated. For Apollo 16, they were: Anthony W. England, Karl G. Henize, Henry W. Hartsfield Jr., Robert F. Overmyer and Donald H. Peterson.
Flight directors were Pete Frank and Philip Shaffer, first shift, Gene Kranz and Donald R. Puddy, second shift, and Gerry Griffin, Neil B. Hutchinson and Charles R. Lewis, third shift. Flight directors during Apollo had a one-sentence job description: "The flight director may take any actions necessary for crew safety and mission success." CAPCOMs were Haise, Roosa, Mitchell, James B. Irwin, England, Peterson, Hartsfield, and C. Gordon Fullerton.
Mission insignia and call signs.
The insignia of Apollo 16 is dominated by a rendering of an American eagle and a red, white and blue shield, representing the people of the United States, over a gray background representing the lunar surface. Overlaying the shield is a gold NASA vector, orbiting the Moon. On its gold-outlined blue border, there are 16 stars, representing the mission number, and the names of the crew members: Young, Mattingly, Duke. The insignia was designed from ideas originally submitted by the crew of the mission, by Barbara Matelski of the graphics shop at the Manned Spacecraft Center in Houston.
Young and Duke chose "Orion" for the lunar module's call sign, while Mattingly chose "Casper" for the command and service module. According to Duke, he and Young chose "Orion" for the LM because they wanted something connected with the stars. Orion is one of the brightest constellations as seen from Earth, and one visible to the astronauts throughout their journey. Duke also stated, "it is a prominent constellation and easy to pronounce and transmit to Mission Control". Mattingly said he chose "Casper", evoking Casper the Friendly Ghost, because "there are enough serious things in this flight, so I picked a non-serious name."
Planning and training.
Landing site selection.
Apollo 16 was the second of Apollo's J missions, featuring the use of the Lunar Roving Vehicle, increased scientific capability, and three-day lunar surface stays. As Apollo 16 was the penultimate mission in the Apollo program and there was no major new hardware or procedures to test on the lunar surface, the last two missions (the other being Apollo 17) presented opportunities for astronauts to clear up some of the uncertainties in understanding the Moon's characteristics. Scientists sought information on the Moon's early history, which might be obtained from its ancient surface features, the lunar highlands. Previous Apollo expeditions, including Apollo 14 and Apollo 15, had obtained samples of pre-mare lunar material, likely thrown from the highlands by meteorite impacts. These were dated from before lava began to upwell from the Moon's interior and flood the low areas and basins. Nevertheless, no Apollo mission had actually visited the lunar highlands.
Apollo 14 had visited and sampled a ridge of material ejected by the impact that created the Mare Imbrium impact basin. Likewise, Apollo 15 had also sampled material in the region of Imbrium, visiting the basin's edge. Because the Apollo 14 and Apollo 15 landing sites were closely associated with the Imbrium basin, there was still the chance that different geologic processes were prevalent in areas of the lunar highlands far from Mare Imbrium. Scientist Dan Milton, studying photographs of the highlands from Lunar Orbiter photographs, saw an area in the Descartes region of the Moon with unusually high albedo that he theorized might be due to volcanic rock; his theory quickly gained wide support. Several members of the scientific community noted that the central lunar highlands resembled regions on Earth that were created by volcanism processes and hypothesized the same might be true on the Moon. They hoped scientific output from the Apollo 16 mission would provide an answer. Some scientists advocated for a landing near the large crater, Tycho, but its distance from the lunar equator and the fact that the lunar module would have to approach over very rough terrain ruled it out.
The Ad Hoc Apollo Site Evaluation Committee met in April and May 1971 to decide the Apollo 16 and 17 landing sites; it was chaired by Noel Hinners of Bellcomm. There was consensus the final landing sites should be in the lunar highlands, and among the sites considered for Apollo 16 were the Descartes Highlands region west of Mare Nectaris and the crater Alphonsus. The considerable distance between the Descartes site and previous Apollo landing sites would also be beneficial for the network of seismometers, deployed on each landing mission beginning with Apollo 12.
At Alphonsus, three scientific objectives were determined to be of primary interest and paramount importance: the possibility of old, pre-Imbrium impact material from within the crater's wall, the composition of the crater's interior and the possibility of past volcanic activity on the floor of the crater at several smaller "dark halo" craters. Geologists feared, however, that samples obtained from the crater might have been contaminated by the Imbrium impact, thus preventing Apollo 16 from obtaining samples of pre-Imbrium material. There also remained the distinct possibility that this objective would have already been satisfied by the Apollo 14 and Apollo 15 missions, as the Apollo 14 samples had not yet been completely analyzed and samples from Apollo 15 had not yet been obtained.
On June 3, 1971, the site selection committee decided to target the Apollo 16 mission for the Descartes site. Following the decision, the Alphonsus site was considered the most likely candidate for Apollo 17, but was eventually rejected. With the assistance of orbital photography obtained on the Apollo 14 mission, the Descartes site was determined to be safe enough for a crewed landing. The specific landing site was between two young impact craters, North Ray and South Ray craters – in diameter, respectively – which provided "natural drill holes" which penetrated through the lunar regolith at the site, thus leaving exposed bedrock that could be sampled by the crew.
After the selection, mission planners made the Descartes and Cayley formations, two geologic units of the lunar highlands, the primary sampling interest of the mission. It was these formations that the scientific community widely suspected were formed by lunar volcanism, but this hypothesis was proven incorrect by the composition of lunar samples from the mission.
Training.
In addition to the usual Apollo spacecraft training, Young and Duke, along with backup commander Fred Haise, underwent an extensive geological training program that included several field trips to introduce them to concepts and techniques they would use in analyzing features and collecting samples on the lunar surface. During these trips, they visited and provided scientific descriptions of geologic features they were likely to encounter. The backup LMP, Mitchell, was unavailable during the early part of the training, occupied with tasks relating to Apollo 14, but by September 1971 had joined the geology field trips. Before that, Tony England (a member of the support crew and the lunar EVA CAPCOM) or one of the geologist trainers would train alongside Haise on geology field trips.
Since Descartes was believed to be volcanic, a good deal of this training was geared towards volcanic rocks and features, but field trips were made to sites featuring other sorts of rock. As Young later commented, the non-volcanic training proved more useful, given that Descartes did not prove to be volcanic. In July 1971, they visited Sudbury, Ontario, Canada, for geology training exercises, the first time U.S. astronauts trained in Canada. The Apollo 14 landing crew had visited a site in West Germany; geologist Don Wilhelms related that unspecified incidents there had caused Slayton to rule out further European training trips. Geologists chose Sudbury because of a wide crater created about 1.8 billion years ago by a large meteorite. The Sudbury Basin shows evidence of shatter cone geology, familiarizing the Apollo crew with geologic evidence of a meteorite impact. During the training exercises the astronauts did not wear space suits, but carried radio equipment to converse with each other and England, practicing procedures they would use on the lunar surface. By the end of the training, the field trips had become major exercises, involving up to eight astronauts and dozens of support personnel, attracting coverage from the media. For the exercise at the Nevada Test Site, where the massive craters left by nuclear explosions simulated the large craters to be found on the Moon, all participants had to have security clearance and a listed next-of-kin, and an overflight by CMP Mattingly required special permission.
In addition to the field geology training, Young and Duke also trained to use their EVA space suits, adapt to the reduced lunar gravity, collect samples, and drive the Lunar Roving Vehicle. The fact that they had been backups for Apollo 13, planned to be a landing mission, meant that they could spend about 40 percent of their time training for their surface operations. They also received survival training and prepared for technical aspects of the mission. The astronauts spent much time studying the lunar samples brought back by earlier missions, learning about the instruments to be carried on the mission, and hearing what the principal investigators in charge of those instruments expected to learn from Apollo 16. This training helped Young and Duke, while on the Moon, quickly realize that the expected volcanic rocks were not there, even though the geologists in Mission Control initially did not believe them. Much of the training—according to Young, 350 hours—was conducted with the crew wearing space suits, something that Young deemed vital, allowing the astronauts to know the limitations of the equipment in doing their assigned tasks. Mattingly also received training in recognizing geological features from orbit by flying over the field areas in an airplane, and trained to operate the Scientific Instrument Module from lunar orbit.
Equipment.
Launch vehicle.
The launch vehicle which took Apollo 16 to the Moon was a Saturn V, designated as AS-511. This was the eleventh Saturn V to be flown and the ninth used on crewed missions. Apollo 16's Saturn V was almost identical to Apollo 15's. One change that was made was the restoration of four retrorockets to the S-IC first stage, meaning there would be a total of eight, as on Apollo 14 and earlier. The retrorockets were used to minimize the risk of collision between the jettisoned first stage and the Saturn V. These four retrorockets had been omitted from Apollo 15's Saturn V to save weight, but analysis of Apollo 15's flight showed that the S-IC came closer than expected after jettison, and it was feared that if there were only four rockets and one failed, there might be a collision.
ALSEP and other surface equipment.
As on all lunar landing missions after Apollo 11, an Apollo Lunar Surface Experiments Package (ALSEP) was flown on Apollo 16. This was a suite of nuclear-powered experiments designed to keep functioning after the astronauts who set them up returned to Earth. Apollo 16's ALSEP consisted of a Passive Seismic Experiment (PSE, a seismometer), an Active Seismic Experiment (ASE), a Lunar Heat Flow Experiment (HFE), and a Lunar Surface Magnetometer (LSM). The ALSEP was powered by a SNAP-27 radioisotope thermoelectric generator, developed by the Atomic Energy Commission.
The PSE added to the network of seismometers left by Apollo 12, 14 and 15. NASA intended to calibrate the Apollo 16 PSE by crashing the LM's ascent stage near it after the astronauts were done with it, an object of known mass and velocity impacting at a known location. However, NASA lost control of the ascent stage after jettison, and this did not occur. The ASE, designed to return data about the Moon's geologic structure, consisted of two groups of explosives: one, a line of "thumpers" were to be deployed attached to three geophones. The thumpers would be exploded during the ALSEP deployment. A second group was four mortars of different sizes, to be set off remotely once the astronauts had returned to Earth. Apollo 14 had also carried an ASE, though its mortars were never set off for fear of affecting other experiments.
The HFE involved the drilling of two holes into the lunar surface and emplacement of thermometers which would measure how much heat was flowing from the lunar interior. This was the third attempt to emplace a HFE: the first flew on Apollo 13 and never reached the lunar surface, while on Apollo 15, problems with the drill meant the probes did not go as deep as planned. The Apollo 16 attempt would fail after Duke had successfully emplaced the first probe; Young, unable to see his feet in the bulky spacesuit, pulled out and severed the cable after it wrapped around his leg. NASA managers vetoed a repair attempt due to the amount of time it would take. A HFE flew, and was successfully deployed, on Apollo 17.
The LSM was designed to measure the strength of the Moon's magnetic field, which is only a small fraction of Earth's. Additional data would be returned by the use of the Lunar Portable Magnetometer (LPM), to be carried on the lunar rover and activated at several geology stops. Scientists also hoped to learn from an Apollo 12 sample, to be briefly returned to the Moon on Apollo 16, from which "soft" magnetism had been removed, to see if it had been restored on its journey. Measurements after the mission found that "soft" magnetism had returned to the sample, although at a lower intensity than before.
A Far Ultraviolet Camera/Spectrograph (UVC) was flown, the first astronomical observations taken from the Moon, seeking data on hydrogen sources in space without the masking effect of the Earth's corona. The instrument was placed in the LM's shadow and pointed at nebulae, other astronomical objects, the Earth itself, and any suspected volcanic vents seen on the lunar surface. The film was returned to Earth. When asked to summarize the results for a general audience, Dr. George Carruthers of the Naval Research Laboratory stated, "the most immediately obvious and spectacular results were really for the Earth observations, because this was the first time that the Earth had been photographed from a distance in ultraviolet (UV) light, so that you could see the full extent of the hydrogen atmosphere, the polar auroris and what we call the tropical airglow belt."
Four panels mounted on the LM's descent stage comprised the Cosmic Ray Detector, designed to record cosmic ray and solar wind particles. Three of the panels were left uncovered during the voyage to the Moon, with the fourth uncovered by the crew early in the EVA. The panels would be bagged for return to Earth. The free-standing Solar Wind Composition Experiment flew on Apollo 16, as it had on each of the lunar landings, for deployment on the lunar surface and return to Earth. Platinum foil was added to the aluminum of the previous experiments, to minimize contamination.
Particles and Fields Subsatellite PFS-2.
The Apollo 16 Particles and Fields Subsatellite (PFS-2) was a small satellite released into lunar orbit from the service module. Its principal objective was to measure charged particles and magnetic fields all around the Moon as the Moon orbited Earth, similar to its sister spacecraft, PFS-1, released eight months earlier by Apollo 15. The two probes were intended to have similar orbits, ranging from above the lunar surface.
Like the Apollo 15 subsatellite, PFS-2 was expected to have a lifetime of at least a year before its orbit decayed and it crashed onto the lunar surface. The decision to bring Apollo 16 home early after there were difficulties with the main engine meant that the spacecraft did not go to the orbit which had been planned for PFS-2. Instead, it was ejected into a lower-than-planned orbit and crashed into the Moon a month later on May 29, 1972, after circling the Moon 424 times. This brief lifetime was because lunar mascons were near to its orbital ground track and helped pull PFS-2 into the Moon.
Mission events.
Elements of the spacecraft and launch vehicle began arriving at Kennedy Space Center in July 1970, and all had arrived by September 1971. Apollo 16 was originally scheduled to launch on March 17, 1972. One of the bladders for the CM's reaction control system burst during testing. This issue, in combination with concerns that one of the explosive cords that would jettison the LM from the CSM after the astronauts returned from the lunar surface would not work properly, and a problem with Duke's spacesuit, made it desirable to slip the launch to the next launch window. Thus, Apollo 16 was postponed to April 16. The launch vehicle stack, which had been rolled out from the Vehicle Assembly Building on December 13, 1971, was returned thereto on January 27, 1972. It was rolled out again to Launch Complex 39A on February 9.
The official mission countdown began on Monday, April 10, 1972, at 8:30 am, six days before the launch. At this point the SaturnV rocket's three stages were powered up, and drinking water was pumped into the spacecraft. As the countdown began, the crew of Apollo 16 was participating in final training exercises in anticipation of a launch on April 16. The astronauts underwent their final preflight physical examination on April 11. The only holds in the countdown were the ones pre-planned in the schedule, and the weather was fair as the time for launch approached.
Launch and outward journey.
The Apollo 16 mission launched from the Kennedy Space Center in Florida at 12:54 pm EST on April 16, 1972. The launch was nominal; the crew experienced vibration similar to that on previous missions. The first and second stages of the SaturnV (the S-IC and S-II) performed nominally; the spacecraft entered orbit around Earth just under 12 minutes after lift-off.
After reaching orbit, the crew spent time adapting to the zero-gravity environment and preparing the spacecraft for trans-lunar injection (TLI), the burn of the third-stage rocket that would propel them to the Moon. In Earth orbit, the crew faced minor technical issues, including a potential problem with the environmental control system and the S-IVB third stage's attitude control system, but eventually resolved or compensated for them as they prepared to depart towards the Moon.
After two orbits, the rocket's third stage reignited for just over five minutes, propelling the craft towards the Moon at about . Six minutes after the burn of the S-IVB, the command and service modules (CSM), containing the crew, separated from the rocket and traveled away from it before turning around and retrieving the lunar module from inside the expended rocket stage. The maneuver, performed by Mattingly and known as transposition, docking, and extraction, went smoothly.
Following transposition and docking, the crew noticed the exterior surface of the lunar module was giving off particles from a spot where the LM's skin appeared torn or shredded; at one point, Duke estimated they were seeing about five to ten particles per second. Young and Duke entered the lunar module through the docking tunnel connecting it with the command module to inspect its systems, at which time they did not spot any major issues.
Once on course towards the Moon, the crew put the spacecraft into a rotisserie "barbecue" mode in which the craft rotated along its long axis three times per hour to ensure even heat distribution about the spacecraft from the Sun. After further preparing the craft for the voyage, the crew began the first sleep period of the mission just under 15 hours after launch.
By the time Mission Control issued the wake-up call to the crew for flight day two, the spacecraft was about away from the Earth, traveling at about . As it was not due to arrive in lunar orbit until flight day four, flight days two and three were largely preparatory, consisting of spacecraft maintenance and scientific research. On day two, the crew performed an electrophoresis experiment, also performed on Apollo 14, in which they attempted to demonstrate that electrophoretic separation in their near-weightless environment could be used to produce substances of greater purity than would be possible on Earth. Using two different sizes of polystyrene particles, one size colored red and one blue, separation of the two types via electrophoresis was achieved, though electro-osmosis in the experiment equipment prevented the clear separation of two particle bands.
The remainder of day two included a two-second mid-course correction burn performed by the CSM's service propulsion system (SPS) engine to tweak the spacecraft's trajectory. Later in the day, the astronauts entered the lunar module for the second time to further inspect the landing craft's systems. The crew reported they had observed additional paint peeling from a portion of the LM's outer aluminum skin. Despite this, the crew discovered that the spacecraft's systems were performing nominally. Following the LM inspection, the crew reviewed checklists and procedures for the following days in anticipation of their arrival and the Lunar Orbit Insertion (LOI) burn. Command Module Pilot Mattingly reported "gimbal lock", meaning that the system to keep track of the craft's attitude was no longer accurate. Mattingly had to realign the guidance system using the Sun and Moon. At the end of day two, Apollo 16 was about away from Earth.
When the astronauts were awakened for flight day three, the spacecraft was about away from the Earth. The velocity of the craft steadily decreased, as it had not yet reached the lunar sphere of gravitational influence. The early part of day three was largely housekeeping, spacecraft maintenance and exchanging status reports with Mission Control in Houston. The crew performed the Apollo light flash experiment, or ALFMED, to investigate "light flashes" that were seen by Apollo lunar astronauts when the spacecraft was dark, regardless of whether their eyes were open. This was thought to be caused by the penetration of the eye by cosmic ray particles. During the second half of the day, Young and Duke again entered the lunar module to power it up and check its systems, and perform housekeeping tasks in preparation for the lunar landing. The systems were found to be functioning as expected. Following this, the crew donned their space suits and rehearsed procedures that would be used on landing day. Just before the end of flight day three at 59 hours, 19 minutes, 45 seconds after liftoff, while from the Earth and from the Moon, the spacecraft's velocity began increasing as it accelerated towards the Moon after entering the lunar sphere of influence.
After waking up on flight day four, the crew began preparations for the LOI maneuver that would brake them into orbit. At an altitude of the scientific instrument module (SIM) bay cover was jettisoned. At just over 74 hours into the mission, the spacecraft passed behind the Moon, temporarily losing contact with Mission Control. While over the far side, the SPS burned for 6minutes and 15 seconds, braking the spacecraft into an orbit with a low point (pericynthion) of 58.3 and a high point (apocynthion) of 170.4 nautical miles (108.0 and 315.6 km, respectively). After entering lunar orbit, the crew began preparations for the Descent Orbit Insertion (DOI) maneuver to further modify the spacecraft's orbital trajectory. The maneuver was successful, decreasing the craft's pericynthion to . The remainder of flight day four was spent making observations and preparing for activation of the lunar module, undocking, and landing the following day.
Lunar surface.
The crew continued preparing for lunar module activation and undocking shortly after waking up to begin flight day five. The boom that extended the mass spectrometer in the SIM bay was stuck, semi-deployed. It was decided that Young and Duke would visually inspect the boom after undocking the LM from the CSM. They entered the LM for activation and checkout of the spacecraft's systems. Despite entering the LM 40 minutes ahead of schedule, they completed preparations only 10 minutes early due to numerous delays in the process. With the preparations finished, they undocked 96 hours, 13 minutes, 31 seconds into the mission.
For the rest of the two crafts' passes over the near side of the Moon, Mattingly prepared to shift "Casper" to a higher, near-circular orbit, while Young and Duke prepared "Orion" for the descent to the lunar surface. At this point, during tests of the CSM's steerable rocket engine in preparation for the burn to modify the craft's orbit, Mattingly detected oscillations in the SPS engine's backup gimbal system. According to mission rules, under such circumstances, "Orion" was to re-dock with "Casper", in case Mission Control decided to abort the landing and use the lunar module's engines for the return trip to Earth. Instead, the two craft kept station, maintaining positions close to each other. After several hours of analysis, mission controllers determined that the malfunction could be worked around, and Young and Duke could proceed with the landing.
Powered descent to the lunar surface began about six hours behind schedule. Because of the delay, Young and Duke began their descent to the surface at an altitude higher than that of any previous mission, at . After descending to an altitude of about , Young was able to view the landing site in its entirety. Throttle-down of the LM's landing engine occurred on time, and the spacecraft tilted forward to its landing orientation at an altitude of . The LM landed north and west of the planned landing site at 104 hours, 29 minutes, and 35 seconds into the mission, at 2:23:35 UTC on April 21 (8:23:35 pm on April 20 in Houston). The availability of the Lunar Roving Vehicle rendered their distance from the targeted point trivial.
After landing, Young and Duke began powering down some of the LM's systems to conserve battery power. Upon completing their initial procedures, the pair configured "Orion" for their three-day stay on the lunar surface, removed their space suits and took initial geological observations of the immediate landing site. They then settled down for their first meal on the surface. After eating, they configured the cabin for sleep. The landing delay caused by the malfunction in the CSM's main engine necessitated significant modifications to the mission schedule. Apollo 16 would spend one less day in lunar orbit after surface exploration had been completed to afford the crew ample margins in the event of further problems. In order to improve Young's and Duke's sleep schedule, the third and final moonwalk of the mission was trimmed from seven hours to five.
First moonwalk.
After waking up on April 21, Young and Duke ate breakfast and began preparations for the first extravehicular activity (EVA), or moonwalk. After the pair donned and pressurized their space suits and depressurized the lunar module cabin, Young climbed out onto the "porch" of the LM, a small platform above the ladder. Duke handed Young a jettison bag full of trash to dispose of on the surface. Young then lowered the equipment transfer bag (ETB), containing equipment for use during the EVA, to the surface. Young descended the ladder and, upon setting foot on the lunar surface, became the ninth human to walk on the Moon. Upon stepping onto the surface, Young expressed his sentiments about being there: "There you are: Mysterious and unknown Descartes. Highland plains. Apollo 16 is gonna change your image. I'm sure glad they got ol' Brer Rabbit, here, back in the briar patch where he belongs." Duke soon descended the ladder and joined Young on the surface, becoming the tenth person to walk on the Moon. Duke was then aged 36; no younger human has ever walked on the lunar surface. Duke expressed his excitement, stating to CAPCOM Anthony England: "Fantastic! Oh, that first foot on the lunar surface is super, Tony!" The pair's first task of the moonwalk was to offload the Lunar Roving Vehicle, the Far Ultraviolet Camera/Spectrograph, and other equipment. This was done without problems. On first driving the lunar rover, Young discovered that the rear steering was not working. He alerted Mission Control to the problem before setting up the television camera, after which Duke erected the United States flag. During lunar surface operations, Commander Young always drove the rover, while Lunar Module Pilot Duke assisted with navigation; this was a division of responsibilities used consistently throughout Apollo's J missions.
The day's next task was to deploy the ALSEP; while they were parking the lunar rover, on which the TV camera was mounted, to observe the deployment, the rear steering began functioning. After ALSEP deployment, they collected samples in the vicinity. About four hours after the beginning of EVA-1, they mounted the lunar rover and drove to the first geologic stop, Plum Crater, a crater on the rim of Flag Crater, about across. There, at a distance of from the LM, they sampled material in the vicinity, which scientists believed had penetrated through the upper regolith layer to the underlying Cayley Formation. It was there that Duke retrieved, at the request of Mission Control, the largest rock returned by an Apollo mission, a breccia nicknamed Big Muley after mission geology principal investigator William R. Muehlberger. The next stop of the day was Buster Crater, a small crater located north of the larger Spook Crater, about from the LM. There, Duke took pictures of Stone Mountain and South Ray Crater, while Young deployed the LPM. By this point, scientists were beginning to reconsider their pre-mission hypothesis that Descartes had been the setting of ancient volcanic activity, as the two astronauts had yet to find any volcanic material. Following their stop at Buster, Young did a "Grand Prix" demonstration drive of the lunar rover, which Duke filmed with a 16 mm movie camera. This had been attempted on Apollo 15, but the camera had malfunctioned. After completing more tasks at the ALSEP, they returned to the LM to close out the moonwalk. They reentered the LM 7hours, 6minutes, and 56 seconds after the start of the EVA. Once inside, they pressurized the LM cabin, went through a half-hour debriefing with scientists in Mission Control, and configured the cabin for the sleep period.
Second moonwalk.
Waking up three and a half minutes earlier than planned, they discussed the day's timeline of events with Houston. The second lunar excursion's primary objective was to visit Stone Mountain to climb up the slope of about 20 degrees to reach a cluster of five craters known as "Cinco craters". They drove there in the LRV, traveling from the LM. At above the valley floor, the pair were at the highest elevation above the LM of any Apollo mission. They marveled at the view (including South Ray) from the side of Stone Mountain, which Duke described as "spectacular", then gathered samples in the vicinity. After spending 54 minutes on the slope, they climbed aboard the lunar rover en route to the day's second stop, dubbed Station 5, a crater across. There, they hoped to find Descartes material that had not been contaminated by ejecta from South Ray Crater, a large crater south of the landing site. The samples they collected there, despite still uncertain origin, are according to geologist Wilhelms, "a reasonable bet to be Descartes".
The next stop, Station 6, was a blocky crater, where the astronauts believed they could sample the Cayley Formation as evidenced by the firmer soil found there. Bypassing station seven to save time, they arrived at Station 8 on the lower flank of Stone Mountain, where they sampled material on a ray from South Ray crater for about an hour. There, they collected black and white breccias and smaller, crystalline rocks rich in plagioclase. At Station 9, an area known as the "Vacant Lot", which was believed to be free of ejecta from South Ray, they spent about 40 minutes gathering samples. Twenty-five minutes after departing the Vacant Lot, they arrived at the final stop of the day, halfway between the ALSEP site and the LM. There, they dug a double core and conducted several penetrometer tests along a line stretching east of the ALSEP. At the request of Young and Duke, the moonwalk was extended by ten minutes. After returning to the LM to wrap up the second lunar excursion, they climbed back inside the landing craft's cabin, sealing and pressurizing the interior after 7hours, 23 minutes, and 26 seconds of EVA time, breaking a record that had been set on Apollo 15. After eating a meal and proceeding with a debriefing on the day's activities with Mission Control, they reconfigured the LM cabin and prepared for the sleep period.
Third moonwalk.
Flight day seven was their third and final day on the lunar surface, returning to orbit to rejoin Mattingly in the CSM following the day's moonwalk. During the third and final lunar excursion, they were to explore North Ray crater, the largest of any of the craters any Apollo expedition had visited. After exiting "Orion", the pair drove to North Ray crater. The drive was smoother than that of the previous day, as the craters were shallower and boulders were less abundant north of the immediate landing site. After passing Palmetto crater, boulders gradually became larger and more abundant as they approached North Ray in the lunar rover. Upon arriving at the rim of North Ray crater, they were away from the LM. After their arrival, the duo took photographs of the wide and deep crater. They visited a large boulder, taller than a four-story building, which became known as 'House Rock'. Samples obtained from this boulder delivered the final blow to the pre-mission volcanic hypothesis, proving it incorrect. House Rock had numerous bullet hole-like marks where micrometeoroids from space had impacted the rock.
About 1hour and 22 minutes after arriving at the North Ray crater, they departed for Station 13, a large boulder field about from North Ray. On the way, they set a lunar speed record, traveling at an estimated downhill. They arrived at a high boulder, which they called "Shadow Rock". Here, they sampled permanently shadowed soil. During this time, Mattingly was preparing the CSM in anticipation of their return approximately six hours later. After three hours and six minutes, they returned to the LM, where they completed several experiments and unloaded the rover. A short distance from the LM, Duke placed a photograph of his family and an Air Force commemorative medallion on the surface. Young drove the rover to a point about east of the LM, known as the 'VIP site,' so its television camera, controlled remotely by Mission Control, could observe Apollo 16's liftoff from the Moon. They then reentered the LM after a 5-hour and 40-minute final excursion. After pressurizing the LM cabin, the crew began preparing to return to lunar orbit.
Solo activities.
After "Orion" was cleared for the landing attempt, "Casper" maneuvered away, and Mattingly performed a burn that took his spacecraft to an orbit of in preparation for his scientific work. The SM carried a suite of scientific instruments in its SIM bay, similar to those carried on Apollo 15. Mattingly had compiled a busy schedule operating the various SIM bay instruments, one that became even busier once Houston decided to bring Apollo 16 home a day early, as the flight directors sought to make up for lost time.
His work was hampered by various malfunctions: when the Panoramic Camera was turned on, it appeared to take so much power from one of the CSM's electrical systems, that it initiated the spacecraft Master Alarm. It was immediately shut off, though later analysis indicated that the drain might have been from the spacecraft's heaters, which came on at the same time. Its work was also hampered by the delay in the beginning of "Casper"'s orbital scientific work and the early return to Earth, and by a malfunction resulting in the overexposure of many of the photographs. Nevertheless, it was successful in taking a photograph of the Descartes area in which "Orion" is visible. The Mass Spectrometer boom did not fully retract following its initial extension, as had happened on Apollo 15, though it retracted far enough to allow the SPS engine to be fired safely when "Casper" maneuvered away from "Orion" before the LM began its Moon landing attempt. Although the Mass Spectrometer was able to operate effectively, it stuck near its fully deployed position prior to the burn that preceded rendezvous, and had to be jettisoned. Scientists had hoped to supplement the lunar data gained with more on the trans-earth coast, but Apollo 15 data could be used instead. The Mapping Camera also did not function perfectly; later analysis found it to have problems with its glare shield. The changes to the flight plan meant that some areas of the lunar surface that were supposed to be photographed could not be; also, a number of images were overexposed. The Laser Altimeter, designed to accurately measure the spacecraft altitude, slowly lost accuracy due to reduced power, and finally failed just before it was due to be used for the last time.
Return to Earth.
Eight minutes before the planned departure from the lunar surface, CAPCOM James Irwin notified Young and Duke from Mission Control that they were go for liftoff. Two minutes before launch, they activated the "Master Arm" switch and then the "Abort Stage" button, causing small explosive charges to sever the ascent stage from the descent stage, with cables connecting the two severed by a guillotine-like mechanism. At the pre-programmed moment, there was liftoff and the ascent stage blasted away from the Moon, as the camera aboard the LRV followed the first moments of the flight. Six minutes after liftoff, at a speed of about , Young and Duke reached lunar orbit. Young and Duke successfully rendezvoused and re-docked with Mattingly in the CSM. To minimize the transfer of lunar dust from the LM cabin into the CSM, Young and Duke cleaned the cabin before opening the hatch separating the two spacecraft. After opening the hatch and reuniting with Mattingly, the crew transferred the samples Young and Duke had collected on the surface into the CSM for transfer to Earth. After transfers were completed, the crew would sleep before jettisoning the empty lunar module ascent stage the next day, when it was to be crashed intentionally into the lunar surface in order to calibrate the seismometer Young and Duke had left on the surface.
The next day, after final checks were completed, the expended LM ascent stage was jettisoned. Likely because of a failure by the crew to activate a certain switch in the LM before sealing it off, it tumbled after separation. NASA could not control it, and it did not execute the rocket burn necessary for the craft's intentional de-orbit. The ascent stage eventually crashed into the lunar surface nearly a year after the mission. The crew's next task, after jettisoning the lunar module ascent stage, was to release a subsatellite into lunar orbit from the CSM's scientific instrument bay. The burn to alter the CSM's orbit to that desired for the subsatellite had been cancelled; as a result, the subsatellite lasted just over a month in orbit, far less than its anticipated one year. Just under five hours after the subsatellite release, on the CSM's 65th orbit around the Moon, its service propulsion system main engine was reignited to propel the craft on a trajectory that would return it to Earth. The SPS engine performed the burn flawlessly despite the malfunction that had delayed their landing several days previously.
During the return to Earth, Mattingly performed an 83-minute EVA to retrieve film cassettes from the cameras in the SIM bay, with assistance from Duke who remained at the command module's hatch. At approximately from Earth, it was the second "deep space" EVA in history, performed at great distance from any planetary body. , it remains one of only three such EVAs, all performed during Apollo's J-missions under similar circumstances. During the EVA, Mattingly set up a biological experiment, the Microbial Ecology Evaluation Device (MEED), an experiment unique to Apollo 16, to evaluate the response of microbes to the space environment. The crew carried out various housekeeping and maintenance tasks aboard the spacecraft and ate a meal before concluding the day.
The penultimate day of the flight was largely spent performing experiments, aside from a twenty-minute press conference during the second half of the day. During the press conference, the astronauts answered questions pertaining to several technical and non-technical aspects of the mission prepared and listed by priority at the Manned Spacecraft Center in Houston by journalists covering the flight. In addition to numerous housekeeping tasks, the astronauts prepared the spacecraft for its atmospheric reentry the next day. At the end of the crew's final full day in space, the spacecraft was approximately from Earth and closing at a rate of about .
When the wake-up call was issued to the crew for their final day in space by CAPCOM England, the CSM was about from Earth, traveling just over . Just over three hours before splashdown in the Pacific Ocean, the crew performed a final course correction burn, using the spacecraft's thrusters to change their velocity by . Approximately ten minutes before reentry into Earth's atmosphere, the cone-shaped command module containing the three crewmembers separated from the service module, which would burn up during reentry. At 265 hours and 37 minutes into the mission, at a velocity of about , Apollo 16 began atmospheric reentry. At its maximum, the temperature of the heat shield was between . After successful parachute deployment and less than 14 minutes after reentry began, the command module splashed down in the Pacific Ocean southeast of the island of Kiritimati 265 hours, 51 minutes, 5seconds after liftoff. The spacecraft and its crew was retrieved by the aircraft carrier . The astronauts were safely aboard the "Ticonderoga" 37 minutes after splashdown.
Scientific results and aftermath.
Scientific analysis of the rocks brought back to Earth confirmed that the Cayley Formation was not volcanic in nature. There was less certainty regarding the Descartes Formation, as it was not clear which if any of the rocks came from there. There was no evidence that showed that Stone Mountain was volcanic. One reason why Descartes had been selected was that it was visually different from previous Apollo landing sites, but rocks from there proved to be closely related to those from the Fra Mauro Formation, Apollo 14's landing site. Geologists realized that they had been so certain that Cayley was volcanic, they had not been open to dissenting views, and that they had been over-reliant on analogues from Earth, a flawed model because the Moon does not share much of the Earth's geologic history. They concluded that there are few if any volcanic mountains on the Moon. These conclusions were informed by observations from Mattingly, the first CMP to use binoculars in his observations, who had seen that from the perspective of lunar orbit, there was nothing distinctive about the Descartes Formation—it fit right in with the Mare Imbrium structure. Other results gained from Apollo 16 included the discovery of two new auroral belts around Earth.
After the mission, Young and Duke served as backups for Apollo 17, and Duke retired from NASA in December 1975. Young and Mattingly both flew the Space Shuttle: Young, who served as Chief Astronaut from 1974 to 1987, commanded the first Space Shuttle mission, STS-1 in 1981, as well as STS-9 in 1983, on the latter mission becoming the first person to journey into space six times. He retired from NASA in 2004. Mattingly also twice commanded Shuttle missions, STS-4 (1982) and STS-51-C (1985), before retiring from NASA in 1985.
Locations of spacecraft and other equipment.
The "Ticonderoga" delivered the Apollo 16 command module to the North Island Naval Air Station, near San Diego, California, on Friday, May 5, 1972. On Monday, May 8, ground service equipment being used to empty the residual toxic reaction control system fuel in the command module tanks exploded in a Naval Air Station hangar. Forty-six people were sent to the hospital for 24 to 48 hours' observation, most suffering from inhalation of toxic fumes. Most seriously injured was a technician who suffered a fractured kneecap when a cart overturned on him. A hole was blown in the hangar roof 250 feet above; about 40 windows in the hangar were shattered. The command module suffered a three-inch gash in one panel.
The Apollo 16 command module "Casper" is on display at the U.S. Space & Rocket Center in Huntsville, Alabama, following a transfer of ownership from NASA to the Smithsonian in November 1973. The lunar module ascent stage separated from the CSM on April 24, 1972, but NASA lost control of it. It orbited the Moon for about a year. Its impact site remains unknown, though research published in 2023 suggests an impact date of May 29, 1972 (the same as for the subsattelite) and an impact location of 9.99° N, 104.26° E.
The S-IVB was deliberately crashed into the Moon. However, due to a communication failure before impact the exact location was unknown until January 2016, when it was discovered within Mare Insularum by the Lunar Reconnaissance Orbiter, approximately southwest of Copernicus Crater.
Duke left two items on the Moon, both of which he photographed while there. One is a plastic-encased photo portrait of his family. The reverse of the photo is signed by Duke's family and bears this message: "This is the family of Astronaut Duke from Planet Earth. Landed on the Moon, April 1972." The other item was a commemorative medal issued by the United States Air Force, which was celebrating its 25th anniversary in 1972. He took two medals, leaving one on the Moon and donating the other to the National Museum of the United States Air Force at Wright-Patterson Air Force Base in Ohio.
In 2006, shortly after Hurricane Ernesto affected Bath, North Carolina, eleven-year-old Kevin Schanze discovered a piece of metal debris on the ground near his beach home. Schanze and a friend discovered a "stamp" on the flat metal sheet, which upon further inspection turned out to be a faded copy of the Apollo 16 mission insignia. NASA later confirmed the object to be a piece of the first stage of the SaturnV that had launched Apollo 16 into space. In July 2011, after returning the piece of debris at NASA's request, 16-year-old Schanze was given an all-access tour of the Kennedy Space Center and VIP seating for the launch of STS-135, the final mission of the Space Shuttle program.
|
1971 | Apollo 17 | Apollo 17 (December 7–19, 1972) was the final mission of NASA's Apollo program, the most recent time humans have set foot on the Moon or traveled beyond low Earth orbit. Commander Gene Cernan and Lunar Module Pilot Harrison Schmitt walked on the Moon, while Command Module Pilot Ronald Evans orbited above. Schmitt was the only professional geologist to land on the Moon; he was selected in place of Joe Engle, as NASA had been under pressure to send a scientist to the Moon. The mission's heavy emphasis on science meant the inclusion of a number of new experiments, including a biological experiment containing five mice that was carried in the command module.
Mission planners had two primary goals in deciding on the landing site: to sample lunar highland material older than that at Mare Imbrium and to investigate the possibility of relatively recent volcanic activity. They therefore selected Taurus–Littrow, where formations that had been viewed and pictured from orbit were thought to be volcanic in nature. Since all three crew members had backed up previous Apollo lunar missions, they were familiar with the Apollo spacecraft and had more time for geology training.
Launched at 12:33 a.m. Eastern Standard Time (EST) on December 7, 1972, following the only launch-pad delay in the course of the whole Apollo program that was caused by a hardware problem, Apollo 17 was a "J-type" mission that included three days on the lunar surface, expanded scientific capability, and the use of the third Lunar Roving Vehicle (LRV). Cernan and Schmitt landed in the Taurus–Littrow valley, completed three moonwalks, took lunar samples and deployed scientific instruments. Orange soil was discovered at Shorty crater; it proved to be volcanic in origin, although from early in the Moon's history. Evans remained in lunar orbit in the command and service module (CSM), taking scientific measurements and photographs. The spacecraft returned to Earth on December 19.
The mission broke several records for crewed spaceflight, including the longest crewed lunar landing mission (12 days, 14 hours), greatest distance from a spacecraft during an extravehicular activity of any type (7.6 kilometers or 4.7 miles), longest total duration of lunar-surface extravehicular activities (22 hours, 4 minutes), largest lunar-sample return (approximately 115 kg or 254 lb), longest time in lunar orbit (6 days, 4 hours), and greatest number of lunar orbits (75).
Crew and key Mission Control personnel.
In 1969, NASA announced that the backup crew of Apollo 14 would be Gene Cernan, Ronald Evans, and former X-15 pilot Joe Engle. This put them in line to be the prime crew of Apollo 17, because the Apollo program's crew rotation generally meant that a backup crew would fly as prime crew three missions later. Harrison Schmitt, who was a professional geologist as well as an astronaut, had served on the backup crew of Apollo 15, and thus, because of the rotation, would have been due to fly as lunar module pilot on Apollo 18.
In September 1970, the plan to launch Apollo 18 was cancelled. The scientific community pressed NASA to assign a geologist, rather than a pilot with non-professional geological training, to an Apollo landing. NASA subsequently assigned Schmitt to Apollo 17 as the lunar module pilot. After that, NASA’s director of flight crew operations, Deke Slayton, was left with the question of who would fill the two other Apollo 17 slots: the rest of the Apollo 15 backup crew (Dick Gordon and Vance Brand), or Cernan and Evans from the Apollo 14 backup crew. Slayton ultimately chose Cernan and Evans. Support at NASA for assigning Cernan was not unanimous. Cernan had crashed a Bell 47G helicopter into the Indian River near Cape Kennedy during a training exercise in January 1971; the accident was later attributed to pilot error, as Cernan had misjudged his altitude before crashing into the water. Jim McDivitt, who was manager of the Apollo Spacecraft Program Office at the time, objected to Cernan's selection because of this accident, but Slayton dismissed the concern. After Cernan was offered command of the mission, he advocated for Engle to fly with him on the mission, but it was made clear to him that Schmitt would be assigned instead, with or without Cernan, so he acquiesced. The prime crew of Apollo 17 was publicly announced on August 13, 1971.
When assigned to Apollo 17, Cernan was a 38-year-old captain in the United States Navy; he had been selected in the third group of astronauts in 1963, and flown as pilot of Gemini 9A in 1966 and as lunar module pilot of Apollo 10 in 1969 before he served on Apollo 14's backup crew. Evans, 39 years old when assigned to Apollo 17, had been selected as part of the fifth group of astronauts in 1966, and had been a lieutenant commander in the United States Navy. Schmitt, a civilian, was 37 years old when assigned Apollo 17, had a doctorate in geology from Harvard University, and had been selected in the fourth group of astronauts in 1965. Both Evans and Schmitt were making their first spaceflights.
For the backup crews of Apollo 16 and 17, the final Apollo lunar missions, NASA selected astronauts who had already flown Apollo lunar missions, to take advantage of their experience, and avoid investing time and money in training rookies who would be unlikely to ever fly an Apollo mission. The original backup crew for Apollo 17, announced at the same time as the prime crew, was the crew of Apollo 15: David Scott as commander, Alfred Worden as CMP and James Irwin as LMP, but in May 1972 they were removed from the backup crew because of their roles in an incident known as the Apollo 15 postal covers incident. They were replaced with the landing crew of Apollo 16: John W. Young as backup crew commander, Charles Duke as LMP, and Apollo 14's CMP, Stuart Roosa. Originally, Apollo 16's CMP, Ken Mattingly, was to be assigned along with his crewmates, but he declined so he could spend more time with his family, his son having just been born, and instead took an assignment to the Space Shuttle program. Roosa had also served as backup CMP for Apollo 16.
For the Apollo program, in addition to the prime and backup crews that had been used in the Mercury and Gemini programs, NASA assigned a third crew of astronauts, known as the support crew. Their role was to provide any assistance in preparing for the missions that the missions director assigned then. Preparations took place in meetings at facilities across the US and sometimes needed a member of the flight crew to attend them. Because McDivitt was concerned that problems could be created if a prime or backup crew member was unable to attend a meeting, Slayton created the support crews to ensure that someone would be able to attend in their stead. Usually low in seniority, they also assembled the mission's rules, flight plan and checklists, and kept them updated; For Apollo 17, they were Robert F. Overmyer, Robert A. Parker and C. Gordon Fullerton.
Flight directors were Gerry Griffin, first shift, Gene Kranz and Neil B. Hutchinson, second shift, and Pete Frank and Charles R. Lewis, third shift. According to Kranz, flight directors during the program Apollo had a one-sentence job description, "The flight director may take any actions necessary for crew safety and mission success." Capsule communicators (CAPCOMs) were Fullerton, Parker, Young, Duke, Mattingly, Roosa, Alan Shepard and Joseph P. Allen.
Mission insignia and call signs.
The insignia's most prominent feature is an image of the Greek sun god Apollo backdropped by a rendering of an American eagle, the red bars on the eagle mirroring those on the U.S. flag. Three white stars above the red bars represent the three crewmembers of the mission. The background includes the Moon, the planet Saturn, and a galaxy or nebula. The wing of the eagle partially overlays the Moon, suggesting humanity's established presence there.
The insignia includes, along with the colors of the U.S. flag (red, white, and blue), the color gold, representative of a "golden age" of spaceflight that was to begin with Apollo 17. The image of Apollo in the mission insignia is a rendering of the "Apollo Belvedere" sculpture in the Vatican Museums. It looks forward into the future, towards the celestial objects shown in the insignia beyond the Moon. These represent humanity's goals, and the image symbolizes human intelligence, wisdom and ambition. The insignia was designed by artist Robert McCall, based on ideas from the crew.
In deciding the call signs for the command module (CM) and lunar module (LM), the crew wished to pay tribute to the American public for their support of the Apollo program, and to the mission, and wanted names with a tradition within American history. The CM was given the call sign "America". According to Cernan, this evoked the 19th century sailing ships which were given that name, and was a thank-you to the people of the United States. The crew selected the name "Challenger" for the LM in lieu of an alternative, "Heritage". Cernan stated that the selected name "just seemed to describe more of what the future for America really held, and that was a challenge". After Schmitt stepped onto the Moon from "Challenger", he stated, "I think the next generation ought to accept this as a challenge. Let's see them leave footprints like these."
Planning and training.
Scheduling and landing site selection.
Prior to the cancellation of Apollo 18 through 20, Apollo 17 was slated to launch in September 1971 as part of NASA's tentative launch schedule set forth in 1969. The in-flight abort of Apollo 13 and the resulting modifications to the Apollo spacecraft delayed subsequent missions. Following the cancellation of Apollo 20 in early 1970, NASA decided there would be no more than two Apollo missions per year. Part of the reason Apollo 17 was scheduled for December 1972 was to make it fall after the presidential election in November, ensuring that if there was a disaster, it would have no effect on President Richard Nixon's re-election campaign. Nixon had been deeply concerned about the Apollo 13 astronauts, and, fearing another mission in crisis as he ran for re-election, initially decided to omit the funds for Apollo 17 from the budget; he was persuaded to accept a December 1972 date for the mission.
Like Apollo 15 and 16, Apollo 17 was slated to be a "J-mission", an Apollo mission type that featured lunar surface stays of three days, higher scientific capability, and the usage of the Lunar Roving Vehicle. Since Apollo 17 was to be the final lunar landing of the Apollo program, high-priority landing sites that had not been visited previously were given consideration for potential exploration. Some sites were rejected at earlier stages. For instance, a landing in the crater Copernicus was rejected because Apollo 12 had already obtained samples from that impact, and three other Apollo expeditions had already visited the vicinity of Mare Imbrium, near the rim of which Copernicus is located. The lunar highlands near the crater Tycho were rejected because of the rough terrain that the astronauts would encounter there. A site on the lunar far side in the crater Tsiolkovskiy was rejected due to technical considerations and the operational costs of maintaining communication with Earth during surface operations. Lastly, a landing in a region southwest of Mare Crisium was rejected on the grounds that a Soviet spacecraft could easily access the site and retrieve samples; Luna 20 ultimately did so shortly after the Apollo 17 site selection was made. Schmitt advocated for a landing on the far side of the Moon until told by Director of Flight Operations Christopher C. Kraft that it would not happen as NASA lacked the funds for the necessary communications satellites.
The three sites that made the final consideration for Apollo 17 were Alphonsus crater, Gassendi crater, and the Taurus–Littrow valley. In making the final landing site decision, mission planners considered the primary objectives for Apollo 17: obtaining old highlands material a substantial distance from Mare Imbrium, sampling material from young volcanic activity (i.e., less than three billion years), and having minimal ground overlap with the orbital ground tracks of Apollo 15 and Apollo 16 to maximize the amount of new data obtained. A significant reason for the selection of Taurus–Littrow was that Apollo 15's CMP, Al Worden, had overflown the site and observed features he described as likely volcanic in nature.
Gassendi was eliminated because NASA felt that its central peak would be difficult to reach due to the roughness of the local terrain, and, though Alphonsus might be easier operationally than Taurus–Littrow, it was of lesser scientific interest. At Taurus–Littrow, it was believed that the crew would be able to obtain samples of old highland material from the remnants of a landslide event that occurred on the south wall of the valley and the possibility of relatively young, explosive volcanic activity in the area. Although the valley is similar to the landing site of Apollo 15 in that it is on the border of a lunar mare, the advantages of Taurus–Littrow were believed to outweigh the drawbacks. The Apollo Site Selection Board, a committee of NASA personnel and scientists charged with setting out scientific objectives of the Apollo landing missions and selecting landing sites for them, unanimously recommended Taurus–Littrow at its final meeting in February 1972. Upon that recommendation, NASA selected Taurus–Littrow as the landing site for Apollo 17.
Training.
As with previous lunar landings, the Apollo 17 astronauts undertook an extensive training program that included learning to collect samples on the surface, usage of the spacesuits, navigation in the Lunar Roving Vehicle, field geology training, survival training, splashdown and recovery training, and equipment training. The geology field trips were conducted as much as possible as if the astronauts were on the Moon: they would be provided with aerial images and maps, and briefed on features of the site and a suggested routing. The following day, they would follow the route, and have tasks and observations to be done at each of the stops.
The geology field trips began with one to Big Bend National Park in Texas in October 1971. The early ones were not specifically tailored to prepare the astronauts for Taurus–Littrow, which was not selected until February 1972, but by June, the astronauts were going on field trips to sites specifically selected to prepare for Apollo 17's landing site. Both Cernan and Schmitt had served on backup crews for Apollo landing missions, and were familiar with many of the procedures. Their trainers, such as Gordon Swann, feared that Cernan would defer to Schmitt as a professional geologist on matters within his field. Cernan also had to adjust for the loss of Engle, with whom he had trained for Apollo 14. In spite of these issues, Cernan and Schmitt worked well together as a team, and Cernan became adept at describing what he was seeing on geology field trips, and working independently of Schmitt when necessary.
The landing crew aimed for a division of labor so that, when they arrived in a new area, Cernan would perform tasks such as adjusting the antenna on the Lunar Roving Vehicle so as to transmit to Earth while Schmitt gave a report on the geological aspects of the site. The scientists in the geology "backroom" relied on Schmitt's reports to adjust the tasks planned for that site, which would be transmitted to the CapCom and then to Cernan and Schmitt. According to William R. Muehlberger, one of the scientists who trained the astronauts, "In effect [Schmitt] was running the mission from the Moon. But we set it up this way. All of those within the geological world certainly knew it, and I had a sneaking hunch that the top brass knew it too, but this is a practical way out, and they didn't object."
Also participating in some of the geology field trips were the commander and lunar module pilot of the backup crew. The initial field trips took place before the Apollo 15 astronauts were assigned as the backup crew for Apollo 17 in February 1972. Either one or both of Scott and Irwin of Apollo 15 took part in four field trips, though both were present together for only two of them. After they were removed from the backup crew, the new backup commander and LMP, Young and Duke, took part in the final four field trips. On field trips, the backup crew would follow half an hour after the prime crew, performing identical tasks, and have their own simulated CapCom and Mission Control guiding them. The Apollo 17 astronauts had fourteen field trips—the Apollo 11 crew had only one.
Evans did not go on the geology field trips, having his own set of trainers—by this time, geology training for the CMP was well-established. He would fly with a NASA geologist/pilot, Dick Laidley, over geologic features, with part of the exercise conducted at , and part at to . The higher altitude was equivalent to what could be seen from the planned lunar orbit of about 60 nmi with binoculars. Evans would be briefed for several hours before each exercise, and given study guides; afterwards, there would be debriefing and evaluation. Evans was trained in lunar geology by Farouk El-Baz late in the training cycle; this continued until close to launch. The CMP was given information regarding the lunar features he would overfly in the CSM and which he was expected to photograph.
Mission hardware and experiments.
Spacecraft and launch vehicle.
The Apollo 17 spacecraft comprised CSM-114 (consisting of Command Module 114 (CM-114) and Service Module 114 (SM-114)); Lunar Module 12 (LM-12); a Spacecraft-Lunar Module Adapter (SLA) numbered SLA-21; and a Launch Escape System (LES). The LES contained a rocket motor that would propel the CM to safety in the event of an aborted mission in the moments after launch, while the SLA housed the LM during the launch and early part of the flight. The LES was jettisoned after the launch vehicle ascended to the point that it was not needed, while the SLA was left atop the S-IVB third stage of the rocket after the CSM and LM separated from it.
The launch vehicle, SA-512, was one of fifteen Saturn V rockets built, and was the twelfth to fly. With a weight at launch of ( of which was attributable to the spacecraft), Apollo 17's vehicle was slightly lighter than Apollo 16, but heavier than every other crewed Apollo mission.
Preparation and assembly.
The first piece of the launch vehicle to arrive at Kennedy Space Center was the S-II second stage, on October 27, 1970; it was followed by the S-IVB on December 21; the S-IC first stage did not arrive until May 11, 1972, followed by the Instrument Unit on June 7. By then, LM-12 had arrived, the ascent stage on June 16, 1971, and the descent stage the following day; they were not mated until May 18, 1972. CM-114, SM-114 and SLA-21 all arrived on March 24, 1972. The rover reached Kennedy Space Center on June 2, 1972.
The CM and the service module (SM) were mated on March 28, 1972, and the testing of the spacecraft began that month. The CSM was placed in a vacuum chamber at Kennedy Space Center, and the testing was conducted under those conditions. The LM was also placed in a vacuum chamber; both the prime and the backup crews participated in testing the CSM and LM. During the testing, it was discovered that the LM's rendezvous radar assembly had received too much voltage during earlier tests; it was replaced by the manufacturer, Grumman. The LM's landing radar also malfunctioned intermittently and was also replaced. The front and rear steering motors of the Lunar Roving Vehicle (LRV) also had to be replaced, and it required several modifications. Following the July 1972 removal from the vacuum chamber, the LM's landing gear was installed, and it, the CSM and the SLA were mated to each other. The combined craft was moved into the Vehicle Assembly Building in August for further testing, after which it was mounted on the launch vehicle. After completing testing, including a simulated mission, the LRV was placed in the LM on August 13.
Erection of the stages of the launch vehicle began on May 15, 1972, in High Bay 3 of the Vehicle Assembly Building, and was completed on June 27. Since the launch vehicles for Skylab 1 and Skylab 2 were being processed in that building at the same time, this marked the first time NASA had three launch vehicles there since the height of the Apollo program in 1969. After the spacecraft was mounted on the launch vehicle on August 24, it was rolled out to Pad 39-A on August 28. Although this was not the final time a Saturn V would fly (another would lift Skylab to orbit), area residents reacted as though it was, and 5,000 of them watched the rollout, during which the prime crew joined the operating crew from Bendix atop the crawler.
At Pad 39-A, testing continued, and the CSM was electrically mated to the launch vehicle on October 11, 1972. Testing concluded with the countdown demonstration tests, accomplished on November 20 and 21. The countdown to launch began at 7:53 a.m. (12:53 UTC) on December 5, 1972.
Lunar surface science.
ALSEP.
The Apollo Lunar Surface Experiments Package was a suite of nuclear-powered experiments, flown on each landing mission after Apollo 11. This equipment was to be emplaced by the astronauts to continue functioning after the astronauts returned to Earth. For Apollo 17, the ALSEP experiments were a Heat Flow Experiment (HFE), to measure the rate of heat flow from the interior of the Moon, a Lunar Surface Gravimeter (LSG), to measure alterations in the lunar gravity field at the site, a Lunar Atmospheric Composition Experiment (LACE), to investigate what the lunar atmosphere is made up of, a Lunar Seismic Profiling Experiment (LSPE), to detect nearby seismic activity, and a Lunar Ejecta and Meteorites Experiment (LEME), to measure the velocity and energy of dust particles. Of these, only the HFE had been flown before; the others were new.
The HFE had been flown on the aborted Apollo 13 mission, as well as on Apollo 15 and 16, but placed successfully only on Apollo 15, and unexpected results from that device made scientists anxious for a second successful emplacement. It was successfully deployed on Apollo 17. The lunar gravimeter was intended to detect wavers in gravity, which would provide support for Albert Einstein's general theory of relativity; it ultimately failed to function as intended. The LACE was a surface-deployed module that used a mass spectrometer to analyze the Moon's atmosphere. On previous missions, the Code Cathode Gauge experiment had measured the quantity of atmospheric particles, but the LACE determined which gases were present: principally neon, helium and hydrogen. The LSPE was a seismic-detecting device that used geophones, which would detect explosives to be set off by ground command once the astronauts left the Moon. When operating, it could only send useful data to Earth in high bit rate, meaning that no other ALSEP experiment could send data then, and limiting its operating time. It was turned on to detect the liftoff of the ascent stage, as well as use of the explosives packages, and the ascent stage's impact, and thereafter about once a week, as well as for some 100 hour periods. The LEME had a set of detectors to measure the characteristics of the dust particles it sought. It was hoped that the LEME would detect dust impacting the Moon from elsewhere, such as from comets or interstellar space, but analysis showed that it primarily detected dust moving at slow speeds across the lunar surface.
All powered ALSEP experiments that remained active were deactivated on September 30, 1977, principally because of budgetary constraints.
Other lunar-surface science.
Like Apollo 15 and 16, Apollo 17 carried a Lunar Roving Vehicle. In addition to being used by the astronauts for transport from station to station on the mission's three moonwalks, the LRV was used to transport the astronauts' tools, communications equipment, and the lunar samples they gathered. The Apollo 17 LRV was also used to carry some of the scientific instruments, such as the Traverse Gravimeter Experiment (TGE) and Surface Electrical Properties (SEP) experiment. The Apollo 17 LRV traveled a cumulative distance of approximately in a total drive time of about four hours and twenty-six minutes; the greatest distance Cernan and Schmitt traveled from the lunar module was about .
This was the only mission to carry the TGE, which was built by Draper Laboratory at the Massachusetts Institute of Technology. As gravimeters had been useful in studying the Earth's internal structure, the objective of this experiment was to do the same on the Moon. The gravimeter was used to obtain relative gravity measurements at the landing site in the immediate vicinity of the lunar module, as well as various locations on the mission's traverse routes. Scientists would then use this data to help determine the geological substructure of the landing site and the surrounding vicinity. Measurements were taken while the TGE was mounted on the LRV, and also while the device was placed on the lunar surface. A total of 26 measurements were taken with the TGE during the mission's three moonwalks, with productive results.
The SEP was also unique to Apollo 17, and included two major components: a transmitting antenna deployed near the lunar module and a receiver mounted on the LRV. At different stops during the mission's traverses, electrical signals traveled from the transmitting device, through the ground, and were received at the LRV. The electrical properties of the lunar regolith could be determined by comparison of the transmitted and received electrical signals. The results of this experiment, which are consistent with lunar rock composition, show that there is almost no water in the area of the Moon in which Apollo 17 landed, to a depth of .
A long, diameter device, the Lunar Neutron Probe was inserted into one of the holes drilled into the surface to collect core samples. It was designed to measure the quantity of neutrons which penetrated to the detectors it bore along its length. This was intended to measure the rate of the "gardening" process on the lunar surface, whereby the regolith on the surface is slowly mixed or buried due to micrometeorites and other events. Placed during the first EVA, it was retrieved during the third and final EVA. The astronauts brought it with them back to Earth, and the measurements from it were compared with the evidence of neutron flux in the core that had been removed from the hole it had been placed in. Results from the probe and from the cores were instrumental in current theories that the top centimeter of lunar regolith turns over every million years, whereas "gardening" to a depth of one meter takes about a billion years.
Orbital science.
Biological experiments.
Apollo 17's CM carried a biological cosmic ray experiment (BIOCORE), containing five mice that had been implanted with radiation monitors under their scalps to see whether they suffered damage from cosmic rays. These animals were placed in individual metal tubes inside a sealed container that had its own oxygen supply, and flown on the mission. All five were pocket mice ("Perognathus longimembris"); this species was chosen because it was well-documented, small, easy to maintain in an isolated state (not requiring drinking water during the mission and with highly concentrated waste), and for its ability to withstand environmental stress. Officially, the mice—four male and one female—were assigned the identification numbers A3326, A3400, A3305, A3356 and A3352. Unofficially, according to Cernan, the Apollo 17 crew dubbed them Fe, Fi, Fo, Fum, and Phooey.
Four of the five mice survived the flight, though only two of them appeared healthy and active; the cause of death of the fifth mouse was not determined. Of those that survived, the study found lesions in the scalp itself and, in one case, the liver. The scalp lesions and liver lesions appeared to be unrelated to one another; nothing was found that could be attributed to cosmic rays.
The Biostack experiment was similar to one carried on Apollo 16, and was designed to test the effects of the cosmic rays encountered in space travel on microorganisms that were included, on seeds, and on the eggs of simple animals (brine shrimp and beetles), which were carried in a sealed container. After the mission, the microorganisms and seeds showed little effect, but many of the eggs of all species failed to hatch, or to mature normally; many died or displayed abnormalities.
Scientific Instrument Module.
The Apollo 17 SM contained the scientific instrument module (SIM) bay. The SIM bay housed three new experiments for use in lunar orbit: a lunar sounder, an infrared scanning radiometer, and a far-ultraviolet spectrometer. A mapping camera, panoramic camera, and a laser altimeter, which had been carried previously, were also included in the SIM bay.
The lunar sounder was to beam electromagnetic impulses toward the lunar surface, which were designed with the objective of obtaining data to assist in developing a geological model of the interior of the Moon to an approximate depth of . The infrared scanning radiometer was designed with the objective of generating a temperature map of the lunar surface to aid in locating surface features such as rock fields, structural differences in the lunar crust, and volcanic activity. The far-ultraviolet spectrometer was to be used to obtain information on the composition, density, and constituency of the lunar atmosphere. The spectrometer was also designed to detect far-UV radiation emitted by the Sun that had been reflected off the lunar surface. The laser altimeter was designed to measure the altitude of the spacecraft above the lunar surface within approximately , providing altitude information to the panoramic and mapping cameras, which were also in the SIM bay.
Light-flash phenomenon and other experiments.
Beginning with Apollo 11, crew members observed light flashes that penetrated their closed eyelids. These flashes, described by the astronauts as "streaks" or "specks" of light, were usually observed while the spacecraft was darkened during a sleep period. These flashes, while not observed on the lunar surface, would average about two per minute and were observed by the crew members during the trip out to the Moon, back to Earth, and in lunar orbit.
The Apollo 17 crew repeated an experiment, also conducted on Apollo 16, with the objective of linking these light flashes with cosmic rays. Evans wore a device over his eyes that recorded the time, strength, and path of high-energy atomic particles that penetrated the device, while the other two wore blindfolds to keep out light. Investigators concluded that the available evidence supports the hypothesis that these flashes occur when charged particles travel through the retina in the eye.
Apollo 17 carried a sodium-iodide crystal identical to the ones in the gamma-ray spectrometer flown on Apollo 15 and 16. Data from this, once it was examined on Earth, was to be used to help form a baseline, allowing for subtraction of rays from the CM or from cosmic radiation to gain better data from the earlier results. In addition, the S-band transponders in the CSM and LM were pointed at the Moon to gain data on its gravitational field. Results from the Lunar Orbiter probes had revealed that lunar gravity varies slightly due to the presence of mass concentrations, or "mascons". Data from the missions, and from the lunar subsatellites left by Apollo 15 and 16, were used to map such variations in lunar gravity.
Mission events.
Launch and outbound trip.
Originally planned to launch on December 6, 1972, at 9:53 p.m. EST (2:53 a.m. on December 7 UTC), Apollo 17 was the final crewed SaturnV launch, and the only one to occur at night. The launch was delayed by two hours and forty minutes due to an automatic cutoff in the launch sequencer at the T-30 second mark in the countdown. The cause of the problem was quickly determined to be the launch sequencer's failure to automatically pressurize the liquid oxygen tank in the third stage of the rocket; although launch control noticed this and manually caused the tank to pressurize, the sequencer did not recognize the fix and therefore paused the countdown. The clock was reset and held at the T-22 minute mark while technicians worked around the malfunction in order to continue with the launch. This pause was the only launch delay in the Apollo program caused by a hardware problem. The countdown then resumed, and the liftoff occurred at 12:33 a.m. EST on December 7, 1972. The launch window, which had begun at the originally planned launch time of 9:53 p.m. on December 6, remained open until 1:31 a.m., the latest time at which a launch could have occurred during the December 6–7 window.
Approximately 500,000 people observed the launch in the immediate vicinity of Kennedy Space Center, despite the early-morning hour. The launch was visible as far away as , and observers in Miami, Florida, reported a "red streak" crossing the northern sky. Among those in attendance at the program's final launch were astronauts Neil Armstrong and Dick Gordon, as well as centenarian Charlie Smith, who alleged he was 130 years old at the time of Apollo 17.
The ascent resulted in an orbit with an altitude and velocity almost exactly that which had been planned. In the hours following the launch, Apollo 17 orbited the Earth while the crew spent time monitoring and checking the spacecraft to ensure its readiness to depart Earth orbit. At 3:46 a.m. EST, the S-IVB third stage was reignited for the 351-second trans-lunar injection burn to propel the spacecraft towards the Moon. Ground controllers chose a faster trajectory for Apollo 17 than originally planned to allow the vehicle to reach lunar orbit at the planned time, despite the launch delay. The Command and Service Module separated from the S-IVB approximately half an hour following the S-IVB trans-lunar injection burn, after which Evans turned the spacecraft to face the LM, still attached to the S-IVB. The CSM then docked with the LM and extracted it from the S-IVB. Following the LM extraction, Mission Control programmed the S-IVB, no longer needed to propel the spacecraft, to impact the Moon and trip the seismometers left by prior Apollo crews. It struck the Moon just under 87 hours into the mission, triggering the seismometers from Apollo 12, 14, 15 and 16. Approximately nine hours after launch, the crew concluded the mission's first day with a sleep period, until waking up to begin the second day.
Mission Control and the crew decided to shorten the mission's second day, the first full day in space, in order to adjust the crew's wake-up times for the subsequent days in preparation for an early morning (EST) wake-up time on the day of the lunar landing, then scheduled for early afternoon (EST). This was done since the first day of the mission had been extended because of the launch delay. Following the second rest period, and on the third day of the mission, the crew executed the first mid-course correction, a two-second burn of the CSM's service propulsion engine to adjust the spacecraft's Moon-bound trajectory. Following the burn, the crew opened the hatch separating the CSM and LM in order to check the LM's systems and concluded that they were nominal. So that events would take place at the time indicated in the flight plan, the mission clocks were moved ahead by 2 hours and 40 minutes, the amount of the launch delay, with one hour of it at 45:00:00 into the mission and the remainder at 65:00:00.
Among their other activities during the outbound trip, the crew photographed the Earth from the spacecraft as it travelled towards the Moon. One of these photographs is now known as "The Blue Marble". The crew found that one of the latches holding the CSM and LM together was unlatched. While Schmitt and Cernan were engaged in a second period of LM housekeeping beginning just before sixty hours into the Mission, Evans worked on the balky latch. He was successful, and left it in the position it would need to be in for the CSM-LM docking that would occur upon return from the lunar surface.
Also during the outward journey, the crew performed a heat flow and convection demonstration, as well as the Apollo light-flash experiment. A few hours before entry into lunar orbit, the SIM door on the SM was jettisoned. At approximately 2:47 p.m. EST on December 10, the service propulsion system engine on the CSM ignited to slow down the CSM/LM stack into lunar orbit. Following orbit insertion and orbital stabilization, the crew began preparations for the landing at Taurus–Littrow.
Lunar landing.
The day of the landing began with a checkout of the Lunar Module's systems, which revealed no problems preventing continuation of the mission. Cernan, Evans, and Schmitt each donned their spacesuits, and Cernan and Schmitt entered the LM in preparation for separating from the CSM and landing. The LM undocked from the CSM, and the two spacecraft orbited close together for about an hour and a half while the astronauts made visual inspections and conducted their final pre-landing checks. After finally separating from the CSM, the LM "Challenger" and its crew of two adjusted their orbit, such that its lowest point would pass about above the landing site, and began preparations for the descent to Taurus–Littrow. While Cernan and Schmitt prepared for landing, Evans remained in orbit to take observations, perform experiments and await the return of his crewmates a few days later.
Soon after completing their preparations for landing and just over two hours following the LM's undocking from the CSM, Cernan and Schmitt began their descent to the Taurus–Littrow valley on the lunar surface with the ignition of the Lunar Module's descent propulsion system (DPS) engine. Approximately ten minutes later, as planned, the LM pitched over, giving Cernan and Schmitt their first look at the landing site during the descent phase and allowing Cernan to guide the spacecraft to a desirable landing target while Schmitt provided data from the flight computer essential for landing. The LM touched down on the lunar surface at 2:55 p.m. EST on December 11, just over twelve minutes after DPS ignition. "Challenger" landed about east of the planned landing point. Shortly thereafter, the two astronauts began re-configuring the LM for their stay on the surface and began preparations for the first moonwalk of the mission, or EVA-1.
Lunar surface.
First EVA.
During their approximately 75-hour stay on the lunar surface, Cernan and Schmitt performed three moonwalks (EVAs). The astronauts deployed the LRV, then emplaced the ALSEP and the seismic explosive charges. They drove the rover to nine planned geological-survey stations to collect samples and make observations. Additionally, twelve short sampling stops were made at Schmitt's discretion while riding the rover, during which the astronauts used a handled scoop to get a sample, without dismounting. During lunar-surface operations, Commander Cernan always drove the rover, while Lunar Module Pilot Schmitt was a passenger who assisted with navigation. This division of responsibilities between the two crew positions was used consistently throughout Apollo's J-missions.
The first lunar excursion began four hours after landing, at 6:54 p.m. EST on December 11. After exiting through the hatch of the LM and descending the ladder to the footpad, Cernan took the first step on the lunar surface of the mission. Just before doing so, Cernan remarked, "I'm on the footpad. And, Houston, as I step off at the surface at Taurus–Littrow, we'd like to dedicate the first step of Apollo 17 to all those who made it possible." After Cernan surveyed the exterior of the LM and commented on the immediate landing site, Schmitt joined Cernan on the surface. The first task was to offload the rover and other equipment from the LM. While working near the rover, Cernan caught his hammer under the right-rear fender extension, accidentally breaking it off. A similar incident occurred on Apollo 16 as John Young maneuvered around the rover. Although this was not a mission-critical issue, the loss of the part caused Cernan and Schmitt to be covered with dust stirred up when the rover was in motion. The crew made a short-lived fix using duct tape at the beginning of the second EVA, attaching a paper map to the damaged fender. Lunar dust stuck to the tape's surface, however, preventing it from adhering properly. Following deployment and testing the maneuverability of the rover, the crew deployed the ALSEP just west of the landing site. The ALSEP deployment took longer than had been planned, with the drilling of core holes presenting some difficulty, meaning the geological portion of the first EVA would need to be shortened, cancelling a planned visit to Emory crater. Instead, following the deployment of the ALSEP, Cernan and Schmitt drove to Steno crater, to the south of the landing site. The objective at Steno was to sample the subsurface material excavated by the impact that formed the crater. The astronauts gathered of samples, took seven gravimeter measurements, and deployed two explosive packages. The explosive packages were later detonated remotely; the resulting explosions detected by geophones placed by the astronauts and also by seismometers left during previous missions. The first EVA ended after seven hours and twelve minutes. and the astronauts remained in the pressurized LM for the next 17 hours.
Second and third EVAs.
On December 12, awakened by a recording of "Ride of the Valkyries" played from Mission Control, Cernan and Schmitt began their second lunar excursion. The first order of business was to provide the rover's fender a better fix. Overnight, the flight controllers devised a procedure communicated by John Young: taping together four stiff paper maps to form a "replacement fender extension" and then clamping it onto the fender. The astronauts carried out the new fix which did its job without failing until near the end of the third excursion. Cernan and Schmitt then departed for station 2—Nansen Crater, at the foot of the South Massif. When they arrived, their range from the "Challenger" was 7.6 kilometers (4.7 miles, 25,029 feet). This remains the furthest distance any spacefarers have ever traveled away from the safety of a pressurizable spacecraft while on a planetary body, and also during an EVA of any type. The astronauts were at the extremity of their "walkback limit", a safety constraint meant to ensure that they could walk back to the LM if the rover failed. They began a return trip, traveling northeast in the rover.
At station 3, Schmitt fell to the ground while working, looking so awkward that Parker jokingly told him that NASA's switchboard had lit up seeking Schmitt's services for Houston's ballet group, and the site of station 3 was in 2019 renamed Ballet Crater. Cernan took a sample at Station 3 that was to be maintained in vacuum until better analytical techniques became available, joking with the CAPCOM, Parker, about placing a note inside. The container remained unopened until 2022.
Stopping at station 4—Shorty crater—the astronauts discovered orange soil, which proved to be very small beads of volcanic glass formed over 3.5 billion years ago. This discovery caused great excitement among the scientists at Mission Control, who felt that the astronauts may have discovered a volcanic vent. However, post-mission sample analysis revealed that Shorty is not a volcanic vent, but rather an impact crater. Analysis also found the orange soil to be a remnant of a fire fountain. This fire fountain sprayed molten lava high into the lunar sky in the Moon's early days, some 3.5 billion years ago and long before Shorty's creation. The orange volcanic beads were droplets of molten lava from the fountain that solidified and were buried by lava deposits until exposed by the impact that formed Shorty, less than 20 million years ago.
The final stop before returning to the LM was Camelot crater; throughout the sojourn, the astronauts collected of samples, took another seven gravimeter measurements, and deployed three more explosive packages. Concluding the EVA at seven hours and thirty-seven minutes, Cernan and Schmitt had completed the longest-duration EVA in history to-date, traveling further away from a spacecraft and covering more ground on a planetary body during a single EVA than any other spacefarers. The improvised fender had remained intact throughout, causing the president of the "Auto Body Association of America" to award them honorary lifetime membership.
The third moonwalk, the last of the Apollo program, began at 5:25 p.m. EST on December 13. Cernan and Schmitt rode the rover northeast of the landing site, exploring the base of the North Massif and the Sculptured Hills. Stopping at station 6, they examined a house-sized split boulder dubbed Tracy's Rock (or Split Rock), after Cernan's daughter. The ninth and final planned station was conducted at Van Serg crater. The crew collected of lunar samples and took another nine gravimeter measurements. Schmitt had seen a fine-grained rock, unusual for that vicinity, earlier in the mission and had stood it on its edge; before closing out the EVA, he went and got it. Subsequently, designated Sample 70215, it was, at , the largest rock brought back by Apollo 17. A small piece of it is on exhibit at the Smithsonian Institution, one of the few rocks from the Moon that the public may touch. Schmitt also collected a sample, designated as Sample 76535, at geology station 6 near the base of the North Massif; the sample, a troctolite, was later identified as the oldest known "unshocked" lunar rock, meaning it has not been damaged by high-impact geological events. Scientists have therefore used Sample 76535 in thermochronological studies to determine if the Moon formed a metallic core or, as study results suggest, a core dynamo.
Before concluding the moonwalk, the crew collected a breccia rock, dedicating it to the nations of Earth, 70 of which were represented by students touring the U.S. and present in Mission Control Center in Houston, Texas, at the time. Portions of this sample, known as the Friendship Rock, were subsequently distributed to the nations represented by the students. A plaque located on the LM, commemorating the achievements made during the Apollo program, was then unveiled. Before reentering the LM for the final time, Cernan remarked,
Cernan then followed Schmitt into the LM; the final lunar excursion had a duration of seven hours and fifteen minutes. Following closing of the LM hatch and repressurization of the LM cabin, Cernan and Schmitt removed their spacesuits and reconfigured the cabin for a final rest period on the lunar surface. As they did following each of the previous two EVAs, Cernan and Schmitt discussed their geological observations from the day's excursion with mission control while preparing to rest.
Solo activities.
While Cernan and Schmitt were on the lunar surface, Evans remained alone in the CSM in lunar orbit and was assigned a number of observational and scientific tasks to perform while awaiting the return of his crewmates. In addition to the operation of the various orbital science equipment contained in the CSM's SIM bay, Evans conducted both visual and photographic observation of surface features from his aerial vantage point. The orbit of the CSM having been modified to an elliptical orbit in preparation for the LM's departure and eventual descent, one of Evans' solo tasks in the CSM was to circularize its orbit such that the CSM would remain at approximately the same distance above the surface throughout its orbit. Evans observed geological features visible to him and used handheld cameras to record certain visual targets. Evans also observed and sketched the solar corona at "sunrise," or the period of time during which the CSM would pass from the darkened portion of the Moon to the illuminated portion when the Moon itself mostly obscured the sun. To photograph portions of the surface that were not illuminated by the sun while Evans passed over them, Evans relied in conjunction on exposure and Earthlight. Evans photographed such features as the craters Eratosthenes and Copernicus, as well as the vicinity of Mare Orientale, using this technique. According to the Apollo 17 Mission Report, Evans was able to capture all scientific photographic targets, as well as some other targets of interest.
Similarly to the crew of Apollo 16, Evans (as well as Schmitt, while in lunar orbit) reported seeing light "flashes" apparently originating from the lunar surface, known as transient lunar phenomena (TLP); Evans reported seeing these "flashes" in the vicinity of Grimaldi crater and Mare Orientale. The causes of TLP are not well-understood and, though inconclusive as an explanation, both of the sites in which Evans reported seeing TLP are the general locations of outgassing from the Moon's interior. Meteorite impacts are another possible explanation.
The flight plan kept Evans busy, making him so tired he overslept one morning by an hour, despite the efforts of Mission Control to awaken him. Before the LM departed for the lunar surface, Evans had discovered that he had misplaced his pair of scissors, necessary to open food packets. Cernan and Schmitt lent him one of theirs. The instruments in the SIM bay functioned without significant hindrance during the orbital portion of the mission, though the lunar sounder and the mapping camera encountered minor problems. Evans spent approximately 148 total hours in lunar orbit, including solo time and time spent together with Cernan and Schmitt, which is more time than any other individual has spent orbiting the Moon.
Evans was also responsible for piloting the CSM during the orbital phase of the mission, maneuvering the spacecraft to alter and maintain its orbital trajectory. In addition to the initial orbital recircularization maneuver shortly after the LM's departure, one of the solo activities Evans performed in the CSM in preparation for the return of his crewmates from the lunar surface was the plane change maneuver. This maneuver was meant to align the CSM's trajectory to the eventual trajectory of the LM to facilitate rendezvous in orbit. Evans fired the SPS engine of the CSM for about 20 seconds in successfully adjusting the CSM's orbital plane.
Return to Earth.
Cernan and Schmitt successfully lifted off from the lunar surface in the ascent stage of the LM on December14, at 5:54 p.m. EST. The return to lunar orbit took just over seven minutes. The LM, piloted by Cernan, and the CSM, piloted by Evans, maneuvered, and redocked about two hours after liftoff from the surface. Once the docking had taken place, the crew transferred equipment and lunar samples from the LM to the CSM for return to Earth. The crew sealed the hatches between the CSM and the LM ascent stage following completion of the transfer and the LM was jettisoned at 11:51 p.m. EST on December14. The unoccupied ascent stage was then remotely deorbited, crashing it into the Moon with an impact recorded by the seismometers left by Apollo 17 and previous missions. At 6:35 p.m. EST on December16, the CSM's SPS engine was ignited once more to propel the spacecraft away from the Moon on a trajectory back towards Earth. The successful trans-Earth injection SPS burn lasted just over two minutes.
During the return to Earth, Evans performed a 65-minute EVA to retrieve film cassettes from the service module's SIM bay, with assistance from Schmitt who remained at the command module's hatch. At approximately 160,000 nautical miles (184,000 mi; 296,000 km) from Earth, it was the third "deep space" EVA in history, performed at great distance from any planetary body. As of , it remains one of only three such EVAs, all performed during Apollo's J-missions under similar circumstances. It was the last EVA of the Apollo program.
During the trip back to Earth, the crew operated the infrared radiometer in the SM, as well as the ultraviolet spectrometer. One midcourse correction was performed, lasting 9 seconds. On December 19, the crew jettisoned the no-longer-needed SM, leaving only the CM for return to Earth. The Apollo 17 spacecraft reentered Earth's atmosphere and splashed down safely in the Pacific Ocean at 2:25 p.m. EST, from the recovery ship, . Cernan, Evans, and Schmitt were then retrieved by a recovery helicopter piloted by Commander Edward E. Dahill, III and were safe aboard the recovery ship 52 minutes after splashdown. As the final Apollo mission concluded successfully, Mission Control in Houston was filled with many former flight controllers and astronauts, who applauded as "America" returned to Earth.
Aftermath and spacecraft locations.
Following their mission, the crew undertook both domestic and international tours, visiting 29 states and 11 countries. The tour kicked off at Super Bowl VII, with the crew leading the crowd in the Pledge of Allegiance; the CM "America" was also displayed during the pregame activities.
None of the Apollo 17 astronauts flew in space again. Cernan retired from NASA and the Navy in 1976. He died in 2017. Evans retired from the Navy in 1976 and from NASA in 1977, entering the private sector. He died in 1990. Schmitt resigned from NASA in 1975 prior to his successful run for a United States Senate seat from New Mexico in 1976. There, he served one six-year term.
The Command Module "America" is currently on display at Space Center Houston at the Lyndon B. Johnson Space Center in Houston, Texas. The ascent stage of Lunar Module "Challenger" impacted the Moon on December 15, 1972, at 06:50:20.8 UTC (1:50 a.m. EST), at . The descent stage remains on the Moon at the landing site, . Eugene Cernan's flown Apollo 17 spacesuit is in the collection of the Smithsonian's National Air and Space Museum (NASM), where it was transferred in 1974, and Harrison Schmitt's is in storage at NASM's Paul E. Garber Facility. Amanda Young of NASM indicated in 2004 that Schmitt's suit is in the best condition of the flown Apollo lunar spacesuits, and therefore is not on public display. Ron Evans' spacesuit was also transferred from NASA in 1974 to the collection of the NASM; it remains in storage.
Since Apollo 17's return, there have been attempts to photograph the landing site, where the LM's descent stage, LRV and some other mission hardware, remain. In 2009 and again in 2011, the Lunar Reconnaissance Orbiter photographed the landing site from increasingly low orbits. At least one group has indicated an intention to visit the site as well; in 2018, the German space company PTScientists said that it planned to land two lunar rovers nearby.
|
1973 | American Revolution | The American Revolution was an ideological and political revolution that occurred in British America between 1765-83.
In the American Revolutionary War, between 1775-83, the colonies secured their independence from the British Crown and established the United States as the first sovereign nation-state founded on Enlightenment principles of constitutionalism and liberal democracy.
American colonists objected to being taxed by the Parliament of Great Britain, a body in which they had no direct representation. Prior to the 1760s, Britain's American colonies enjoyed a high level of autonomy, which were locally governed by colonial legislatures. During the 1760s, however, the British Parliament passed acts intended to bring the American colonies under more direct rule from the British metropole and increasingly intertwine the colonial economies with that of Britain. The passage of the Stamp Act 1765 imposed internal taxes on official documents, newspapers, and most things printed in the colonies, which led to colonial protest and the meeting of representatives from several colonies at the Stamp Act Congress. Tensions relaxed briefly with Britain's repeal of the Stamp Act, but flared again with the passage of the Townshend Acts in 1767.
The British government deployed troops to Boston in 1768 to quell unrest, leading to the Boston Massacre on March 5, 1770. While the British government repealed most of the Townshend duties in 1770, it retained its tax on tea in order to symbolically assert Parliament's right to tax the colonies. The burning of the "Gaspee" in Rhode Island in 1772, the passage of the Tea Act in 1773, and the resulting Boston Tea Party in Boston Harbor on December 16, 1773 vastly escalated tensions. The British responded by closing Boston Harbor and enacting a series of punitive laws, which effectively rescinded Massachusetts' governing autonomy. The colonies responded by rallying behind Massachusetts. In late 1774, twelve of the thirteen colonies sent delegates to Philadelphia, where they formed the First Continental Congress and began coordinating their resistance. Opponents of Britain were known as "Patriots" or "Whigs", while colonists who retained their allegiance to the Crown were known as "Loyalists" or "Tories." In early 1775, the state of Massachusetts was declared to be in a state of rebellion, and the order was sent for "patriots" to be disarmed.
In April 1775, open warfare erupted, launching the American Revolutionary War, when British troops were sent to capture military supplies and were confronted by local Patriot militia at Lexington and Concord. Patriot militia, aided by the newly-formed Continental Army, then put British forces in Boston under siege, forcing them to withdraw by sea. Each colony formed a Provincial Congress, which assumed power from the former colonial governments, suppressed Loyalism, and contributed to the Continental Army, led by George Washington following his June 14, 1775 appointment as commander-in-chief by the Second Continental Congress in Philadelphia. The Patriots unsuccessfully attempted to invade northeastern Quebec to rally sympathetic colonists during the winter of 1775–76, but were more successful in the southwestern parts of the colony.
In Philadelphia, the Second Continental Congress declared King George III a tyrant who had trampled the colonists' rights as Englishmen. On July 2, 1776, the Congress passed the Lee Resolution, which declared that the colonies considered themselves "free and independent states". On July 4, 1776, the Congress unanimously adopted the Declaration of Independence, principally authored by Thomas Jefferson. The Declaration of Independence embodied the political philosophies of liberalism and republicanism, rejected monarchy and aristocracy, and proclaimed that "all men are created equal", though it was not until later centuries that constitutional amendments and federal laws incrementally granted equal rights to African Americans, Native Americans, poor white men, women, and LGBT+ people.
In the summer of 1776, in a major setback for American patriots, the British captured New York City and its strategic harbor. In October 1777, the Continental Army experienced a significant victory, capturing British troops at the Battle of Saratoga. Following the victory in the Saratoga campaign, France then entered the war as an ally of the cause of American independence, expanding the Revolutionary War into a global conflict. The British Royal Navy blockaded ports and held New York City for the duration of the war, and other cities for brief periods, but failed to destroy Washington's forces. Britain's priorities shifted southward, attempting to hold the Southern states with the anticipated aid of Loyalists that never materialized. British general Charles Cornwallis captured Continental Army troops at Charleston, South Carolina in early 1780, but failed to enlist enough volunteers from Loyalist civilians to take effective control of the territory. A combined American and French force captured Cornwallis' army at Yorktown in the fall of 1781, effectively securing an American victory and bringing the war to an end. The Treaty of Paris was signed on September 3, 1783, formally ending the conflict and confirming the new nation's complete separation from the British Empire. The United States took possession of nearly all territory east of the Mississippi River and south of the Great Lakes, including southern Canada, with the British retaining control of northern Canada, and French ally Spain taking back Florida.
Among the significant results of the American victory were American independence and the end of British mercantilism in America, opening up worldwide trade for the United States—including resumption with Britain. Around 60,000 Loyalists migrated to other British territories, particularly Canada, but the majority remained in the United States. The Americans wrote the United States Constitution in 1787 and adopted it in 1789, replacing the weak wartime Confederation and establishing a comparatively strong national government structured as a federal republic, which included an elected executive, a national judiciary, and an elected bicameral Congress representing states in the Senate and the population in the House of Representatives. It is the world's first federal democratic republic founded on the consent of the governed. In 1791, a Bill of Rights was ratified as the first ten amendments, guaranteeing fundamental rights used as justification for the revolution.
Origin.
1651–1763: Early seeds.
From the start of English colonization of the Americas, the English government pursued a policy of mercantilism, consistent with the economic policies of other European colonial powers of the time. Under this system, they hoped to grow England's economic and political power by restricting imports, promoting exports, regulating commerce, gaining access to new natural resources, and accumulating new precious metals as monetary reserves. Mercantilist policies were a defining feature of several English American colonies from their inception. The original 1606 charter of the Virginia Company regulated trade in what would become the Colony of Virginia. In general, the export of raw materials to foreign lands was banned, imports of foreign goods were discouraged, and cabotage was restricted to English vessels. These regulations were enforced by the Royal Navy.
Following the parliamentarian victory in the English Civil War, the first mercantilist legislation was passed. In 1651, the Rump Parliament passed the first of the Navigation Acts, intended to both improve England's trade ties with its colonies and to address Dutch domination of the trans-Atlantic trade at the time. This led to the outbreak of war with the Netherlands the following year. After the Restoration, the 1651 Act was repealed, but the Cavalier Parliament passed a series of even more restrictive Navigation Acts. Colonial reactions to these policies were mixed. The Acts prohibited exports of tobacco and other raw materials to non-English territories, which prevented many planters from receiving higher prices for their goods. Additionally, merchants were restricted from importing certain goods and materials from other nations, harming profits. These factors led to smuggling among colonial merchants, especially following passage of the Molasses Act. On the other hand, certain merchants and local industries benefitted from the restrictions on foreign competition. The restrictions on foreign-built ships also greatly benefitted the colonial shipbuilding industry, particularly of the New England colonies. Some argue that the economic impact was minimal on the colonists, but the political friction which the acts triggered was more serious, as the merchants most directly affected were also the most politically active.
King Philip's War was fought from 1675 to 1678 between the New England colonies and a half-dozen Native American tribes. It was fought without military assistance from England, thereby contributing to the development of a unique American identity separate from that of the British people. The Restoration of King Charles II to the English throne also accelerated this development. New England had strong Puritan heritage and had supported the parliamentarian Commonwealth government that was responsible for the execution of his father, Charles I. Massachusetts did not recognize the legitimacy of Charles II's reign for more than a year after its onset. Charles II thus became determined to bring the New England colonies under a more centralized administration and direct English control in the 1680s. The New England colonists fiercely opposed his efforts, and the Crown nullified their colonial charters in response. Charles' successor James II finalized these efforts in 1686, establishing the consolidated Dominion of New England, which also included the formerly separate colonies of New York and New Jersey. Edmund Andros was appointed royal governor, and tasked with governing the new Dominion under his direct rule. Colonial assemblies and town meetings were restricted, new taxes were levied, and rights were abridged. Dominion rule triggered bitter resentment throughout New England; the enforcement of the unpopular Navigation Acts and the curtailing of local democracy greatly angered the colonists. New Englanders were encouraged, however, by a change of government in England which saw James II effectively abdicate, and a populist uprising in New England overthrew Dominion rule on April 18, 1689. Colonial governments reasserted their control after the revolt. The new monarchs, William and Mary, granted new charters to the individual New England colonies, and local democratic self-government was restored. Successive Crown governments made no attempts to restore the Dominion.
Subsequent British governments continued in their efforts to tax certain goods however, passing acts regulating the trade of wool, hats, and molasses. "The Molasses Act of 1733" was particularly egregious to the colonists, as a significant part of colonial trade relied on molasses. The taxes severely damaged the New England economy and resulted in a surge of smuggling, bribery, and intimidation of customs officials. Colonial wars fought in America were also a source of considerable tension. For example, New England colonial forces captured the fortress of Louisbourg in Acadia during King George's War in 1745, but the British government then ceded it back to France in 1748 in exchange for Chennai, which the British had lost in 1746. New England colonists resented their losses of lives, as well as the effort and expenditure involved in subduing the fortress, only to have it returned to their erstwhile enemy, who would remain a threat to them after the war.
Some writers begin their histories of the American Revolution with the British coalition victory in the Seven Years' War in 1763, viewing the French and Indian War as though it were the American theater of the "Seven Years' War". Lawrence Henry Gipson writes:
The Royal Proclamation of 1763 redrew boundaries of the lands west of newly-British Quebec and west of a line running along the crest of the Allegheny Mountains, making them indigenous territory and barred to colonial settlement for two years. The colonists protested, and the boundary line was adjusted in a series of treaties with indigenous tribes. In 1768, the Iroquois agreed to the Treaty of Fort Stanwix, and the Cherokee agreed to the Treaty of Hard Labour followed in 1770 by the Treaty of Lochaber. The treaties opened most of what is present-day Kentucky and West Virginia to colonial settlement. The new map was drawn up at the Treaty of Fort Stanwix, which moved the line much farther to the west, from the green line to the red line on the map at right.
1764–1766: Taxes imposed and withdrawn.
In 1764 Parliament passed the Sugar Act, decreasing the existing customs duties on sugar and molasses but providing stricter measures of enforcement and collection. That same year, Prime Minister George Grenville proposed direct taxes on the colonies to raise revenue, but he delayed action to see whether the colonies would propose some way to raise the revenue themselves.
Grenville asserted in 1762 that the whole revenue of the custom houses in America amounted to one or two thousand pounds sterling a year, and that the English exchequer was paying between seven and eight thousand pounds a year to collect. Adam Smith wrote in "The Wealth of Nations" that Parliament "has never hitherto demanded of [the American colonies] anything which even approached to a just proportion to what was paid by their fellow subjects at home." Benjamin Franklin would later testify in Parliament in 1766 to the contrary, reporting that Americans already contributed heavily to the defense of the Empire. He argued that local colonial governments had raised, outfitted, and paid 25,000 soldiers to fight France in just the French and Indian War alone—as many as Britain itself sent—and spent many millions from American treasuries doing so.
Parliament passed the Stamp Act in March 1765, which imposed direct taxes on the colonies for the first time. All official documents, newspapers, almanacs, and pamphlets were required to have the stamps—even decks of playing cards. The colonists did not object that the taxes were high; they were actually low. They objected to their lack of representation in the Parliament, which gave them no voice concerning legislation that affected them. The British were, however, reacting to an entirely different issue: at the conclusion of the recent war the Crown had to deal with approximately 1,500 politically well-connected British Army officers. The decision was made to keep them on active duty with full pay, but they—and their commands—also had to be stationed somewhere. Stationing a standing army in Great Britain during peacetime was politically unacceptable, so they determined to station them in America and have the Americans pay them through the new tax. The soldiers had no military mission however; they were not there to defend the colonies because there was no current threat to the colonies.
Shortly following adoption of the Stamp Act, the Sons of Liberty formed, and began using public demonstrations, boycotts, and threats of violence to ensure that the British tax laws became unenforceable. In Boston, the Sons of Liberty burned the records of the vice admiralty court and looted the home of chief justice Thomas Hutchinson. Several legislatures called for united action, and nine colonies sent delegates to the Stamp Act Congress in New York City in October. Moderates led by John Dickinson drew up a Declaration of Rights and Grievances stating that taxes passed without representation violated their rights as Englishmen, and colonists emphasized their determination by boycotting imports of British merchandise.
The Parliament at Westminster saw itself as the supreme lawmaking authority throughout the Empire and thus entitled to levy any tax without colonial approval or even consultation. They argued that the colonies were legally British corporations subordinate to the British Parliament, and they pointed to numerous instances where Parliament had made laws in the past that were binding on the colonies. Parliament insisted that the colonists effectively enjoyed a "virtual representation", as most British people did, since only a small minority of the British population elected representatives to Parliament. However, Americans such as James Otis maintained that there was no one in Parliament responsible specifically for any colonial constituency, so they were not "virtually represented" by anyone in Parliament at all.
The Rockingham government came to power in July 1765, and Parliament debated whether to repeal the stamp tax or to send an army to enforce it. Benjamin Franklin appeared to make the case for repeal, explaining that the colonies had spent heavily in manpower, money, and blood defending the empire in a series of wars against the French and indigenous people, and that further taxes to pay for those wars were unjust and might bring about a rebellion. Parliament agreed and repealed the tax on February 21, 1766, but they insisted in the Declaratory Act of March 1766 that they retained full power to make laws for the colonies "in all cases whatsoever". The repeal nonetheless caused widespread celebrations in the colonies.
1767–1773: Townshend Acts and the Tea Act.
In 1767, the British Parliament passed the Townshend Acts, which placed duties on several staple goods, including paper, glass, and tea, and established a Board of Customs in Boston to more rigorously execute trade regulations. The new taxes were enacted on the belief that Americans only objected to internal taxes and not to external taxes such as custom duties. However, in his widely read pamphlet, "Letters from a Farmer in Pennsylvania", John Dickinson argued against the constitutionality of the acts because their purpose was to raise revenue and not to regulate trade. Colonists responded to the taxes by organizing new boycotts of British goods. These boycotts were less effective, however, as the goods taxed by the Townshend Acts were widely used.
In February 1768, the Assembly of Massachusetts Bay Colony issued a circular letter to the other colonies urging them to coordinate resistance. The governor dissolved the assembly when it refused to rescind the letter. Meanwhile, a riot broke out in Boston in June 1768 over the seizure of the sloop "Liberty", owned by John Hancock, for alleged smuggling. Customs officials were forced to flee, prompting the British to deploy troops to Boston. A Boston town meeting declared that no obedience was due to parliamentary laws and called for the convening of a convention. A convention assembled but only issued a mild protest before dissolving itself. In January 1769, Parliament responded to the unrest by reactivating the Treason Act 1543 which called for subjects outside the realm to face trials for treason in England. The governor of Massachusetts was instructed to collect evidence of said treason, and the threat caused widespread outrage, though it was not carried out.
On March 5, 1770, a large crowd gathered around a group of British soldiers on a Boston street. The crowd grew threatening, throwing snowballs, rocks, and debris at them. One soldier was clubbed and fell. There was no order to fire, but the soldiers panicked and fired into the crowd. They hit 11 people; three civilians died of wounds at the scene of the shooting, and two died shortly after the incident. The event quickly came to be called the Boston Massacre. The soldiers were tried and acquitted (defended by John Adams), but the widespread descriptions soon began to turn colonial sentiment against the British. This accelerated the downward spiral in the relationship between Britain and the Province of Massachusetts.
A new ministry under Lord North came to power in 1770, and Parliament withdrew all taxes except the tax on tea, giving up its efforts to raise revenue while maintaining the right to tax. This temporarily resolved the crisis, and the boycott of British goods largely ceased, with only the more radical patriots such as Samuel Adams continuing to agitate.
In June 1772, American patriots, including John Brown, burned a British warship that had been vigorously enforcing unpopular trade regulations, in what became known as the "Gaspee" Affair. The affair was investigated for possible treason, but no action was taken.
In 1772, it became known that the Crown intended to pay fixed salaries to the governors and judges in Massachusetts, which had been paid by local authorities. This would reduce the influence of colonial representatives over their government. Samuel Adams in Boston set about creating new Committees of Correspondence, which linked Patriots in all 13 colonies and eventually provided the framework for a rebel government. Virginia, the largest colony, set up its Committee of Correspondence in early 1773, on which Patrick Henry and Thomas Jefferson served.
A total of about 7,000 to 8,000 Patriots served on Committees of Correspondence at the colonial and local levels, comprising most of the leadership in their communities. Loyalists were excluded. The committees became the leaders of the American resistance to British actions, and later largely determined the war effort at the state and local level. When the "First Continental Congress" decided to boycott British products, the colonial and local Committees took charge, examining merchant records and publishing the names of merchants who attempted to defy the boycott by importing British goods.
In 1773, private letters were published in which Massachusetts Governor Thomas Hutchinson claimed that the colonists could not enjoy all English liberties, and in which Lieutenant Governor Andrew Oliver called for the direct payment of colonial officials. The letters' contents were used as evidence of a systematic plot against American rights, and discredited Hutchinson in the eyes of the people; the colonial Assembly petitioned for his recall. Benjamin Franklin, postmaster general for the colonies, acknowledged that he leaked the letters, which led to him being berated by British officials and removed from his position.
Meanwhile, Parliament passed the Tea Act lowering the price of taxed tea exported to the colonies, to help the British East India Company undersell smuggled untaxed Dutch tea. Special consignees were appointed to sell the tea to bypass colonial merchants. The act was opposed by those who resisted the taxes and also by smugglers who stood to lose business. In most instances, the consignees were forced by the Americans to resign and the tea was turned back, but Massachusetts governor Hutchinson refused to allow Boston merchants to give in to pressure. A town meeting in Boston determined that the tea would not be landed, and ignored a demand from the governor to disperse. On December 16, 1773, a group of men, led by Samuel Adams and dressed to evoke the appearance of indigenous people, boarded the ships of the East India Company and dumped £10,000 worth of tea from their holds (approximately £636,000 in 2008) into Boston Harbor. Decades later, this event became known as the Boston Tea Party and remains a significant part of American patriotic lore.
1774–1775: Intolerable Acts.
The British government responded by passing several measures that came to be known as the Intolerable Acts, further darkening colonial opinion towards England. They consisted of four laws enacted by the British parliament. The first was the Massachusetts Government Act which altered the Massachusetts charter and restricted town meetings. The second act was the Administration of Justice Act which ordered that all British soldiers to be tried were to be arraigned in Britain, not in the colonies. The third Act was the Boston Port Act, which closed the port of Boston until the British had been compensated for the tea lost in the Boston Tea Party. The fourth Act was the Quartering Act of 1774, which allowed royal governors to house British troops in the homes of citizens without requiring permission of the owner.
In response, Massachusetts patriots issued the Suffolk Resolves and formed an alternative shadow government known as the Provincial Congress, which began training militia outside British-occupied Boston. In September 1774, the First Continental Congress convened, consisting of representatives from each colony, to serve as a vehicle for deliberation and collective action. During secret debates, conservative Joseph Galloway proposed the creation of a colonial Parliament that would be able to approve or disapprove acts of the British Parliament, but his idea was tabled in a vote of 6 to 5 and was subsequently removed from the record. Congress called for a boycott beginning on December 1, 1774, of all British goods; it was enforced by new local committees authorized by the Congress.
Military hostilities begin.
Massachusetts was declared in a state of rebellion in February 1775 and the British garrison received orders to disarm the rebels and arrest their leaders, leading to the Battles of Lexington and Concord on April 19, 1775. The Patriots laid siege to Boston, expelled royal officials from all the colonies, and took control through the establishment of Provincial Congresses. The Battle of Bunker Hill followed on June 17, 1775. It was a British victory—but at a great cost: about 1,000 British casualties from a garrison of about 6,000, as compared to 500 American casualties from a much larger force. The Second Continental Congress was divided on the best course of action, but eventually produced the Olive Branch Petition, in which they attempted to come to an accord with King George. The king, however, issued a Proclamation of Rebellion which declared that the states were "in rebellion" and the members of Congress were traitors.
The war that arose was in some ways a classic insurgency. As Benjamin Franklin wrote to Joseph Priestley in October 1775:
In the winter of 1775, the Americans invaded northeastern Quebec under generals Benedict Arnold and Richard Montgomery, expecting to rally sympathetic colonists there. The attack was a failure; many Americans who weren't killed were either captured or died of smallpox.
In March 1776, the Continental Army forced the British to evacuate Boston, with George Washington as the commander of the new army. The revolutionaries now fully controlled all thirteen colonies and were ready to declare independence. There still were many Loyalists, but they were no longer in control anywhere by July 1776, and all of the Royal officials had fled.
Creating new state constitutions.
Following the Battle of Bunker Hill in June 1775, the Patriots had control of Massachusetts outside Boston's city limits, and the Loyalists suddenly found themselves on the defensive with no protection from the British army. In all 13 colonies, Patriots had overthrown their existing governments, closing courts and driving away British officials. They held elected conventions and "legislatures" that existed outside any legal framework; new constitutions were drawn up in each state to supersede royal charters. They proclaimed that they were now "states", no longer "colonies".
On January 5, 1776, New Hampshire ratified the first state constitution. In May 1776, Congress voted to suppress all forms of crown authority, to be replaced by locally created authority. New Jersey, South Carolina, and Virginia created their constitutions before July 4. Rhode Island and Connecticut simply took their existing royal charters and deleted all references to the crown. The new states were all committed to republicanism, with no inherited offices. They decided what form of government to create, and also how to select those who would craft the constitutions and how the resulting document would be ratified. On May 26, 1776, John Adams wrote James Sullivan from Philadelphia warning against extending the franchise too far:
The resulting constitutions in states, including those of Delaware, Maryland, Massachusetts, New York, and Virginia featured:
In Pennsylvania, New Jersey, and New Hampshire, the resulting constitutions embodied:
The radical provisions of Pennsylvania's constitution, however, lasted only 14 years. In 1790, conservatives gained power in the state legislature, called a new constitutional convention, and rewrote the constitution. The new constitution substantially reduced universal male suffrage, gave the governor veto power and patronage appointment authority, and added an upper house with substantial wealth qualifications to the unicameral legislature. Thomas Paine called it a constitution unworthy of America.
Independence and Union.
In April 1776, the North Carolina Provincial Congress issued the Halifax Resolves explicitly authorizing its delegates to vote for independence. By June, nine Provincial Congresses were ready for independence; one by one, the last four fell into line: Pennsylvania, Delaware, Maryland, and New York. Richard Henry Lee was instructed by the Virginia legislature to propose independence, and he did so on June 7, 1776. On June 11, a committee was created by the Second Continental Congress to draft a document explaining the justifications for separation from Britain. After securing enough votes for passage, independence was voted for on July 2.
Gathered at Independence Hall in Philadelphia, the nation's 56 Founding Fathers, representing America's Thirteen Colonies, unanimously adopted and issued to King George III the Declaration of Independence, which was drafted largely by Thomas Jefferson and presented by the Committee of Five, which had been charged with its development. The Congress struck several provisions of Jefferson's draft, and then adopted it unanimously on July 4. With the issuance of the Declaration of Independence, each colony began operating as independent and autonomous states. The next step was to form a union to facilitate international relations and alliances.
On November 5, 1777, the Second Continental Congress, gathered at Independence Hall in Philadelphia, approved the Articles of Confederation and Perpetual Union and sent it to each state for ratification. The Congress immediately began operating under the Articles' terms, providing a structure of shared sovereignty during prosecution of the Revolutionary War and facilitating international relations and alliances with France and Spain. The Articles were fully ratified on March 1, 1781. At that point, the Continental Congress was dissolved and a new government of the United States in Congress Assembled took its place the following day, on March 2, 1782, with Samuel Huntington leading the Congress as presiding officer.
Defending the Revolution.
British return: 1776–1777.
According to British historian Jeremy Black, the British had significant advantages, including a highly trained army, the world's largest navy, and an efficient system of public finance that could easily fund the war. However, they seriously misunderstood the depth of support for the American Patriot position and ignored the advice of General Gage, misinterpreting the situation as merely a large-scale riot. The British government believed that they could overawe the Americans by sending a large military and naval force, forcing them to be loyal again:
Washington forced the British out of Boston in the spring of 1776, and neither the British nor the Loyalists controlled any significant areas. The British, however, were massing forces at their naval base at Halifax, Nova Scotia. They returned in force in July 1776, landing in New York and defeating Washington's Continental Army in August at the Battle of Brooklyn. Following that victory, they requested a meeting with representatives from Congress to negotiate an end to hostilities.
A delegation including John Adams and Benjamin Franklin met British admiral Richard Howe on Staten Island in New York Harbor on September 11 in what became known as the Staten Island Peace Conference. Howe demanded that the Americans retract the Declaration of Independence, which they refused to do, and negotiations ended. The British then seized New York City and nearly captured Washington's army. They made the city and its strategic harbor their main political and military base of operations, holding it until November 1783. The city became the destination for Loyalist refugees and a focal point of Washington's intelligence network.
The British also took New Jersey, pushing the Continental Army into Pennsylvania. Washington crossed the Delaware River back into New Jersey in a surprise attack in late December 1776 and defeated the Hessian and British armies at Trenton and Princeton, thereby regaining control of most of New Jersey. The victories gave an important boost to Patriots at a time when morale was flagging, and they have become iconic events of the war.
In 1777, the British sent Burgoyne's invasion force from Canada south to New York to seal off New England. Their aim was to isolate New England, which the British perceived as the primary source of agitation. Rather than move north to support Burgoyne, the British army in New York City went to Philadelphia in a major case of mis-coordination, capturing it from Washington. The invasion army under Burgoyne was much too slow and became trapped in northern New York state. It surrendered after the Battles of Saratoga in October 1777. From early October 1777 until November 15, a siege distracted British troops at Fort Mifflin, Philadelphia, Pennsylvania, and allowed Washington time to preserve the Continental Army by safely leading his troops to harsh winter quarters at Valley Forge.
Prisoners.
On August 23, 1775, George III declared Americans to be traitors to the Crown if they took up arms against royal authority. There were thousands of British and Hessian soldiers in American hands following their surrender at the Battles of Saratoga. Lord Germain took a hard line, but the British generals on American soil never held treason trials, and instead treated captured American soldiers as prisoners of war. The dilemma was that tens of thousands of Loyalists were under American control and American retaliation would have been easy. The British built much of their strategy around using these Loyalists. The British maltreated the prisoners whom they held, resulting in more deaths to American prisoners of war than from combat operations. At the end of the war, both sides released their surviving prisoners.
American alliances after 1778.
The capture of a British army at Saratoga encouraged the French to formally enter the war in support of Congress, and Benjamin Franklin negotiated a permanent military alliance in early 1778; France thus became the first foreign nation to officially recognize the Declaration of Independence. On February 6, 1778, the United States and France signed the Treaty of Amity and Commerce and the Treaty of Alliance. William Pitt spoke out in Parliament urging Britain to make peace in America and to unite with America against France, while British politicians who had sympathized with colonial grievances now turned against the Americans for allying with Britain's rival and enemy.
The Spanish and the Dutch became allies of the French in 1779 and 1780 respectively, forcing the British to fight a global war without major allies, and requiring it to slip through a combined blockade of the Atlantic. Britain began to view the American war for independence as merely one front in a wider war, and the British chose to withdraw troops from America to reinforce the British colonies in the Caribbean, which were under threat of Spanish or French invasion. British commander Sir Henry Clinton evacuated Philadelphia and returned to New York City. General Washington intercepted him in the Battle of Monmouth Court House, the last major battle fought in the north. After an inconclusive engagement, the British retreated to New York City. The northern war subsequently became a stalemate, as the focus of attention shifted to the smaller southern theater.
The British move South: 1778–1783.
The British strategy in America now concentrated on a campaign in the southern states. With fewer regular troops at their disposal, the British commanders saw the "southern strategy" as a more viable plan, as they perceived the south as strongly Loyalist with a large population of recent immigrants and large numbers of slaves who might be tempted to run away from their masters to join the British and gain their freedom.
Beginning in late December 1778, the British captured Savannah and controlled the Georgia coastline. In 1780, they launched a fresh invasion and took Charleston, as well. A significant victory at the Battle of Camden meant that royal forces soon controlled most of Georgia and South Carolina. The British set up a network of forts inland, hoping that the Loyalists would rally to the flag. Not enough Loyalists turned out, however, and the British had to fight their way north into North Carolina and Virginia with a severely weakened army. Behind them, much of the territory that they had already captured dissolved into a chaotic guerrilla war, fought predominantly between bands of Loyalists and American militia, and which negated many of the gains that the British had previously made.
Surrender at Yorktown (1781).
The British army under Cornwallis marched to Yorktown, Virginia, where they expected to be rescued by a British fleet. The fleet did arrive, but so did a larger French fleet. The French were victorious in the Battle of the Chesapeake, and the British fleet returned to New York for reinforcements, leaving Cornwallis trapped. In October 1781, the British surrendered their second invading army of the war under a siege by the combined French and Continental armies commanded by Washington.
The end of the war.
Washington did not know if or when the British might reopen hostilities after Yorktown. They still had 26,000 troops occupying New York City, Charleston, and Savannah, together with a powerful fleet. The French army and navy departed, so the Americans were on their own in 1782–83. The American treasury was empty, and the unpaid soldiers were growing restive, almost to the point of mutiny or possible coup d'etat. Washington dispelled the unrest among officers of the Newburgh Conspiracy in 1783, and Congress subsequently created the promise of a five years bonus for all officers.
Historians continue to debate whether the odds were long or short for American victory. John E. Ferling says that the odds were so long that the American victory was "almost a miracle". On the other hand, Joseph Ellis says that the odds favored the Americans, and asks whether there ever was any realistic chance for the British to win. He argues that this opportunity came only once, in the summer of 1776, and the British failed that test. Admiral Howe and his brother General Howe "missed several opportunities to destroy the Continental Army ... Chance, luck, and even the vagaries of the weather played crucial roles." Ellis's point is that the strategic and tactical decisions of the Howes were fatally flawed because they underestimated the challenges posed by the Patriots. Ellis concludes that, once the Howe brothers failed, the opportunity "would never come again" for a British victory.
Support for the conflict had never been strong in Britain, where many sympathized with the Americans, but now it reached a new low. King George wanted to fight on, but his supporters lost control of Parliament and they launched no further offensives in America on the eastern seaboard. However, the British continued formal and informal assistance to Indian tribes making war on US citizens over the next three decades, which contributed to a "Second American Revolution" in the War of 1812. In that war against Britain, the US permanently established its territory and its citizenship independent of the British Empire.
Paris peace treaty.
During negotiations in Paris, the American delegation discovered that France supported American independence but no territorial gains, hoping to confine the new nation to the area east of the Appalachian Mountains. The Americans opened direct secret negotiations with London, cutting out the French. British Prime Minister Lord Shelburne was in charge of the British negotiations, and he saw a chance to make the United States a valuable economic partner. The US obtained all the land east of the Mississippi River, including southern Canada, but Spain took control of Florida from the British. It gained fishing rights off Canadian coasts, and agreed to allow British merchants and Loyalists to recover their property. Prime Minister Shelburne foresaw highly profitable two-way trade between Britain and the rapidly growing United States, which did come to pass. The blockade was lifted and all British interference had been driven out, and American merchants were free to trade with any nation anywhere in the world.
The British largely abandoned their indigenous allies, who were not a party to this treaty and did not recognize it until they were defeated militarily by the United States. However, the British did sell them munitions and maintain forts in American territory until the Jay Treaty of 1795.
Losing the war and the Thirteen Colonies was a shock to Britain. The war revealed the limitations of Britain's fiscal-military state when they discovered that they suddenly faced powerful enemies with no allies, and they were dependent on extended and vulnerable transatlantic lines of communication. The defeat heightened dissension and escalated political antagonism to the King's ministers. The King went so far as to draft letters of abdication, although they were never delivered. Inside Parliament, the primary concern changed from fears of an over-mighty monarch to the issues of representation, parliamentary reform, and government retrenchment. Reformers sought to destroy what they saw as widespread institutional corruption, and the result was a crisis from 1776 to 1783. The crisis ended after 1784 confidence in the British constitution was restored during the administration of Prime Minister William Pitt.
Finance.
Britain's war against the Americans, the French, and the Spanish cost about £100 million, and the Treasury borrowed 40 percent of the money that it needed. Meanwhile in Paris, heavy spending and a weak tax base brought France to the verge of bankruptcy and revolution. In London the British had relatively little difficulty financing their war, keeping their suppliers and soldiers paid, and hiring tens of thousands of German soldiers. Britain had a sophisticated financial system based on the wealth of thousands of landowners who supported the government, together with banks and financiers in London. The British tax system collected about 12 percent of the GDP in taxes during the 1770s.
In sharp contrast, Congress and the American states had no end of difficulty financing the war. In 1775, there was at most 12 million dollars in gold in the colonies, not nearly enough to cover current transactions, let alone finance a major war. The British made the situation much worse by imposing a tight blockade on every American port, which cut off almost all imports and exports. One partial solution was to rely on volunteer support from militiamen and donations from patriotic citizens. Another was to delay actual payments, pay soldiers and suppliers in depreciated currency, and promise that it would be made good after the war. Indeed, the soldiers and officers were given land grants in 1783 to cover the wages that they had earned but had not been paid during the war. The national government did not have a strong leader in financial matters until 1781, when Robert Morris was named Superintendent of Finance of the United States. Morris used a French loan in 1782 to set up the private Bank of North America to finance the war. He reduced the civil list, saved money by using competitive bidding for contracts, tightened accounting procedures, and demanded the national government's full share of money and supplies from the individual states.
Congress used four main methods to cover the cost of the war, which cost about 66 million dollars in specie (gold and silver). Congress made issues of paper money, known colloquially as "Continental Dollars", in 1775–1780 and in 1780–1781. The first issue amounted to 242 million dollars. This paper money would supposedly be redeemed for state taxes, but the holders were eventually paid off in 1791 at the rate of one cent on the dollar. By 1780, the paper money was so devalued that the phrase "not worth a Continental" became synonymous with worthlessness. The skyrocketing inflation was a hardship on the few people who had fixed incomes, but 90 percent of the people were farmers and were not directly affected by it. Debtors benefited by paying off their debts with depreciated paper. The greatest burden was borne by the soldiers of the Continental Army whose wages were usually paid late and declined in value every month, weakening their morale and adding to the hardships of their families.
Beginning in 1777, Congress repeatedly asked the states to provide money, but the states had no system of taxation and were of little help. By 1780, Congress was making requisitions for specific supplies of corn, beef, pork, and other necessities, an inefficient system which barely kept the army alive. Starting in 1776, the Congress sought to raise money by loans from wealthy individuals, promising to redeem the bonds after the war. The bonds were redeemed in 1791 at face value, but the scheme raised little money because Americans had little specie, and many of the rich merchants were supporters of the Crown. The French secretly supplied the Americans with money, gunpowder, and munitions to weaken Great Britain; the subsidies continued when France entered the war in 1778, and the French government and Paris bankers lent large sums to the American war effort. The Americans struggled to pay off the loans; they ceased making interest payments to France in 1785 and defaulted on installments due in 1787. In 1790, however, they resumed regular payments on their debts to the French, and settled their accounts with the French government in 1795 when James Swan, an American banker, assumed responsibility for the balance of the debt in exchange for the right to refinance it at a profit.
Concluding the Revolution.
Creating a "more perfect union" and guaranteeing rights.
The war ended in 1783 and was followed by a period of prosperity. The national government was still operating under the Articles of Confederation and settled the issue of the western territories, which the states ceded to Congress. American settlers moved rapidly into those areas, with Vermont, Kentucky, and Tennessee becoming states in the 1790s.
However, the national government had no money either to pay the war debts owed to European nations and the private banks, or to pay Americans who had been given millions of dollars of promissory notes for supplies during the war. Nationalists led by Washington, Alexander Hamilton, and other veterans feared that the new nation was too fragile to withstand an international war, or even the repetition of internal revolts such as the Shays' Rebellion of 1786 in Massachusetts. They convinced Congress to call the Philadelphia Convention in 1787. The Convention adopted a new Constitution which provided for a republic with a much stronger national government in a federal framework, including an effective executive in a check-and-balance system with the judiciary and legislature. The Constitution was ratified in 1788, after a fierce debate in the states over the proposed new government. The new administration under President George Washington took office in New York in March 1789. James Madison spearheaded Congressional legislation proposing amendments to the Constitution as assurances to those cautious about federal power, guaranteeing many of the inalienable rights that formed a foundation for the revolution. Rhode Island was the final state to ratify the Constitution in 1790, the first ten amendments were ratified in 1791 and became known as the United States Bill of Rights.
National debt.
The national debt fell into three categories after the American Revolution. The first was the $12 million owed to foreigners, mostly money borrowed from France. There was general agreement to pay the foreign debts at full value. The national government owed $40 million and state governments owed $25 million to Americans who had sold food, horses, and supplies to the Patriot forces. There were also other debts which consisted of promissory notes issued during the war to soldiers, merchants, and farmers who accepted these payments on the premise that the new Constitution would create a government that would pay these debts eventually.
The war expenses of the individual states added up to $114 million, compared to $37 million by the central government. In 1790, Congress combined the remaining state debts with the foreign and domestic debts into one national debt totaling $80 million at the recommendation of first Secretary of the Treasury Alexander Hamilton. Everyone received face value for wartime certificates, so that the national honor would be sustained and the national credit established.
Ideology and factions.
The population of the Thirteen States was not homogeneous in political views and attitudes. Loyalties and allegiances varied widely within regions and communities and even within families, and sometimes shifted during the Revolution.
Ideology behind the Revolution.
The American Enlightenment was a critical precursor of the American Revolution. Chief among the ideas of the American Enlightenment were the concepts of natural law, natural rights, consent of the governed, individualism, property rights, self-ownership, self-determination, liberalism, republicanism, and defense against corruption. A growing number of American colonists embraced these views and fostered an intellectual environment which led to a new sense of political and social identity.
Liberalism.
John Locke (1632–1704) is often referred to as "the philosopher of the American Revolution" due to his work in the Social Contract and Natural Rights theories that underpinned the Revolution's political ideology. Locke's "Two Treatises of Government" published in 1689 was especially influential. He argued that all humans were created equally free, and governments therefore needed the "consent of the governed". In late eighteenth-century America, belief was still widespread in "equality by creation" and "rights by creation". Locke's ideas on liberty influenced the political thinking of English writers such as John Trenchard, Thomas Gordon, and Benjamin Hoadly, whose political ideas in turn also had a strong influence on the American Patriots.
The theory of the social contract influenced the belief among many of the Founders that the right of the people to overthrow their leaders, should those leaders betray the historic rights of Englishmen, was one of the "natural rights" of man. The Americans heavily relied on Montesquieu's analysis of the wisdom of the "balanced" British Constitution (mixed government) in writing the state and national constitutions.
Republicanism.
The most basic features of republicanism anywhere are a representational government in which citizens elect leaders from among themselves for a predefined term, as opposed to a permanent ruling class or aristocracy, and laws are passed by these leaders for the benefit of the entire republic. In addition, unlike a direct or "pure" democracy in which the majority vote rules, a republic codifies in a charter or constitution a certain set of basic civil rights that is guaranteed to every citizen and cannot be overridden by majority rule.
The American interpretation of "republicanism" was inspired by the Whig party in Great Britain which openly criticized the corruption within the British government. Americans were increasingly embracing republican values, seeing Britain as corrupt and hostile to American interests. The colonists associated political corruption with ostentatious luxury and inherited aristocracy, which they condemned.
The Founding Fathers were strong advocates of republican values, particularly Samuel Adams, Patrick Henry, John Adams, Benjamin Franklin, Thomas Jefferson, Thomas Paine, George Washington, James Madison, and Alexander Hamilton, which required men to put civic duty ahead of their personal desires. Men were honor bound by civic obligation to be prepared and willing to fight for the rights and liberties of their countrymen. John Adams wrote to Mercy Otis Warren in 1776, agreeing with some classical Greek and Roman thinkers: "Public Virtue cannot exist without private, and public Virtue is the only Foundation of Republics." He continued:
"Republican motherhood" became the ideal for American women, exemplified by Abigail Adams and Mercy Otis Warren; the first duty of the republican woman was to instill republican values in her children and to avoid luxury and ostentation.
Protestant Dissenters and the Great Awakening.
Protestant churches that had separated from the Church of England, called "dissenters", were the "school of democracy", in the words of historian Patricia Bonomi. Before the Revolution, the Southern Colonies and three of the New England Colonies had official established churches: Congregational in Massachusetts Bay, Connecticut, and New Hampshire, and the Church of England in Maryland, Virginia, North-Carolina, South Carolina, and Georgia. The New York, New Jersey, Pennsylvania, Delaware, and the Colony of Rhode Island and Providence Plantations had no officially established churches. Church membership statistics from the period are unreliable and scarce, but what little data exists indicates that the Church of England was not in the majority, not even in the colonies where the it was the established church, and they probably did not comprise even 30 percent of the population in most localities (with the possible exception of Virginia).
John Witherspoon, president of the College of New Jersey (now Princeton University), who was considered a "new light" Presbyterian, wrote widely circulated sermons linking the American Revolution to the teachings of the Bible. Throughout the colonies, dissenting Protestant ministers from the Congregational, Baptist, and Presbyterian churches preached Revolutionary themes in their sermons while most Church of England clergymen preached loyalty to the king, the titular head of the English state church. Religious motivation for fighting tyranny transcended socioeconomic lines to encompass rich and poor, men and women, frontierspeople and townspeople, farmers and merchants. The Declaration of Independence also referred to the "Laws of Nature and of Nature's God" as justification for the Americans' separation from the British monarchy. Most eighteenth-century Americans believed that the entire universe ("nature") was God's creation and he was "Nature's God". Everything was part of the "universal order of things" which began with God and was directed by his providence. Accordingly, the signers of the Declaration professed their "firm reliance on the Protection of divine Providence", and they appealed to "the Supreme Judge for the rectitude of our intentions". George Washington was firmly convinced that he was an instrument of providence, to the benefit of the American people and of all humanity.
Historian Bernard Bailyn argues that the evangelicalism of the era challenged traditional notions of natural hierarchy by preaching that the Bible teaches that all men are equal, so that the true value of a man lies in his moral behavior, not in his class. Kidd argues that religious disestablishment, belief in God as the source of human rights, and shared convictions about sin, virtue, and divine providence worked together to unite rationalists and evangelicals and thus encouraged a large proportion of Americans to fight for independence from the Empire. Bailyn, on the other hand, denies that religion played such a critical role. Alan Heimert argues that New Light anti-authoritarianism was essential to furthering democracy in colonial American society, and set the stage for a confrontation with British monarchical and aristocratic rule.
Class and psychology of the factions.
John Adams concluded in 1818:
In the mid-20th century, historian Leonard Woods Labaree identified eight characteristics of the Loyalists that made them essentially conservative, opposite to the characteristics of the Patriots. Loyalists tended to feel that resistance to the Crown was morally wrong, while the Patriots thought that morality was on their side. Loyalists were alienated when the Patriots resorted to violence, such as burning houses and tarring and feathering. Loyalists wanted to take a centrist position and resisted the Patriots' demand to declare their opposition to the Crown. Many Loyalists had maintained strong and long-standing relations with Britain, especially merchants in port cities such as New York and Boston. Many Loyalists felt that independence was bound to come eventually, but they were fearful that revolution might lead to anarchy, tyranny, or mob rule. In contrast, the prevailing attitude among Patriots was a desire to seize the initiative. Labaree also wrote that Loyalists were pessimists who lacked the confidence in the future displayed by the Patriots.
Historians in the early 20th century such as J. Franklin Jameson examined the class composition of the Patriot cause, looking for evidence of a class war inside the revolution. More recent historians have largely abandoned that interpretation, emphasizing instead the high level of ideological unity. Both Loyalists and Patriots were a "mixed lot", but ideological demands always came first. The Patriots viewed independence as a means to gain freedom from British oppression and to reassert their basic rights. Most yeomen farmers, craftsmen, and small merchants joined the Patriot cause to demand more political equality. They were especially successful in Pennsylvania but less so in New England, where John Adams attacked Thomas Paine's "Common Sense" for the "absurd democratical notions" that it proposed.
King George III.
The revolution became a personal issue for the king, fueled by his growing belief that British leniency would be taken as weakness by the Americans. He also sincerely believed that he was defending Britain's constitution against usurpers, rather than opposing patriots fighting for their natural rights.
Although Prime Minister Lord North was not an ideal war leader, George III managed to give Parliament a sense of purpose to fight, and Lord North was able to keep his cabinet together. Lord North's cabinet ministers, the Earl of Sandwich, First Lord of the Admiralty, and Lord George Germain, Secretary of State for the Colonies, however, proved to lack leadership skills suited for their positions, which in turn, aided the American revolutionaries.
King George III is often accused of obstinately trying to keep Great Britain at war with the revolutionaries in America, despite the opinions of his own ministers. In the words of the British historian George Otto Trevelyan, the King was determined "never to acknowledge the independence of the Americans, and to punish their contumacy by the indefinite prolongation of a war which promised to be eternal." The king wanted to "keep the rebels harassed, anxious, and poor, until the day when, by a natural and inevitable process, discontent and disappointment were converted into penitence and remorse". Later historians defend George by saying in the context of the times no king would willingly surrender such a large territory, and his conduct was far less ruthless than contemporary monarchs in Europe. After the surrender of a British army at Saratoga, both Parliament and the British people were largely in favor of the war; recruitment ran at high levels and although political opponents were vocal, they remained a small minority.
With the setbacks in America, Lord North asked to transfer power to Lord Chatham, whom he thought more capable, but George refused to do so; he suggested instead that Chatham serve as a subordinate minister in North's administration, but Chatham refused. He died later in the same year. Lord North was allied to the "King's Friends" in Parliament and believed George III had the right to exercise powers. In early 1778, Britain's chief rival France signed a treaty of alliance with the United States, and the confrontation soon escalated from a "rebellion" to something that has been characterized as "world war". The French fleet was able to outrun the British naval blockade of the Mediterranean and sailed to North America. The conflict now affected North America, Europe and India. The United States and France were joined by Spain in 1779 and the Dutch Republic, while Britain had no major allies of its own, except for the Loyalist minority in America and German auxiliaries (i.e. "Hessians"). Lord Gower and Lord Weymouth both resigned from the government. Lord North again requested that he also be allowed to resign, but he stayed in office at George III's insistence. Opposition to the costly war was increasing, and in June 1780 contributed to disturbances in London known as the Gordon riots.
As late as the Siege of Charleston in 1780, Loyalists could still believe in their eventual victory, as British troops inflicted defeats on the Continental forces at the Battle of Camden and the Battle of Guilford Court House. In late 1781, the news of Cornwallis's surrender at the siege of Yorktown reached London; Lord North's parliamentary support ebbed away and he resigned the following year. The King drafted an abdication notice, which was never delivered, finally accepted the defeat in North America, and authorized peace negotiations. The Treaties of Paris, by which Britain recognized the independence of the United States and returned Florida to Spain, were signed in 1782 and 1783 respectively. In early 1783, George III privately conceded "America is lost!" He reflected that the Northern colonies had developed into Britain's "successful rivals" in commercial trade and fishing.
When John Adams was appointed American Minister to London in 1785, George had become resigned to the new relationship between his country and the former colonies. He told Adams, "I was the last to consent to the separation; but the separation having been made and having become inevitable, I have always said, as I say now, that I would be the first to meet the friendship of the United States as an independent power."
Patriots.
Those who fought for independence were called "Revolutionaries" "Continentals", "Rebels", "Patriots", "Whigs", "Congress-men", or "Americans" during and after the war. They included a full range of social and economic classes but were unanimous regarding the need to defend the rights of Americans and uphold the principles of republicanism in rejecting monarchy and aristocracy, while emphasizing civic virtue by citizens. The signers of the Declaration of Independence were mostly—with definite exceptions—well-educated, of British stock, and of the Protestant faith. Newspapers were strongholds of patriotism (although there were a few Loyalist papers) and printed many pamphlets, announcements, patriotic letters, and pronouncements.
According to historian Robert Calhoon, 40 to 45 percent of the white population in the Thirteen Colonies supported the Patriots' cause, 15 to 20 percent supported the Loyalists, and the remainder were neutral or kept a low profile. Mark Lender analyzes why ordinary people became insurgents against the British, even if they were unfamiliar with the ideological reasons behind the war. He concludes that such people held a sense of rights which the British were violating, rights that stressed local autonomy, fair dealing, and government by consent. They were highly sensitive to the issue of tyranny, which they saw manifested in the British response to the Boston Tea Party. The arrival in Boston of the British Army heightened their sense of violated rights, leading to rage and demands for revenge. They had faith that God was on their side.
Thomas Paine published his pamphlet "Common Sense" in January 1776, after the Revolution had started. It was widely distributed and often read aloud in taverns, contributing significantly to concurrently spreading the ideas of republicanism and liberalism, bolstering enthusiasm for separation from Great Britain and encouraging recruitment for the Continental Army. Paine presented the Revolution as the solution for Americans alarmed by the threat of tyranny.
Loyalists.
The consensus of scholars is that about 15 to 20 percent of the white population remained loyal to the British Crown. Those who actively supported the king were known at the time as "Loyalists", "Tories", or "King's men". The Loyalists never controlled territory unless the British Army occupied it. They were typically older, less willing to break with old loyalties, and often connected to the Church of England; they included many established merchants with strong business connections throughout the Empire, as well as royal officials such as Thomas Hutchinson of Boston.
There were 500 to 1,000 Black Loyalists, enslaved African Americans who escaped to British lines and supported Britain's cause via several means. Many of them died from various diseases, but the survivors were evacuated by the British to their remaining colonies in North America.
The revolution could divide families, such as William Franklin, son of Benjamin Franklin and royal governor of the Province of New Jersey who remained loyal to the Crown throughout the war. He and his father never spoke again. Recent immigrants who had not been fully Americanized were also inclined to support the King, such as Flora MacDonald, a Scottish settler in the backcountry.
After the war, the great majority of the half-million Loyalists remained in America and resumed normal lives. Some became prominent American leaders, such as Samuel Seabury. Approximately 46,000 Loyalists relocated to Canada; others moved to Britain (7,000), Florida, or the West Indies (9,000). The exiles represented approximately two percent of the total population of the colonies. Nearly all black loyalists left for Nova Scotia, Florida, or England, where they could remain free. Loyalists who left the South in 1783 took thousands of their slaves with them as they fled to the British West Indies.
Neutrals.
A minority of uncertain size tried to stay neutral in the war. Most kept a low profile, but the Quakers were the most important group to speak out for neutrality, especially in Pennsylvania. The Quakers continued to do business with the British even after the war began, and they were accused of supporting British rule, "contrivers and authors of seditious publications" critical of the revolutionary cause. Most Quakers remained neutral, although a sizeable number nevertheless participated to some degree.
Role of women.
Women contributed to the American Revolution in many ways and were involved on both sides. Formal politics did not include women, but ordinary domestic behaviors became charged with political significance as Patriot women confronted a war which permeated all aspects of political, civil, and domestic life. They participated by boycotting British goods, spying on the British, following armies as they marched, washing, cooking, and mending for soldiers, delivering secret messages, and even fighting disguised as men in a few cases, such as Deborah Samson. Mercy Otis Warren held meetings in her house and cleverly attacked Loyalists with her creative plays and histories. Many women also acted as nurses and helpers, tending to the soldiers' wounds and buying and selling goods for them. Some of these camp followers even participated in combat, such as Madam John Turchin who led her husband's regiment into battle. Above all, women continued the agricultural work at home to feed their families and the armies. They maintained their families during their husbands' absences and sometimes after their deaths.
American women were integral to the success of the boycott of British goods, as the boycotted items were largely household articles such as tea and cloth. Women had to return to knitting goods and to spinning and weaving their own cloth—skills that had fallen into disuse. In 1769, the women of Boston produced 40,000 skeins of yarn, and 180 women in Middletown, Massachusetts wove of cloth. Many women gathered food, money, clothes, and other supplies during the war to help the soldiers. A woman's loyalty to her husband could become an open political act, especially for women in America committed to men who remained loyal to the King. Legal divorce, usually rare, was granted to Patriot women whose husbands supported the King.
Other participants.
France and Spain.
In early 1776, France set up a major program of aid to the Americans, and the Spanish secretly added funds. Each country spent one million "livres tournaises" to buy munitions. A dummy corporation run by Pierre Beaumarchais concealed their activities. American Patriots obtained some munitions from the Dutch Republic as well, through the French and Spanish ports in the West Indies. Heavy expenditures and a weak taxation system pushed France toward bankruptcy.
In 1777, Charles François Adrien le Paulmier, Chevalier d'Annemours, acting as a secret agent for France, made sure General George Washington was privy to his mission. He followed Congress around for the next two years, reporting what he observed back to France. The Treaty of Alliance between the French and the Americans followed in 1778, which led to more French money, matériel and troops being sent to the United States.
Spain did not officially recognize the United States, but it was a French ally and it separately declared war on Britain on June 21, 1779. Bernardo de Gálvez, general of the Spanish forces in New Spain, also served as governor of Louisiana. He led an expedition of colonial troops to capture Florida from the British and to keep open a vital conduit for supplies.
Germans.
Ethnic Germans served on both sides of the American Revolutionary War. As George III was also the Elector of Hanover, many supported the Loyalist cause and served as allies of the Kingdom of Great Britain; most notably rented auxiliary troops from German states such as the Landgraviate of Hessen-Kassel.
American Patriots tended to represent such troops as mercenaries in propaganda against the British Crown. Even American historians followed suit, in spite of Colonial-era jurists drawing a distinction between auxiliaries and mercenaries, with auxiliaries serving their prince when sent to the aid of another prince, and mercenaries serving a foreign prince as individuals. By this distinction the troops which served in the American Revolution were auxiliaries.
Other German individuals came to assist the American revolutionaries, most notably Friedrich Wilhelm von Steuben, who served as a general in the Continental Army and is credited with professionalizing that force, but most Germans who served were already colonists. Von Steuben's native Prussia joined the League of Armed Neutrality, and King Frederick II of Prussia was well appreciated in the United States for his support early in the war. He expressed interest in opening trade with the United States and bypassing English ports, and allowed an American agent to buy arms in Prussia. Frederick predicted American success, and promised to recognize the United States and American diplomats once France did the same. Prussia also interfered in the recruiting efforts of Russia and neighboring German states when they raised armies to send to the Americas, and Frederick II forbade enlistment for the American war within Prussia. All Prussian roads were denied to troops from Anhalt-Zerbst, which delayed reinforcements that Howe had hoped to receive during the winter of 1777–1778.
However, when the War of the Bavarian Succession erupted, Frederick II became much more cautious with Prussian/British relations. U.S. ships were denied access to Prussian ports, and Frederick refused to officially recognize the United States until they had signed the Treaty of Paris. Even after the war, Frederick II predicted that the United States was too large to operate as a republic, and that it would soon rejoin the British Empire with representatives in Parliament.
Native Americans.
Most indigenous people rejected pleas that they remain neutral and instead supported the British Crown. The great majority of the 200,000 indigenous people east of the Mississippi distrusted the colonists and supported the British cause, hoping to forestall continued expansion of settlement into their territories. Those tribes closely involved in trade tended to side with the Patriots, although political factors were important as well. Some indigenous people tried to remain neutral, seeing little value in joining what they perceived to be a "white man's war", and fearing reprisals from whichever side they opposed.
The great majority of indigenous people did not participate directly in the war, with the notable exceptions of warriors and bands associated with four of the Iroquois tribes in New York and Pennsylvania which allied with the British, and the Oneida and Tuscarora tribes among the Iroquois of central and western New York who supported the American cause. The British did have other allies, particularly in the regions of southwest Quebec on the Patriot's frontier. The British provided arms to indigenous people who were led by Loyalists in war parties to raid frontier settlements from the Carolinas to New York. These war parties managed to kill many settlers on the frontier, especially in Pennsylvania and New York's Mohawk Valley.
In 1776, Cherokee war parties attacked American Colonists all along the southern Quebec frontier of the uplands throughout the Washington District, North Carolina (now Tennessee) and the Kentucky wilderness area. The Chickamauga Cherokee under Dragging Canoe allied themselves closely with the British, and fought on for an additional decade after the Treaty of Paris was signed. They would launch raids with roughly 200 warriors, as seen in the Cherokee–American wars; they could not mobilize enough forces to invade settler areas without the help of allies, most often the Creek.
Joseph Brant ("also" Thayendanegea) of the powerful Mohawk tribe in New York was the most prominent indigenous leader against the Patriot forces. In 1778 and 1780, he led 300 Iroquois warriors and 100 white Loyalists in multiple attacks on small frontier settlements in New York and Pennsylvania, killing many settlers and destroying villages, crops, and stores.
In 1779, the Continental Army forced the hostile indigenous people out of upstate New York when Washington sent an army under John Sullivan which destroyed 40 evacuated Iroquois villages in central and western New York. Sullivan systematically burned the empty villages and destroyed about 160,000 bushels of corn that composed the winter food supply. The Battle of Newtown proved decisive, as the Patriots had an advantage of three-to-one, and it ended significant resistance; there was little combat otherwise. Facing starvation and homeless for the winter, the Iroquois fled to Canada. The British resettled them in Ontario, providing land grants as compensation for some of their losses.
At the peace conference following the war, the British ceded lands which they did not really control, and which they did not consult about with their indigenous allies during the treaty negotiations. They transferred control to the United States of all the land south of the Great Lakes east of the Mississippi and north of Florida. Calloway concludes:
The British did not give up their forts until 1796 in the Ohio country and Illinois country; they kept alive the dream of forming an allied indigenous nation there, which they referred to an "Indian barrier state". That goal was one of the causes of the War of 1812.
Black Americans.
Free blacks in the New England Colonies and Middle Colonies in the North as well as Southern Colonies fought on both sides of the War, but the majority fought for the Patriots. Gary Nash reports that there were about 9,000 black veteran Patriots, counting the Continental Army and Navy, state militia units, privateers, wagoneers in the Army, servants to officers, and spies. Ray Raphael notes that thousands did join the Loyalist cause, but "a far larger number, free as well as slave, tried to further their interests by siding with the patriots." Crispus Attucks was one of the five people killed in the Boston Massacre in 1770 and is considered the first American casualty for the cause of independence.
The effects of the war were more dramatic in the South. Tens of thousands of slaves escaped to British lines throughout the South, causing dramatic losses to slaveholders and disrupting cultivation and harvesting of crops. For instance, South Carolina was estimated to have lost about 25,000 slaves to flight, migration, or death which amounted to a third of its slave population. From 1770 to 1790, the black proportion of the population (mostly slaves) in South Carolina dropped from 60.5 percent to 43.8 percent, and from 45.2 percent to 36.1 percent in Georgia.
During the war, the British commanders attempted to weaken the Patriots by issuing proclamations of freedom to their slaves. In the November 1775 document known as Dunmore's Proclamation Virginia royal governor, Lord Dunmore recruited black men into the British forces with the promise of freedom, protection for their families, and land grants. Some men responded and briefly formed the British Ethiopian Regiment. Historian David Brion Davis explains the difficulties with a policy of wholesale arming of the slaves:
Davis underscores the British dilemma: "Britain, when confronted by the rebellious American colonists, hoped to exploit their fear of slave revolts while also reassuring the large number of slave-holding Loyalists and wealthy Caribbean planters and merchants that their slave property would be secure". The Americans, however, accused the British of encouraging slave revolts, with the issue becoming one of the 27 colonial grievances.
The existence of slavery in the American colonies had attracted criticism from both sides of the Atlantic as many could not reconcile the existence of the institution with the egalitarian ideals espoused by leaders of the Revolution. British writer Samuel Johnson wrote "how is it we hear the loudest yelps for liberty among the drivers of the Negroes?" in a text opposing the grievances of the colonists. Referring to this contradiction, English abolitionist Thomas Day wrote in a 1776 letter that African American writer Lemuel Haynes expressed similar viewpoints in his essay "Liberty Further Extended" where he wrote that "Liberty is Equally as pre[c]ious to a Black man, as it is to a white one". Thomas Jefferson unsuccessfully attempted to include a section in the Declaration of Independence which asserted that King George III had "forced" the slave trade onto the colonies. Despite the turmoil of the period, African-Americans contributed to the foundation of an American national identity during the Revolution. Phyllis Wheatley, an African-American poet, popularized the image of Columbia to represent America. She came to public attention when her "Poems on Various Subjects, Religious and Moral" appeared in 1773, and received praise from George Washington.
The 1779 Philipsburg Proclamation expanded the promise of freedom for black men who enlisted in the British military to all the colonies in rebellion. British forces gave transportation to 10,000 slaves when they evacuated Savannah and Charleston, carrying through on their promise. They evacuated and resettled more than 3,000 Black Loyalists from New York to Nova Scotia, Upper Canada, and Lower Canada. Others sailed with the British to England or were resettled as freedmen in the West Indies of the Caribbean. But slaves carried to the Caribbean under control of Loyalist masters generally remained slaves until British abolition of slavery in its colonies in 1833–1838. More than 1,200 of the Black Loyalists of Nova Scotia later resettled in the British colony of Sierra Leone, where they became leaders of the Krio ethnic group of Freetown and the later national government. Many of their descendants still live in Sierra Leone, as well as other African countries.
Effects of the Revolution.
After the Revolution, genuinely democratic politics became possible in the former American colonies. The rights of the people were incorporated into state constitutions. Concepts of liberty, individual rights, equality among men and hostility toward corruption became incorporated as core values of liberal republicanism. The greatest challenge to the old order in Europe was the challenge to inherited political power and the democratic idea that government rests on the consent of the governed. The example of the first successful revolution against a European empire, and the first successful establishment of a republican form of democratically elected government, provided a model for many other colonial peoples who realized that they too could break away and become self-governing nations with directly elected representative government.
Interpretations.
Interpretations vary concerning the effect of the Revolution. Historians such as Bernard Bailyn, Gordon Wood, and Edmund Morgan view it as a unique and radical event which produced deep changes and had a profound effect on world affairs, such as an increasing belief in the principles of the Enlightenment. These were demonstrated by a leadership and government that espoused protection of natural rights, and a system of laws chosen by the people. John Murrin, by contrast, argues that the definition of "the people" at that time was mostly restricted to free men who passed a property qualification. This view argues that any significant gain of the revolution was irrelevant in the short term to women, black Americans and slaves, poor white men, youth, and Native Americans.
Gordon Wood states:
Edmund Morgan has argued that, in terms of long-term impact on American society and values:
Inspiring other independence movements and revolutions.
The first shot of the American Revolution at the Battle of Lexington and Concord is referred to as the "shot heard 'round the world" due to its historical and global significance. The Revolutionary War victory not only established the United States as the first modern constitutional republic, but marked the transition from an age of monarchy to a new age of freedom by inspiring similar movements worldwide. The American Revolution was the first of the "Atlantic Revolutions": followed most notably by the French Revolution, the Haitian Revolution, and the Latin American wars of independence. Aftershocks contributed to rebellions in Ireland, the Polish–Lithuanian Commonwealth, and the Netherlands.
The U.S. Constitution, drafted shortly after independence, remains the world's oldest written constitution, and has been emulated by other countries, in some cases verbatim. Some historians and scholars argue that the subsequent wave of independence and revolutionary movements has contributed to the continued expansion of democratic government; 144 countries, representing two-third of the world's population, are full or partially democracies of same form.
The Dutch Republic, also at war with Britain, was the next country after France to sign a treaty with the United States, on October 8, 1782. On April 3, 1783, Ambassador Extraordinary Gustaf Philip Creutz, representing King Gustav III of Sweden, and Benjamin Franklin, signed a Treaty of Amity and Commerce with the U.S.
The Revolution had a strong, immediate influence in Great Britain, Ireland, the Netherlands, and France. Many British and Irish Whigs in Parliament spoke glowingly in favor of the American cause. In Ireland, the Protestant minority who controlled Ireland demanded self-rule. Under the leadership of Henry Grattan, the Irish Patriot Party forced the reversal of mercantilist prohibitions against trade with other British colonies. The King and his cabinet in London could not risk another rebellion on the American model, and so made a series of concessions to the Patriot faction in Dublin. Armed volunteer units of the "Protestant Ascendancy" were set up ostensibly to protect against an invasion from France. As had been in colonial America, so too in Ireland now the King no longer had a monopoly of lethal force.
For many Europeans, such as the Marquis de Lafayette, who later were active during the era of the French Revolution, the American case along with the Dutch Revolt (end of the 16th century) and the 17th century English Civil War, was among the examples of overthrowing an old regime. The American Declaration of Independence influenced the French Declaration of the Rights of Man and of the Citizen of 1789. The spirit of the Declaration of Independence led to laws ending slavery in all the Northern states and the Northwest Territory, with New Jersey the last in 1804. States such as New Jersey and New York adopted gradual emancipation, which kept some people as slaves for more than two decades longer.
Status of African Americans.
During the revolution, the contradiction between the Patriots' professed ideals of liberty and the institution of slavery generated increased scrutiny of the latter. As early as 1764, the Boston Patriot leader James Otis, Jr. declared that all men, "white or black", were "by the law of nature" born free. Anti-slavery calls became more common in the early 1770s. In 1773, Benjamin Rush, the future signer of the Declaration of Independence, called on "advocates for American liberty" to oppose slavery, writing, "The plant of liberty is of so tender a nature that it cannot thrive long in the neighborhood of slavery.". The contradiction between calls for liberty and the continued existence of slavery also opened up the Patriots to charges of hypocrisy. In 1775, the English Tory writer Samuel Johnson asked, "How is it that we hear the loudest yelps for liberty among the drivers of negroes?"
In the late 1760s and early 1770s, several colonies, including Massachusetts and Virginia, attempted to restrict the slave trade, but were prevented from doing so by royally appointed governors. In 1774, as part of a broader non-importation movement aimed at Britain, the Continental Congress called on all the colonies to ban the importation of slaves, and the colonies passed acts doing so. In 1775, the Quakers founded first antislavery society in the Western world, the Pennsylvania Abolition Society.
In the first two decades after the American Revolution, state legislatures and individuals took actions to free slaves, in part based on revolutionary ideals. Northern states passed new constitutions that contained language about equal rights or specifically abolished slavery; some states, such as New York and New Jersey, where slavery was more widespread, passed laws by the end of the 18th century to abolish slavery by a gradual method. By 1804, all the northern states had passed laws outlawing slavery, either immediately or over time. In New York, the last slaves were freed in 1827. Indentured servitude (temporary slavery), which had been widespread in the colonies (half the population of Philadelphia had once been bonded servants) dropped dramatically, and disappeared by 1800.
No southern state abolished slavery, but for a period individual owners could free their slaves by personal decision, often providing for manumission in wills but sometimes filing deeds or court papers to free individuals. Numerous slaveholders who freed their slaves cited revolutionary ideals in their documents; others freed slaves as a reward for service. Records also suggest that some slaveholders were freeing their own mixed-race children, born into slavery to slave mothers. The number of free blacks as a proportion of the black population in the upper South increased from less than 1 percent to nearly 10 percent between 1790 and 1810 as a result of these actions. Nevertheless, slavery continued in the South, where it became a "peculiar institution", setting the stage for future sectional conflict between North and South over the issue.
Thousands of free Blacks in the northern states fought in the state militias and Continental Army. In the south, both sides offered freedom to slaves who would perform military service. Roughly 20,000 slaves fought in the American Revolution.
Status of American women.
The democratic ideals of the Revolution inspired changes in the roles of women.
The concept of republican motherhood was inspired by this period and reflects the importance of revolutionary republicanism as the dominant American ideology. It assumed that a successful republic rested upon the virtue of its citizens. Women were considered to have the essential role of instilling their children with values conducive to a healthy republic. During this period, the wife's relationship with her husband also became more liberal, as love and affection instead of obedience and subservience began to characterize the ideal marital relationship. In addition, many women contributed to the war effort through fundraising and running family businesses without their husbands.
The traditional constraints gave way to more liberal conditions for women. Young people had more freedom to choose their spouses and more often used birth control to regulate the size of their families. Society emphasized the role of mothers in child rearing, especially the patriotic goal of raising republican children rather than those locked into aristocratic value systems. There was more permissiveness in child-rearing. Patriot women married to Loyalists who left the state could get a divorce and obtain control of the ex-husband's property.
Whatever gains they had made, however, women still found themselves subordinated, legally and socially, to their husbands, disfranchised and usually with only the role of mother open to them. But, some women earned livelihoods as midwives and in other roles in the community not originally recognized as significant by men.
Abigail Adams expressed to her husband, the president, the desire of women to have a place in the new republic:
The Revolution sparked a discussion on the rights of woman and an environment favorable to women's participation in politics. Briefly the possibilities for women's rights were highly favorable, but a backlash led to a greater rigidity that excluded women from politics.
For more than thirty years, however, the 1776 New Jersey State Constitution gave the vote to "all inhabitants" who had a certain level of wealth, including unmarried women and blacks (not married women because they could not own property separately from their husbands), until in 1807, when that state legislature passed a bill interpreting the constitution to mean universal "white male" suffrage, excluding paupers.
Loyalist expatriation.
Tens of thousands of Loyalists left the United States following the war, and Maya Jasanoff estimates as many as 70,000. Some migrated to Britain, but the great majority received land and subsidies for resettlement in British colonies in North America, especially Quebec (concentrating in the Eastern Townships), Prince Edward Island, and Nova Scotia. Britain created the colonies of Upper Canada (Ontario) and New Brunswick expressly for their benefit, and the Crown awarded land to Loyalists as compensation for losses in the United States. Nevertheless, approximately eighty-five percent of the Loyalists stayed in the United States as American citizens, and some of the exiles later returned to the U.S. Patrick Henry spoke of the issue of allowing Loyalists to return as such: "Shall we, who have laid the proud British lion at our feet, be frightened of its whelps?" His actions helped secure return of the Loyalists to American soil.
Commemorations.
The American Revolution has a central place in the American memory as the story of the nation's founding. It is covered in the schools, memorialized by two national holidays, Washington's Birthday in February and Independence Day in July, and commemorated in innumerable monuments. George Washington's estate at Mount Vernon was one of the first national pilgrimages for tourists and attracted 10,000 visitors a year by the 1850s.
The Revolution became a matter of contention in the 1850s in the debates leading to the American Civil War (1861–1865), as spokesmen of both the Northern United States and the Southern United States claimed that their region was the true custodian of the legacy of 1776. The United States Bicentennial in 1976 came a year after the American withdrawal from the Vietnam War, and speakers stressed the themes of renewal and rebirth based on a restoration of traditional values.
Today, more than 100 are protected and maintained by the government. The National Park Service alone owns and maintains more than 50 battlefield parks and many other sites such as Independence Hall that are related to the Revolution, as well as the residences, workplaces and meeting places of many Founders and other important figures. The private American Battlefield Trust uses government grants and other funds to preserve almost 700 acres of battlefield land in six states, and the ambitious private recreation/restoration/preservation/interpretation of over 300 acres of pre-1790 Colonial Williamsburg was created in the first half of the 20th century for public visitation.
|
1975 | Alan Ayckbourn | Sir Alan Ayckbourn (born 12 April 1939) is a prolific British playwright and director. He has written and produced as of 2023, 89 full-length plays in Scarborough and London and was, between 1972 and 2009, the artistic director of the Stephen Joseph Theatre in Scarborough, where all but four of his plays have received their first performance. More than 40 have subsequently been produced in the West End, at the Royal National Theatre or by the Royal Shakespeare Company since his first hit "Relatively Speaking" opened at the Duke of York's Theatre in 1967.
Major successes include "Absurd Person Singular" (1975), "The Norman Conquests" trilogy (1973), "Bedroom Farce" (1975), "Just Between Ourselves" (1976), "A Chorus of Disapproval" (1984), "Woman in Mind" (1985), "A Small Family Business" (1987), "Man of the Moment" (1988), "House" & "Garden" (1999) and "Private Fears in Public Places" (2004). His plays have won numerous awards, including seven London "Evening Standard" Awards. They have been translated into over 35 languages and are performed on stage and television throughout the world. Ten of his plays have been staged on Broadway, attracting two Tony nominations, and one Tony award.
Life.
Childhood.
Ayckbourn was born in Hampstead, London. His mother, Irene Worley ("Lolly") (1906–1998), was a writer of short stories who published under the name "Mary James". His father, Horace Ayckbourn (1904–1965), was an orchestral violinist and was the lead violinist at the London Symphony Orchestra. His parents, who separated shortly after World War II, never married, and Ayckbourn's mother divorced her first husband to marry again in 1948.
Ayckbourn wrote his first play at Wisborough Lodge (a preparatory school in the village of Wisborough Green) when he was about 10. While he was at prep school as a boarder, his mother wrote to tell him she was marrying Cecil Pye, a bank manager. His new family consisted of his mother, his stepfather and Christopher, his stepfather's son by an earlier marriage. This relationship too, reportedly ran into difficulties early on.
Ayckbourn attended Haileybury and Imperial Service College, in the village of Hertford Heath and, while there, he toured Europe and America with the school's Shakespeare company.
Adult life.
After leaving school at 17, Ayckbourn took several temporary jobs in various places before starting a temporary position at the Scarborough Library Theatre, where he was introduced to the artistic director, Stephen Joseph. It is said that Joseph became both a mentor and father figure for Ayckbourn until his untimely death in 1967, and Ayckbourn has consistently spoken highly of him.
Ayckbourn's career was briefly interrupted when he was called up for National Service. He was swiftly discharged, officially on medical grounds, but it is suggested that a doctor who noticed his reluctance to join the Armed Forces deliberately failed the medical as a favour. Although Ayckbourn continued to move wherever his career took him, he settled in Scarborough, eventually buying Longwestgate House, which had previously been owned by his mentor, Joseph.
In 1957, Ayckbourn married Christine Roland, another member of the Library Theatre company. Ayckbourn's first two plays were, in fact, written jointly with her under the pseudonym of "Roland Allen". They had two sons, Steven and Philip. However, the marriage had difficulties, which eventually led to their separation in 1971. Ayckbourn said that his relationship with Roland became easy once they agreed their marriage was over. About this time, he shared a home with Heather Stoney, an actress he had first met ten years earlier. Like his mother, neither he nor Roland sought an immediate divorce and it was not until thirty years later, in 1997, that they were formally divorced and Ayckbourn married Stoney. One side effect of the timing is that, when Ayckbourn was awarded a knighthood a few months before the divorce, both his first and second wives were entitled to take the title of Lady Ayckbourn.
In February 2006, he suffered a stroke in Scarborough, and stated: "I hope to be back on my feet, or should I say my left leg, as soon as possible, but I know it is going to take some time. In the meantime I am in excellent hands and so is the Stephen Joseph Theatre." He left hospital after eight weeks and returned to directing after six months. The following year, Ayckbourn announced he would step down as artistic director of the Stephen Joseph Theatre. He continues, however, to write and direct his own work at the theatre.
Influence on plays.
Since the time Ayckbourn's plays became established in the West End, interviewers have raised the question of whether his work is autobiographical. There is no clear answer to this question. There has been only one biography, written by Paul Allen, which primarily covers his career in the theatre. Ayckbourn has frequently said he sees aspects of himself in all of his characters. In "Bedroom Farce" (1975), for example, he admitted to being, in some respects, all four of the men in the play. It has been suggested that, after Ayckbourn himself, the person who is used most often in his plays is his mother, particularly as Susan in "Woman in Mind" (1985).
What is less clear is the extent to which events in Ayckbourn's life have influenced his writing. It is true that the theme of marriages in difficulty was heavily present throughout his plays in the early seventies, at about the time his own marriage was coming to an end. However, by that time, he had also witnessed the failure of his parents' relationships and those of some of his friends. Which relationships, if any, he drew on for his plays, is unclear. In Paul Allen's biography, Ayckbourn is briefly compared with Dafydd and Guy in "A Chorus of Disapproval" (1984). Both characters feel themselves to be in trouble and there was speculation that Ayckbourn himself might have felt the same way. At the time, he had reportedly become seriously involved with another actress, which threatened his relationship with Stoney. It is unclear whether this had any effect on the writing; Paul Allen's view is that Ayckbourn did not use his personal experiences to write his plays.
It is possible that Ayckbourn wrote plays with himself and his own situation in mind but, as Ayckbourn is portrayed as a guarded and private man, it is hard to imagine him exposing his own life in his plays to any great degree. In the biography, Paul Allen writes, with regard to a suggestion in "Cosmopolitan" that Ayckbourn’s plays were becoming autobiographical: "If we take that to mean that his plays tell his own life story, he still hasn't started."
Career.
Early career and acting.
On leaving school, Ayckbourn’s theatrical career began immediately, when his French master introduced him to Sir Donald Wolfit. Ayckbourn joined Wolfit on tour to the Edinburgh Festival Fringe as an acting assistant stage manager (a role that involved both acting and stage management) for three weeks. His first experiences on the professional stage were various roles in "The Strong are Lonely" by Fritz Hochwälder. In the following year, Ayckbourn appeared in six other plays at the Connaught Theatre, Worthing and the Thorndike theatre, Leatherhead.
In 1957, Ayckbourn was employed by the director Stephen Joseph at the Library Theatre, Scarborough, the predecessor to the modern Stephen Joseph Theatre. Again, his role was initially as acting stage manager. This employment led to Ayckbourn's first professional script commission, in 1958. When he complained about the quality of a script he was performing, Joseph challenged him to write a better one. The result was "The Square Cat", written under the pseudonym Roland Allen and first performed in 1959. In this play, Ayckbourn himself played the character of Jerry Watiss.
In 1962, after thirty-four appearances in plays at the Library Theatre, including four of his own, Ayckbourn moved to Stoke-on-Trent to help set up the Victoria Theatre (now the New Vic), where he appeared in a further eighteen plays. His final appearance in one of his own plays was as the Crimson Gollywog in the disastrous children's play "Christmas v Mastermind". He left the Stoke company in 1964, officially to commit his time to the London production of "Mr. Whatnot", but reportedly because was having trouble working with the artistic director, Peter Cheeseman. By now, his career as a writer was coming to fruition and his acting career was sidelined.
His final role on stage was as Jerry in "Two for the Seesaw" by William Gibson, at the Civic Theatre in Rotherham. He was left stranded on stage because Heather Stoney (his future wife) was unable to re-appear due to her props not being ready for use. This led to his conclusion that acting was more trouble than it was worth. The assistant stage manager on the production, Bill Kenwright, would go on to become one of the UK's most successful producers.
Writing.
Ayckbourn's earliest plays were written and produced at a time when the Scarborough Library theatre, like most regional theatres, regularly commissioned work from their own actors to keep costs down. Another actor whose work was being commissioned was David Campton). Ayckbourn’s first play, "The Square Cat", was sufficiently popular locally to secure further commissions, although neither this nor the following three plays had much impact beyond Scarborough. After his transfer to Victoria Theatre in Stoke-on-Trent, "Christmas v Mastermind", flopped; this play is now universally regarded as Ayckbourn's greatest disaster.
Ayckbourn’s fortunes revived in 1963 with "Mr. Whatnot", which also premiered at the Victoria Theatre. This was the first play that Ayckbourn was sufficiently happy with to allow performances today, and the first play to receive a West End performance. However, the West End production flopped, in part due to misguided casting. After this, Ayckbourn experimented by collaborating with comedians, first writing a monologue for Tommy Cooper, and later with Ronnie Barker, who played Lord Slingsby-Craddock in the London production of "Mr Whatnot" in 1964, on the scripts for LWT's "Hark at Barker". Ayckbourn used the pseudonym Peter Caulfield because he was under exclusive contract to the BBC at the time.
In 1965, back at the Scarborough Library Theatre, "Meet my Father" was produced, and later retitled "Relatively Speaking". This time, the play was a massive success, both in Scarborough and in the West End, earning Ayckbourn a congratulatory telegram from Noël Coward. This was not quite the end of Ayckbourn's hit-and-miss record. His next play, "The Sparrow" ran for only three weeks at Scarborough but the following play, "How the Other Half Loves", secured his runaway success as a playwright.
The height of Ayckbourn's commercial success came with plays such as "Absurd Person Singular" (1975), "The Norman Conquests" trilogy (1973), "Bedroom Farce" (1975) and "Just Between Ourselves" (1976). These plays focused heavily on marriage in the British middle classes. The only failure during this period was a 1975 musical with Andrew Lloyd Webber, "Jeeves"; even this did little to dent Ayckbourn's career.
From the 1980s, Ayckbourn moved away from the recurring theme of marriage to explore other contemporary issues. One example was "Woman in Mind", a play performed entirely from the perspective of a woman going through a nervous breakdown. He also experimented with unconventional ways of writing plays: "Intimate Exchanges", for example, has one beginning and sixteen possible endings, and in "House & Garden", two plays take place simultaneously on two separate stages. He also diversified into children's theatre, such as "Mr A's Amazing Maze Plays" and musical plays, such as "By Jeeves" (a more successful rewrite of the original "Jeeves").
With a résumé of over seventy plays, of which more than forty have played at the National Theatre or in the West End, Alan Ayckbourn is one of England's most successful living playwrights. Despite his success, honours and awards (which include a prestigious Laurence Olivier Award), Alan Ayckbourn remains a relatively anonymous figure, dedicated to regional theatre. Throughout his writing career, all but four of his plays premiered at the Stephen Joseph Theatre in Scarborough in its three different locations.
Ayckbourn received the CBE in 1987 and was knighted in the 1997 New Year Honours. It is frequently claimed (but not proved) that Alan Ayckbourn is the most performed living English playwright, and the second most performed of all time, after Shakespeare.
Although Ayckbourn's plays no longer dominate the theatrical scene on the scale of his earlier works, he continues to write. Among major success has been "Private Fears in Public Places", which had a hugely successful Off-Broadway run at 59E59 Theaters and, in 2006, was made into a film, "Cœurs", directed by Alain Resnais. After Ayckbourn suffered a stroke, there was uncertainty as to whether he could continue to write. The play that premiered immediately after his stroke, "If I Were You", had been written before his illness; the first play written afterwards, "Life and Beth", premiered in the summer of 2008. Ayckbourn continues to write for the Stephen Joseph Theatre on the invitation of his successor as artistic director, Chris Monks. The first new play under this arrangement, "My Wonderful Day", was performed in October 2009.
Ayckbourn continues to experiment with theatrical form. The play "Roundelay" opened in September 2014; before each performance, members of the audience are invited to extract five coloured ping pong balls from a bag, leaving the order in which each of the five acts is played left to chance, and allowing 120 possible permutations. In "Arrivals and Departures" (2013), the first half of the play is told from the point of view of one character, only for the second half to dramatise the same events from the point of view of another.
Many of Ayckbourn's plays, including "Private Fears in Public Places", "Intimate Exchanges", "My Wonderful Day" and "Neighbourhood Watch", have had their New York premiere at 59E59 Theaters as part of the annual Brits Off Broadway Festival.
In 2019, Ayckbourn had published his first novel, "The Divide", which had previously been showcased during a reading at the Stephen Joseph Theatre.
As a consequence of the Covid lockdown, Ayckbourn's 2020 play, "Anno Domino," was recorded as a radio production, with Ayckbourn and his wife Heather playing all the roles. Similarly, Ayckbourn's Covid-period 2021 play, "The Girl Next Door", was streamed online and made available behind a paywall on the Stephen Joseph Theatre's website.
In 2022, the first Ayckbourn play in around 60 years premiered in a venue other than Scarborough: "All Lies" at the Old Laundry in Bowness-on-Windermere.
Directing.
Although Ayckbourn is best known as a writer, it is said that he only spends 10% of his time writing plays. Most of the remaining time is spent directing.
Ayckbourn began directing at the Scarborough Library Theatre in 1961, with a production of "Gaslight" by Patrick Hamilton. During that year and the next, he directed five other plays in Scarborough and, after transferring to the Victoria Theatre, in 1963 directed a further six plays. Between 1964 and 1967, much of his time was taken up by various productions of his early successes, "Mr. Whatnot" and "Relatively Speaking" and he directed only one play, "The Sparrow", which he wrote and which was later withdrawn. In 1968, he resumed directing plays regularly, mostly at Scarborough. At this time he also worked as a radio drama producer for the BBC, based in Leeds.
At first, his directing career was kept separate from his writing career. It was not until 1963 that Ayckbourn directed a play of his own (a revival of "Standing Room Only") and 1967 before he directed a premiere of his own ("The Sparrow"). The London premieres remained in the hands of other directors for longer; the first of his own plays to be directed by him in London was "Bedroom Farce," in 1977.
After the death of Stephen Joseph in 1967, the Director of Productions was appointed on an annual basis. Ayckbourn was offered the position in 1969 and 1970, succeeding Rodney Wood, but he handed the position over to Caroline Smith in 1971, having spent most that year in the US with "How the Other Half Loves". He became Director of Productions again in 1972 and, on 12 November of that year, he was made the permanent artistic director of the theatre.
In mid-1986, Ayckbourn accepted an invitation to work as a visiting director for two years at the National Theatre in London, to form his own company, and perform a play in each of the three auditoria, provided at least one was a new play of his own. He used a stock company that included performers such as Michael Gambon, Polly Adams and Simon Cadell. The three plays became four: "Tons of Money" by Will Evans and Valentine, with adaptations by Ayckbourn (Lyttelton); Arthur Miller's "A View From the Bridge" (Cottesloe); his own play "A Small Family Business" (Olivier) and John Ford's "'Tis Pity She's a Whore" (Olivier again). During this time, Ayckbourn shared his role of artistic director of the Stephen Joseph Theatre with Robin Herford and returned in 1987 to direct the premiere of "Henceforward...".
He announced in 1999 that he would step back from directing the work of other playwrights, to concentrate on his own plays, the last one being Rob Shearman's "Knights in Plastic Armour" in 1999; he made one exception in 2002, when he directed the world premiere of Tim Firth's "The Safari Party".
In 2002, following a dispute over the Duchess Theatre's handling of "Damsels in Distress", Ayckbourn sharply criticised both this and the West End's treatment of theatre in general and, in particular, their casting of celebrities. Although he did not explicitly say he would boycott the West End, he did not return to direct in there again until 2009, with a revival of "Woman in Mind". He did, however, allow other West End producers to revive "Absurd Person Singular" in 2007 and "The Norman Conquests" in 2008.
Ayckbourn suffered a stroke in February 2006 and returned to work in September; the premiere of his 70th play "If I Were You" at the Stephen Joseph Theatre came the following month.
He announced in June 2007 that he would retire as artistic director of the Stephen Joseph Theatre after the 2008 season. His successor, Chris Monks, took over at the start of the 2009–2010 season but Ayckbourn remained to direct premieres and revivals of his work at the theatre, beginning with "How the Other Half Loves" in June 2009.
In March 2010, he directed an in-the-round revival of his play "Taking Steps" at the Orange Tree Theatre, winning universal press acclaim.
In July 2014, Ayckbourn directed a musical adaptation of "The Boy Who Fell into A Book", with musical adaptation and lyrics by Paul James and music by Eric Angus and Cathy Shostak. The show ran in The Stephen Joseph Theatre and received critical acclaim.
Honours and awards.
Ayckbourn also sits on the Council of the Society of Authors.
Works.
One-act plays.
Alan Ayckbourn has written eight one-act plays. Five of them ("Mother Figure", "Drinking Companion", "Between Mouthfuls", "Gosforth's Fete" and "Widows Might") were written for "Confusions", first performed in 1974.
The other three one-act plays are:
Film adaptations of Ayckbourn plays.
Plays adapted as films include:
|
1979 | Alpha Centauri | Alpha Centauri (α Centauri, Alpha Cen, or α Cen) is a triple star system in the southern constellation of Centaurus. It consists of three stars: Rigil Kentaurus (Alpha Centauri A), Toliman (B) and Proxima Centauri (C). Proxima Centauri is also the closest star to the Sun at 4.2465 light-years (1.3020 pc).
Alpha Centauri A and B are Sun-like stars (Class G and K, respectively), and together they form the binary star system Alpha Centauri AB. To the naked eye, the two main components appear to be a single star with an apparent magnitude of −0.27. It is the brightest star in the constellation and the third-brightest in the night sky, outshone only by Sirius and Canopus.
Alpha Centauri A has 1.1 times the mass and 1.5 times the luminosity of the Sun, while Alpha Centauri B is smaller and cooler, at 0.9 times the Sun's mass and less than 0.5 times its luminosity. The pair orbit around a common centre with an orbital period of 79 years. Their elliptical orbit is eccentric, so that the distance between A and B varies from 35.6 astronomical units (AU), or about the distance between Pluto and the Sun, to 11.2 AU, or about the distance between Saturn and the Sun.
Alpha Centauri C, or Proxima Centauri, is a small faint red dwarf (Class M). Though not visible to the naked eye, Proxima Centauri is the closest star to the Sun at a distance of , slightly closer than Alpha Centauri AB. Currently, the distance between Proxima Centauri and Alpha Centauri AB is about , equivalent to about 430 times the radius of Neptune's orbit.
Proxima Centauri has two confirmed planets: Proxima b, an Earth-sized planet in the habitable zone discovered in 2016, and Proxima d, a candidate sub-Earth which orbits very closely to the star, announced in 2022. The existence of Proxima c, a mini-Neptune 1.5 AU away discovered in 2019, is controversial. Alpha Centauri A may have a candidate Neptune-sized planet in the habitable zone, though it is not yet known to be planetary in nature and could be an artifact of the discovery mechanism. Alpha Centauri B has no known planets: planet Bb, purportedly discovered in 2012, was later disproven, and no other planet has yet been confirmed.
Etymology and nomenclature.
"α Centauri" (Latinised to "Alpha Centauri") is the system's designation given by Johann Bayer in 1603. It bears the traditional name "Rigil Kentaurus", which is a Latinisation of the Arabic name "Rijl al-Qinṭūrus," meaning 'the Foot of the Centaur'. The name is frequently abbreviated to "Rigil Kent" or even "Rigil", though the latter name is better known for Rigel (Beta Orionis).
An alternative name found in European sources, "Toliman", is an approximation of the Arabic "aẓ-Ẓalīmān" (in older transcription, "aṭ-Ṭhalīmān"), meaning 'the (two male) Ostriches', an appellation Zakariya al-Qazwini had applied to Lambda and Mu Sagittarii, also in the southern hemisphere.
A third name that has been used is "Bungula" (). Its origin is not known, but it may have been coined from the Greek letter beta (β) and Latin 'hoof'.
Alpha Centauri C was discovered in 1915 by Robert T. A. Innes, who suggested that it be named "Proxima Centaurus", . The name "Proxima Centauri" later became more widely used and is now listed by the International Astronomical Union (IAU) as the approved proper name.
In 2016, the Working Group on Star Names of the IAU, having decided to attribute proper names to individual component stars rather than to multiple systems, approved the name "Rigil Kentaurus" () as being restricted to "Alpha Centauri A" and the name "Proxima Centauri" () for "Alpha Centauri C". On 10 August 2018, the IAU approved the name "Toliman" () for "Alpha Centauri B".
Observation.
To the naked eye, Alpha Centauri AB appears to be a single star, the brightest in the southern constellation of Centaurus. Their apparent angular separation varies over about 80 years between 2 and 22 arcsec (the naked eye has a resolution of 60 arcsec), but through much of the orbit, both are easily resolved in binoculars or small telescopes. At −0.27 apparent magnitude (combined for A and B magnitudes), Alpha Centauri is a first-magnitude star and is fainter only than Sirius and Canopus. It is the outer star of "The Pointers" or "The Southern Pointers", so called because the line through Beta Centauri (Hadar/Agena), some 4.5° west, points to the constellation Crux — the Southern Cross. The Pointers easily distinguish the true Southern Cross from the fainter asterism known as the False Cross.
South of about 29° South latitude, Alpha Centauri is circumpolar and never sets below the horizon. North of about 29° N latitude, Alpha Centauri never rises. Alpha Centauri lies close to the southern horizon when viewed from the 29° North latitude to the equator (close to Hermosillo and Chihuahua City in Mexico; Galveston, Texas; Ocala, Florida; and Lanzarote, the Canary Islands of Spain), but only for a short time around its culmination. The star culminates each year at local midnight on 24 April and at local 9 p.m. on 8 June.
As seen from Earth, Proxima Centauri is 2.2° southwest from Alpha Centauri AB, about four times the angular diameter of the Moon. Proxima Centauri appears as a deep-red star of a typical apparent magnitude of 11.1 in a sparsely populated star field, requiring moderately sized telescopes to be seen. Listed as V645 Cen in the "General Catalogue of Variable Stars Version 4.2", this UV Ceti-type flare star can unexpectedly brighten rapidly by as much as 0.6 magnitude at visual wavelengths, then fade after only a few minutes. Some amateur and professional astronomers regularly monitor for outbursts using either optical or radio telescopes. In August 2015, the largest recorded flares of the star occurred, with the star becoming 8.3 times brighter than normal on 13 August, in the B band (blue light region).
Alpha Centauri may be inside the G-cloud of the Local Bubble, and its nearest known system is the binary brown dwarf system Luhman 16 at .
Observational history.
Alpha Centauri is listed in the 2nd-century star catalog of Ptolemy. He gave its ecliptic coordinates, but texts differ as to whether the ecliptic latitude reads or . (Presently the ecliptic latitude is , but it has decreased by a fraction of a degree since Ptolemy's time due to proper motion.) In Ptolemy's time, Alpha Centauri was visible from Alexandria, Egypt, at but, due to precession, its declination is now , and it can no longer be seen at that latitude. English explorer Robert Hues brought Alpha Centauri to the attention of European observers in his 1592 work "Tractatus de Globis", along with Canopus and Achernar, noting:
The binary nature of Alpha Centauri AB was recognized in December 1689 by Jean Richaud, while observing a passing comet from his station in Puducherry. Alpha Centauri was only the second binary star to be discovered, preceded by Acrux.
The large proper motion of Alpha Centauri AB was discovered by Manuel John Johnson, observing from Saint Helena, who informed Thomas Henderson at the Royal Observatory, Cape of Good Hope of it. The parallax of Alpha Centauri was subsequently determined by Henderson from many exacting positional observations of the AB system between April 1832 and May 1833. He withheld his results, however, because he suspected they were too large to be true, but eventually published them in 1839 after Friedrich Wilhelm Bessel released his own accurately determined parallax for 61 Cygni in 1838. For this reason, Alpha Centauri is sometimes considered as the second star to have its distance measured because Henderson's work was not fully acknowledged at first. (The distance of Alpha Centauri from the Earth is now reckoned at .)
Later, John Herschel made the first micrometrical observations in 1834. Since the early 20th century, measures have been made with photographic plates.
By 1926, William Stephen Finsen calculated the approximate orbit elements close to those now accepted for this system. All future positions are now sufficiently accurate for visual observers to determine the relative places of the stars from a binary star ephemeris. Others, like D. Pourbaix (2002), have regularly refined the precision of new published orbital elements.
Robert T. A. Innes discovered Proxima Centauri in 1915 by blinking photographic plates taken at different times during a proper motion survey. These showed large proper motion and parallax similar in both size and direction to those of Alpha Centauri AB, which suggested that Proxima Centauri is part of the Alpha Centauri system and slightly closer to Earth than Alpha Centauri AB. As such, Innes concluded that Proxima Centauri was the closest star to Earth yet discovered.
Kinematics.
All components of Alpha Centauri display significant proper motion against the background sky. Over centuries, this causes their apparent positions to slowly change. Proper motion was unknown to ancient astronomers. Most assumed that the stars were permanently fixed on the celestial sphere, as stated in the works of the philosopher Aristotle. In 1718, Edmond Halley found that some stars had significantly moved from their ancient astrometric positions.
In the 1830s, Thomas Henderson discovered the true distance to Alpha Centauri by analysing his many astrometric mural circle observations. He then realised this system also likely had a high proper motion. In this case, the apparent stellar motion was found using Nicolas Louis de Lacaille's astrometric observations of 1751–1752, by the observed differences between the two measured positions in different epochs.
Calculated proper motion of the centre of mass for Alpha Centauri AB is about 3620 mas/y (milliarcseconds per year) toward the west and 694 mas/y toward the north, giving an overall motion of 3686 mas/y in a direction 11° north of west. The motion of the centre of mass is about 6.1 arcmin each century, or 1.02° each millennium. The speed in the western direction is and in the northerly direction . Using spectroscopy the mean radial velocity has been determined to be around towards the Solar System. This gives a speed with respect to the Sun of , very close to the peak in the distribution of speeds of nearby stars.
Since Alpha Centauri AB is almost exactly in the plane of the Milky Way as viewed from Earth, many stars appear behind it. In early May 2028, Alpha Centauri A will pass between the Earth and a distant red star, when there is a 45% probability that an Einstein ring will be observed. Other conjunctions will also occur in the coming decades, allowing accurate measurement of proper motions and possibly giving information on planets.
Predicted future changes.
Based on the system's common proper motion and radial velocities, Alpha Centauri will continue to change its position in the sky significantly and will gradually brighten. For example, in about 6,200 AD, α Centauri's true motion will cause an extremely rare first-magnitude stellar conjunction with Beta Centauri, forming a brilliant optical double star in the southern sky. It will then pass just north of the Southern Cross or Crux, before moving northwest and up towards the present celestial equator and away from the galactic plane. By about 26,700 AD, in the present-day constellation of Hydra, Alpha Centauri will reach perihelion at away, though later calculations suggest that this will occur in 27,000 AD. At nearest approach, Alpha Centauri will attain a maximum apparent magnitude of −0.86, comparable to present-day magnitude of Canopus, but it will still not surpass that of Sirius, which will brighten incrementally over the next 60,000 years, and will continue to be the brightest star as seen from Earth (other than the Sun) for the next 210,000 years.
Stellar system.
Alpha Centauri is a triple star system, with its two main stars, Alpha Centauri A and Alpha Centauri B, together comprising a binary component. The "AB" designation, or older "A×B", denotes the mass centre of a main binary system relative to companion star(s) in a multiple star system. "AB-C" refers to the component of Proxima Centauri in relation to the central binary, being the distance between the centre of mass and the outlying companion. Because the distance between Proxima (C) and either of Alpha Centauri A or B is similar, the AB binary system is sometimes treated as a single gravitational object.
Orbital properties.
The A and B components of Alpha Centauri have an orbital period of 79.762 years. Their orbit is moderately eccentric, as it has an eccentricity of almost 0.52; their closest approach or periastron is , or about the distance between the Sun and Saturn; and their furthest separation or apastron is , about the distance between the Sun and Pluto. The most recent periastron was in August 1955 and the next will occur in May 2035; the most recent apastron was in May 1995 and will next occur in 2075.
Viewed from Earth, the "apparent orbit" of A and B means that their separation and position angle (PA) are in continuous change throughout their projected orbit. Observed stellar positions in 2019 are separated by 4.92 arcsec through the PA of 337.1°, increasing to 5.49 arcsec through 345.3° in 2020. The closest recent approach was in February 2016, at 4.0 arcsec through the PA of 300°. The observed maximum separation of these stars is about 22 arcsec, while the minimum distance is 1.7 arcsec. The widest separation occurred during February 1976, and the next will be in January 2056.
Alpha Centauri C is about from Alpha Centauri AB, equivalent to about 5% of the distance between Alpha Centauri AB and the Sun. Until 2017, measurements of its small speed and its trajectory were of too little accuracy and duration in years to determine whether it is bound to Alpha Centauri AB or unrelated.
Radial velocity measurements made in 2017 were precise enough to show that Proxima Centauri and Alpha Centauri AB are gravitationally bound. The orbital period of Proxima Centauri is approximately years, with an eccentricity of 0.5, much more eccentric than Mercury's. Proxima Centauri comes within of AB at periastron, and its apastron occurs at .
Physical properties.
Asteroseismic studies, chromospheric activity, and stellar rotation (gyrochronology) are all consistent with the Alpha Centauri system being similar in age to, or slightly older than, the Sun. Asteroseismic analyses that incorporate tight observational constraints on the stellar parameters for the Alpha Centauri stars have yielded age estimates of Gyr, Gyr, 5.2 ± 1.9 Gyr, 6.4 Gyr, and Gyr. Age estimates for the stars based on chromospheric activity (Calcium H & K emission) yield 4.4 ± 2.1 Gyr, whereas gyrochronology yields Gyr. Stellar evolution theory implies both stars are slightly older than the Sun at 5 to 6 billion years, as derived by their mass and spectral characteristics.
From the orbital elements, the total mass of Alpha Centauri AB is about – or twice that of the Sun. The average individual stellar masses are about and , respectively, though slightly different masses have also been quoted in recent years, such as and , totalling . Alpha Centauri A and B have absolute magnitudes of +4.38 and +5.71, respectively.
Alpha Centauri AB System.
Alpha Centauri A.
Alpha Centauri A, also known as Rigil Kentaurus, is the principal member, or primary, of the binary system. It is a solar-like main-sequence star with a similar yellowish colour, whose stellar classification is spectral type G2-V; it is about 10% more massive than the Sun, with a radius about 22% larger. When considered among the individual brightest stars in the night sky, it is the fourth-brightest at an apparent magnitude of +0.01, being slightly fainter than Arcturus at an apparent magnitude of −0.05.
The type of magnetic activity on Alpha Centauri A is comparable to that of the Sun, showing coronal variability due to star spots, as modulated by the rotation of the star. However, since 2005 the activity level has fallen into a deep minimum that might be similar to the Sun's historical Maunder Minimum. Alternatively, it may have a very long stellar activity cycle and is slowly recovering from a minimum phase.
Alpha Centauri B.
Alpha Centauri B, also known as Toliman, is the secondary star of the binary system. It is a main-sequence star of spectral type K1-V, making it more an orange colour than Alpha Centauri A; it has around 90% of the mass of the Sun and a 14% smaller diameter. Although it has a lower luminosity than A, Alpha Centauri B emits more energy in the X-ray band. Its light curve varies on a short time scale, and there has been at least one observed flare. It is more magnetically active than Alpha Centauri A, showing a cycle of compared to 11 years for the Sun, and has about half the minimum-to-peak variation in coronal luminosity of the Sun. Alpha Centauri B has an apparent magnitude of +1.35, slightly dimmer than Mimosa.
Alpha Centauri C (Proxima Centauri).
Alpha Centauri C, better known as Proxima Centauri, is a small main-sequence red dwarf of spectral class M6-Ve. It has an absolute magnitude of +15.60, over 20,000 times fainter than the Sun. Its mass is calculated to be . It is the closest star to the Sun but is too faint to be visible to the naked eye.
Planetary system.
The Alpha Centauri system as a whole has two confirmed planets, both of them around Proxima Centauri. While other planets have been claimed to exist around all of the stars, none of the discoveries have been confirmed.
Planets of Proxima Centauri.
Proxima Centauri b is a terrestrial planet discovered in 2016 by astronomers at the European Southern Observatory (ESO). It has an estimated minimum mass of 1.17 (Earth masses) and orbits approximately 0.049 AU from Proxima Centauri, placing it in the star's habitable zone.
Proxima Centauri c is a planet that was formally published in 2020 and could be a super-Earth or mini-Neptune. It has a mass of roughly 7 and orbits about 1.49 AU from Proxima Centauri with a period of . In June 2020, a possible direct imaging detection of the planet hinted at the potential presence of a large ring system. However, a 2022 study disputed the existence of this planet.
A 2020 paper refining Proxima b's mass excludes the presence of extra companions with masses above at periods shorter than 50 days, but the authors detected a radial-velocity curve with a periodicity of 5.15 days, suggesting the presence of a planet with a mass of about . This planet, Proxima Centauri d, was confirmed in 2022.
Planets of Alpha Centauri A.
In 2021, a candidate planet named Candidate 1 (abbreviated as C1) was detected around Alpha Centauri A, thought to orbit at approximately 1.1 AU with a period of about one year, and to have a mass between that of Neptune and one-half that of Saturn, though it may be a dust disk or an artifact. The possibility of C1 being a background star has been ruled out. If this candidate is confirmed, the temporary name C1 will most likely be replaced with the scientific designation Alpha Centauri Ab in accordance with current naming conventions.
GO Cycle 1 observations are planned for the James Webb Space Telescope (JWST) to search for planets around Alpha Centauri A. The observations are planned to occur at a date between July and August 2023. Pre-launch estimates predicted that JWST will be able to find planets with a radius of 5 at 1–3 au. Multiple observations every 3–6 months could push the limit down to 3 . Post-processing techniques could push the limit down to 0.5 to 0.7 . Post-launch estimates based on observations of HIP 65426 b find that JWST will be able to find planets even closer to Alpha Centauri A and could find a 5 planet at 0.5 to 2.5 au. Candidate 1 has an estimated radius between 3.3 and 11 and orbits at 1.1 au. It is therefore likely within the reach of JWST observations.
Planets of Alpha Centauri B.
In 2012, a planet around Alpha Centauri B was reported, Alpha Centauri Bb, but in 2015 a new analysis concluded that that report was an artifact of the datum analysis.
A possible transit-like event was observed in 2013, which could be associated with a separate planet. The transit event could correspond to a planetary body with a radius around . This planet would most likely orbit Alpha Centauri B with an orbital period of 20.4 days or less, with only a 5% chance of it having a longer orbit. The median of the likely orbits is 12.4 days. Its orbit would likely have an eccentricity of 0.24 or less. It could have lakes of molten lava and would be far too close to Alpha Centauri B to harbour life. If confirmed, this planet might be called Alpha Centauri Bc. However, the name has not been used in the literature, as it is not a claimed discovery. , it appears that no further transit-like events have been observed.
Hypothetical planets.
Additional planets may exist in the Alpha Centauri system, either orbiting Alpha Centauri A or Alpha Centauri B individually, or in large orbits around Alpha Centauri AB. Because both stars are fairly similar to the Sun (for example, in age and metallicity), astronomers have been especially interested in making detailed searches for planets in the Alpha Centauri system. Several established planet-hunting teams have used various radial velocity or star transit methods in their searches around these two bright stars. All the observational studies have so far failed to find evidence for brown dwarfs or gas giants.
In 2009, computer simulations showed that a planet might have been able to form near the inner edge of Alpha Centauri B's habitable zone, which extends from 0.5 to 0.9 AU from the star. Certain special assumptions, such as considering that the Alpha Centauri pair may have initially formed with a wider separation and later moved closer to each other (as might be possible if they formed in a dense star cluster), would permit an accretion-friendly environment farther from the star. Bodies around Alpha Centauri A would be able to orbit at slightly farther distances due to its stronger gravity. In addition, the lack of any brown dwarfs or gas giants in close orbits around Alpha Centauri make the likelihood of terrestrial planets greater than otherwise. A theoretical study indicates that a radial velocity analysis might detect a hypothetical planet of in Alpha Centauri B's habitable zone.
Radial velocity measurements of Alpha Centauri B made with the High Accuracy Radial Velocity Planet Searcher spectrograph were sufficiently sensitive to detect a planet within the habitable zone of the star (i.e. with an orbital period P = 200 days), but no planets were detected.
Current estimates place the probability of finding an Earth-like planet around Alpha Centauri at roughly 75%. The observational thresholds for planet detection in the habitable zones by the radial velocity method are currently (2017) estimated to be about for Alpha Centauri A, for Alpha Centauri B, and for Proxima Centauri.
Early computer-generated models of planetary formation predicted the existence of terrestrial planets around both Alpha Centauri A and B, but most recent numerical investigations have shown that the gravitational pull of the companion star renders the accretion of planets difficult. Despite these difficulties, given the similarities to the Sun in spectral types, star type, age and probable stability of the orbits, it has been suggested that this stellar system could hold one of the best possibilities for harbouring extraterrestrial life on a potential planet.
In the Solar System, it was once thought that Jupiter and Saturn were probably crucial in perturbing comets into the inner Solar System, providing the inner planets with a source of water and various other ices. However, since isotope measurements of the deuterium to hydrogen (D/H) ratio in comets Halley, Hyakutake, Hale–Bopp, 2002T7, and Tuttle yield values approximately twice that of Earth's oceanic water, more recent models and research predict that less than 10% of Earth's water was supplied from comets. In the Alpha Centauri system, Proxima Centauri may have influenced the planetary disk as the Alpha Centauri system was forming, enriching the area around Alpha Centauri with volatile materials. This would be discounted if, for example, Alpha Centauri B happened to have gas giants orbiting Alpha Centauri A (or vice versa), or if Alpha Centauri A and B themselves were able to perturb comets into each other's inner systems as Jupiter and Saturn presumably have done in the Solar System. Such icy bodies probably also reside in Oort clouds of other planetary systems. When they are influenced gravitationally by either the gas giants or disruptions by passing nearby stars, many of these icy bodies then travel star-wards. Such ideas also apply to the close approach of Alpha Centauri or other stars to the Solar System, when, in the distant future, the Oort Cloud might be disrupted enough to increase the number of active comets.
To be in the habitable zone, a planet around Alpha Centauri A would have an orbital radius of between about 1.2 and so as to have similar planetary temperatures and conditions for liquid water to exist. For the slightly less luminous and cooler Alpha Centauri B, the habitable zone is between about 0.7 and .
With the goal of finding evidence of such planets, both Proxima Centauri and Alpha Centauri-AB were among the listed "Tier-1" target stars for NASA's Space Interferometry Mission (S.I.M.). Detecting planets as small as three Earth-masses or smaller within two AU of a "Tier-1" target would have been possible with this new instrument. The S.I.M. mission, however, was cancelled due to financial issues in 2010.
Circumstellar discs.
Based on observations between 2007 and 2012, a study found a slight excess of emissions in the 24-µm (mid/far-infrared) band surrounding , which may be interpreted as evidence for a sparse circumstellar disc or dense interplanetary dust. The total mass was estimated to be between to the mass of the Moon, or 10–100 times the mass of the Solar System's zodiacal cloud. If such a disc existed around both stars, disc would likely be stable to 2.8 AU, and would likely be stable to 2.5 AU This would put A's disc entirely within the frost line, and a small part of B's outer disc just outside.
View from this system.
The sky from Alpha Centauri AB would appear much as it does from the Earth, except that Centaurus would be missing its brightest star. The Sun would appear as a white star of apparent magnitude +0.5, roughly the same as the average brightness of Betelgeuse from Earth. It would be at the antipodal point of Alpha Centauri AB's current right ascension and declination, at (2000), in eastern Cassiopeia, easily outshining all the rest of the stars in the constellation. With the placement of the Sun east of the magnitude 3.4 star Epsilon Cassiopeiae, nearly in front of the Heart Nebula, the "W" line of stars of Cassiopeia would have a "/W" shape.
The Winter Triangle would not look equilateral, but very thin and long, with Procyon outshining Pollux in the middle of Gemini, and Sirius lying less than a degree from Betelgeuse in Orion. With a magnitude of −1.2, Sirius would be a little fainter than from Earth but still the brightest star in the night sky. Both Vega and Altair would be shifted northwestward relative to Deneb, giving the Summer Triangle a more equilateral appearance.
A planet around either α Centauri A or B would see the other star as a very bright secondary. For example, an Earth-like planet at 1.25 AU from α Cen A (with a revolution period of 1.34 years) would get Sun-like illumination from its primary, and α Cen B would appear 5.7 to 8.6 magnitudes dimmer (−21.0 to −18.2), 190 to 2,700 times dimmer than α Cen A but still 150 to 2,100 times brighter than the full Moon. Conversely, an Earth-like planet at 0.71 AU from α Cen B (with a revolution period of 0.63 years) would get nearly Sun-like illumination from its primary, and α Cen A would appear 4.6 to 7.3 magnitudes dimmer (−22.1 to −19.4), 70 to 840 times dimmer than α Cen B but still 470 to 5,700 times brighter than the full Moon.
Proxima Centauri would appear dim as one of many stars.
Other names.
In modern literature, colloquial alternative names of Alpha Centauri include "Rigil Kent" (also "Rigel Kent" and variants; ) and "Toliman" (the latter of which became the proper name of Alpha Centauri B on 10 August 2018 by approval of the International Astronomical Union).
"Rigil Kent" is short for "Rigil Kentaurus", which is sometimes further abbreviated to "Rigil" or "Rigel", though that is ambiguous with Beta Orionis, which is also called Rigel.
The name "Toliman" originates with Jacobus Golius' 1669 edition of Al-Farghani's "Compendium". "Tolimân" is Golius' latinisation of the Arabic name "the ostriches", the name of an asterism of which Alpha Centauri formed the main star.
During the 19th century, the northern amateur popularist Elijah H. Burritt used the now-obscure name "Bungula", possibly coined from "β" and the Latin "ungula" ("hoof").
Together, Alpha and Beta Centauri form the "Southern Pointers" or "The Pointers", as they point towards the Southern Cross, the asterism of the constellation of Crux.
In Chinese astronomy, "Nán Mén", meaning "Southern Gate", refers to an asterism consisting of Alpha Centauri and Epsilon Centauri. Consequently, the Chinese name for Alpha Centauri itself is "Nán Mén Èr", the Second Star of the Southern Gate.
To the Australian aboriginal Boorong people of northwestern Victoria, Alpha Centauri and Beta Centauri are "Bermbermgle", two brothers noted for their courage and destructiveness, who speared and killed "Tchingal" "The Emu" (the Coalsack Nebula). The form in Wotjobaluk is "Bram-bram-bult".
Future exploration.
Alpha Centauri is a first target for crewed or robotic interstellar exploration. Using current spacecraft technologies, crossing the distance between the Sun and Alpha Centauri would take several millennia, though the possibility of nuclear pulse propulsion or laser light sail technology, as considered in the Breakthrough Starshot program, could make the journey to Alpha Centauri in 20 years. An objective of such a mission would be to make a fly-by of, and possibly photograph, planets that might exist in the system. The existence of Proxima Centauri b, announced by the European Southern Observatory (ESO) in August 2016, would be a target for the Starshot program.
NASA announced in 2017 that it plans to send a spacecraft to Alpha Centauri in 2069, scheduled to coincide with the 100th anniversary of the first crewed lunar landing in 1969, Apollo 11. Even at speed 10% of the speed of light (67 million mph), which NASA experts say may be possible, it would take a spacecraft 44 years to reach the constellation, by the year 2113, and will take another 4 years for a signal, by the year 2117 to reach Earth.
|
1980 | Amiga | Amiga is a family of personal computers introduced by Commodore in 1985. The original model is one of a number of mid-1980s computers with 16- or 16/32-bit processors, 256 KB or more of RAM, mouse-based GUIs, and significantly improved graphics and audio compared to previous 8-bit systems. These systems include the Atari ST—released earlier the same year—as well as the Macintosh and Acorn Archimedes. Based on the Motorola 68000 microprocessor, the Amiga differs from its contemporaries through the inclusion of custom hardware to accelerate graphics and sound, including sprites and a blitter, and a pre-emptive multitasking operating system called AmigaOS.
The Amiga 1000 was released in July 1985, but production problems kept it from becoming widely available until early 1986. The best-selling model, the Amiga 500, was introduced in 1987 along with the more expandable Amiga 2000. The Amiga 3000 was introduced in 1990, followed by the Amiga 500 Plus, and Amiga 600 in March 1992. Finally, the Amiga 1200 and Amiga 4000 were released in late 1992. The Amiga line sold an estimated 4.85 million units.
Although early advertisements cast the computer as an all-purpose business machine, especially when outfitted with the Sidecar IBM PC compatibility add-on, the Amiga was most commercially successful as a home computer, with a wide range of games and creative software. The Video Toaster hardware and software suite helped Amiga find a prominent role in desktop video and video production. The Amiga's audio hardware made it a popular platform for music tracker software. The processor and memory capacity enabled 3D rendering packages, including LightWave 3D, Imagine, and Traces, a predecessor to Blender.
Poor marketing and the failure of later models to repeat the technological advances of the first systems resulted in Commodore quickly losing market share to the rapidly dropping prices of IBM PC compatibles, which gained 256 color graphics in 1987, as well as the fourth generation of video game consoles.
Commodore ultimately went bankrupt in April 1994 after a version of the Amiga packaged as a game console, the Amiga CD32, failed in the marketplace. Since the demise of Commodore, various groups have marketed successors to the original Amiga line, including Genesi, Eyetech, ACube Systems Srl and A-EON Technology. AmigaOS has influenced replacements, clones, and compatible systems such as MorphOS and AROS. Currently Belgian company Hyperion Entertainment maintains and develops AmigaOS 4, which is an official and direct descendant of AmigaOS 3.1 – the last system made by Commodore for the original Amiga Computers.
History.
Concept and early development.
Jay Miner joined Atari, Inc. in the 1970s to develop custom integrated circuits, and led development of the Atari Video Computer System's TIA. When complete, the team began developing a much more sophisticated set of chips, CTIA, ANTIC and POKEY, that formed the basis of the Atari 8-bit family.
With the 8-bit line's launch in 1979, the team once again started looking at a next generation chipset. Nolan Bushnell had sold the company to Warner Communications in 1978, and the new management was much more interested in the existing lines than development of new products that might cut into their sales. Miner wanted to start work with the new Motorola 68000, but management was only interested in another 6502 based system. Miner left the company, and, for a time, the industry.
In 1979, Larry Kaplan left Atari and founded Activision. In 1982, Kaplan was approached by a number of investors who wanted to develop a new game platform. Kaplan hired Miner to run the hardware side of the newly formed company, "Hi-Toro". The system was code-named "Lorraine" in keeping with Miner's policy of giving systems female names, in this case the company president's wife, Lorraine Morse. When Kaplan left the company late in 1982, Miner was promoted to head engineer and the company relaunched as Amiga Corporation.
The Amiga hardware was designed by Miner, RJ Mical, and Dale Luck. A breadboard prototype for testing and development was largely completed by late 1983, and shown at the January 1984 Consumer Electronics Show (CES). At the time, the operating system was not ready, so the machine was demonstrated with the "Boing Ball" demo, a real-time animation showing a red-and-white spinning ball bouncing and casting a shadow; this bouncing ball became the official logo of the Amiga company. CES attendees had trouble believing the computer being demonstrated had the power to display such a demo and searched in vain for the "real" computer behind it.
A further developed version of the system was demonstrated at the June 1984 CES and shown to many companies in hopes of garnering further funding, but found little interest in a market that was in the final stages of the video game crash of 1983.
In March, Atari expressed a tepid interest in Lorraine for its potential use in a games console or home computer tentatively known as the . The talks were progressing slowly, and Amiga was running out of money. A temporary arrangement in June led to a $500,000 loan from Atari to Amiga to keep the company going. The terms required the loan to be repaid at the end of the month, otherwise Amiga would forfeit the Lorraine design to Atari.
Commodore.
During 1983, Atari lost over a week, due to the combined effects of the crash and the ongoing price war in the home computer market. By the end of the year, Warner was desperate to sell the company. In January 1984, Jack Tramiel resigned from Commodore due to internal battles over the future direction of the company. A number of Commodore employees followed him to his new company, Tramel Technology. This included a number of the senior technical staff, where they began development of a 68000-based machine of their own. In June, Tramiel arranged a no-cash deal to take over Atari, reforming Tramel Technology as Atari Corporation.
As many Commodore technical staff had moved to Atari, Commodore was left with no workable path to design their own next-generation computer. The company approached Amiga offering to fund development as a home computer system. They quickly arranged to repay the Atari loan, ending that threat. The two companies were initially arranging a license agreement before Commodore offered to purchase Amiga outright.
By late 1984, the prototype breadboard chipset had successfully been turned into integrated circuits, and the system hardware was being readied for production. At this time the operating system (OS) was not as ready, and led to a deal to port an OS known as TRIPOS to the platform. TRIPOS was a multitasking system that had been written in BCPL during the 1970s for the PDP-11 minicomputer, but later experimentally ported to the 68000. This early version was known as AmigaDOS and the GUI as Workbench. The BCPL parts were later rewritten in the C language, and the entire system became AmigaOS.
The system was enclosed in a pizza box form factor case; a late change was the introduction of vertical supports on either side of the case to provide a "garage" under the main section of the system where the keyboard could be stored.
Launch.
The first model was announced in 1985 as simply "The Amiga from Commodore", later to be retroactively dubbed the Amiga 1000. They were first offered for sale in August, but by October only 50 had been built, all of which were used by Commodore. Machines only began to arrive in quantity in mid-November, meaning they missed the Christmas buying rush. By the end of the year, they had sold 35,000 machines, and severe cashflow problems made the company pull out of the January 1986 CES. Bad or entirely missing marketing, forcing the development team to move to the east coast, notorious stability problems and other blunders limited sales in early 1986 to between 10,000 and 15,000 units a month.
Later models.
In late 1985, Thomas Rattigan was promoted to COO of Commodore, and then to CEO in February 1986. He immediately implemented an ambitious plan that covered almost all of the company's operations. Among these was the long-overdue cancellation of the now outdated PET and VIC-20 lines, as well as a variety of poorly selling Commodore 64 offshoots and the Commodore 900 workstation effort.
Another one of the changes was to split the Amiga into two products, a new high-end version of the Amiga aimed at the creative market, and a cost-reduced version that would take over for the Commodore 64 in the low-end market. These new designs were released in 1987 as the Amiga 2000 and Amiga 500, the latter of which went on to widespread success and became their best selling model.
Similar high-end/low-end models would make up the Amiga line for the rest of its history; follow-on designs included the Amiga 3000/Amiga 500 Plus/Amiga 600, and the Amiga 4000/Amiga 1200. These models incorporated a series of technical upgrades known as the ECS and AGA, which added higher resolution displays among many other improvements and simplifications.
The Amiga line sold an estimated 4,850,000 machines over its lifetime. The machines were most popular in the UK and Germany, with about 1.5 million sold in each country, and sales in the high hundreds of thousands in other European nations. The machine was less popular in North America, where an estimated 700,000 were sold. In the United States, the Amiga found a niche with enthusiasts and in vertical markets for video processing and editing. In Europe, it was more broadly popular as a home computer and often used for video games. Beginning in 1988 it overlapped with the 16-bit Mega Drive, then the Super Nintendo Entertainment System in the early 1990s. Commodore UK's Kelly Sumner did not see Sega or Nintendo as competitors, but instead credited their marketing campaigns which spent over or for promoting video games as a whole and thus helping to boost Amiga sales.
Bankruptcy.
In spite of his successes in making the company profitable and bringing the Amiga line to market, Rattigan was soon forced out in a power struggle with majority shareholder, Irving Gould. This is widely regarded as the turning point, as further improvements to the Amiga were eroded by rapid improvements in other platforms.
Commodore shut down the Amiga division on April 26, 1994, and filed for bankruptcy three days later. Commodore's assets were purchased by Escom, a German PC manufacturer, who created the subsidiary company Amiga Technologies. They re-released the A1200 and A4000T, and introduced a new 68060 version of the A4000T. Amiga Technologies researched and developed the Amiga Walker prototype. They presented the machine publicly at CeBit. In 1996, it was reported that Escom had sold the Amiga intellectual property to VIScorp for $40m, but this deal fell through, and Escom, in turn, went bankrupt in 1997.
A U.S. Wintel PC manufacturer, Gateway 2000, then purchased the Amiga branch and technology. In 2000, Gateway sold the Amiga brand to Amiga, Inc., without having released any products. Amiga, Inc. licensed the rights to sell hardware using the AmigaOne brand to Eyetech Group and Hyperion Entertainment. In 2019, Amiga, Inc. sold its intellectual property to Amiga Corporation.
Hardware.
The Amiga has a custom chipset consisting of several coprocessors which handle audio, video, and direct memory access independently of the Central Processing Unit (CPU). This architecture gave the Amiga a performance edge over its competitors, particularly for graphics-intensive applications and games.
The architecture uses two distinct bus subsystems: the chipset bus and the CPU bus. The chipset bus allows the coprocessors and CPU to address "Chip RAM". The CPU bus provides addressing to conventional RAM, ROM and the Zorro II or Zorro III expansion subsystems. This enables independent operation of the subsystems. The CPU bus can be much faster than the chipset bus. CPU expansion boards may provide additional custom buses. Additionally, "busboards" or "bridgeboards" may provide ISA or PCI buses.
Central processing unit.
The most popular models from Commodore, including the Amiga 1000, Amiga 500, and Amiga 2000, use the Motorola 68000 as the CPU. From a developer's point of view, the 68000 provides a full suite of 32-bit operations, but the chip can address only 16 MB of physical memory and is implemented using a 16-bit arithmetic logic unit and has a 16-bit external data bus, so 32-bit computations are transparently handled as multiple 16-bit values at a performance cost. The later Amiga 2500 and the Amiga 3000 models use fully 32-bit, 68000 compatible, processors from Motorola improved performance and larger addressing capability.
CPU upgrades were offered by both Commodore and third-party manufacturers. Most Amiga models can be upgraded either by direct CPU replacement or through expansion boards. Such boards often included faster and higher capacity memory interfaces and hard disk controllers.
Towards the end of Commodore's time in charge of Amiga development, there were suggestions that Commodore intended to move away from the 68000 series to higher performance RISC processors, such as the PA-RISC. Those ideas were never developed before Commodore filed for bankruptcy. Despite this, third-party manufacturers designed upgrades featuring a combination of 68000 series and PowerPC processors along with a PowerPC native microkernel and software. Later Amiga clones featured PowerPC processors only.
Custom chipset.
The custom chipset at the core of the Amiga design appeared in three distinct generations, with a large degree of backward-compatibility. The Original Chip Set (OCS) appeared with the launch of the A1000 in 1985. OCS was eventually followed by the modestly improved Enhanced Chip Set (ECS) in 1990 and finally by the partly 32-bit Advanced Graphics Architecture (AGA) in 1992. Each chipset consists of several coprocessors that handle graphics acceleration, digital audio, direct memory access and communication between various peripherals (e.g., CPU, memory and floppy disks). In addition, some models featured auxiliary custom chips that performed tasks such as SCSI control and display de-interlacing.
Graphics.
All Amiga systems can display full-screen animated planar graphics with 2, 4, 8, 16, 32, 64 (EHB Mode), or 4096 colors (HAM Mode). Models with the AGA chipset (A1200 and A4000) also have non-EHB 64, 128, 256, and 262144 (HAM8 Mode) color modes and a palette expanded from 4096 to 16.8 million colors.
The Amiga chipset can "genlock", which is the ability to adjust its own screen refresh timing to match an incoming NTSC or PAL video signal. When combined with setting transparency, this allows an Amiga to overlay an external video source with graphics. This ability made the Amiga popular for many applications, and provides the ability to do character generation and CGI effects far more cheaply than earlier systems. This ability has been frequently utilized by wedding videographers, TV stations and their weather forecasting divisions (for weather graphics and radar), advertising channels, music video production, and desktop videographers. The NewTek Video Toaster was made possible by the genlock ability of the Amiga.
In 1988, the release of the Amiga A2024 fixed-frequency monochrome monitor with built-in framebuffer and flicker fixer hardware provided the Amiga with a choice of high-resolution graphic modes (1024×800 for NTSC and 1024×1024 for PAL).
ReTargetable Graphics.
ReTargetable Graphics is an API for device drivers mainly used by 3rd party graphics hardware to interface with AmigaOS via a set of libraries. The software libraries may include software tools to adjust resolution, screen colors, pointers and screenmodes. The standard Intuition interface is limited to display depths of 8 bits, while RTG makes it possible to handle higher depths like 24-bits.
Sound.
The sound chip, named Paula, supports four PCM sound channels (two for the left speaker and two for the right) with 8-bit resolution for each channel and a 6-bit volume control per channel. The analog output is connected to a low-pass filter, which filters out high-frequency aliasing when the Amiga is using a lower sampling rate (see Nyquist frequency). The brightness of the Amiga's power LED is used to indicate the status of the Amiga's low-pass filter. The filter is active when the LED is at normal brightness, and deactivated when dimmed (or off on older A500 Amigas). On Amiga 1000 (and first Amiga 500 and Amiga 2000 model), the power LED had no relation to the filter's status, and a wire needed to be manually soldered between pins on the sound chip to disable the filter. Paula can read arbitrary waveforms at arbitrary rates and amplitudes directly from the system's RAM, using direct memory access (DMA), making sound playback without CPU intervention possible.
Although the hardware is limited to four separate sound channels, software such as "OctaMED" uses software mixing to allow eight or more virtual channels, and it was possible for software to mix two hardware channels to achieve a single 14-bit resolution channel by playing with the volumes of the channels in such a way that one of the source channels contributes the most significant bits and the other the least.
The quality of the Amiga's sound output, and the fact that the hardware is ubiquitous and easily addressed by software, were standout features of Amiga hardware unavailable on PC platforms for years. Third-party sound cards exist that provide DSP functions, multi-track direct-to-disk recording, multiple hardware sound channels and 16-bit and beyond resolutions. A retargetable sound API called AHI was developed allowing these cards to be used transparently by the OS and software.
Kickstart firmware.
Kickstart is the firmware upon which AmigaOS is bootstrapped. Its purpose is to initialize the Amiga hardware and core components of AmigaOS and then attempt to boot from a bootable volume, such as a floppy disk or hard disk drive. Most models (excluding the Amiga 1000) come equipped with Kickstart on an embedded ROM-chip.
Keyboard and mouse.
The keyboard on Amiga computers is similar to that found on a mid-80s IBM PC: Ten function keys, a numeric keypad, and four separate directional arrow keys. Caps Lock and Control share space to the left of A. Absent are Home, End, Page Up, and Page Down keys: These functions are accomplished on Amigas by pressing shift and the appropriate arrow key. The Amiga keyboard adds a Help key, which a function key usually acts as on PCs (usually F1). In addition to the Control and Alt modifier keys, the Amiga has 2 "Amiga" keys, rendered as "Open Amiga" and "Closed Amiga" similar to the Open/Closed Apple logo keys on Apple II keyboards. The left is used to manipulate the operating system (moving screens and the like) and the right delivers commands to the application. The absence of Num lock frees space for more mathematical symbols around the numeric pad.
Like IBM-compatible computers, the mouse has two buttons, but in AmigaOS, pressing and holding the right button replaces the system status line at the top of the screen with a Maclike menu bar. As with Apple's Mac OS prior to Mac OS 8, menu options are selected by releasing the button over that option, not by left clicking. Menu items that have a boolean toggle state can be left clicked whilst the menu is kept open with the right button, which allows the user – for example – to set some selected text to bold, underline and italics in one visit to the menus.
The mouse plugs into one of two Atari joystick ports used for joysticks, game paddles, and graphics tablets. Although compatible with analog joysticks, Atari-style digital joysticks became standard. Unusually, two independent mice can be connected to the joystick ports; some games, such as Lemmings, were designed to take advantage of this.
Other peripherals and expansions.
The Amiga was one of the first computers for which inexpensive sound sampling and video digitization accessories were available. As a result of this and the Amiga's audio and video capabilities, the Amiga became a popular system for editing and producing both music and video.
Many expansion boards were produced for Amiga computers to improve the performance and capability of the hardware, such as memory expansions, SCSI controllers, CPU boards, and graphics boards. Other upgrades include genlocks, network cards for Ethernet, modems, sound cards and samplers, video digitizers, extra serial ports, and IDE controllers. Additions after the demise of Commodore company are USB cards. The most popular upgrades were memory, SCSI controllers and CPU accelerator cards. These were sometimes combined into one device.
Early CPU accelerator cards used the full 32-bit CPUs of the 68000 family such as the Motorola 68020 and Motorola 68030, almost always with 32-bit memory and usually with FPUs and MMUs or the facility to add them. Later designs feature the Motorola 68040 or Motorola 68060. Both CPUs feature integrated FPUs and MMUs. Many CPU accelerator cards also had integrated SCSI controllers.
Phase5 designed the PowerUP boards (Blizzard PPC and CyberStorm PPC) featuring both a 68k (a 68040 or 68060) and a PowerPC (603 or 604) CPU, which are able to run the two CPUs at the same time and share the system memory. The PowerPC CPU on PowerUP boards is usually used as a coprocessor for heavy computations; a powerful CPU is needed to run MAME for example, but even decoding JPEG pictures and MP3 audio was considered heavy computation at the time. It is also possible to ignore the 68k CPU and run Linux on the PPC via project Linux APUS, but a PowerPC-native AmigaOS promised by Amiga Technologies GmbH was not available when the PowerUP boards first appeared.
24-bit graphics cards and video cards were also available. Graphics cards were designed primarily for 2D artwork production, workstation use, and later, gaming. Video cards are designed for inputting and outputting video signals, and processing and manipulating video.
In the North American market, the "NewTek Video Toaster" was a video effects board that turned the Amiga into an affordable video processing computer that found its way into many professional video environments. One well-known use was to create the special effects in early series of "Babylon 5". Due to its NTSC-only design, it did not find a market in countries that used the PAL standard, such as in Europe. In those countries, the "OpalVision" card was popular, although less featured and supported than the Video Toaster. Low-cost time base correctors (TBC) specifically designed to work with the Toaster quickly came to market, most of which were designed as standard Amiga bus cards.
Various manufacturers started producing PCI busboards for the A1200, A3000 and A4000, allowing standard Amiga computers to use PCI cards such as graphics cards, Sound Blaster sound cards, 10/100 Ethernet cards, USB cards, and television tuner cards. Other manufacturers produced hybrid boards that contained an Intel x86 series chip, allowing the Amiga to emulate a PC.
PowerPC upgrades with Wide SCSI controllers, PCI busboards with Ethernet, sound and 3D graphics cards, and tower cases allowed the A1200 and A4000 to survive well into the late nineties.
Expansion boards were made by Richmond Sound Design that allow their show control and sound design software to communicate with their custom hardware frames either by ribbon cable or fiber optic cable for long distances, allowing the Amiga to control up to eight million digitally controlled external audio, lighting, automation, relay and voltage control channels spread around a large theme park, for example. See Amiga software for more information on these applications.
Other devices included the following:
Serial ports.
The Commodore A2232 board provides seven RS-232C serial ports in addition to the Amiga's built-in serial port. Each port can be driven independently at speeds of 50 to . There is, however, a driver available on Aminet that allows two of the serial ports to be driven at . The serial card used the 65CE02 CPU clocked at . This CPU was also part of the CSG 4510 CPU core that was used in the Commodore 65 computer.
Networking.
Amiga has three networking interface APIs:
Different network media were used:
Models and variants.
The original Amiga models were produced from 1985 to 1996. They are, in order of production: 1000, 2000, 500, 1500, 2500, 3000, 3000UX, 3000T, CDTV, 500+, 600, 4000, 1200, CD32, and 4000T. The PowerPC-based AmigaOne computers were later marketed beginning in 2002. Several companies and private persons have also released Amiga clones and still do so today.
Commodore Amiga.
The first Amiga model, the Amiga 1000, was launched in 1985. In 2006, PC World rated the Amiga 1000 as the seventh greatest PC of all time, stating "Years ahead of its time, the Amiga was the world's first multimedia, multitasking personal computer".
Commodore updated the desktop line of Amiga computers with the Amiga 2000 in 1987, the Amiga 3000 in 1990, and the Amiga 4000 in 1992, each offering improved capabilities and expansion options. The best-selling models were the budget models, however, particularly the highly successful Amiga 500 (1987) and the Amiga 1200 (1992). The Amiga 500+ (1991) was the shortest-lived model, replacing the Amiga 500 and lasting only six months until it was phased out and replaced with the Amiga 600 (1992), which in turn was also quickly replaced by the Amiga 1200.
The CDTV, launched in 1991, was a CD-ROM-based game console and multimedia appliance several years before CD-ROM drives were common. The system never achieved any real success.
Commodore's last Amiga offering before filing for bankruptcy was the Amiga CD32 (1993), a 32-bit CD-ROM games console. Although discontinued after Commodore's demise it met with moderate commercial success in Europe. The CD32 was a next-generation CDTV, and it was designed to save Commodore by entering the growing video game console market.
Following purchase of Commodore's assets by Escom in 1995, the A1200 and A4000T continued to be sold in small quantities until 1996, though the ground lost since the initial launch and the prohibitive expense of these units meant that the Amiga line never regained any real popularity.
Several Amiga models contained references to songs by the rock band The B-52's. Early A500 units had the words "B52/ROCK LOBSTER" silk-screen printed onto their printed circuit board, a reference to the song "Rock Lobster" The Amiga 600 referenced "JUNE BUG" (after the song "Junebug") and the Amiga 1200 had "CHANNEL Z" (after "Channel Z"), and the CD-32 had "Spellbound."
AmigaOS 4 systems.
AmigaOS 4 is designed for PowerPC Amiga systems. It is mainly based on AmigaOS 3.1 source code, with some parts of version 3.9. Currently runs on both Amigas equipped with CyberstormPPC or BlizzardPPC accelerator boards, on the Teron series based AmigaOne computers built by Eyetech under license by Amiga, Inc., on the Pegasos II from Genesi/bPlan GmbH, on the ACube Systems Srl Sam440ep / Sam460ex / AmigaOne 500 systems and on the A-EON AmigaOne X1000.
AmigaOS 4.0 had been available only in developer pre-releases for numerous years until it was officially released in December 2006. Due to the nature of some provisions of the contract between Amiga Inc. and Hyperion Entertainment (the Belgian company that is developing the OS), the commercial AmigaOS 4 had been available only to licensed buyers of AmigaOne motherboards.
AmigaOS 4.0 for Amigas equipped with PowerUP accelerator boards was released in November 2007. Version 4.1 was released in August 2008 for AmigaOne systems, and in May 2011 for Amigas equipped with PowerUP accelerator boards. The most recent release of AmigaOS for all supported platforms is 4.1 update 5. Starting with release 4.1 update 4 there is an Emulation drawer containing official AmigaOS 3.x ROMs (all classic Amiga models including CD32) and relative Workbench files.
Acube Systems entered an agreement with Hyperion under which it has ported AmigaOS 4 to its Sam440ep and Sam460ex line of PowerPC-based motherboards. In 2009 a version for Pegasos II was released in co-operation with Acube Systems. In 2012, A-EON Technology Ltd manufactured and released the AmigaOne X1000 to consumers through their partner, Amiga Kit who provided end-user support, assembly and worldwide distribution of the new system.
Amiga hardware clones.
Long-time Amiga developer MacroSystem entered the Amiga-clone market with their DraCo non-linear video editing system. It appears in two versions, initially a tower model and later a cube. DraCo expanded upon and combined a number of earlier expansion cards developed for Amiga (VLabMotion, Toccata, WarpEngine, RetinaIII) into a true Amiga-clone powered by the Motorola 68060 processor. The DraCo can run AmigaOS 3.1 up through AmigaOS 3.9. It is the only Amiga-based system to support FireWire for video I/O. DraCo also offers an Amiga-compatible Zorro-II expansion bus and introduced a faster custom DraCoBus, capable of transfer rates (faster than Commodore's Zorro-III). The technology was later used in the Casablanca system, a set-top-box also designed for non-linear video editing.
In 1998, Index Information released the Access, an Amiga-clone similar to the Amiga 1200, but on a motherboard that could fit into a standard -inch drive bay. It features either a 68020 or 68030 CPU, with a AGA chipset, and runs AmigaOS 3.1.
In 1998, former Amiga employees (John Smith, Peter Kittel, Dave Haynie and Andy Finkel to mention few) formed a new company called PIOS. Their hardware platform, PIOS One, was aimed at Amiga, Atari and Macintosh users. The company was renamed to Met@box in 1999 until it folded.
The NatAmi (short for "Native Amiga") hardware project began in 2005 with the aim of designing and building an Amiga clone motherboard that is enhanced with modern features. The NatAmi motherboard is a standard Mini-ITX-compatible form factor computer motherboard, powered by a Motorola/Freescale 68060 and its chipset. It is compatible with the original Amiga chipset, which has been inscribed on a programmable FPGA Altera chip on the board. The NatAmi is the second Amiga clone project after the Minimig motherboard, and its history is very similar to that of the C-One mainboard developed by Jeri Ellsworth and Jens Schönfeld. From a commercial point of view, Natami's circuitry and design are currently closed source. One goal of the NatAmi project is to design an Amiga-compatible motherboard that includes up-to-date features but that does not rely on emulation (as in WinUAE), modern PC Intel components, or a modern PowerPC mainboard. As such, NatAmi is not intended to become another evolutionary heir to classic Amigas, such as with AmigaOne or Pegasos computers. This "purist" philosophy essentially limits the resulting processor speed but puts the focus on bandwidth and low latencies. The developers also recreated the entire Amiga chipset, freeing it from legacy Amiga limitations such as two megabytes of audio and video graphics RAM as in the AGA chipset, and rebuilt this new chipset by programming a modern FPGA Altera Cyclone IV chip. Later, the developers decided to create from scratch a new software-form processor chip, codenamed "N68050" that resides in the physical Altera FPGA programmable chip.
In 2006, two new Amiga clones were announced, both using FPGA based hardware synthesis to replace the Amiga OCS custom chipset. The first, the Minimig, is a personal project of Dutch engineer Dennis van Weeren. Referred to as "new Amiga hardware", the original model was built on a Xilinx Spartan-3 development board, but soon a dedicated board was developed. The minimig uses the FPGA to reproduce the custom Denise, Agnus, Paula and Gary chips as well as both 8520 CIAs and implements a simple version of Amber. The rest of the chips are an actual 68000 CPU, ram chips, and a PIC microcontroller for BIOS control. The design for Minimig was released as open-source on July 25, 2007. In February 2008, an Italian company Acube Systems began selling Minimig boards. A third party upgrade replaces the PIC microcontroller with a more powerful ARM processor, providing more functionality such as write access and support for hard disk images. The Minimig core has been ported to the FPGArcade "Replay" board. The Replay uses an FPGA with about three times more capacity and that does support the AGA chipset and a 68020 soft core with 68030 capabilities. The Replay board is designed to implement many older computers and classic arcade machines.
The second is the Clone-A system announced by Individual Computers. As of mid 2007 it has been shown in its development form, with FPGA-based boards replacing the Amiga chipset and mounted on an Amiga 500 motherboard.
Operating systems.
AmigaOS.
AmigaOS is a single-user multitasking operating system. It was one of the first commercially available consumer operating systems for personal computers to implement preemptive multitasking. It was developed first by Commodore International and initially introduced in 1985 with the Amiga 1000. John C. Dvorak wrote in "PC Magazine" in 1996:
AmigaOS combines a command-line interface and graphical user interface. AmigaDOS is the disk operating system and command line portion of the OS and Workbench the native graphical windowing, graphical environment for file management and launching applications. AmigaDOS allows long filenames (up to 107 characters) with whitespace and does not require filename extensions. The windowing system and user interface engine that handles all input events is called Intuition.
The multi-tasking kernel is called Exec. It acts as a scheduler for tasks running on the system, providing pre-emptive multitasking with prioritised round-robin scheduling. It enabled true pre-emptive multitasking in as little as 256 KB of free memory.
AmigaOS does not implement memory protection; the 68000 CPU does not include a memory management unit. Although this speeds and eases inter-process communication because programs can communicate by simply passing a pointer back and forth, the lack of memory protection made the AmigaOS more vulnerable to crashes from badly behaving programs than other multitasking systems that did implement memory protection, and Amiga OS is fundamentally incapable of enforcing any form of security model since any program had full access to the system. A co-operational memory protection feature was implemented in AmigaOS 4 and could be retrofitted to old AmigaOS systems using Enforcer or CyberGuard tools.
The problem was somewhat exacerbated by Commodore's initial decision to release documentation relating not only to the OS's underlying software routines, but also to the hardware itself, enabling intrepid programmers who had developed their skills on the Commodore 64 to POKE the hardware directly, as was done on the older platform. While the decision to release the documentation was a popular one and allowed the creation of fast, sophisticated sound and graphics routines in games and demos, it also contributed to system instabilityas some programmers lacked the expertise to program at this level. For this reason, when the new AGA chipset was released, Commodore declined to release low-level documentation in an attempt to force developers into using the approved software routines.
Influence on other operating systems.
AmigaOS directly or indirectly inspired the development of various operating systems. MorphOS and AROS clearly inherit heavily from the structure of AmigaOS as explained directly in articles regarding these two operating systems. AmigaOS also influenced BeOS, which featured a centralized system of Datatypes, similar to that present in AmigaOS. Likewise, DragonFly BSD was also inspired by AmigaOS as stated by Dragonfly developer Matthew Dillon who is a former Amiga developer. WindowLab and amiwm are among several window managers for the X Window System seek to mimic the Workbench interface. IBM licensed the Amiga GUI from Commodore in exchange for the REXX language license. This allowed OS/2 to have the WPS (Workplace Shell) GUI shell for OS/2 2.0, a 32-bit operating system.
Unix and Unix-like systems.
Commodore-Amiga produced Amiga Unix, informally known as Amix, based on AT&T SVR4. It supports the Amiga 2500 and Amiga 3000 and is included with the Amiga 3000UX. Among other unusual features of Amix is a hardware-accelerated windowing system that can scroll windows without copying data. Amix is not supported on the later Amiga systems based on 68040 or 68060 processors.
Other, still maintained, operating systems are available for the classic Amiga platform, including Linux and NetBSD. Both require a CPU with MMU such as the 68020 with 68851 or full versions of the 68030, 68040 or 68060. There is also a version of Linux for Amigas with PowerPC accelerator cards. Debian and Yellow Dog Linux can run on the AmigaOne.
There is an official, older version of OpenBSD. The last Amiga release is 3.2. MINIX 1.5.10 also runs on Amiga.
Emulating other systems.
The Amiga Sidecar is a complete IBM PC XT compatible computer contained in an expansion card. It was released by Commodore in 1986 and promoted as a way to run business software on the Amiga 1000.
Amiga software.
In the late 1980s and early 1990s the platform became particularly popular for gaming, demoscene activities and creative software uses. During this time commercial developers marketed a wide range of games and creative software, often developing titles simultaneously for the Atari ST due to the similar hardware architecture. Popular creative software included 3D rendering (ray-tracing) packages, bitmap graphics editors, desktop video software, software development packages and "tracker" music editors.
Until the late 1990s the Amiga remained a popular platform for non-commercial software, often developed by enthusiasts, and much of which was freely redistributable. An on-line archive, Aminet, was created in 1991 and until the late-1990s was the largest public archive of software, art and documents for any platform.
Marketing.
The name "Amiga" was chosen by the developers from the Spanish word for a female friend, because they knew Spanish, and because it occurred before Apple and Atari alphabetically. It also conveyed the message that the Amiga computer line was "user friendly" as a pun or play on words.
The first official Amiga logo was a rainbow-colored double check mark. In later marketing material Commodore largely dropped the checkmark and used logos styled with various typefaces. Although it was never adopted as a trademark by Commodore, the "Boing Ball" has been synonymous with Amiga since its launch. It became an unofficial and enduring theme after a visually impressive animated demonstration at the 1984 Winter Consumer Electronics Show in January 1984 showing a checkered ball bouncing and rotating. Following Escom's purchase of Commodore in 1996, the Boing Ball theme was incorporated into a new logo.
Early Commodore advertisements attempted to cast the computer as an all-purpose business machine, though the Amiga was most commercially successful as a home computer. Throughout the 1980s and early 1990s Commodore primarily placed advertising in computer magazines and occasionally in national newspapers and on television.
Legacy.
Since the demise of Commodore, various groups have marketed successors to the original Amiga line:
AmigaOS and MorphOS are commercial proprietary operating systems. AmigaOS 4, based on AmigaOS 3.1 source code with some parts of version 3.9, is developed by Hyperion Entertainment and runs on PowerPC based hardware. MorphOS, based on some parts of AROS source code, is developed by MorphOS Team and is continued on Apple and other PowerPC based hardware.
There is also AROS, a free and open source operating system (re-implementation of the AmigaOS 3.1 APIs), for Amiga 68k, x86 and ARM hardware (one version runs Linux-hosted on the Raspberry Pi). In particular, AROS for Amiga 68k hardware aims to create an open source Kickstart ROM replacement for emulation purpose and/or for use on real "classic" hardware.
Magazines.
"Amiga Format" continued publication until 2000. "Amiga Active" was launched in 1999 and was published until 2001.
Several magazines are in publication today: "Amiga Future", which is available in both English and German; "Bitplane.it", a bimonthly magazine in Italian; and "AmigaPower", a long-running French magazine. Print magazine "Amiga Addict" started publication in 2020.
Trade Shows.
The Amiga continues to be popular enough that fans to support conferences such as Amiga37 which had over 50 vendors.
Uses.
The Amiga series of computers found a place in early computer graphic design and television presentation. Season 1 and part of season 2 of the television series "Babylon 5" were rendered in LightWave 3D on Amigas. Other television series using Amigas for special effects included "SeaQuest DSV" and "Max Headroom".
In addition, many celebrities and notable individuals have made use of the Amiga:
|
1985 | Absorption | Absorption may refer to:
|
1986 | Actinophryid | The actinophryids are an order of heliozoa, a polyphyletic array of stramenopiles, having a close relationship with pedinellids and "Ciliophrys". They are common in fresh water and occasionally found in marine and soil habitats. Actinophryids are unicellular and roughly spherical in shape, with many axopodia that radiate outward from the cell body. Axopodia are a type of pseudopodia that are supported by hundreds of microtubules arranged in interlocking spirals and forming a needle-like internal structure or axoneme. Small granules, extrusomes, that lie under the membrane of the body and axopodia capture flagellates, ciliates and small metazoa that make contact with the arms.
Description.
Actinophryids are largely aquatic protozoa with a spherical cell body and many needle-like axopodia. They resemble the shape of a sun due to this structure, which is the inspiration for their common name: heliozoa, or "sun-animalcules". Their bodies, without arms, range in size from a few tens of micrometers to slightly under a millimeter across.
The outer region of cell body is often vacuolated. The endoplasm of actinophryids is less vacuolated than the outer layer, and a sharp boundary layer may be seen by light microscopy. The organisms can be either mononucleate, with a single, well defined nucleus in the center of the cell body, or multinucleate, with 10 or more nuclei located under the outer vacuolated layer of cytoplasm. The cytoplasm of actinophryids is often granular, similar to that of "Amoeba".
Actinoprhyid cells may fuse when feeding, creating larger aggregated organisms. Fine granules that occur just under the cell membrane are used up when food vacuoles form to enclose prey. Actinophryids may also form cysts when food is not readily available. A layer of siliceous plates is deposited under the cell membrane during the encystment process.
Contractile vacuoles are common in these organisms, which are presumed to use them to maintain body volume by expelling fluids to compensate for the entry of water by osmosis. Contractile vacuoles are visible as clear bulges from the surface of the cell body that slowly fill then rapidly deflate, expelling their contents into the environment.
Axopodia.
The most distinctive characteristic of the actinophryids is their axopodia. These axopodia consist of a central, rigid rod which is coated in a thin layer of ectoplasm. In "Actinophrys" the axonemes end on the surface of the central nucleus, and in the multicellular "Actinosphaerium" they end at or near nuclei. The axonemes are composed microtubules arranged in a double spiral pattern characteristic of the order. Due to their long, parallel construction these microtubules demonstrate strong birefringence.
These axopodia are used for prey capture, in movement, cell fusion and perhaps division. They are stiff but may flex especially near their tips, and are highly dynamic, undergoing frequent construction and destruction. When used to collect prey items, two methods of capture have been noted, termed axopodial flow and rapid axopodial contraction. Axopodial flow involves the slow movement of a prey item along the surface of the axopod as the ectoplasm itself moves, while rapid axopodial contraction involves the collapse of the axoneme's microtubule structure. This behavior has been documented in many species, including "Actinosphaerium nucleofilum", "Actinophrys sol", and "Raphidiophrys contractilis". The rapid axopodial contraction occurs at high speed, often in excess of 5mm/s or tens of body lengths per second.
The axopodial contractions have been shown to be highly sensitive to environmental factors such as temperature and pressure as well as chemical signals like Ca2+ and colchicine.
Reproduction.
Reproduction in actinophryids generally takes place via fission, where one parent cell divides into two or more daughter cells. For multinucleate heliozoa, this process is plasmotomic as the nuclei are not duplicated prior to division. It has been observed that reproduction appears to be a response to food scarcity, with an increased number of divisions following the removal of food and larger organisms during times of food excess.
Actinophryids also undergo autogamy during times of food scarcity. This is better described as genetic reorganization than reproduction, as the number of individuals produced is the same as the initial number. Nonetheless, it serves as a way to increase genetic diversity within an individual which may improve the likelihood of expressing favorable genetic traits.
Plastogamy has also been extensively documented in actinophryids, especially in multinucleate ones. "Actinosphaerium" were observed to combine freely without the combination of nuclei, and this process sometimes resulted in more or less individuals than originally combined. This process is not caused merely by contact between two individuals but can be caused by damage to the cell body.
Cyst function and formation.
Under unfavourable conditions, some species will form a cyst. This is often the product of autogamy, in which case the cysts produced are zygotes. Cells undergoing this process withdraw their axopodia, adhere to the substrate, and take on an opaque and grayish appearance. This cyst then divides until only uninucleate cells remain. The cyst wall is thickly layered 7–8 times and includes gelatinous layers, layers of silica plates, and iron.
Taxonomy.
Originally placed in Heliozoa (Sarcodina), the actinophryids are now understood to be part of the stramenopiles. They are unrelated to centrohelid and desmothoracid heliozoa with which they had been previously classified.
There are several genera included within this classification. "Actinophrys" are smaller and have a single, central nucleus. Most have a cell body 40–50 micrometer in diameter with axopods around 100 μm in length, though this varies significantly. "Actinosphaerium" are several times larger, from 200 to 1000 μm in diameter, with many nuclei and are found exclusively in fresh water. A third genus, "Camptonema", has a debated status. It has been observed once and was treated as a junior subjective synonym of "Actinosphaerium" by Mikrjukov & Patterson in 2001, but as a valid genus by Cavalier-Smith & Scoble (2013). "Heliorapha" is a further debated taxon, it being a new generic vehicle for the species "azurina" that was initially assigned to the genus "Ciliophrys".
|
1988 | Abel Tasman | Abel Janszoon Tasman (; 160310 October 1659) was a Dutch seafarer and explorer, best known for his voyages of 1642 and 1644 in the service of the Dutch East India Company (VOC).
Born in 1603 in Lutjegast, Netherlands, Tasman started his career as a merchant seaman and became a skilled navigator. In 1633, he joined the VOC and sailed to Batavia, now Jakarta, Indonesia. He participated in several voyages, including one to Japan. In 1642, Tasman was appointed by the VOC to lead an expedition to explore the uncharted regions of the Southern Pacific Ocean. His mission was to discover new trade routes and to establish trade relations with the native inhabitants. After leaving Batavia, Tasman sailed eastward and reached the coast of Tasmania, which he named Van Diemen's Land after his patron. He then sailed north and discovered the west coast of New Zealand, which he named "Staten Landt", but later renamed "Nieuw Zeeland" after the Dutch province of Zeeland.
Despite his achievements, Tasman's expedition was not entirely successful. The encounter with the Māori people on the South Island of New Zealand resulted in a violent confrontation, which left four of Tasman's men dead. He returned to Batavia without having made any significant contact with the native inhabitants or establishing any trade relations. Nonetheless, Tasman's expedition paved the way for further exploration and colonization of Australia and New Zealand by the Europeans. Tasman continued to serve the Dutch East India Company until his death in 1659, leaving behind a legacy as one of the greatest explorers of his time.
Origins and early life.
Abel Tasman was born around 1603 in Lutjegast, a small village in the province of Groningen, in the north of the Netherlands. The oldest available source mentioning him is dated 27 December 1631 when, as a seafarer living in Amsterdam, the 28-year-old became engaged to marry 21-year-old Jannetje Tjaers, of Palmstraat in the Jordaan district of the city.
Relocation to the Dutch East Indies.
Employed by the Dutch East India Company (VOC), Tasman sailed from Texel (Netherlands) to Batavia, now Jakarta, in 1633 taking the southern Brouwer Route. While based in Batavia, Tasman took part in a voyage to Seram Island (in what is now the Maluku Province in Indonesia) because the locals had sold spices to other European nationalities than the Dutch. He had a narrow escape from death when in an incautious landing several of his companions were killed by the inhabitants of the island.
By August 1637, Tasman was back in Amsterdam, and the following year he signed on for another ten years and took his wife with him to Batavia. On 25 March 1638 he tried to sell his property in the Jordaan, but the purchase was cancelled.
He was second-in-command of a 1639 expedition of exploration into the north Pacific under Matthijs Quast. The fleet included the ships "Engel" and "Gracht" and reached Fort Zeelandia (Dutch Formosa) and Deshima (an artificial island off Nagasaki, Japan).
First major voyage.
In August 1642, the Council of the Indies, consisting of Antonie van Diemen, Cornelis van der Lijn, Joan Maetsuycker, Justus Schouten, Salomon Sweers, Cornelis Witsen, and Pieter Boreel in Batavia dispatched Tasman and Franchoijs Jacobszoon Visscher on a voyage of exploration to little-charted areas east of the Cape of Good Hope, west of Staten Land (near the Cape Horn of South America) and south of the Solomon Islands.
One of the objectives was to obtain knowledge of "all the totally unknown" Provinces of Beach. This was a purported yet non-existent landmass said to have plentiful gold, which had appeared on European maps since the 15th century, as a result of an error in some editions of Marco Polo's works.
The expedition was to use two small ships, "Heemskerck" and "Zeehaen".
Mauritius.
In accordance with Visscher's directions, Tasman sailed from Batavia on 14 August 1642 and arrived at Mauritius on 5 September 1642, according to the captain's journal. The reason for this was the crew could be fed well on the island; there was plenty of fresh water and timber to repair the ships. Tasman got the assistance of the governor Adriaan van der Stel.
Because of the prevailing winds, Mauritius was chosen as a turning point. After a four-week stay on the island, both ships left on 8 October using the Roaring Forties to sail east as fast as possible. (No one had gone as far as Pieter Nuyts in 1626/27.) On 7 November, snow and hail influenced the ship's council to alter course to a more north-easterly direction, with the intention of having the Solomon Islands as their destination.
Tasmania.
On 24 November 1642, Tasman reached and sighted the west coast of Tasmania, north of Macquarie Harbour. He named his discovery Van Diemen's Land, after Antonio van Diemen, Governor-General of the Dutch East Indies.
Proceeding south, Tasman skirted the southern end of Tasmania and turned north-east. He then tried to work his two ships into Adventure Bay on the east coast of South Bruny Island, but he was blown out to sea by a storm. This area he named Storm Bay. Two days later, on 1 December, Tasman anchored to the north of Cape Frederick Hendrick just north of the Forestier Peninsula. On 2 December, two ship's boats under the command of the Pilot, Major Visscher, rowed through the Marion Narrows into Blackman Bay, and then west to the outflow of Boomer Creek where they gathered some edible "greens". Tasman named the bay, Frederick Hendrik Bay, which included the present North Bay, Marion Bay and what is now Blackman Bay. (Tasman's original naming, Frederick Henrick Bay, was mistakenly transferred to its present location by Marion Dufresne in 1772). The next day, an attempt was made to land in North Bay. However, because the sea was too rough, a ship's carpenter swam through the surf and planted the Dutch flag. Tasman then claimed formal possession of the land on 3 December 1642.
For two more days, he continued to follow the east coast northward to see how far it went. When the land veered to the north-west at Eddystone Point, he tried to follow the coast line but his ships were suddenly hit by the Roaring Forties howling through Bass Strait. Tasman was on a mission to find the Southern Continent not more islands, so he abruptly turned away to the east and continued his continent-hunting.
New Zealand.
Tasman had intended to proceed in a northerly direction but as the wind was unfavourable he steered east. The expedition endured a rough voyage and in one of his diary entries Tasman claimed that his compass was the only thing that had kept him alive.
On 13 December 1642 they sighted land on the north-west coast of the South Island of New Zealand, becoming the first Europeans to sight New Zealand. Tasman named it "Staten Landt" "in honour of the States General" (Dutch parliament). He wrote, "it is possible that this land joins to the Staten Landt but it is uncertain", referring to Isla de los Estados, a landmass of the same name at the southern tip of South America, encountered by the Dutch navigator Jacob Le Maire in 1616. However, in 1643 Brouwer's expedition to Valdivia found out that Staaten Landt was separated by sea from the hypothetical Southern Land. Tasman continued: "We believe that this is the mainland coast of the unknown Southland." Tasman thought he had found the western side of the long-imagined Terra Australis that stretched across the Pacific to near the southern tip of South America.
After sailing north then east for five days, the expedition anchored about from the coast off what is now Golden Bay. A group of Māori paddled out in a waka (canoe) and attacked some sailors who were rowing between the two Dutch vessels. Four sailors were clubbed to death with patu. As Tasman sailed out of the bay he observed 22 waka near the shore, of which "eleven swarming with people came off towards us". The waka approached the "Zeehaen" which fired and hit a man in the largest waka holding a small white flag. Canister shot also hit the side of a waka. Archaeologist Ian Barber suggests that local Maori were trying to secure a cultivation field under ritual protection (tapu) where they believed the Dutch were attempting to land. As the month of this contact, December was at the mid-point of the locally important sweetpotato/kūmara ("Ipomoea batatas") growing season.<ref>
|
1991 | Angula | Angula may refer to:
|
1994 | ASP | ASP may refer to:
|
1997 | Algebraic geometry | Algebraic geometry is a branch of mathematics which classically studies zeros of multivariate polynomials. Modern algebraic geometry is based on the use of abstract algebraic techniques, mainly from commutative algebra, for solving geometrical problems about these sets of zeros.
The fundamental objects of study in algebraic geometry are algebraic varieties, which are geometric manifestations of solutions of systems of polynomial equations. Examples of the most studied classes of algebraic varieties are lines, circles, parabolas, ellipses, hyperbolas, cubic curves like elliptic curves, and quartic curves like lemniscates and Cassini ovals. These are plane algebraic curves. A point of the plane lies on an algebraic curve if its coordinates satisfy a given polynomial equation. Basic questions involve the study of points of special interest like singular points, inflection points and points at infinity. More advanced questions involve the topology of the curve and the relationship between curves defined by different equations.
Algebraic geometry occupies a central place in modern mathematics and has multiple conceptual connections with such diverse fields as complex analysis, topology and number theory. As a study of systems of polynomial equations in several variables, the subject of algebraic geometry begins with finding specific solutions via equation solving, and then proceeds to understand the intrinsic properties of the totality of solutions of a system of equations. This understanding requires both conceptual theory and computational technique.
In the 20th century, algebraic geometry split into several subareas.
Much of the development of the mainstream of algebraic geometry in the 20th century occurred within an abstract algebraic framework, with increasing emphasis being placed on "intrinsic" properties of algebraic varieties not dependent on any particular way of embedding the variety in an ambient coordinate space; this parallels developments in topology, differential and complex geometry. One key achievement of this abstract algebraic geometry is Grothendieck's scheme theory which allows one to use sheaf theory to study algebraic varieties in a way which is very similar to its use in the study of differential and analytic manifolds. This is obtained by extending the notion of point: In classical algebraic geometry, a point of an affine variety may be identified, through Hilbert's Nullstellensatz, with a maximal ideal of the coordinate ring, while the points of the corresponding affine scheme are all prime ideals of this ring. This means that a point of such a scheme may be either a usual point or a subvariety. This approach also enables a unification of the language and the tools of classical algebraic geometry, mainly concerned with complex points, and of algebraic number theory. Wiles' proof of the longstanding conjecture called Fermat's Last Theorem is an example of the power of this approach.
Basic notions.
Zeros of simultaneous polynomials.
In classical algebraic geometry, the main objects of interest are the vanishing sets of collections of polynomials, meaning the set of all points that simultaneously satisfy one or more polynomial equations. For instance, the two-dimensional sphere of radius 1 in three-dimensional Euclidean space R3 could be defined as the set of all points ("x","y","z") with
A "slanted" circle in R3 can be defined as the set of all points ("x","y","z") which satisfy the two polynomial equations
Affine varieties.
First we start with a field "k". In classical algebraic geometry, this field was always the complex numbers C, but many of the same results are true if we assume only that "k" is algebraically closed. We consider the affine space of dimension "n" over "k", denoted An("k") (or more simply A"n", when "k" is clear from the context). When one fixes a coordinate system, one may identify An("k") with "k""n". The purpose of not working with "k""n" is to emphasize that one "forgets" the vector space structure that "k"n carries.
A function "f" : A"n" → A1 is said to be "polynomial" (or "regular") if it can be written as a polynomial, that is, if there is a polynomial "p" in "k"["x"1...,"x""n"] such that "f"("M") = "p"("t"1...,"t""n") for every point "M" with coordinates ("t"1...,"t""n") in A"n". The property of a function to be polynomial (or regular) does not depend on the choice of a coordinate system in A"n".
When a coordinate system is chosen, the regular functions on the affine "n"-space may be identified with the ring of polynomial functions in "n" variables over "k". Therefore, the set of the regular functions on A"n" is a ring, which is denoted "k"[A"n"].
We say that a polynomial "vanishes" at a point if evaluating it at that point gives zero. Let "S" be a set of polynomials in "k"[An]. The "vanishing set of S" (or "vanishing locus" or "zero set") is the set "V"("S") of all points in A"n" where every polynomial in "S" vanishes. Symbolically,
A subset of A"n" which is "V"("S"), for some "S", is called an "algebraic set". The "V" stands for "variety" (a specific type of algebraic set to be defined below).
Given a subset "U" of A"n", can one recover the set of polynomials which generate it? If "U" is "any" subset of A"n", define "I"("U") to be the set of all polynomials whose vanishing set contains "U". The "I" stands for ideal: if two polynomials "f" and "g" both vanish on "U", then "f"+"g" vanishes on "U", and if "h" is any polynomial, then "hf" vanishes on "U", so "I"("U") is always an ideal of the polynomial ring "k"[A"n"].
Two natural questions to ask are:
The answer to the first question is provided by introducing the Zariski topology, a topology on A"n" whose closed sets are the algebraic sets, and which directly reflects the algebraic structure of "k"[A"n"]. Then "U" = "V"("I"("U")) if and only if "U" is an algebraic set or equivalently a Zariski-closed set. The answer to the second question is given by Hilbert's Nullstellensatz. In one of its forms, it says that "I"("V"("S")) is the radical of the ideal generated by "S". In more abstract language, there is a Galois connection, giving rise to two closure operators; they can be identified, and naturally play a basic role in the theory; the example is elaborated at Galois connection.
For various reasons we may not always want to work with the entire ideal corresponding to an algebraic set "U". Hilbert's basis theorem implies that ideals in "k"[A"n"] are always finitely generated.
An algebraic set is called "irreducible" if it cannot be written as the union of two smaller algebraic sets. Any algebraic set is a finite union of irreducible algebraic sets and this decomposition is unique. Thus its elements are called the "irreducible components" of the algebraic set. An irreducible algebraic set is also called a "variety". It turns out that an algebraic set is a variety if and only if it may be defined as the vanishing set of a prime ideal of the polynomial ring.
Some authors do not make a clear distinction between algebraic sets and varieties and use "irreducible variety" to make the distinction when needed.
Regular functions.
Just as continuous functions are the natural maps on topological spaces and smooth functions are the natural maps on differentiable manifolds, there is a natural class of functions on an algebraic set, called "regular functions" or "polynomial functions". A regular function on an algebraic set "V" contained in A"n" is the restriction to "V" of a regular function on A"n". For an algebraic set defined on the field of the complex numbers, the regular functions are smooth and even analytic.
It may seem unnaturally restrictive to require that a regular function always extend to the ambient space, but it is very similar to the situation in a normal topological space, where the Tietze extension theorem guarantees that a continuous function on a closed subset always extends to the ambient topological space.
Just as with the regular functions on affine space, the regular functions on "V" form a ring, which we denote by "k"["V"]. This ring is called the "coordinate ring of V".
Since regular functions on V come from regular functions on A"n", there is a relationship between the coordinate rings. Specifically, if a regular function on "V" is the restriction of two functions "f" and "g" in "k"[A"n"], then "f" − "g" is a polynomial function which is null on "V" and thus belongs to "I"("V"). Thus "k"["V"] may be identified with "k"[A"n"]/"I"("V").
Morphism of affine varieties.
Using regular functions from an affine variety to A1, we can define regular maps from one affine variety to another. First we will define a regular map from a variety into affine space: Let "V" be a variety contained in A"n". Choose "m" regular functions on "V", and call them "f"1, ..., "f""m". We define a "regular map" "f" from "V" to A"m" by letting . In other words, each "f""i" determines one coordinate of the range of "f".
If "V"′ is a variety contained in A"m", we say that "f" is a "regular map" from "V" to "V"′ if the range of "f" is contained in "V"′.
The definition of the regular maps apply also to algebraic sets.
The regular maps are also called "morphisms", as they make the collection of all affine algebraic sets into a category, where the objects are the affine algebraic sets and the morphisms are the regular maps. The affine varieties is a subcategory of the category of the algebraic sets.
Given a regular map "g" from "V" to "V"′ and a regular function "f" of "k"["V"′], then . The map is a ring homomorphism from "k"["V"′] to "k"["V"]. Conversely, every ring homomorphism from "k"["V"′] to "k"["V"] defines a regular map from "V" to "V"′. This defines an equivalence of categories between the category of algebraic sets and the opposite category of the finitely generated reduced "k"-algebras. This equivalence is one of the starting points of scheme theory.
Rational function and birational equivalence.
In contrast to the preceding sections, this section concerns only varieties and not algebraic sets. On the other hand, the definitions extend naturally to projective varieties (next section), as an affine variety and its projective completion have the same field of functions.
If "V" is an affine variety, its coordinate ring is an integral domain and has thus a field of fractions which is denoted "k"("V") and called the "field of the rational functions" on "V" or, shortly, the "function field" of "V". Its elements are the restrictions to "V" of the rational functions over the affine space containing "V". The domain of a rational function "f" is not "V" but the complement of the subvariety (a hypersurface) where the denominator of "f" vanishes.
As with regular maps, one may define a "rational map" from a variety "V" to a variety "V"'. As with the regular maps, the rational maps from "V" to "V"' may be identified to the field homomorphisms from "k"("V"') to "k"("V").
Two affine varieties are "birationally equivalent" if there are two rational functions between them which are inverse one to the other in the regions where both are defined. Equivalently, they are birationally equivalent if their function fields are isomorphic.
An affine variety is a "rational variety" if it is birationally equivalent to an affine space. This means that the variety admits a "rational parameterization", that is a parametrization with rational functions. For example, the circle of equation formula_5 is a rational curve, as it has the parametric equation
which may also be viewed as a rational map from the line to the circle.
The problem of resolution of singularities is to know if every algebraic variety is birationally equivalent to a variety whose projective completion is nonsingular (see also smooth completion). It was solved in the affirmative in characteristic 0 by Heisuke Hironaka in 1964 and is yet unsolved in finite characteristic.
Projective variety.
Just as the formulas for the roots of second, third, and fourth degree polynomials suggest extending real numbers to the more algebraically complete setting of the complex numbers, many properties of algebraic varieties suggest extending affine space to a more geometrically complete projective space. Whereas the complex numbers are obtained by adding the number "i", a root of the polynomial , projective space is obtained by adding in appropriate points "at infinity", points where parallel lines may meet.
To see how this might come about, consider the variety . If we draw it, we get a parabola. As "x" goes to positive infinity, the slope of the line from the origin to the point ("x", "x"2) also goes to positive infinity. As "x" goes to negative infinity, the slope of the same line goes to negative infinity.
Compare this to the variety "V"("y" − "x"3). This is a cubic curve. As "x" goes to positive infinity, the slope of the line from the origin to the point ("x", "x"3) goes to positive infinity just as before. But unlike before, as "x" goes to negative infinity, the slope of the same line goes to positive infinity as well; the exact opposite of the parabola. So the behavior "at infinity" of "V"("y" − "x"3) is different from the behavior "at infinity" of "V"("y" − "x"2).
The consideration of the "projective completion" of the two curves, which is their prolongation "at infinity" in the projective plane, allows us to quantify this difference: the point at infinity of the parabola is a regular point, whose tangent is the line at infinity, while the point at infinity of the cubic curve is a cusp. Also, both curves are rational, as they are parameterized by "x", and the Riemann-Roch theorem implies that the cubic curve must have a singularity, which must be at infinity, as all its points in the affine space are regular.
Thus many of the properties of algebraic varieties, including birational equivalence and all the topological properties, depend on the behavior "at infinity" and so it is natural to study the varieties in projective space. Furthermore, the introduction of projective techniques made many theorems in algebraic geometry simpler and sharper: For example, Bézout's theorem on the number of intersection points between two varieties can be stated in its sharpest form only in projective space. For these reasons, projective space plays a fundamental role in algebraic geometry.
Nowadays, the "projective space" P"n" of dimension "n" is usually defined as the set of the lines passing through a point, considered as the origin, in the affine space of dimension , or equivalently to the set of the vector lines in a vector space of dimension . When a coordinate system has been chosen in the space of dimension , all the points of a line have the same set of coordinates, up to the multiplication by an element of "k". This defines the homogeneous coordinates of a point of P"n" as a sequence of elements of the base field "k", defined up to the multiplication by a nonzero element of "k" (the same for the whole sequence).
A polynomial in variables vanishes at all points of a line passing through the origin if and only if it is homogeneous. In this case, one says that the polynomial "vanishes" at the corresponding point of P"n". This allows us to define a "projective algebraic set" in P"n" as the set , where a finite set of homogeneous polynomials vanishes. Like for affine algebraic sets, there is a bijection between the projective algebraic sets and the reduced homogeneous ideals which define them. The "projective varieties" are the projective algebraic sets whose defining ideal is prime. In other words, a projective variety is a projective algebraic set, whose homogeneous coordinate ring is an integral domain, the "projective coordinates ring" being defined as the quotient of the graded ring or the polynomials in variables by the homogeneous (reduced) ideal defining the variety. Every projective algebraic set may be uniquely decomposed into a finite union of projective varieties.
The only regular functions which may be defined properly on a projective variety are the constant functions. Thus this notion is not used in projective situations. On the other hand, the "field of the rational functions" or "function field " is a useful notion, which, similarly to the affine case, is defined as the set of the quotients of two homogeneous elements of the same degree in the homogeneous coordinate ring.
Real algebraic geometry.
Real algebraic geometry is the study of real algebraic varieties.
The fact that the field of the real numbers is an ordered field cannot be ignored in such a study. For example, the curve of equation formula_8 is a circle if formula_9, but has no real points if formula_10. Real algebraic geometry also investigates, more broadly, "semi-algebraic sets", which are the solutions of systems of polynomial inequalities. For example, neither branch of the hyperbola of equation formula_11 is a real algebraic variety. However, the branch in the first quadrant is a semi-algebraic set defined by formula_12 and formula_13.
One open problem in real algebraic geometry is the following part of Hilbert's sixteenth problem: Decide which respective positions are possible for the ovals of a nonsingular plane curve of degree 8.
Computational algebraic geometry.
One may date the origin of computational algebraic geometry to meeting EUROSAM'79 (International Symposium on Symbolic and Algebraic Manipulation) held at Marseille, France, in June 1979. At this meeting,
Since then, most results in this area are related to one or several of these items either by using or improving one of these algorithms, or by finding algorithms whose complexity is simply exponential in the number of the variables.
A body of mathematical theory complementary to symbolic methods called numerical algebraic geometry has been developed over the last several decades. The main computational method is homotopy continuation. This supports, for example, a model of floating point computation for solving problems of algebraic geometry.
Gröbner basis.
A Gröbner basis is a system of generators of a polynomial ideal whose computation allows the deduction of many properties of the affine algebraic variety defined by the ideal.
Given an ideal "I" defining an algebraic set "V":
Gröbner basis computations do not allow one to compute directly the primary decomposition of "I" nor the prime ideals defining the irreducible components of "V", but most algorithms for this involve Gröbner basis computation. The algorithms which are not based on Gröbner bases use regular chains but may need Gröbner bases in some exceptional situations.
Gröbner bases are deemed to be difficult to compute. In fact they may contain, in the worst case, polynomials whose degree is doubly exponential in the number of variables and a number of polynomials which is also doubly exponential. However, this is only a worst case complexity, and the complexity bound of Lazard's algorithm of 1979 may frequently apply. Faugère F5 algorithm realizes this complexity, as it may be viewed as an improvement of Lazard's 1979 algorithm. It follows that the best implementations allow one to compute almost routinely with algebraic sets of degree more than 100. This means that, presently, the difficulty of computing a Gröbner basis is strongly related to the intrinsic difficulty of the problem.
Cylindrical algebraic decomposition (CAD).
CAD is an algorithm which was introduced in 1973 by G. Collins to implement with an acceptable complexity the Tarski–Seidenberg theorem on quantifier elimination over the real numbers.
This theorem concerns the formulas of the first-order logic whose atomic formulas are polynomial equalities or inequalities between polynomials with real coefficients. These formulas are thus the formulas which may be constructed from the atomic formulas by the logical operators "and" (∧), "or" (∨), "not" (¬), "for all" (∀) and "exists" (∃). Tarski's theorem asserts that, from such a formula, one may compute an equivalent formula without quantifier (∀, ∃).
The complexity of CAD is doubly exponential in the number of variables. This means that CAD allows, in theory, to solve every problem of real algebraic geometry which may be expressed by such a formula, that is almost every problem concerning explicitly given varieties and semi-algebraic sets.
While Gröbner basis computation has doubly exponential complexity only in rare cases, CAD has almost always this high complexity. This implies that, unless if most polynomials appearing in the input are linear, it may not solve problems with more than four variables.
Since 1973, most of the research on this subject is devoted either to improving CAD or finding alternative algorithms in special cases of general interest.
As an example of the state of art, there are efficient algorithms to find at least a point in every connected component of a semi-algebraic set, and thus to test if a semi-algebraic set is empty. On the other hand, CAD is yet, in practice, the best algorithm to count the number of connected components.
Asymptotic complexity vs. practical efficiency.
The basic general algorithms of computational geometry have a double exponential worst case complexity. More precisely, if "d" is the maximal degree of the input polynomials and "n" the number of variables, their complexity is at most formula_14 for some constant "c", and, for some inputs, the complexity is at least formula_15 for another constant "c"′.
During the last 20 years of the 20th century, various algorithms have been introduced to solve specific subproblems with a better complexity. Most of these algorithms have a complexity formula_16.
Among these algorithms which solve a sub problem of the problems solved by Gröbner bases, one may cite "testing if an affine variety is empty" and "solving nonhomogeneous polynomial systems which have a finite number of solutions." Such algorithms are rarely implemented because, on most entries Faugère's F4 and F5 algorithms have a better practical efficiency and probably a similar or better complexity ("probably" because the evaluation of the complexity of Gröbner basis algorithms on a particular class of entries is a difficult task which has been done only in a few special cases).
The main algorithms of real algebraic geometry which solve a problem solved by CAD are related to the topology of semi-algebraic sets. One may cite "counting the number of connected components", "testing if two points are in the same components" or "computing a Whitney stratification of a real algebraic set". They have a complexity of
formula_16, but the constant involved by "O" notation is so high that using them to solve any nontrivial problem effectively solved by CAD, is impossible even if one could use all the existing computing power in the world. Therefore, these algorithms have never been implemented and this is an active research area to search for algorithms with have together a good asymptotic complexity and a good practical efficiency.
Abstract modern viewpoint.
The modern approaches to algebraic geometry redefine and effectively extend the range of basic objects in various levels of generality to schemes, formal schemes, ind-schemes, algebraic spaces, algebraic stacks and so on. The need for this arises already from the useful ideas within theory of varieties, e.g. the formal functions of Zariski can be accommodated by introducing nilpotent elements in structure rings; considering spaces of loops and arcs, constructing quotients by group actions and developing formal grounds for natural intersection theory and deformation theory lead to some of the further extensions.
Most remarkably, in the late 1950s, algebraic varieties were subsumed into Alexander Grothendieck's concept of a scheme. Their local objects are affine schemes or prime spectra which are locally ringed spaces which form a category which is antiequivalent to the category of commutative unital rings, extending the duality between the category of affine algebraic varieties over a field "k", and the category of finitely generated reduced "k"-algebras. The gluing is along Zariski topology; one can glue within the category of locally ringed spaces, but also, using the Yoneda embedding, within the more abstract category of presheaves of sets over the category of affine schemes. The Zariski topology in the set theoretic sense is then replaced by a Grothendieck topology. Grothendieck introduced Grothendieck topologies having in mind more exotic but geometrically finer and more sensitive examples than the crude Zariski topology, namely the étale topology, and the two flat Grothendieck topologies: fppf and fpqc; nowadays some other examples became prominent including Nisnevich topology. Sheaves can be furthermore generalized to stacks in the sense of Grothendieck, usually with some additional representability conditions leading to Artin stacks and, even finer, Deligne–Mumford stacks, both often called algebraic stacks.
Sometimes other algebraic sites replace the category of affine schemes. For example, Nikolai Durov has introduced commutative algebraic monads as a generalization of local objects in a generalized algebraic geometry. Versions of a tropical geometry, of an absolute geometry over a field of one element and an algebraic analogue of Arakelov's geometry were realized in this setup.
Another formal generalization is possible to universal algebraic geometry in which every variety of algebras has its own algebraic geometry. The term "variety of algebras" should not be confused with "algebraic variety".
The language of schemes, stacks and generalizations has proved to be a valuable way of dealing with geometric concepts and became cornerstones of modern algebraic geometry.
Algebraic stacks can be further generalized and for many practical questions like deformation theory and intersection theory, this is often the most natural approach. One can extend the Grothendieck site of affine schemes to a higher categorical site of derived affine schemes, by replacing the commutative rings with an infinity category of differential graded commutative algebras, or of simplicial commutative rings or a similar category with an appropriate variant of a Grothendieck topology. One can also replace presheaves of sets by presheaves of simplicial sets (or of infinity groupoids). Then, in presence of an appropriate homotopic machinery one can develop a notion of derived stack as such a presheaf on the infinity category of derived affine schemes, which is satisfying certain infinite categorical version of a sheaf axiom (and to be algebraic, inductively a sequence of representability conditions). Quillen model categories, Segal categories and quasicategories are some of the most often used tools to formalize this yielding the "derived algebraic geometry", introduced by the school of Carlos Simpson, including Andre Hirschowitz, Bertrand Toën, Gabrielle Vezzosi, Michel Vaquié and others; and developed further by Jacob Lurie, Bertrand Toën, and Gabriele Vezzosi. Another (noncommutative) version of derived algebraic geometry, using A-infinity categories has been developed from the early 1990s by Maxim Kontsevich and followers.
History.
Before the 16th century.
Some of the roots of algebraic geometry date back to the work of the Hellenistic Greeks from the 5th century BC. The Delian problem, for instance, was to construct a length "x" so that the cube of side "x" contained the same volume as the rectangular box "a"2"b" for given sides "a" and "b". Menaechmus () considered the problem geometrically by intersecting the pair of plane conics "ay" = "x"2 and "xy" = "ab". In the 3rd century BC, Archimedes and Apollonius systematically studied additional problems on conic sections using coordinates. Apollonius in the Conics further developed a method that is so similar to analytic geometry that his work is sometimes thought to have anticipated the work of Descartes by some 1800 years.His application of reference lines, a diameter and a tangent is essentially no different from our modern use of a coordinate frame, where the distances measured along the diameter from the point of tangency are the abscissas, and the segments parallel to the tangent and intercepted between the axis and the curve are the ordinates. He further developed relations between the abscissas and the corresponding coordinates using geometric methods like using parabolas and curves. Medieval mathematicians, including Omar Khayyam, Leonardo of Pisa, Gersonides and Nicole Oresme in the Medieval Period , solved certain cubic and quadratic equations by purely algebraic means and then interpreted the results geometrically. The Persian mathematician Omar Khayyám (born 1048 AD) believed that there was a relationship between arithmetic, algebra and geometry. This was criticized by Jeffrey Oaks, who claims that the study of curves by means of equations originated with Descartes in the seventeenth century.
Renaissance.
Such techniques of applying geometrical constructions to algebraic problems were also adopted by a number of Renaissance mathematicians such as Gerolamo Cardano and Niccolò Fontana "Tartaglia" on their studies of the cubic equation. The geometrical approach to construction problems, rather than the algebraic one, was favored by most 16th and 17th century mathematicians, notably Blaise Pascal who argued against the use of algebraic and analytical methods in geometry. The French mathematicians Franciscus Vieta and later René Descartes and Pierre de Fermat revolutionized the conventional way of thinking about construction problems through the introduction of coordinate geometry. They were interested primarily in the properties of "algebraic curves", such as those defined by Diophantine equations (in the case of Fermat), and the algebraic reformulation of the classical Greek works on conics and cubics (in the case of Descartes).
During the same period, Blaise Pascal and Gérard Desargues approached geometry from a different perspective, developing the synthetic notions of projective geometry. Pascal and Desargues also studied curves, but from the purely geometrical point of view: the analog of the Greek "ruler and compass construction". Ultimately, the analytic geometry of Descartes and Fermat won out, for it supplied the 18th century mathematicians with concrete quantitative tools needed to study physical problems using the new calculus of Newton and Leibniz. However, by the end of the 18th century, most of the algebraic character of coordinate geometry was subsumed by the "calculus of infinitesimals" of Lagrange and Euler.
19th and early 20th century.
It took the simultaneous 19th century developments of non-Euclidean geometry and Abelian integrals in order to bring the old algebraic ideas back into the geometrical fold. The first of these new developments was seized up by Edmond Laguerre and Arthur Cayley, who attempted to ascertain the generalized metric properties of projective space. Cayley introduced the idea of "homogeneous polynomial forms", and more specifically quadratic forms, on projective space. Subsequently, Felix Klein studied projective geometry (along with other types of geometry) from the viewpoint that the geometry on a space is encoded in a certain class of transformations on the space. By the end of the 19th century, projective geometers were studying more general kinds of transformations on figures in projective space. Rather than the projective linear transformations which were normally regarded as giving the fundamental Kleinian geometry on projective space, they concerned themselves also with the higher degree birational transformations. This weaker notion of congruence would later lead members of the 20th century Italian school of algebraic geometry to classify algebraic surfaces up to birational isomorphism.
The second early 19th century development, that of Abelian integrals, would lead Bernhard Riemann to the development of Riemann surfaces.
In the same period began the algebraization of the algebraic geometry through commutative algebra. The prominent results in this direction are Hilbert's basis theorem and Hilbert's Nullstellensatz, which are the basis of the connection between algebraic geometry and commutative algebra, and Macaulay's multivariate resultant, which is the basis of elimination theory. Probably because of the size of the computation which is implied by multivariate resultants, elimination theory was forgotten during the middle of the 20th century until it was renewed by singularity theory and computational algebraic geometry.
20th century.
B. L. van der Waerden, Oscar Zariski and André Weil developed a foundation for algebraic geometry based on contemporary commutative algebra, including valuation theory and the theory of ideals. One of the goals was to give a rigorous framework for proving the results of the Italian school of algebraic geometry. In particular, this school used systematically the notion of generic point without any precise definition, which was first given by these authors during the 1930s.
In the 1950s and 1960s, Jean-Pierre Serre and Alexander Grothendieck recast the foundations making use of sheaf theory. Later, from about 1960, and largely led by Grothendieck, the idea of schemes was worked out, in conjunction with a very refined apparatus of homological techniques. After a decade of rapid development the field stabilized in the 1970s, and new applications were made, both to number theory and to more classical geometric questions on algebraic varieties, singularities, moduli, and formal moduli.
An important class of varieties, not easily understood directly from their defining equations, are the abelian varieties, which are the projective varieties whose points form an abelian group. The prototypical examples are the elliptic curves, which have a rich theory. They were instrumental in the proof of Fermat's Last Theorem and are also used in elliptic-curve cryptography.
In parallel with the abstract trend of the algebraic geometry, which is concerned with general statements about varieties, methods for effective computation with concretely-given varieties have also been developed, which lead to the new area of computational algebraic geometry. One of the founding methods of this area is the theory of Gröbner bases, introduced by Bruno Buchberger in 1965. Another founding method, more specially devoted to real algebraic geometry, is the cylindrical algebraic decomposition, introduced by George E. Collins in 1973.
See also: derived algebraic geometry.
Analytic geometry.
An analytic variety is defined locally as the set of common solutions of several equations involving analytic functions. It is analogous to the included concept of real or complex algebraic variety. Any complex manifold is an analytic variety. Since analytic varieties may have singular points, not all analytic varieties are manifolds.
Modern analytic geometry is essentially equivalent to real and complex algebraic geometry, as has been shown by Jean-Pierre Serre in his paper "GAGA", the name of which is French for "Algebraic geometry and analytic geometry". Nevertheless, the two fields remain distinct, as the methods of proof are quite different and algebraic geometry includes also geometry in finite characteristic.
Applications.
Algebraic geometry now finds applications in statistics, control theory, robotics, error-correcting codes, phylogenetics and geometric modelling. There are also connections to string theory, game theory, graph matchings, solitons and integer programming.
|
1998 | Austin, Texas | Austin is the capital city of the U.S. state of Texas, as well as the seat and most populous city of Travis County, with portions extending into Hays and Williamson counties. Incorporated on December 27, 1839, it is the 10th-most-populous city in the United States, the fourth-most-populous city in Texas, and the second-most-populous state capital city. It has been one of the fastest growing large cities in the United States since 2010. Downtown Austin and Downtown San Antonio are approximately apart, and both fall along the Interstate 35 corridor. Austin is the southernmost state capital in the contiguous United States and is considered a Beta-level global city as categorized by the Globalization and World Cities Research Network.
As of 2021, Austin had an estimated population of 964,177, up from 961,855 at the 2020 census. The city is the cultural and economic center of the metropolitan statistical area, which had an estimated population of 2,421,115 as of July 1, 2022. Located in within the greater Texas Hill Country, it is home to numerous lakes, rivers, and waterways, including Lady Bird Lake and Lake Travis on the Colorado River, Barton Springs, McKinney Falls, and Lake Walter E. Long.
Residents of Austin are known as Austinites. They include a diverse mix of government employees, college students, musicians, high-tech workers, and blue-collar workers. The city's official slogan promotes Austin as "The Live Music Capital of the World", a reference to the city's many musicians and live music venues, as well as the long-running PBS TV concert series "Austin City Limits". The city also adopted "Silicon Hills" as a nickname in the 1990s due to a rapid influx of technology and development companies. In recent years, some Austinites have adopted the unofficial slogan "Keep Austin Weird", which refers to the desire to protect small, unique, and local businesses from being overrun by large corporations. Since the late 19th century, Austin has also been known as the "City of the Violet Crown", because of the colorful glow of light across the hills just after sunset.
In 1987, Austin originated and remains the site for South by Southwest (stylized as SXSW and colloquially referred to as "South By"), an annual conglomeration of parallel film, interactive media, and music festivals and conferences that take place in mid-March.
Emerging from a strong economic focus on government and education, since the 1990s, Austin has become a center for technology and business. The technology roots in Austin can be traced back to the 1960s when Tracor (now BAE Systems), a major defense electronics contractor, began operation in the city in 1962. IBM followed in 1967, opening a facility to produce its Selectric typewriters. Texas Instruments setup in Austin two years later, Motorola (now NXP Semiconductors) started semiconductor chip manufacturing in 1974. BAE Systems, IBM, and NXP Semiconductors still have campuses and manufacturing operations in Austin as of 2022. A number of Fortune 500 companies have headquarters or regional offices in Austin, including 3M, Advanced Micro Devices (AMD), Amazon, Apple, Facebook (Meta), Google, IBM, Intel, NXP Semiconductors, Oracle, Tesla, Texas Instruments, and Whole Foods Market. Dell's worldwide headquarters is located in the nearby suburb of Round Rock. With regard to education, Austin is the home of the University of Texas at Austin, which is one of the largest universities in the U.S., with over 50,000 students. In 2021, Austin became home to the Austin FC, the first (and currently only) major professional sports team in the city.
History.
Austin, Travis County and Williamson County have been the site of human habitation since at least 9200 BC. The area's earliest known inhabitants lived during the late Pleistocene (Ice Age) and are linked to the Clovis culture around 9200 BC (over 11,200 years ago), based on evidence found throughout the area and documented at the much-studied Gault Site, midway between Georgetown and Fort Cavazos.
When settlers arrived from Europe, the Tonkawa tribe inhabited the area. The Comanches and Lipan Apaches were also known to travel through the area. Spanish colonists, including the Espinosa-Olivares-Aguirre expedition, traveled through the area, though few permanent settlements were created for some time. In 1730, three Catholic missions from East Texas were combined and reestablished as one mission on the south side of the Colorado River, in what is now Zilker Park, in Austin. The mission was in this area for only about seven months, and then was moved to San Antonio de Béxar and split into three missions.
During the 1830s, pioneers began to settle the area in central Austin along the Colorado River. Spanish forts were established in what are now Bastrop and San Marcos. Following Mexico's independence, new settlements were established in Central Texas, but growth in the region was stagnant because of conflicts with the regional Native Americans.
In 1835–1836, Texans fought and won independence from Mexico. Texas thus became an independent country with its own president, congress, and monetary system. In 1839, the Texas Congress formed a commission to seek a site for a new capital of the Republic of Texas to replace Houston. When he was Vice President of Texas, Mirabeau B. Lamar had visited the area during a buffalo-hunting expedition between 1837 and 1838. He advised the commissioners to consider the area on the north bank of the Colorado River (near the present-day Congress Avenue Bridge), noting the area's hills, waterways, and pleasant surroundings. It was seen as a convenient crossroads for trade routes between Santa Fe and Galveston Bay, as well as routes between northern Mexico and the Red River. In 1839, the site was chosen, and was briefly incorporated under the name "Waterloo". Shortly afterward, the name was changed to Austin in honor of Stephen F. Austin, the "Father of Texas" and the republic's first secretary of state.
The city grew throughout the 19th century and became a center for government and education with the construction of the Texas State Capitol and the University of Texas at Austin.
Edwin Waller was picked by Lamar to survey the village and draft a plan laying out the new capital. The original site was narrowed to that fronted the Colorado River between two creeks, Shoal Creek and Waller Creek, which was later named in his honor. Waller and a team of surveyors developed Austin's first city plan, commonly known as the Waller Plan, dividing the site into a 14-block grid plan bisected by a broad north–south thoroughfare, Congress Avenue, running up from the river to Capital Square, where the new Texas State Capitol was to be constructed. A temporary one-story capitol was erected on the corner of Colorado and 8th Streets. On August 1, 1839, the first auction of 217 out of 306 lots total was held. The Waller Plan designed and surveyed now forms the basis of downtown Austin.
In 1840, a series of conflicts between the Texas Rangers and the Comanches, known as the Council House Fight and the Battle of Plum Creek, pushed the Comanches westward, mostly ending conflicts in Central Texas. Settlement in the area began to expand quickly. Travis County was established in 1840, and the surrounding counties were mostly established within the next two decades.
Initially, the new capital thrived but Lamar's political enemy, Sam Houston, used two Mexican army incursions to San Antonio as an excuse to move the government. Sam Houston fought bitterly against Lamar's decision to establish the capital in such a remote wilderness. The men and women who traveled mainly from Houston to conduct government business were intensely disappointed as well. By 1840, the population had risen to 856, nearly half of whom fled Austin when Congress recessed. The resident African American population listed in January of this same year was 176. The fear of Austin's proximity to the Indians and Mexico, which still considered Texas a part of their land, created an immense motive for Sam Houston, the first and third President of the Republic of Texas, to relocate the capital once again in 1841. Upon threats of Mexican troops in Texas, Houston raided the Land Office to transfer all official documents to Houston for safe keeping in what was later known as the Archive War, but the people of Austin would not allow this unaccompanied decision to be executed. The documents stayed, but the capital would temporarily move from Austin to Houston to Washington-on-the-Brazos. Without the governmental body, Austin's population declined to a low of only a few hundred people throughout the early 1840s. The voting by the fourth President of the Republic, Anson Jones, and Congress, who reconvened in Austin in 1845, settled the issue to keep Austin the seat of government, as well as annex the Republic of Texas into the United States.
In 1860, 38% of Travis County residents were slaves. In 1861, with the outbreak of the American Civil War, voters in Austin and other Central Texas communities voted against secession. However, as the war progressed and fears of attack by Union forces increased, Austin contributed hundreds of men to the Confederate forces. The African American population of Austin swelled dramatically after the enforcement of the Emancipation Proclamation in Texas by Union General Gordon Granger at Galveston, in an event commemorated as Juneteenth. Black communities such as Wheatville, Pleasant Hill, and Clarksville were established, with Clarksville being the oldest surviving freedomtown ‒ the original post-Civil War settlements founded by former African-American slaves ‒ west of the Mississippi River. In 1870, blacks made up 36.5% of Austin's population.
The postwar period saw dramatic population and economic growth. The opening of the Houston and Texas Central Railway (H&TC) in 1871 turned Austin into the major trading center for the region, with the ability to transport both cotton and cattle. The Missouri, Kansas & Texas (MKT) line followed close behind. Austin was also the terminus of the southernmost leg of the Chisholm Trail, and "drovers" pushed cattle north to the railroad. Cotton was one of the few crops produced locally for export, and a cotton gin engine was located downtown near the trains for "ginning" cotton of its seeds and turning the product into bales for shipment. However, as other new railroads were built through the region in the 1870s, Austin began to lose its primacy in trade to the surrounding communities. In addition, the areas east of Austin took over cattle and cotton production from Austin, especially in towns like Hutto and Taylor that sit over the blackland prairie, with its deep, rich soils for producing cotton and hay.
In September 1881, Austin public schools held their first classes. The same year, Tillotson Collegiate and Normal Institute (now part of Huston–Tillotson University) opened its doors. The University of Texas held its first classes in 1883, although classes had been held in the original wooden state capitol for four years before.
During the 1880s, Austin gained new prominence as the state capitol building was completed in 1888 and claimed as the seventh largest building in the world. In the late 19th century, Austin expanded its city limits to more than three times its former area, and the first granite dam was built on the Colorado River to power a new street car line and the new "moon towers". The first dam washed away in a flood on April 7, 1900.
In the late 1920s and 1930s, Austin implemented the 1928 Austin city plan through a series of civic development and beautification projects that created much of the city's infrastructure and many of its parks. In addition, the state legislature established the Lower Colorado River Authority (LCRA) that, along with the city of Austin, created the system of dams along the Colorado River to form the Highland Lakes. These projects were enabled in large part because the Public Works Administration provided Austin with greater funding for municipal construction projects than other Texas cities.
During the early twentieth century, a three-way system of social segregation emerged in Austin, with Anglos, African Americans and Mexicans being separated by custom or law in most aspects of life, including housing, health care, and education. Many of the municipal improvement programs initiated during this period—such as the construction of new roads, schools, and hospitals—were deliberately designed to institutionalize this system of segregation. Deed restrictions also played an important role in residential segregation. After 1935 most housing deeds prohibited African Americans (and sometimes other nonwhite groups) from using land. Combined with the system of segregated public services, racial segregation increased in Austin during the first half of the twentieth century, with African Americans and Mexicans experiencing high levels of discrimination and social marginalization.
In 1940, the destroyed granite dam on the Colorado River was finally replaced by a hollow concrete dam that formed Lake McDonald (now called Lake Austin) and which has withstood all floods since. In addition, the much larger Mansfield Dam was built by the LCRA upstream of Austin to form Lake Travis, a flood-control reservoir.
In the early 20th century, the Texas Oil Boom took hold, creating tremendous economic opportunities in Southeast Texas and North Texas. The growth generated by this boom largely passed by Austin at first, with the city slipping from fourth largest to tenth largest in Texas between 1880 and 1920.
After a severe lull in economic growth from the Great Depression, Austin resumed its steady development. Following the mid-20th century, Austin became established as one of Texas' major metropolitan centers. In 1970, the U.S. Census Bureau reported Austin's population as 14.5% Hispanic, 11.9% black, and 73.4% non-Hispanic white. In the late 20th century, Austin emerged as an important high tech center for semiconductors and software. The University of Texas at Austin emerged as a major university.
The 1970s saw Austin's emergence in the national music scene, with local artists such as Willie Nelson, Asleep at the Wheel, and Stevie Ray Vaughan and iconic music venues such as the Armadillo World Headquarters. Over time, the long-running television program "Austin City Limits", its namesake Austin City Limits Festival, and the South by Southwest music festival solidified the city's place in the music industry.
Geography.
Austin, the southernmost state capital of the contiguous 48 states, is located in Central Texas on the Colorado River. Austin is northwest of Houston, south of Dallas and northeast of San Antonio.
Austin occupies a total area of . Approximately of this area is water. Austin is situated at the foot of the Balcones Escarpment, on the Colorado River, with three artificial lakes within the city limits: Lady Bird Lake (formerly known as Town Lake), Lake Austin (both created by dams along the Colorado River), and Lake Walter E. Long that is partly used for cooling water for the Decker Power Plant. Mansfield Dam and the foot of Lake Travis are located within the city's limits. Lady Bird Lake, Lake Austin, and Lake Travis are each on the Colorado River.
The elevation of Austin varies from to approximately above sea level. Due to the fact it straddles the Balcones Fault, much of the eastern part of the city is flat, with heavy clay and loam soils, whereas the western part and western suburbs consist of rolling hills on the edge of the Texas Hill Country. Because the hills to the west are primarily limestone rock with a thin covering of topsoil, portions of the city are frequently subjected to flash floods from the runoff caused by thunderstorms. To help control this runoff and to generate hydroelectric power, the Lower Colorado River Authority operates a series of dams that form the Texas Highland Lakes. The lakes also provide venues for boating, swimming, and other forms of recreation within several parks on the lake shores.
Austin is located at the intersection of four major ecological regions, and is consequently a temperate-to-hot green oasis with a highly variable climate having some characteristics of the desert, the tropics, and a wetter climate. The area is very diverse ecologically and biologically, and is home to a variety of animals and plants. Notably, the area is home to many types of wildflowers that blossom throughout the year but especially in the spring. This includes the popular bluebonnets, some planted by "Lady Bird" Johnson, wife of former President Lyndon B. Johnson.
The soils of Austin range from shallow, gravelly clay loams over limestone in the western outskirts to deep, fine sandy loams, silty clay loams, silty clays or clays in the city's eastern part. Some of the clays have pronounced shrink-swell properties and are difficult to work under most moisture conditions. Many of Austin's soils, especially the clay-rich types, are slightly to moderately alkaline and have free calcium carbonate.
Cityscape.
Austin's skyline historically was modest, dominated by the Texas State Capitol and the University of Texas Main Building. However, since the 2000s, many new high-rise towers have been constructed. Austin is currently undergoing a skyscraper boom, which includes recent construction on new office, hotel and residential buildings. Downtown's buildings are somewhat spread out, partly due to a set of zoning restrictions that preserve the view of the Texas State Capitol from various locations around Austin, known as the Capitol View Corridors.
At night, parts of Austin are lit by "artificial moonlight" from moonlight towers built to illuminate the central part of the city. The moonlight towers were built in the late 19th century and are now recognized as historic landmarks. Only 15 of the 31 original innovative towers remain standing in Austin, but none remain in any of the other cities where they were installed. The towers are featured in the 1993 film "Dazed and Confused".
Downtown.
The central business district of Austin is home to the tallest condo towers in the state, with The Independent (58 stories and tall) and The Austonian (topping out at 56 floors and tall). The Independent became the tallest all-residential building in the U.S. west of Chicago when topped out in 2018. In 2005, then-Mayor Will Wynn set out a goal of having 25,000 people living downtown by 2015. Although downtown's growth did not meet this goal, downtown's residential population did surge from an estimated 5,000 in 2005 to 12,000 in 2015. The skyline has drastically changed in recent years, and the residential real estate market has remained relatively strong. , there were 31 high rise projects either under construction, approved or planned to be completed in Austin's downtown core between 2017 and 2020. Sixteen of those were set to rise above tall, including four above 600', and eight above 500'. An additional 15 towers were slated to stand between 300' and 399' tall.
Climate.
Austin is located within the middle of a unique, narrow transitional zone between the dry deserts of the American Southwest and the lush, green, more humid regions of the American Southeast. Its climate, topography, and vegetation share characteristics of both. Officially, Austin has a humid subtropical climate ("Cfa") under the Köppen climate classification. This climate is typified by long, very hot summers, short, mild winters, and warm to hot spring and fall seasons in-between. Austin averages of annual rainfall distributed mostly evenly throughout the year, though spring and fall are the wettest seasons. Sunshine is common during all seasons, with 2,650 hours, or 60.3% of the possible total, of bright sunshine per year.
Summers in Austin are very hot, with average July and August highs frequently reaching the high-90s (34–36 °C) or above. Highs reach on 123 days per year, of which 29 days reach ; all years in the 1991-2020 period recorded at least 1 day of the latter. The average daytime high is or warmer between March 1 and November 21, rising to or warmer between April 14 and October 24, and reaching or warmer between May 30 and September 18. The highest ever recorded temperature was occurring on September 5, 2000, and August 28, 2011. An uncommon characteristic of Austin's climate is its highly variable humidity, which fluctuates frequently depending on the shifting patterns of air flow and wind direction. It is common for a lengthy series of warm, dry, low-humidity days to be occasionally interrupted by very warm and humid days, and vice versa. Humidity rises with winds from the east or southeast, when the air drifts inland from the Gulf of Mexico, but decreases significantly with winds from the west or southwest, bringing air flowing from Chihuahuan Desert areas of West Texas or northern Mexico.
Winters in Austin are mild, although occasional short-lived bursts of cold weather known as "Blue Northers" can occur. January is the coolest month with an average daytime high of . The overnight low drops to or below freezing 12 times per year, and sinks below during 76 evenings per year, mostly between mid-December and mid-February. The average first and last dates for a freeze are December 1 and February 15, giving Austin an average growing season of 288 days, and the coldest temperature of the year is normally about under the 1991-2020 climate normals, putting Austin in USDA zone 9a.
Conversely, winter months also produce warm days on a regular basis. On average, 10 days in January reach or exceed and 1 day reaches ; during the 1991-2020 period, all Januarys had at least 1 day with a high of or more, and most (60%) had at least 1 day with a high of or more. The lowest ever recorded temperature in the city was on January 31, 1949. Roughly every two years Austin experiences an ice storm that freezes roads over and cripples travel in the city for 24 to 48 hours. When Austin received of ice on January 24, 2014, there were 278 vehicular collisions. Similarly, snowfall is rare in Austin. A snow event of on February 4, 2011, caused more than 300 car crashes. The most recent major snow event occurred February 14–15, 2021, when of snow fell at Austin's Camp Mabry, the largest two-day snowfall since records began being kept in 1948.
Typical of Central Texas, severe weather in Austin is a threat that can strike during any season. However, it is most common during the spring. According to most classifications, Austin lies within the extreme southern periphery of Tornado Alley, although many sources place Austin outside of Tornado Alley altogether. Consequently, tornadoes strike Austin less frequently than areas farther to the north. However, severe weather and/or supercell thunderstorms can occur multiple times per year, bringing damaging winds, lightning, heavy rain, and occasional flash flooding to the city. The deadliest storm to ever strike city limits was the twin tornadoes storm of May 4, 1922, while the deadliest tornado outbreak to ever strike the metro area was the Central Texas tornado outbreak of May 27, 1997.
Natural disasters.
2011 drought.
From October 2010 through September 2011, both major reporting stations in Austin, Camp Mabry and Bergstrom Int'l, had the least rainfall of a water year on record, receiving less than a third of normal precipitation. This was a result of La Niña conditions in the eastern Pacific Ocean where water was significantly cooler than normal. David Brown, a regional official with the National Oceanic and Atmospheric Administration, explained that "these kinds of droughts will have effects that are even more extreme in the future, given a warming and drying regional climate." The drought, coupled with exceedingly high temperatures throughout the summer of 2011, caused many wildfires throughout Texas, including notably the Bastrop County Complex Fire in neighboring Bastrop, Texas.
2018 flooding and water crisis.
In the fall of 2018, Austin and surrounding areas received heavy rainfall and flash flooding following Hurricane Sergio. The Lower Colorado River Authority opened four floodgates of the Mansfield Dam after Lake Travis was recorded at 146% full at . From October 22 to October 29, 2018, the City of Austin issued a mandatory citywide boil-water advisory after the Highland Lakes, home to the city's main water supply, became overwhelmed by unprecedented amounts of silt, dirt, and debris that had washed in from the Llano River. Austin Water, the city's water utility, has the capacity to process up to 300 million gallons of water per day; however, the elevated level of turbidity reduced output to only 105 million gallons per day. Since Austin residents consumed an average of 120 million gallons of water per day, the infrastructure was not able to keep up with demand.
2021 winter storm.
In February 2021, Winter Storm Uri dropped prolific amounts of snow across Texas and Oklahoma, including Austin. The Austin area received a total of of snowfall between February 14 and 15, with snow cover persisting until February 20.
This marked the longest time the area had had more than of snow, with the previous longest time being three days in January 1985.
Lack of winterization in natural gas power plants, which supply a large amount of power to the Texas grid, and increased energy demand caused ERCOT and Austin Energy to enact rolling blackouts in order to avoid total grid collapse between February 15 and February 18. Initial rolling blackouts were to last for a maximum of 40 minutes, however lack of energy production caused many blackouts to last for much longer, at the peak of the blackouts an estimated 40% of Austin Energy homes were without power.
Starting on February 15, Austin Water received reports of pipe breaks, hourly water demand increased from 150 million gallons per day (MGD) on February 15 to a peak hourly demand of 260 MGD on February 16. On the morning of February 17 demand increased to 330 MGD, the resulting drop of water pressure caused the Austin area to enter into a boil-water advisory which would last until water pressure was restored on February 23.
2023 winter storm.
Beginning January 30, 2023 the City of Austin experienced a winter freeze which left 170,000 Austin Energy customers without electricity or heat for several days. The slow pace of repairs and lack of public information from City officials frustrated many residents. A week after the freeze and when Austin City Council members were proposing to evaluate his employment, City Manager Spencer Cronk finally apologized. On Thursday February 16, 2023, Cronk was fired by the Austin City Council for the city's response to the winter storm. Former City Manager Jesus Garcia was named Interim City Manager
Demographics.
In 2020, there were 961,855 people, up from the 2000 United States census tabulation where there were people, households, and families residing in the city. In 2000, the population density was . There were dwelling units at an average density of . There were households, out of which 26.8% had children under the age of 18 living with them, 38.1% were married couples living together, 10.8% had a female householder with no husband present, and 46.7% were non-families. 32.8% of all households were made up of individuals, and 4.6% had someone living alone who was 65 years of age or older. The average household size was 2.40 and the average family size was 3.14.
In the city, 22.5% of the population was under the age of 18, 16.6% was from 18 to 24, 37.1% from 25 to 44, 17.1% from 45 to 64, and 6.7% were 65 years of age or older. The median age was 30 years. For every 100 females, there were 105.8 males.
The median income for a household in the city was , and the median income for a family was $. Males had a median income of $ compared to $ for females. The per capita income for the city was $. About 9.1% of families and 14.4% of the population were below the poverty line, including 16.5% of those under age 18 and 8.7% of those age 65 or over. The median house price was $ in 2009, and it has increased every year since 2004. The median value of a house which the owner occupies was $318,400 in 2019—higher than the average American home value of $240,500.
Race and ethnicity.
According to the 2010 United States census, the racial composition of Austin was 68.3% White (48.7% non-Hispanic whites), 35.1% Hispanic or Latino (29.1% Mexican, 0.5% Puerto Rican, 0.4% Cuban, 5.1% Other), 8.1% African American, 6.3% Asian (1.9% Indian, 1.5% Chinese, 1.0% Vietnamese, 0.7% Korean, 0.3% Filipino, 0.2% Japanese, 0.8% Other), 0.9% American Indian, 0.1% Native Hawaiian and Other Pacific Islander, and 3.4% two or more races.
According to the 2020 United States census, the racial composition of Austin was 72.6% White (48.3% non-Hispanic whites), 33.9% Hispanic or Latino, 7.8% African American, 7.6% Asian, 0.7% American Indian, 0.1% Native Hawaiian and Other Pacific Islander, and 3.4% two or more races.
A 2014 University of Texas study stated that Austin was the only U.S. city with a fast growth rate between 2000 and 2010 with a net loss in African Americans. , Austin's African American and non-Hispanic white percentage share of the total population was declining despite the actual numbers of both ethnic groups increasing, as the rapid growth of the Latino or Hispanic and Asian populations has outpaced all other ethnic groups in the city. Austin's non-Hispanic white population first dropped below 50% in 2005.
Sexual orientation and gender identity.
According to a survey completed in 2014 by Gallup, it is estimated that 5.3% of residents in the Austin metropolitan area identify as lesbian, gay, bisexual, or transgender. The Austin metropolitan area had the third-highest rate in the nation.
Religion.
According to Sperling's BestPlaces, 52.4% of Austin's population are religious. The majority of Austinites identified themselves as Christians, about 25.2% of whom claimed affiliation with the Catholic Church. The city's Catholic population is served by the Roman Catholic Diocese of Austin, headquartered at the Cathedral of Saint Mary. Nationwide, 23% of Americans identified as Catholic in 2016. Other significant Christian groups in Austin include Baptists (8.7%), followed by Methodists (4.3%), Latter-day Saints (1.5%), Episcopalians or Anglicans (1.0%), Lutherans (0.8%), Presbyterians (0.6%), Pentecostals (0.3%), and other Christians such as the Disciples of Christ and Eastern Orthodox Church (7.1%). The second largest religion Austinites identify with is Islam (1.7%); roughly 0.8% of Americans nationwide claimed affiliation with the Islamic faith. The dominant branch of Islam is Sunni Islam. Established in 1977, the largest mosque in Austin is the Islamic Center of Greater Austin. The community is affiliated with the Islamic Society of North America. The same study says that eastern faiths including Buddhism, Sikhism, and Hinduism made up 0.9% of the city's religious population. Several Hindu temples exist in the Austin Metropolitan area with the most notable one being Radha Madhav Dham. Judaism forms less than 0.1% of the religious demographic in Austin. Orthodox, Reform, and Conservative congregations are present in the community. In addition to those religious groups, Austin is also home to an active secular humanist community, hosting nationwide television shows and charity work.
Homelessness.
As of 2019, there were 2,255 individuals experiencing homelessness in Travis County. Of those, 1,169 were sheltered and 1,086 were unsheltered. In September 2019, the Austin City Council approved $62.7 million for programs aimed at homelessness, which includes housing displacement prevention, crisis mitigation, and affordable housing; the city council also earmarked $500,000 for crisis services and encampment cleanups.
In June 2019, following a federal court ruling on homelessness sleeping in public, the Austin City Council lifted a 25-year-old ban on camping, sitting, or lying down in public unless doing so causes an obstruction. The resolution also included the approval of a new housing-focused shelter in South Austin. In early October 2019, Texas Governor Greg Abbott sent a letter to Mayor Steve Adler threatening to deploy state resources to combat the camping ban repeal. On October 17, 2019, the City Council revised the camping ordinance, which imposed increased restrictions on sidewalk camping. In November 2019, the State of Texas opened a temporary homeless encampment on a former vehicle storage yard owned by the Texas Department of Transportation.
In May 2021, the camping ban was reinstated after a ballot proposition was approved by 57% of voters. The ban introduces penalties for camping, sitting, or lying down on a public sidewalk or sleeping outdoors in or near Downtown Austin or the area around the University of Texas campus. The ordinance also prohibits solicitation at certain locations.
Economy.
The Greater Austin metropolitan statistical area had a gross domestic product (GDP) of $86 billion in 2010. Austin is considered to be a major center for high tech. Thousands of graduates each year from the engineering and computer science programs at the University of Texas at Austin provide a steady source of employees that help to fuel Austin's technology and defense industry sectors. The region's rapid growth has led "Forbes" to rank the Austin metropolitan area number one among all big cities for jobs for 2012 in their annual survey and WSJ Marketwatch to rank the area number one for growing businesses. As a result of the high concentration of high-tech companies in the region, Austin was strongly affected by the dot-com boom in the late 1990s and subsequent bust. Austin's largest employers include the Austin Independent School District, the City of Austin, Dell, the U.S. Federal Government, NXP Semiconductors, IBM, St. David's Healthcare Partnership, Seton Family of Hospitals, the State of Texas, the Texas State University, and the University of Texas at Austin.
Other high-tech companies with operations in Austin include 3M, Apple, Amazon, AMD, Apartment Ratings, Applied Materials, Arm Holdings, Bigcommerce, BioWare, Blizzard Entertainment, Buffalo Technology, Cirrus Logic, Cisco Systems, Dropbox, eBay, Electronic Arts, Flextronics, Facebook, Google, Hewlett-Packard, Hoover's, HomeAway, HostGator, Intel Corporation, National Instruments, Nintendo, Nvidia, Oracle, PayPal, Polycom, Qualcomm, Rackspace, RetailMeNot, Rooster Teeth, Samsung Group, Silicon Labs, Spansion, Tesla, United Devices, VMware, Xerox, and Zoho Corporation. In 2010, Facebook accepted a grant to build a downtown office that could bring as many as 200 jobs to the city. The proliferation of technology companies has led to the region's nickname, "Silicon Hills", and spurred development that greatly expanded the city.
Austin is also emerging as a hub for pharmaceutical and biotechnology companies; the city is home to about 85 of them. In 2004, the city was ranked by the Milken Institute as the No. 12 biotech and life science center in the United States and in 2018, CBRE Group ranked Austin as #3 emerging life sciences cluster. Companies such as Hospira, Pharmaceutical Product Development, and ArthroCare Corporation are located there.
Whole Foods Market, an international grocery store chain specializing in fresh and packaged food products, was founded and is headquartered in Austin.
Other companies based in Austin include NXP Semiconductors, GoodPop, Temple-Inland, Sweet Leaf Tea Company, Keller Williams Realty, National Western Life, GSD&M, Dimensional Fund Advisors, Golfsmith, Forestar Group, EZCorp, Outdoor Voices, Tito's Vodka, Indeed, Speak Social, and YETI.
In 2018, Austin metro-area companies saw a total of $1.33 billion invested. In 2018, Austin's venture capital investments accounted for more than 60 percent of Texas' total investments.
Culture and contemporary life.
"Keep Austin Weird" has been a local motto for years, featured on bumper stickers and T-shirts. This motto has not only been used in promoting Austin's eccentricity and diversity, but is also meant to bolster support of local independent businesses. According to the 2010 book "Weird City" the phrase was begun by a local Austin Community College librarian, Red Wassenich, and his wife, Karen Pavelka, who were concerned about Austin's "rapid descent into commercialism and overdevelopment." The slogan has been interpreted many ways since its inception, but remains an important symbol for many Austinites who wish to voice concerns over rapid growth and development. Austin has a long history of vocal citizen resistance to development projects perceived to degrade the environment, or to threaten the natural and cultural landscapes.
According to the Nielsen Company, adults in Austin read and contribute to blogs more than those in any other U.S. metropolitan area. Austin residents have the highest Internet usage in all of Texas. In 2013, Austin was the most active city on Reddit, having the largest number of views per capita. Austin was selected as the No. 2 Best Big City in "Best Places to Live" by "Money" magazine in 2006, and No. 3 in 2009, and also the "Greenest City in America" by MSN.
South Congress is a shopping district stretching down South Congress Avenue from Downtown. This area is home to coffee shops, eccentric stores, restaurants, food trucks, trailers, and festivals. It prides itself on "Keeping Austin Weird," especially with development in the surrounding area(s). Many Austinites attribute its enduring popularity to the magnificent and unobstructed view of the Texas State Capitol.
The Rainey Street Historic District is a neighborhood in Downtown Austin formerly consisting of bungalow style homes built in the early 20th century. Since the early 2010s, the former working class residential street has turned into a popular nightlife district. Much of the historic homes have been renovated into hotels, condominiums, bars and restaurants, many of which feature large porches and outdoor yards for patrons. The Rainey Street district is also home to the Emma S. Barrientos Mexican American Cultural Center.
Austin has been part of the UNESCO Creative Cities Network under Media Arts the category.
Old Austin.
"Old Austin" is an adage often used by nostalgic natives. The term "Old Austin" refers to a time when the city was smaller and more bohemian with a considerably lower cost of living and better known for its lack of traffic, hipsters, and urban sprawl. It is often employed by longtime residents expressing displeasure at the rapidly changing culture, or when referencing nostalgia of Austin culture.
The growth and popularity of Austin can be seen by the expansive development taking place in its downtown landscape. This growth can have a negative impact on longtime small businesses that cannot keep up with the expenses associated with gentrification and the rising cost of real estate. A former Austin musician, Dale Watson, described his move away from Austin, "I just really feel the city has sold itself. Just because you're going to get $45 million for a company to come to town – if it's not in the best interest of the town, I don't think they should do it. This city was never about money. It was about quality of life." Though much is changing rapidly in Austin, businesses such as Thundercloud Subs are thought by many to maintain classic Austin business cultural sentiments unique to the history of the city; as Diana Burgess stated, "I definitely appreciate that they haven't raised their prices a ton or made things super fancy. I think it speaks to that original Old Austin vibe. A lot of us that grew up here really appreciate that."
Annual cultural events.
The O. Henry House Museum hosts the annual O. Henry Pun-Off, a pun contest where the successful contestants exhibit wit akin to that of the author William Sydney Porter.
Other annual events include Eeyore's Birthday Party, Spamarama, Austin Pride Festival & Parade in August, the Austin Reggae Festival in April, Kite Festival, Texas Craft Brewers Festival in September, Art City Austin in April, East Austin Studio Tour in November, and Carnaval Brasileiro in February. Sixth Street features annual festivals such as the Pecan Street Festival and Halloween night. The three-day Austin City Limits Music Festival has been held in Zilker Park every year since 2002. Every year around the end of March and the beginning of April, Austin is home to "Texas Relay Weekend."
Austin's Zilker Park Tree is a Christmas display made of lights strung from the top of a Moonlight tower in Zilker Park. The Zilker Tree is lit in December along with the "Trail of Lights," an Austin Christmas tradition. The Trail of Lights was canceled four times, first starting in 2001 and 2002 due to the September 11 Attacks, and again in 2010 and 2011 due to budget shortfalls, but the trail was turned back on for the 2012 holiday season.
Cuisine and breweries.
Austin is perhaps best known for its Texas barbecue and Tex-Mex cuisine. Franklin Barbecue is perhaps Austin's most famous barbecue restaurant; the restaurant has sold out of brisket every day since its establishment. Breakfast tacos and queso are popular food items in the city; Austin is sometimes called the "home of the breakfast taco." Kolaches are a common pastry in Austin bakeries due to the large Czech and German immigrant population in Texas. The Oasis Restaurant is the largest outdoor restaurant in Texas, which promotes itself as the "Sunset Capital of Texas" with its terraced views looking West over Lake Travis. P. Terry's, an Austin-based fast food burger chain, has a loyal following among Austinites. Some other Austin-based chain restaurants include Amy's Ice Creams, Bush's Chicken, Chuy's, DoubleDave's Pizzaworks, and Schlotzky's.
Austin is also home to a large number of food trucks, with 1,256 food trucks operating in 2016. The city of Austin has the second-largest number of food trucks per capita in the United States. Austin's first food hall, "Fareground," features a number of Austin-based food vendors and a bar in the ground level and courtyard of One Congress Plaza.
Austin has a large craft beer scene, with over 50 microbreweries in the metro area. Drinks publication VinePair named Austin as the "top beer destination in the world" in 2019. Notable Austin-area breweries include Jester King Brewery, Live Oak Brewing Company, and Real Ale Brewing Company.
Music.
As Austin's official slogan is "The Live Music Capital of the World", the city has a vibrant live music scene with more music venues per capita than any other U.S. city. Austin's music revolves around the many nightclubs on 6th Street and an annual film/music/interactive festival known as South by Southwest (SXSW). The concentration of restaurants, bars, and music venues in the city's downtown core is a major contributor to Austin's live music scene, as the ZIP Code encompassing the downtown entertainment district hosts the most bar or alcohol-serving establishments in the U.S.
The longest-running concert music program on American television, "Austin City Limits", is recorded at ACL Live at The Moody Theater, located in the bottom floor of the W Hotels in Austin. "Austin City Limits" and C3 Presents produce the Austin City Limits Music Festival, an annual music and art festival held at Zilker Park in Austin. Other music events include the Urban Music Festival, Fun Fun Fun Fest, Chaos In Tejas and Old Settler's Music Festival. Austin Lyric Opera performs multiple operas each year (including the 2007 opening of Philip Glass's "Waiting for the Barbarians", written by University of Texas at Austin alumnus J. M. Coetzee). The Austin Symphony Orchestra performs a range of classical, pop and family performances and is led by music director and conductor Peter Bay. The Austin Baroque Orchestra and La Follia Austin Baroque ensembles both give historically informed performances of Baroque music. The Texas Early Music Project regularly performs music from the Medieval and Renaissance eras, as well as the Baroque.
Film.
Austin hosts several film festivals, including the SXSW (South by Southwest) Film Festival and the Austin Film Festival, which hosts international films. A movie theater chain by the name of Alamo Drafthouse Cinema was founded in Austin in 1997; the South Lamar location of which is home to the annual week-long Fantastic Fest film festival. In 2004 the city was first in "MovieMaker Magazine's" annual top ten cities to live and make movies.
Austin has been the location for a number of motion pictures, partly due to the influence of The University of Texas at Austin Department of Radio-Television-Film. Films produced in Austin include "The Texas Chain Saw Massacre" (1974), "Songwriter" (1984), "Man of the House", "Secondhand Lions", "Texas Chainsaw Massacre 2", "Nadine", "Waking Life", "Spy Kids", "The Faculty", "Dazed and Confused", "The Guards Themselves", "Wild Texas Wind", "Office Space", "The Life of David Gale", "Miss Congeniality", "Doubting Thomas", "Slacker", "Idiocracy", "Death Proof", "The New Guy", "Hope Floats", "The Alamo", "Blank Check", "The Wendall Baker Story", "School of Rock", "A Slipping-Down Life", "A Scanner Darkly", "Saturday Morning Massacre", and most recently, the Coen brothers' "True Grit", "Grindhouse", "Machete", "How to Eat Fried Worms", "Bandslam" and "Lazer Team". In order to draw future film projects to the area, the Austin Film Society has converted several airplane hangars from the former Mueller Airport into filmmaking center Austin Studios. Projects that have used facilities at Austin Studios include music videos by The Flaming Lips and feature films such as "25th Hour" and "Sin City".
Austin also hosted the MTV series, "" in 2005. Season 4 of the AMC show "Fear the Walking Dead" was filmed in various locations around Austin in 2018. The film review websites Spill.com and Ain't It Cool News are based in Austin. Rooster Teeth Productions, creator of popular web series such as "Red vs. Blue" and "RWBY", is also located in Austin.
Theater.
Austin has a strong theater culture, with dozens of itinerant and resident companies producing a variety of work. The Church of the Friendly Ghost is a volunteer-run arts organization supporting creative expression and counter-culture community. The city also has live performance theater venues such as the Zachary Scott Theatre Center, Vortex Repertory Company, Salvage Vanguard Theater, Rude Mechanicals' the Off Center, Austin Playhouse, Scottish Rite Children's Theater, Hyde Park Theatre, the Blue Theater, The Hideout Theatre, and Esther's Follies. The Victory Grill was a renowned venue on the Chitlin' Circuit. Public art and performances in the parks and on bridges are popular. Austin hosts the Fuse Box Festival each April featuring theater artists.
The Paramount Theatre, opened in downtown Austin in 1915, contributes to Austin's theater and film culture, showing classic films throughout the summer and hosting regional premieres for films such as "Miss Congeniality". The Zilker Park Summer Musical is a long-running outdoor musical.
The Long Center for the Performing Arts is a 2,300-seat theater built partly with materials reused from the old Lester E. Palmer Auditorium.
Ballet Austin is among the fifteen largest ballet academies in the country. Each year Ballet Austin's 20-member professional company performs ballets from a wide variety of choreographers, including their international award-winning artistic director, Stephen Mills. The city is also home to the Ballet East Dance Company, a modern dance ensemble, and the Tapestry Dance Company which performs a variety of dance genres.
The Austin improvisational theatre scene has several theaters: ColdTowne Theater, The Hideout Theater, The Fallout Theater, and The Institution Theater. Austin also hosts the Out of Bounds Comedy Festival, which draws comedic artists in all disciplines to Austin.
Libraries.
The Austin Public Library is operated by the City of Austin and consists of the Central Library on César Chávez Street, the Austin History Center, 20 branches and the Recycled Reads bookstore and upcycling facility. The APL library system also has mobile libraries – bookmobile buses and a human-powered trike and trailer called "unbound: sin fronteras."
The Central Library, which is an anchor to the redevelopment of the former Seaholm Power Plant site and the Shoal Creek Walk, opened on October 28, 2017. The six-story Central Library contains a living rooftop garden, reading porches, an indoor reading room, bicycle parking station, large indoor and outdoor event spaces, a gift shop, an art gallery, café, and a "technology petting zoo" where visitors can play with next-generation gadgets like 3D printers. In 2018, Time magazine named the Austin Central Library on its list of "World's Greatest Places."
Museums and other points of interest.
Museums in Austin include the Texas Memorial Museum, the George Washington Carver Museum and Cultural Center, Thinkery, the Blanton Museum of Art (reopened in 2006), the Bob Bullock Texas State History Museum across the street (which opened in 2000), The Contemporary Austin, the Elisabet Ney Museum and the galleries at the Harry Ransom Center. The Texas State Capitol itself is also a major tourist attraction.
The Driskill Hotel, built in 1886, once owned by George W. Littlefield, and located at 6th and Brazos streets, was finished just before the construction of the Capitol building. Sixth Street is a musical hub for the city. The Enchanted Forest, a multi-acre outdoor music, art, and performance art space in South Austin hosts events such as fire-dancing and circus-like-acts. Austin is also home to the Lyndon Baines Johnson Library and Museum, which houses documents and artifacts related to the Johnson administration, including LBJ's limousine and a re-creation of the Oval Office.
Locally produced art is featured at the South Austin Museum of Popular Culture. The Mexic-Arte Museum is a Mexican and Mexican-American art museum founded in 1983. Austin is also home to the O. Henry House Museum, which served as the residence of O. Henry from 1893 to 1895. Farmers' markets are popular attractions, providing a variety of locally grown and often organic foods.
Austin also has many odd statues and landmarks, such as the "Stevie Ray Vaughan Memorial", the "Willie Nelson" statue, the Mangia dinosaur, the Loca Maria lady at Taco Xpress, the Hyde Park Gym's giant flexed arm, and Daniel Johnston's "Hi, How are You?" Jeremiah the Innocent frog mural.
The Ann W. Richards Congress Avenue Bridge houses the world's largest urban population of Mexican free-tailed bats. Starting in March, up to 1.5 million bats take up residence inside the bridge's expansion and contraction zones as well as in long horizontal grooves running the length of the bridge's underside, an environment ideally suited for raising their young. Every evening around sunset, the bats emerge in search of insects, an exit visible on weather radar. Watching the bat emergence is an event that is popular with locals and tourists, with more than 100,000 viewers per year. The bats migrate to Mexico each winter.
The Austin Zoo, located in unincorporated western Travis County, is a rescue zoo that provides sanctuary to displaced animals from a variety of situations, including those involving neglect.
The HOPE Outdoor Gallery was a public, three-story outdoor street art project located on Baylor Street in the Clarksville neighborhood. The gallery, which consisted of the foundations of a failed multifamily development, was a constantly-evolving canvas of graffiti and murals. Also known as "Castle Hill" or simply "Graffiti Park," the site on Baylor Street was closed to the public in early January 2019 but remained intact, behind a fence and with an armed guard, in mid-March 2019. The gallery will build a new art park at Carson Creek Ranch in Southeast Austin.
Sports.
Many Austinites support the athletic programs of the University of Texas at Austin known as the Texas Longhorns. During the 2005–2006 academic term, Longhorns football team was named the NCAA Division I FBS National Football Champion, and Longhorns baseball team won the College World Series. The Texas Longhorns play home games in the state's second-largest sports stadium, Darrell K Royal–Texas Memorial Stadium, seating over 101,000 fans. Baseball games are played at UFCU Disch–Falk Field.
Austin was the most populous city in the United States without a major-league professional sports team, which changed in 2021 with Austin FC's entry to MLS. Minor-league professional sports came to Austin in 1996, when the Austin Ice Bats began playing at the Travis County Expo Center; they were later replaced by the AHL Texas Stars. Austin has hosted a number of other professional teams, including the Austin Spurs of the NBA G League, the Austin Aztex of the United Soccer League, the Austin Outlaws in WFA football, and the Austin Aces in WTT tennis.
Natural features like the bicycle-friendly Texas Hill Country and generally mild climate make Austin the home of several endurance and multi-sport races and communities. The Capitol 10,000 is the largest race in Texas, and approximately fifth largest in the United States. The Austin Marathon has been run in the city every year since 1992. Additionally, the city is home to the largest 5 mile race in Texas, named the Turkey Trot as it is run annually on Thanksgiving. Started in 1991 by Thundercloud Subs, a local sandwich chain (who still sponsors the event), the event has grown to host over 20,000 runners. All proceeds are donated to Caritas of Austin, a local charity.
The Austin-founded American Swimming Association hosts several swim races around town. Austin is also the hometown of several cycling groups and the disgraced cyclist Lance Armstrong. Combining these three disciplines is a growing crop of triathlons, including the Capital of Texas Triathlon held every Memorial Day on and around Lady Bird Lake, Auditorium Shores, and Downtown Austin.
Austin is home to the Circuit of the Americas (COTA), a grade 1 Fédération Internationale de l'Automobile specification motor racing facility which hosts the Formula One United States Grand Prix. The State of Texas has pledged $25 million in public funds annually for 10 years to pay the sanctioning fees for the race. Built at an estimated cost of $250 to $300 million, the circuit opened in 2012 and is located just east of the Austin Bergstrom International Airport. The circuit also hosts the EchoPark Automotive Grand Prix NASCAR race in late March each year.
The summer of 2014 marked the inaugural season for World TeamTennis team Austin Aces, formerly Orange County Breakers of the southern California region. The Austin Aces played their matches at the Cedar Park Center northwest of Austin, and featured former professionals Andy Roddick and Marion Bartoli, as well as current WTA tour player Vera Zvonareva. The team left after the 2015 season.
In 2017, Precourt Sports Ventures announced a plan to move the Columbus Crew SC soccer franchise from Columbus, Ohio to Austin. Precourt negotiated an agreement with the City of Austin to build a $200 million privately funded stadium on public land at 10414 McKalla Place, following initial interest in Butler Shores Metropolitan Park and Roy G. Guerrero Colorado River Park. As part of an arrangement with the league, operational rights of Columbus Crew SC were sold in late 2018, and Austin FC was announced as Major League Soccer's 27th franchise on January 15, 2019, with the expansion team starting play in 2021.
Parks and recreation.
The Austin Parks and Recreation Department received the Excellence in Aquatics award in 1999 and the Gold Medal Awards in 2004 from the National Recreation and Park Association.
To strengthen the region's parks system, which spans more than , The Austin Parks Foundation (APF) was established in 1992 to develop and improve parks in and around Austin. APF works to fill the city's park funding gap by leveraging volunteers, philanthropists, park advocates, and strategic collaborations to develop, maintain and enhance Austin's parks, trails and green spaces.
Lady Bird Lake.
Lady Bird Lake (formerly Town Lake) is a river-like reservoir on the Colorado River. The lake is a popular recreational area for paddleboards, kayaks, canoes, dragon boats, and rowing shells. Austin's warm climate and the river's calm waters, nearly length and straight courses are especially popular with crew teams and clubs. Other recreational attractions along the shores of the lake include swimming in Deep Eddy Pool, the oldest swimming pool in Texas, and Red Bud Isle, a small island formed by the 1900 collapse of the McDonald Dam that serves as a recreation area with a dog park and access to the lake for canoeing and fishing. The Ann and Roy Butler Hike and Bike Trail forms a complete circuit around the lake. A local nonprofit, The Trail Foundation, is the Trail's private steward and has built amenities and infrastructure including trailheads, lakefront gathering areas, restrooms, exercise equipment, as well as doing Trailwide ecological restoration work on an ongoing basis. The Butler Trail loop was completed in 2014 with the public-private partnership 1-mile Boardwalk project.
Along the shores of Lady Bird Lake is the Zilker Park, which contains large open lawns, sports fields, cross country courses, historical markers, concession stands, and picnic areas. Zilker Park is also home to numerous attractions, including the Zilker Botanical Garden, the Umlauf Sculpture Garden, Zilker Hillside Theater, the Austin Nature & Science Center, and the Zilker Zephyr, a gauge miniature railway carries passengers on a tour around the park. Auditorium Shores, an urban park along the lake, is home to the Palmer Auditorium, the Long Center for the Performing Arts, and an off-leash dog park on the water. Both Zilker Park and Auditorium Shores have a direct view of the Downtown skyline.
Barton Creek Greenbelt.
The Barton Creek Greenbelt is a public green belt managed by the City of Austin's Park and Recreation Department. The Greenbelt, which begins at Zilker Park and stretches South/Southwest to the Woods of Westlake subdivision, is characterized by large limestone cliffs, dense foliage, and shallow bodies of water. Popular activities include rock climbing, mountain biking, and hiking. Some well known naturally forming swimming holes along Austin's greenbelt include Twin Falls, Sculpture Falls, Gus Fruh Pool, and Campbell's Hole. During years of heavy rainfall, the water level of the creek rises high enough to allow swimming, cliff diving, kayaking, paddle boarding, and tubing.
Swimming holes.
Austin is home to more than 50 public pools and swimming holes. These include Deep Eddy Pool, Texas' oldest human-made swimming pool, and Barton Springs Pool, the nation's largest natural swimming pool in an urban area. Barton Springs Pool is spring-fed while Deep Eddy is well-fed. Both range in temperature from about during the winter to about during the summer. Hippie Hollow Park, a county park situated along Lake Travis, is the only officially sanctioned clothing-optional public park in Texas. Hamilton Pool Preserve is a natural pool that was created when the dome of an underground river collapsed due to massive erosion thousands of years ago. The pool, located about west of Austin, is a popular summer swimming spot for visitors and residents. Hamilton Pool Preserve consists of of protected natural habitat featuring a jade green pool into which a waterfall flows.
Other parks and recreation.
In May 2021, voters in the City of Austin reinstated a public camping ban. That includes downtown green spaces as well as trails and greenbelts such as along Barton Creek.
McKinney Falls State Park is a state park administered by the Texas Parks and Wildlife Department, located at the confluence of Onion Creek and Williamson Creek. The park includes several designated hiking trails and campsites with water and electric. The namesake features of the park are the scenic upper and lower falls along Onion Creek. The Emma Long Metropolitan Park is a municipal park along the shores of Lake Austin, originally constructed by the Civilian Conservation Corps. The Lady Bird Johnson Wildflower Center is a botanical garden and arboretum that features more than 800 species of native Texas plants in both garden and natural settings; the Wildflower Center is located southwest of Downtown in Circle C Ranch. Roy G. Guerrero Park is located along the Colorado River in East Riverside and contains miles of wooded trails, a sandy beach along the river, and a disc golf course.
Covert Park, located on the top of Mount Bonnell, is a popular tourist destination overlooking Lake Austin and the Colorado River. The mount provides a vista for viewing the city of Austin, Lake Austin, and the surrounding hills. It was designated a Recorded Texas Historic Landmark in 1969, bearing Marker number 6473, and was listed on the National Register of Historic Places in 2015.
The Austin Country Club is a private golf club located along the shores of the Colorado River, right next to the Pennybacker Bridge. Founded in 1899, the club moved to its third and present site in 1984, which features a challenging layout designed by noted course architect Pete Dye.
Government.
City government.
Austin is administered by an 11-member city council (10 council members elected by geographic district plus a mayor elected at large). The council is accompanied by a hired city manager under the manager-council system of municipal governance. Council and mayoral elections are non-partisan, with a runoff in case there is no majority winner. A referendum approved by voters on November 6, 2012, changed the council composition from six council members plus a mayor elected at large to the current "10+1" district system. Supporters maintained that the at-large system would increase participation for all areas of the city, especially for those which had lacked representation from City Council.
November 2014 marked the first election under the new system. The Federal government had forced San Antonio and Dallas to abandon at-large systems before 1987; however, the court could not show a racist pattern in Austin and upheld the city's at-large system during a 1984 lawsuit. In five elections between 1973 and 1994 Austin voters rejected single-member districts.
Austin formerly operated its city hall at 128 West 8th Street. Antoine Predock and Cotera Kolar Negrete & Reed Architects designed a new city hall building, which was intended to reflect what "The Dallas Morning News" referred to as a "crazy-quilt vitality, that embraces everything from country music to environmental protests and high-tech swagger." The new city hall, built from recycled materials, has solar panels in its garage. The city hall, at 301 West Second Street, opened in November 2004. Kirk Watson is the current mayor of Austin, assuming the office for a second non-consecutive term on January 6, 2023.
In the 2012 elections, City Council elections were moved from May to November and City council members were given staggered term limits In 2022 Proposition D moved the term of the Austin Mayor to coincide with Presidential election years, so Kirk Watson would only serve two years unlike his predecessor Steve Adler
Law enforcement in Austin is provided by the Austin Police Department, except for state government buildings, which are patrolled by the Texas Department of Public Safety. The University of Texas Police operate from the University of Texas.
Fire protection within the city limits is provided by the Austin Fire Department, while the surrounding county is divided into twelve geographical areas known as emergency services districts, which are covered by separate regional fire departments. Emergency medical services are provided for the whole county by Austin-Travis County Emergency Medical Services.
Mayor Kirk Watson (D)
In 2003, the city adopted a resolution against the USA PATRIOT Act that reaffirmed constitutionally guaranteed rights.
As of 2018, all six of Austin's state legislative districts are held by Democrats.
Crime.
As of 2019, Austin is one of the safest large cities in the United States. In 2019, the FBI named Austin the 11th safest city on a list of 22 American cities with a population above 400,000.
FBI statistics show that overall violent and property crimes dropped in Austin in 2015, but increased in suburban areas of the city. One such southeastern suburb, Del Valle, reported eight homicides within two months in 2016. According to 2016 APD crime statistics, the 78723 census tract had the most violent crime, with 6 murders, 25 rapes, and 81 robberies. The city had 39 homicides in 2016, the most since 1997.
Notable incidents.
One of the first American mass school shooting incidents took place in Austin on August 1, 1966, when Charles Whitman shot 43 people, killing 13 from the top of the University of Texas tower. The University of Texas tower shooting led to the formation of the SWAT team of the Austin Police Department.
In 1991, four teenage girls were murdered in a yogurt shop by an unknown assailant(s). A police officer responded to reports of a fire at the I Can't Believe It's Yogurt! store on Anderson Lane and discovered the girls' bodies in a back room. The murders remain unsolved.
In 2010, Andrew Joseph Stack III deliberately crashed his Piper PA-28 Cherokee into Echelon 1, a building in which the Internal Revenue Service, housing 190 employees was a lessee of. The resulting explosion killed 1 and injured 13 IRS employees, completely destroyed the building and cost the IRS a total of $38.6 million. "(see 2010 Austin suicide attack)"
A series of bombings occurred in Austin in March 2018. Over the course of 20 days, five package bombs exploded, killing two people and injuring another five. The suspect, 23-year-old Mark Anthony Conditt of Pflugerville, Texas, blew himself up inside his vehicle after he was pulled over by police on March 21, also injuring a police officer.
In 2020, Austin was the victim of a cyberattack by the Russian group Berserk Bear, possibly related to the U.S. federal government data breach earlier that year.
On April 18, 2021, a shooting occurred at the Arboretum Oaks Apartments near The Arboretum shopping center, in which a former Travis County Sheriff's Office detective killed his ex-wife, his adoptive daughter, and his daughter's boyfriend. The suspect, who was previously charged with child sexual assault, was arrested in Manor after a 20-hour manhunt.
A mass shooting took place in the early morning of June 12, 2021, on Sixth Street, which resulted in 14 people injured and one dead. The man killed was believed to be an innocent bystander who was struck as he was standing outside a bar. A 19-year-old suspect was formally charged and arrested in Killeen nearly two weeks after the shooting.
Other levels of government.
Austin is the county seat of Travis County and hosts the Heman Marion Sweatt Travis County Courthouse downtown, as well as other county government offices. The Texas Department of Transportation operates the Austin District Office in Austin. The Texas Department of Criminal Justice (TDCJ) operates the Austin I and Austin II district parole offices in Austin. The United States Postal Service operates several post offices in Austin.
Politics.
Former Governor Rick Perry had previously referred to it as a "blueberry in the tomato soup," meaning, Austin had previously been a Democratic city in a Republican state. However, Texas currently has multiple urban cities also voting Democratic and electing Democratic mayors in elections.
After the most recent redistricting, Austin is currently divided between the 10th, 37th and 35th Congressional districts.
Issues.
A controversial turning point in the political history of the Austin area was the 2003 Texas redistricting. Before then, Austin had been entirely or almost entirely within the borders of a single congressional district–what was then the 10th District–for over a century. Opponents characterized the resulting district layout as excessively partisan gerrymandering, and the plan was challenged in court by Democratic and minority activists. The Supreme Court of the United States has never struck down a redistricting plan for being excessively partisan. The plan was subsequently upheld by a three-judge federal panel in late 2003, and on June 28, 2006, the matter was largely settled when the Supreme Court, in a 7–2 decision, upheld the entire congressional redistricting plan with the exception of a Hispanic-majority district in southwest Texas. This affected Austin's districting, as U.S. Rep. Lloyd Doggett's district (U.S. Congressional District 25) was found to be insufficiently compact to compensate for the reduced minority influence in the southwest district; it was redrawn so that it took in most of southeastern Travis County and several counties to its south and east.
Environmental movement.
The distinguishing political movement of Austin politics has been that of the environmental movement, which spawned the parallel neighborhood movement, then the more recent conservationist movement (as typified by the Hill Country Conservancy), and eventually the current ongoing debate about "sense of place" and preserving the Austin quality of life. Much of the environmental movement has matured into a debate on issues related to saving and creating an Austin "sense of place." In 2012, Austin became just one of a few cities in Texas to ban the sale and use of plastic bags. However, the ban ended in 2018 due to a court ruling that regarded all bag bans in the state to contravene the Texas Solid Waste Disposal Act. In 2016, Austin became the first Gold designee of the SolSmart program, a national program from the U.S. Department of Energy that recognizes local governments for enacting solar-friendly measures at the local level.
Education.
According to the 2015–2019 Census estimates, 51.7% of Austin residents ages 25 and over have earned at least a bachelor's degree, compared to the national figure of 32.1%. 19.4% hold a graduate or professional degree, compared to the national figure of 12.4%.
Higher education.
Austin is home to the University of Texas at Austin, the flagship institution of the University of Texas System with over 40,000 undergraduate students and 11,000 graduate students.
Other institutions of higher learning in Austin include St. Edward's University, Huston–Tillotson University, Austin Community College, Concordia University, the Seminary of the Southwest, Texas Health and Science University, University of St. Augustine for Health Sciences, Austin Graduate School of Theology, Austin Presbyterian Theological Seminary, Virginia College's Austin Campus, The Art Institute of Austin, Southern Careers Institute of Austin, Austin Conservatory and a branch of Park University.
The University of Texas System and Texas State University System are headquartered in downtown Austin.
Public primary and secondary education.
Approximately half of the city by area is served by the Austin Independent School District. This district includes notable schools such as the magnet Liberal Arts and Science Academy High School of Austin, Texas (LASA), which, by test scores, has consistently been within the top thirty high schools in the nation, as well as The Ann Richards School for Young Women Leaders. The remaining portion of Austin is served by adjoining school districts, including Round Rock ISD, Pflugerville ISD, Leander ISD, Manor ISD, Del Valle ISD, Lake Travis ISD, Hays, and Eanes ISD.
Four of the metro's major public school systems, representing 54% of area enrollment, are included in "Expansion Management" magazine's latest annual education quality ratings of nearly 2,800 school districts nationwide. Two districts—Eanes and Round Rock—are rated "gold medal," the highest of the magazine's cost-performance categories.
Private and alternative education.
The Austin metropolitan area is also served by 27 charter school districts and over 100 private schools. Austin has a large network of private and alternative education institutions for children in PreK–12th grade exists. Austin is also home to child developmental institutions.
Media.
Austin's main daily newspaper is the "Austin American-Statesman". "The Austin Chronicle" is Austin's alternative weekly, while "The Daily Texan" is the student newspaper of the University of Texas at Austin. Austin's business newspaper is the weekly "Austin Business Journal". "The Austin Monitor" is an online outlet that specializes in insider reporting on City Hall, Travis County Commissioners Court, AISD, and other related local civics beats. The "Monitor" is backed by the nonprofit Capital of Texas Media Foundation. Austin also has numerous smaller special interest or sub-regional newspapers such as the "Oak Hill Gazette", "Westlake Picayune", "Hill Country News", "Round Rock Leader", "NOKOA", and "The Villager" among others. "Texas Monthly", a major regional magazine, is also headquartered in Austin. The "Texas Observer", a muckraking biweekly political magazine, has been based in Austin for over five decades. The weekly "Community Impact Newspaper" published by John Garrett, former publisher of the "Austin Business Journal" has five regional editions and is delivered to every house and business within certain ZIP codes and all of the news is specific to those ZIP codes. Another statewide publication based in Austin is "The Texas Tribune", an on-line publication focused on Texas politics. The "Tribune" is "user-supported" through donations, a business model similar to public radio. The editor is Evan Smith, former editor of "Texas Monthly". Smith co-founded the "Texas Tribune", a nonprofit, non-partisan public media organization, with Austin venture capitalist John Thornton and veteran journalist Ross Ramsey.
Commercial radio stations include KASE-FM (country), KVET (sports), KVET-FM (country), KKMJ-FM (adult contemporary), KLBJ (talk), KLBJ-FM (classic rock), KJFK (variety hits), KFMK (contemporary Christian), KOKE-FM (progressive country) and KPEZ (rhythmic contemporary). KUT-FM is the leading public radio station in Texas and produces the majority of its content locally. KOOP (FM) is a volunteer-run radio station with more than 60 locally produced programs. KVRX is the student-run college radio station of the University of Texas at Austin with a focus on local and non-mainstream music and community programming. Other listener-supported stations include KAZI (urban contemporary), and KMFA (classical).
Network television stations (affiliations in parentheses) include KTBC (Fox O&O), KVUE (ABC), KXAN (NBC), KEYE-TV (CBS), KLRU (PBS), KNVA (The CW), KBVO (MyNetworkTV), and KAKW (Univision O&O). KLRU produces several award-winning locally produced programs such as "Austin City Limits". Despite Austin's explosive growth, it is only a medium-sized market (currently 38th) because the suburban and rural areas are not much larger than the city proper. Additionally, the proximity of San Antonio truncates the potential market area.
Alex Jones, journalist, radio show host and filmmaker, produces his talk show "The Alex Jones Show" in Austin which broadcasts nationally on more than 60 AM and FM radio stations in the United States, WWCR Radio shortwave and XM Radio: Channel 166.
Transportation.
In 2009, 72.7% of Austin (city) commuters drove alone, with other mode shares being: 10.4% carpool, 6% were remote workers, 5% use transit, 2.3% walk, and 1% bicycle. In 2016, the American Community Survey estimated modal shares for Austin (city) commuters of 73.5% for driving alone, 9.6% for carpooling, 3.6% for riding transit, 2% for walking, and 1.5% for cycling. The city of Austin has a lower than average percentage of households without a car. In 2015, 6.9 percent of Austin households lacked a car, and decreased slightly to 6 percent in 2016. The national average was 8.7 percent in 2016. Austin averaged 1.65 cars per household in 2016, compared to a national average of 1.8.
In mid-2019, TomTom ranked Austin as having the worst traffic congestion in Texas, as well as 19th nationally and 179th globally.
Highways.
Central Austin lies between two major north–south freeways: Interstate 35 to the east and the Mopac Expressway (Loop 1) to the west. U.S. Highway 183 runs from northwest to southeast, and State Highway 71 crosses the southern part of the city from east to west, completing a rough "box" around central and north-central Austin. Austin is the largest city in the United States to be served by only one Interstate Highway.
U.S. Highway 290 enters Austin from the east and merges into Interstate 35. Its highway designation continues south on I-35 and then becomes part of Highway 71, continuing to the west. Highway 290 splits from Highway 71 in southwest Austin, in an interchange known as "The Y." Highway 71 continues to Brady, Texas, and Highway 290 continues west to intersect Interstate 10 near Junction. Interstate 35 continues south through San Antonio to Laredo on the Texas-Mexico border. Interstate 35 is the highway link to the Dallas-Fort Worth metroplex in northern Texas. There are two links to Houston, Texas (Highway 290 and State Highway 71/Interstate 10). Highway 183 leads northwest of Austin toward Lampasas.
In the mid-1980s, construction was completed on Loop 360, a scenic highway that curves through the hill country from near the 71/Mopac interchange in the south to near the 183/Mopac interchange in the north. The iconic Pennybacker Bridge, also known as the "360 Bridge," crosses Lake Austin to connect the northern and southern portions of Loop 360.
Tollways.
State Highway 130 is a bypass route designed to relieve traffic congestion, starting from Interstate 35 just north of Georgetown and running along a parallel route to the east, where it bypasses Round Rock, Austin, San Marcos and New Braunfels before ending at Interstate 10 east of Seguin, where drivers could drive west to return to Interstate 35 in San Antonio. The first segment was opened in November 2006, which was located east of Austin–Bergstrom International Airport at Austin's southeast corner on State Highway 71. Highway 130 runs concurrently with Highway 45 from Pflugerville on the north until it reaches US 183 well south of Austin, at which point SR 45 continues west. The entire route of State Highway 130 is now complete. The final leg opened on November 1, 2012. The highway is noted for having a maximum speed limit of for the entire route. The section of the toll road between Mustang Ridge and Seguin has a posted speed limit of , the highest posted speed limit in the United States.
State Highway 45 runs east–west from just south of Highway 183 in Cedar Park to 130 inside Pflugerville (just east of Round Rock). A tolled extension of State Highway Loop 1 was also created. A new southeast leg of Highway 45 has recently been completed, running from US 183 and the south end of Segment 5 of TX-130 south of Austin due west to I-35 at the FM 1327/Creedmoor exit between the south end of Austin and Buda. The 183A Toll Road opened in March 2007, providing a tolled alternative to U.S. 183 through the cities of Leander and Cedar Park. Currently under construction is a change to East US 290 from US 183 to the town of Manor. Officially, the tollway will be dubbed Tollway 290 with "Manor Expressway" as nickname.
Despite the overwhelming initial opposition to the toll road concept when it was first announced, all three toll roads have exceeded revenue projections.
Airports.
Austin's primary airport is Austin–Bergstrom International Airport (ABIA) ( AUS), located southeast of the city. The airport is on the site of the former Bergstrom Air Force Base, which was closed in 1993 as part of the Base Realignment and Closure process. Until 1999, Robert Mueller Municipal Airport was Austin's main airport until ABIA took that role and the old airport was shut down. Austin Executive Airport, along with several smaller airports outside the city center, serves general aviation traffic.
Intercity transit.
Amtrak's Austin station is located in west downtown and is served by the "Texas Eagle" which runs daily between Chicago and San Antonio, continuing on to Los Angeles several times a week.
Railway segments between Austin and San Antonio have been evaluated for a proposed regional passenger rail project called "Lone Star Rail". However, failure to come to an agreement with the track's current owner, Union Pacific Railroad, ended the project in 2016.
Greyhound Lines operates the current Austin Bus Station at the Eastside Bus Plaza Grupo Senda's Turimex Internacional service operates bus service from Austin to Nuevo Laredo and on to many destinations in Mexico from their station in East Austin. Megabus offers daily service to San Antonio, Dallas/Fort Worth and Houston.
Public transportation.
The Capital Metropolitan Transportation Authority (Capital Metro) provides public transportation to the city, primarily with its MetroBus local bus service, the MetroExpress express bus system, as well as a bus rapid transit service, MetroRapid. Capital Metro opened a commuter rail system, Capital MetroRail, in 2010. The system consists of a single line serving downtown Austin, the neighborhoods of East Austin, North Central Austin, and Northwest Austin plus the suburb of Leander.
Since it began operations in 1985, Capital Metro has proposed adding light rail services to its network. Despite support from the City Council, voters rejected light rail proposals in 2000 and 2014. However, in 2020, voters approved Capital Metro's transit expansion plan, Project Connect, by a comfortable margin. The plan proposes 2 new light rail lines, an additional bus rapid transit line (which could be converted to light rail in the future), a second commuter rail line, several new MetroRapid lines, more MetroExpress routes, and a number of other infrastructure, technology and service expansion projects.
Capital Area Rural Transportation System connects Austin with outlying suburbs and surrounding rural areas.
Ride sharing.
Austin is served by several ride-sharing companies including Uber and Lyft. On May 9, 2016, Uber and Lyft voluntarily ceased operations in Austin in response to a city ordinance that required ride sharing company drivers to get fingerprint checks, have their vehicles labeled, and not pick up or drop off in certain city lanes. Uber and Lyft resumed service in the summer of 2017. The city was previously served by Fasten until they ceased all operations in the city in March 2018.
Austin is also served by Electric Cab of North America's six-passenger electric cabs that operate on a flexible route from the Kramer MetroRail Station to Domain Northside and from the Downtown MetroRail station and MetroRapid stops to locations between the Austin Convention Center and near Sixth and Bowie streets by Whole Foods.
Carsharing service Zipcar operates in Austin and, until 2019, the city was also served by Car2Go which kept its North American headquarters in the city even after pulling out.
Cycling and walking.
The city's bike advocacy organization is Bike Austin. BikeTexas, a state-level advocacy organization, also has its main office in Austin.
Bicycles are a popular transportation choice among students, faculty, and staff at the University of Texas. According to a survey done at the University of Texas, 57% of commuters bike to campus.
The City of Austin and Capital Metro jointly own a bike-sharing service, Capital MetroBike, which is available in and around downtown. The service is a franchise of BCycle, a national bike sharing network owned by Trek Bicycle, and is operated by local nonprofit organization Bike Share of Austin. Until 2020 the service was known as Austin BCycle. In 2018, Lime began offering dockless bikes, which do not need to be docked at a designated station.
In 2018, scooter-sharing companies Lime and Bird debuted rentable electric scooters in Austin. The city briefly banned the scooters — which began operations before the city could implement a permitting system — until the city completed development of their "dockless mobility" permitting process on May 1, 2018. Dockless electric scooters and bikes are banned from Austin city parks and the Ann and Roy Butler Trail and Boardwalk. For the 2018 Austin City Limits Music Festival, the city of Austin offered a designated parking area for dockless bikes and scooters.
International relations.
Austin has two types of relationships with other cities, sister and friendship.
Sister cities.
Austin's sister cities are:
The cities of Belo Horizonte, Brazil and Elche, Spain were formerly sister cities, but upon a vote of the Austin City Council in 1991, their status was de-activated.
Friendship cities.
Covenants between two city leaders:
|
2003 | Argument from morality | The argument from morality is an argument for the existence of God. Arguments from morality tend to be based on moral normativity or moral order. Arguments from moral normativity observe some aspect of morality and argue that God is the best or only explanation for this, concluding that God must exist. Arguments from moral order are based on the asserted need for moral order to exist in the universe. They claim that, for this moral order to exist, God must exist to support it. The argument from morality is noteworthy in that one cannot evaluate the soundness of the argument without attending to almost every important philosophical issue in meta-ethics.
German philosopher Immanuel Kant devised an argument from morality based on practical reason. Kant argued that the goal of humanity is to achieve perfect happiness and virtue (the "summum bonum") and believed that an afterlife must be assumed to exist in order for this to be possible, and that God must be assumed to exist to provide this. Rather than aiming to prove the existence of God, however, Kant was simply attempting to demonstrate that all moral thought requires the assumption that God exists, and therefore that we are entitled to make such an assumption only as a regulative principle rather than a constitutive principle (meaning that such a principle can guide our actions, but it does not provide knowledge). In his book "Mere Christianity", C. S. Lewis argued that "conscience reveals to us a moral law whose source cannot be found in the natural world, thus pointing to a supernatural Lawgiver." Lewis argued that accepting the validity of human reason as a given must include accepting the validity of practical reason, which could not be valid without reference to a higher cosmic moral order which could not exist without a God to create and/or establish it. A related argument is from conscience; John Henry Newman argued that the conscience supports the claim that objective moral truths exist because it drives people to act morally even when it is not in their own interest. Newman argued that, because the conscience suggests the existence of objective moral truths, God must exist to give authority to these truths.
Contemporary defenders of the argument from morality are Graham Ward, Alister McGrath and William Lane Craig.
General form.
All variations of the argument from morality begin with an observation about moral thought or experiences and conclude with the existence of God. Some of these arguments propose moral facts which they claim evident through human experience, arguing that God is the best explanation for these. Other versions describe some end which humans should strive to attain that is only possible if God exists.
Many arguments from morality are based on moral normativity, which suggests that objective moral truths exist and require God's existence to give them authority. Often, they consider that morality seems to be binding – obligations are seen to convey more than just a preference, but imply that the obligation will stand, regardless of other factors or interests. For morality to be binding, God must exist. In its most general form, the argument from moral normativity is:
Some arguments from moral order suggest that morality is based on rationality and that this can only be the case if there is a moral order in the universe. The arguments propose that only the existence of God as orthodoxly conceived could support the existence of moral order in the universe, so God must exist. Alternative arguments from moral order have proposed that we have an obligation to attain the perfect good of both happiness and moral virtue. They attest that whatever we are obliged to do must be possible, and achieving the perfect good of both happiness and moral virtue is only possible if a natural moral order exists. A natural moral order requires the existence of God as orthodoxly conceived, so God must exist.
Variations.
Practical reason.
In his "Critique of Pure Reason", German philosopher Immanuel Kant stated that no successful argument for God's existence arises from reason alone. In his "Critique of Practical Reason" he went on to argue that, despite the failure of these arguments, morality requires that God's existence is assumed, owing to practical reason. Rather than proving the existence of God, Kant was attempting to demonstrate that all moral thought requires the assumption that God exists. Kant argued that humans are obliged to bring about the "summum bonum": the two central aims of moral virtue and happiness, where happiness arises out of virtue. As ought implies can, Kant argued, it must be possible for the "summum bonum" to be achieved. He accepted that it is not within the power of humans to bring the "summum bonum" about, because we cannot ensure that virtue always leads to happiness, so there must be a higher power who has the power to create an afterlife where virtue can be rewarded by happiness.
Philosopher G. H. R. Parkinson notes a common objection to Kant's argument: that what ought to be done does not necessarily entail that it is possible. He also argues that alternative conceptions of morality exist which do not rely on the assumptions that Kant makes – he cites utilitarianism as an example which does not require the "summum bonum". Nicholas Everitt argues that much moral guidance is unattainable, such as the Biblical command to be Christ-like. He proposes that Kant's first two premises only entail that we must try to achieve the perfect good, not that it is actually attainable.
Argument from objective moral truths.
Both theists and non-theists have accepted that the existence of objective moral truths might entail the existence of God. Atheist philosopher J. L. Mackie accepted that, if objective moral truths existed, they would warrant a supernatural explanation. Scottish philosopher W. R. Sorley presented the following argument:
Many critics have challenged the second premise of this argument, by offering a biological and sociological account of the development of human morality which suggests that it is neither objective nor absolute. This account, supported by biologist E. O. Wilson and philosopher Michael Ruse, proposes that the human experience of morality is a by-product of natural selection, a theory philosopher Mark D. Linville calls evolutionary naturalism. According to the theory, the human experience of moral obligations was the result of evolutionary pressures, which attached a sense of morality to human psychology because it was useful for moral development; this entails that moral values do not exist independently of the human mind. Morality might be better understood as an evolutionary imperative in order to propagate genes and ultimately reproduce. No human society today advocates immorality, such as theft or murder, because it would undoubtedly lead to the end of that particular society and any chance for future survival of offspring. Scottish empiricist David Hume made a similar argument, that belief in objective moral truths is unwarranted and to discuss them is meaningless.
Because evolutionary naturalism proposes an empirical account of morality, it does not require morality to exist objectively; Linville considers the view that this will lead to moral scepticism or antirealism. C. S. Lewis argued that, if evolutionary naturalism is accepted, human morality cannot be described as absolute and objective because moral statements cannot be right or wrong. Despite this, Lewis argued, those who accept evolutionary naturalism still act as if objective moral truths exist, leading Lewis to reject naturalism as incoherent. As an alternative ethical theory, Lewis offered a form of divine command theory which equated God with goodness and treated goodness as an essential part of reality, thus asserting God's existence.
J. C. A. Gaskin challenges the first premise of the argument from moral objectivity, arguing that it must be shown why absolute and objective morality entails that morality is commanded by God, rather than simply a human invention. It could be the consent of humanity that gives it moral force, for example. American philosopher Michael Martin argues that it is not necessarily true that objective moral truths must entail the existence of God, suggesting that there could be alternative explanations: he argues that naturalism may be an acceptable explanation and, even if a supernatural explanation is necessary, it does not have to be God (polytheism is a viable alternative). Martin also argues that a non-objective account of ethics might be acceptable and challenges the view that a subjective account of morality would lead to moral anarchy.
William Lane Craig has argued for this form of the moral argument.
Argument for conscience.
Related to the argument from morality is the argument from conscience, associated with eighteenth-century bishop Joseph Butler and nineteenth-century cardinal John Henry Newman. Newman proposed that the conscience, as well as giving moral guidance, provides evidence of objective moral truths which must be supported by the divine. He argued that emotivism is an inadequate explanation of the human experience of morality because people avoid acting immorally, even when it might be in their interests. Newman proposed that, to explain the conscience, God must exist.
British philosopher John Locke argued that moral rules cannot be established from conscience because the differences in people's consciences would lead to contradictions. Locke also noted that the conscience is influenced by "education, company, and customs of the country", a criticism mounted by J. L. Mackie, who argued that the conscience should be seen as an "introjection" of other people into an agent's mind. Michael Martin challenges the argument from conscience with a naturalistic account of conscience, arguing that naturalism provides an adequate explanation for the conscience without the need for God's existence. He uses the example of the internalization by humans of social pressures, which leads to the fear of going against these norms. Even if a supernatural cause is required, he argues, it could be something other than God; this would mean that the phenomenon of the conscience is no more supportive of monotheism than polytheism.
C. S. Lewis argues for the existence of God in a similar way in his book "Mere Christianity", but he does not directly refer to it as the argument from morality.
|
2004 | ASL (disambiguation) | ASL is a common initialism for American Sign Language, the sign language of the United States and Canada (not be confused with Auslan, also called ASL or Asilulu language which has the ISO code ASL), and may also refer to:
|
2006 | Auschwitz concentration camp | Auschwitz concentration camp ( (); also ' or ') was a complex of over 40 concentration and extermination camps operated by Nazi Germany in occupied Poland (in a portion annexed into Germany in 1939) during World War II and the Holocaust. It consisted of Auschwitz I, the main camp ("Stammlager") in Oświęcim; Auschwitz II-Birkenau, a concentration and extermination camp with gas chambers; Auschwitz III-Monowitz, a labor camp for the chemical conglomerate IG Farben; and dozens of subcamps. The camps became a major site of the Nazis' final solution to the Jewish question.
After Germany sparked World War II by invading Poland in September 1939, the "Schutzstaffel" (SS) converted Auschwitz I, an army barracks, into a prisoner-of-war camp. The initial transport of political detainees to Auschwitz consisted almost solely of Poles for whom the camp was initially established. The bulk of inmates were Polish for the first two years. In May 1940, German criminals brought to the camp as functionaries established the camp's reputation for sadism. Prisoners were beaten, tortured, and executed for the most trivial reasons. The first gassings—of Soviet and Polish prisoners—took place in block 11 of Auschwitz I around August 1941.
Construction of Auschwitz II began the following month, and from 1942 until late 1944 freight trains delivered Jews from all over German-occupied Europe to its gas chambers. Of the 1.3 million people sent to Auschwitz, 1.1 million were murdered. The number of victims includes 960,000 Jews (865,000 of whom were gassed on arrival), 74,000 non-Jewish Poles, 21,000 Romani, 15,000 Soviet prisoners of war, and up to 15,000 others. Those not gassed were murdered via starvation, exhaustion, disease, individual executions, or beatings. Others were killed during medical experiments.
At least 802 prisoners tried to escape, 144 successfully, and on 7 October 1944, two "Sonderkommando" units, consisting of prisoners who operated the gas chambers, launched an unsuccessful uprising. Only 789 Schutzstaffel personnel (no more than 15 percent) ever stood trial after the Holocaust ended; several were executed, including camp commandant Rudolf Höss. The Allies' failure to act on early reports of atrocities by bombing the camp or its railways remains controversial.
As the Soviet Red Army approached Auschwitz in January 1945, toward the end of the war, the SS sent most of the camp's population west on a death march to camps inside Germany and Austria. Soviet troops entered the camp on 27 January 1945, a day commemorated since 2005 as International Holocaust Remembrance Day. In the decades after the war, survivors such as Primo Levi, Viktor Frankl, and Elie Wiesel wrote memoirs of their experiences, and the camp became a dominant symbol of the Holocaust. In 1947, Poland founded the Auschwitz-Birkenau State Museum on the site of Auschwitz I and II, and in 1979 it was named a World Heritage Site by UNESCO.
Background.
The ideology of Nazism combined elements of "racial hygiene", eugenics, antisemitism, pan-Germanism, and territorial expansionism, Richard J. Evans writes. Adolf Hitler and his Nazi Party became obsessed by the "Jewish question". Both during and immediately after the Nazi seizure of power in Germany in 1933, acts of violence against German Jews became ubiquitous, and legislation was passed excluding them from certain professions, including the civil service and the law.
Harassment and economic pressure encouraged Jews to leave Germany; their businesses were denied access to markets, forbidden from advertising in newspapers, and deprived of government contracts. On 15 September 1935, the Reichstag passed the Nuremberg Laws. One, the Reich Citizenship Law, defined as citizens those of "German or related blood who demonstrate by their behaviour that they are willing and suitable to serve the German People and Reich faithfully", and the Law for the Protection of German Blood and German Honor prohibited marriage and extramarital relations between those with "German or related blood" and Jews.
When Germany invaded Poland in September 1939, triggering World War II, Hitler ordered that the Polish leadership and intelligentsia be destroyed. The area around Auschwitz was annexed to the German Reich, as part of first Gau Silesia and from 1941 Gau Upper Silesia. The camp at Auschwitz was established in April 1940, at first as a quarantine camp for Polish political prisoners. On 22 June 1941, in an attempt to obtain new territory, Hitler invaded the Soviet Union. The first gassing at Auschwitz—of a group of Soviet prisoners of war—took place around August 1941. By the end of that year, during what most historians regard as the first phase of the Holocaust, 500,000–800,000 Soviet Jews had been murdered in mass shootings by a combination of German "Einsatzgruppen", ordinary German soldiers, and local collaborators. At the Wannsee Conference in Berlin on 20 January 1942, Reinhard Heydrich outlined the Final Solution to the Jewish Question to senior Nazis, and from early 1942 freight trains delivered Jews from all over occupied Europe to German extermination camps in Poland: Auschwitz, Bełżec, Chełmno, Majdanek, Sobibór, and Treblinka. Most prisoners were gassed on arrival.
Camps.
Auschwitz I.
Growth.
A former World War I camp for transient workers and later a Polish army barracks, Auschwitz I was the main camp ("Stammlager") and administrative headquarters of the camp complex. Fifty km southwest of Kraków, the site was first suggested in February 1940 as a quarantine camp for Polish prisoners by Arpad Wigand, the inspector of the Sicherheitspolizei (security police) and deputy of Erich von dem Bach-Zelewski, the Higher SS and Police Leader for Silesia. Richard Glücks, head of the Concentration Camps Inspectorate, sent Walter Eisfeld, former commandant of the Sachsenhausen concentration camp in Oranienburg, Germany, to inspect it. Around 1,000 m long and 400 m wide, Auschwitz consisted at the time of 22 brick buildings, eight of them two-story. A second story was added to the others in 1943 and eight new blocks were built.
Reichsführer-SS Heinrich Himmler, head of the SS, approved the site in April 1940 on the recommendation of SS-Obersturmbannführer Rudolf Höss of the camps inspectorate. Höss oversaw the development of the camp and served as its first commandant. The first 30 prisoners arrived on 20 May 1940 from the Sachsenhausen camp. German "career criminals" ("Berufsverbrecher"), the men were known as "greens" ("Grünen") after the green triangles on their prison clothing. Brought to the camp as functionaries, this group did much to establish the sadism of early camp life, which was directed particularly at Polish inmates, until the political prisoners took over their roles. Bruno Brodniewicz, the first prisoner (who was given serial number 1), became "Lagerälteste" (camp elder). The others were given positions such as "kapo" and block supervisor.
First mass transport.
The first mass transport—of 728 Polish male political prisoners, including Catholic priests and Jews—arrived on 14 June 1940 from Tarnów, Poland. They were given serial numbers 31 to 758. In a letter on 12 July 1940, Höss told Glücks that the local population was "fanatically Polish, ready to undertake any sort of operation against the hated SS men". By the end of 1940, the SS had confiscated land around the camp to create a 40-square-kilometer (15 sq mi) "zone of interest" ("Interessengebiet") patrolled by the SS, Gestapo and local police. By March 1941, 10,900 were imprisoned in the camp, most of them Poles.
An inmate's first encounter with Auschwitz, if they were registered and not sent straight to the gas chamber, was at the prisoner reception center near the gate with the "Arbeit macht frei" sign, where they were tattooed, shaved, disinfected, and given a striped prison uniform. Built between 1942 and 1944, the center contained a bathhouse, laundry, and 19 gas chambers for delousing clothes. The prisoner reception center of Auschwitz I became the visitor reception center of the Auschwitz-Birkenau State Museum.
Crematorium I, first gassings.
Construction of crematorium I began at Auschwitz I at the end of June or beginning of July 1940. Initially intended not for mass murder but for prisoners who had been executed or had otherwise died in the camp, the crematorium was in operation from August 1940 until July 1943, by which time the crematoria at Auschwitz II had taken over. By May 1942 three ovens had been installed in crematorium I, which together could burn 340 bodies in 24 hours.
The first experimental gassing took place around August 1941, when Lagerführer Karl Fritzsch, at the instruction of Rudolf Höss, murdered a group of Soviet prisoners of war by throwing Zyklon B crystals into their basement cell in block 11 of Auschwitz I. A second group of 600 Soviet prisoners of war and around 250 sick Polish prisoners were gassed on 3–5 September. The morgue was later converted to a gas chamber able to hold at least 700–800 people. Zyklon B was dropped into the room through slits in the ceiling.
First mass transport of Jews.
Historians have disagreed about the date the all-Jewish transports began arriving in Auschwitz. At the Wannsee Conference in Berlin on 20 January 1942, the Nazi leadership outlined, in euphemistic language, its plans for the Final Solution. According to Franciszek Piper, the Auschwitz commandant Rudolf Höss offered inconsistent accounts after the war, suggesting the extermination began in December 1941, January 1942, or before the establishment of the women's camp in March 1942. In "Kommandant in Auschwitz", he wrote: "In the spring of 1942 the first transports of Jews, all earmarked for extermination, arrived from Upper Silesia." On 15 February 1942, according to Danuta Czech, a transport of Jews from Beuthen, Upper Silesia (Bytom, Poland), arrived at Auschwitz I and was sent straight to the gas chamber. In 1998 an eyewitness said the train contained "the women of Beuthen". Saul Friedländer wrote that the Beuthen Jews were from the Organization Schmelt labor camps and had been deemed unfit for work. According to Christopher Browning, transports of Jews unfit for work were sent to the gas chamber at Auschwitz from autumn 1941. The evidence for this and the February 1942 transport was contested in 2015 by Nikolaus Wachsmann.
Around 20 March 1942, according to Danuta Czech, a transport of Polish Jews from Silesia and Zagłębie Dąbrowskie was taken straight from the station to the Auschwitz II gas chamber, which had just come into operation. On 26 and 28 March, two transports of Slovakian Jews were registered as prisoners in the women's camp, where they were kept for slave labour; these were the first transports organized by Adolf Eichmann's department IV B4 (the Jewish office) in the Reich Security Head Office (RSHA). On 30 March the first RHSA transport arrived from France. "Selection", where new arrivals were chosen for work or the gas chamber, began in April 1942 and was conducted regularly from July. Piper writes that this reflected Germany's increasing need for labor. Those selected as unfit for work were gassed without being registered as prisoners.
There is also disagreement about how many were gassed in Auschwitz I. Perry Broad, an "SS-Unterscharführer", wrote that "transport after transport vanished in the Auschwitz [I] crematorium." In the view of Filip Müller, one of the Auschwitz I "Sonderkommando", tens of thousands of Jews were murdered there from France, Holland, Slovakia, Upper Silesia, and Yugoslavia, and from the Theresienstadt, Ciechanow, and Grodno ghettos. Against this, Jean-Claude Pressac estimated that up to 10,000 people had been murdered in Auschwitz I. The last inmates gassed there, in December 1942, were around 400 members of the Auschwitz II "Sonderkommando", who had been forced to dig up and burn the remains of that camp's mass graves, thought to hold over 100,000 corpses.
Auschwitz II–Birkenau.
Construction.
After visiting Auschwitz I in March 1941, it appears that Himmler ordered that the camp be expanded, although Peter Hayes notes that, on 10 January 1941, the Polish underground told the Polish government-in-exile in London: "the Auschwitz concentration camp ...can accommodate approximately 7,000 prisoners at present, and is to be rebuilt to hold approximately 30,000." Construction of Auschwitz II-Birkenau—called a "Kriegsgefangenenlager" (prisoner-of-war camp) on blueprints—began in October 1941 in Brzezinka, about three kilometers from Auschwitz I. The initial plan was that Auschwitz II would consist of four sectors (Bauabschnitte I–IV), each consisting of six subcamps (BIIa–BIIf) with their own gates and fences. The first two sectors were completed (sector BI was initially a quarantine camp), but the construction of BIII began in 1943 and stopped in April 1944, and the plan for BIV was abandoned.
SS-Sturmbannführer Karl Bischoff, an architect, was the chief of construction. Based on an initial budget of RM 8.9 million, his plans called for each barracks to hold 550 prisoners, but he later changed this to 744 per barracks, which meant the camp could hold 125,000, rather than 97,000. There were 174 barracks, each measuring , divided into 62 bays of . The bays were divided into "roosts", initially for three inmates and later for four. With personal space of to sleep and place whatever belongings they had, inmates were deprived, Robert-Jan van Pelt wrote, "of the minimum space needed to exist".
The prisoners were forced to live in the barracks as they were building them; in addition to working, they faced long roll calls at night. As a result, most prisoners in BIb (the men's camp) in the early months died of hypothermia, starvation or exhaustion within a few weeks. Some 10,000 Soviet prisoners of war arrived at Auschwitz I between 7 and 25 October 1941, but by 1 March 1942 only 945 were still registered; they were transferred to Auschwitz II, where most of them had died by May.
Crematoria II–V.
The first gas chamber at Auschwitz II was operational by March 1942. On or around 20 March, a transport of Polish Jews sent by the Gestapo from Silesia and Zagłębie Dąbrowskie was taken straight from the Oświęcim freight station to the Auschwitz II gas chamber, then buried in a nearby meadow. The gas chamber was located in what prisoners called the "little red house" (known as bunker 1 by the SS), a brick cottage that had been turned into a gassing facility; the windows had been bricked up and its four rooms converted into two insulated rooms, the doors of which said "Zur Desinfektion" ("to disinfection"). A second brick cottage, the "little white house" or bunker 2, was converted and operational by June 1942. When Himmler visited the camp on 17 and 18 July 1942, he was given a demonstration of a selection of Dutch Jews, a mass-murder in a gas chamber in bunker 2, and a tour of the building site of Auschwitz III, the new IG Farben plant being constructed at Monowitz. Use of bunkers I and 2 stopped in spring 1943 when the new crematoria were built, although bunker 2 became operational again in May 1944 for the murder of the Hungarian Jews. Bunker I was demolished in 1943 and bunker 2 in November 1944.
Plans for crematoria II and III show that both had an oven room on the ground floor, and an underground dressing room and gas chamber . The dressing rooms had wooden benches along the walls and numbered pegs for clothing. Victims would be led from these rooms to a five-yard-long narrow corridor, which in turn led to a space from which the gas chamber door opened. The chambers were white inside, and nozzles were fixed to the ceiling to resemble showerheads. The daily capacity of the crematoria (how many bodies could be burned in a 24-hour period) was 340 corpses in crematorium I; 1,440 each in crematoria II and III; and 768 each in IV and V. By June 1943 all four crematoria were operational, but crematorium I was not used after July 1943. This made the total daily capacity 4,416, although by loading three to five corpses at a time, the "Sonderkommando" were able to burn some 8,000 bodies a day. This maximum capacity was rarely needed; the average between 1942 and 1944 was 1,000 bodies burned every day.
Auschwitz III–Monowitz.
After examining several sites for a new plant to manufacture Buna-N, a type of synthetic rubber essential to the war effort, the German chemical conglomerate IG Farben chose a site near the towns of Dwory and Monowice (Monowitz in German), about east of Auschwitz I. Tax exemptions were available to corporations prepared to develop industries in the frontier regions under the Eastern Fiscal Assistance Law, passed in December 1940. In addition to its proximity to the concentration camp, a source of cheap labor, the site had good railway connections and access to raw materials. In February 1941, Himmler ordered that the Jewish population of Oświęcim be expelled to make way for skilled laborers; that all Poles able to work remain in the town and work on building the factory; and that Auschwitz prisoners be used in the construction work.
Auschwitz inmates began working at the plant, known as Buna Werke and IG-Auschwitz, in April 1941, demolishing houses in Monowitz to make way for it. By May, because of a shortage of trucks, several hundred of them were rising at 3 am to walk there twice a day from Auschwitz I. Because a long line of exhausted inmates walking through the town of Oświęcim might harm German-Polish relations, the inmates were told to shave daily, make sure they were clean, and sing as they walked. From late July they were taken to the factory by train on freight wagons. Given the difficulty of moving them, including during the winter, IG Farben decided to build a camp at the plant. The first inmates moved there on 30 October 1942. Known as "KL Auschwitz III–Aussenlager" (Auschwitz III subcamp), and later as the Monowitz concentration camp, it was the first concentration camp to be financed and built by private industry.
Measuring , the camp was larger than Auschwitz I. By the end of 1944, it housed 60 barracks measuring , each with a day room and a sleeping room containing 56 three-tiered wooden bunks. IG Farben paid the SS three or four Reichsmark for nine- to eleven-hour shifts from each worker. In 1943–1944, about 35,000 inmates worked at the plant; 23,000 (32 a day on average) were killed through malnutrition, disease, and the workload. Within three to four months at the camp, Peter Hayes writes, the inmates were "reduced to walking skeletons". Deaths and transfers to the gas chambers at Auschwitz II reduced the population by nearly a fifth each month. Site managers constantly threatened inmates with the gas chambers, and the smell from the crematoria at Auschwitz I and II hung heavy over the camp.
Although the factory had been expected to begin production in 1943, shortages of labor and raw materials meant start-up was postponed repeatedly. The Allies bombed the plant in 1944 on 20 August, 13 September, 18 December, and 26 December. On 19 January 1945, the SS ordered that the site be evacuated, sending 9,000 inmates, most of them Jews, on a death march to another Auschwitz subcamp at Gliwice. From Gliwice, prisoners were taken by rail in open freight wagons to the Buchenwald and Mauthausen concentration camps. The 800 inmates who had been left behind in the Monowitz hospital were liberated along with the rest of the camp on 27 January 1945 by the 1st Ukrainian Front of the Red Army.
Subcamps.
Several other German industrial enterprises, such as Krupp and Siemens-Schuckert, built factories with their own subcamps. There were around 28 camps near industrial plants, each camp holding hundreds or thousands of prisoners. Designated as "Aussenlager" (external camp), "Nebenlager" (extension camp), "Arbeitslager" (labor camp), or "Aussenkommando" (external work detail), camps were built at Blechhammer, Jawiszowice, Jaworzno, Lagisze, Mysłowice, Trzebinia, and as far afield as the Protectorate of Bohemia and Moravia in Czechoslovakia. Industries with satellite camps included coal mines, foundries and other metal works, and chemical plants. Prisoners were also made to work in forestry and farming. For example, "Wirtschaftshof Budy", in the Polish village of Budy near Brzeszcze, was a farming subcamp where prisoners worked 12-hour days in the fields, tending animals, and making compost by mixing human ashes from the crematoria with sod and manure. Incidents of sabotage to decrease production took place in several subcamps, including Charlottengrube, Gleiwitz II, and Rajsko. Living conditions in some of the camps were so poor that they were regarded as punishment subcamps.
Life in the camps.
SS garrison.
Rudolf Höss, born in Baden-Baden in 1900, was named the first commandant of Auschwitz when Heinrich Himmler ordered on 27 April 1940 that the camp be established. Living with his wife and children in a two-story stucco house near the commandant's and administration building, he served as commandant until 11 November 1943, with Josef Kramer as his deputy. Succeeded as commandant by Arthur Liebehenschel, Höss joined the SS Business and Administration Head Office in Oranienburg as director of Amt DI, a post that made him deputy of the camps inspectorate.
Richard Baer became commandant of Auschwitz I on 11 May 1944 and Fritz Hartjenstein of Auschwitz II from 22 November 1943, followed by Josef Kramer from 15 May 1944 until the camp's liquidation in January 1945. Heinrich Schwarz was commandant of Auschwitz III from the point at which it became an autonomous camp in November 1943 until its liquidation. Höss returned to Auschwitz between 8 May and 29 July 1944 as the local SS garrison commander ("Standortältester") to oversee the arrival of Hungary's Jews, which made him the superior officer of all the commandants of the Auschwitz camps.
According to Aleksander Lasik, about 6,335 people (6,161 of them men) worked for the SS at Auschwitz over the course of the camp's existence; 4.2 percent were officers, 26.1 percent non-commissioned officers, and 69.7 percent rank and file. In March 1941, there were 700 SS guards; in June 1942, 2,000; and in August 1944, 3,342. At its peak in January 1945, 4,480 SS men and 71 SS women worked in Auschwitz; the higher number is probably attributable to the logistics of evacuating the camp. Female guards were known as SS supervisors ("SS-Aufseherinnen").
Most of the staff were from Germany or Austria, but as the war progressed, increasing numbers of "Volksdeutsche" from other countries, including Czechoslovakia, Poland, Yugoslavia, and the Baltic states, joined the SS at Auschwitz. Not all were ethnically German. Guards were also recruited from Hungary, Romania, and Slovakia. Camp guards, around three quarters of the SS personnel, were members of the "SS-Totenkopfverbände" (death's head units). Other SS staff worked in the medical or political departments, or in the economic administration, which was responsible for clothing and other supplies, including the property of dead prisoners. The SS viewed Auschwitz as a comfortable posting; being there meant they had avoided the front and had access to the victims' property.
Functionaries and "Sonderkommando".
Certain prisoners, at first non-Jewish Germans but later Jews and non-Jewish Poles, were assigned positions of authority as "Funktionshäftlinge" (functionaries), which gave them access to better housing and food. The "Lagerprominenz" (camp elite) included "Blockschreiber" (barracks clerk), "Kapo" (overseer), "Stubendienst" (barracks orderly), and "Kommandierte" (trusties). Wielding tremendous power over other prisoners, the functionaries developed a reputation as sadists. Very few were prosecuted after the war, because of the difficulty of determining which atrocities had been performed by order of the SS.
Although the SS oversaw the murders at each gas chamber, the forced labor portion of the work was done by prisoners known from 1942 as the "Sonderkommando" (special squad). These were mostly Jews but they included groups such as Soviet POWs. In 1940–1941 when there was one gas chamber, there were 20 such prisoners, in late 1943 there were 400, and by 1944 during the Holocaust in Hungary the number had risen to 874. The "Sonderkommando" removed goods and corpses from the incoming trains, guided victims to the dressing rooms and gas chambers, removed their bodies afterwards, and took their jewelry, hair, dental work, and any precious metals from their teeth, all of which was sent to Germany. Once the bodies were stripped of anything valuable, the "Sonderkommando" burned them in the crematoria.
Because they were witnesses to the mass murder, the "Sonderkommando" lived separately from the other prisoners, although this rule was not applied to the non-Jews among them. Their quality of life was further improved by their access to the property of new arrivals, which they traded within the camp, including with the SS. Nevertheless, their life expectancy was short; they were regularly murdered and replaced. About 100 survived to the camp's liquidation. They were forced on a death march and by train to the camp at Mauthausen, where three days later they were asked to step forward during roll call. No one did, and because the SS did not have their records, several of them survived.
Tattoos and triangles.
Uniquely at Auschwitz, prisoners were tattooed with a serial number, on their left breast for Soviet prisoners of war and on the left arm for civilians. Categories of prisoner were distinguishable by triangular pieces of cloth (German: "Winkel") sewn onto on their jackets below their prisoner number. Political prisoners "(Schutzhäftlinge" or Sch), mostly Poles, had a red triangle, while criminals ("Berufsverbrecher" or BV) were mostly German and wore green. Asocial prisoners ("Asoziale" or Aso), which included vagrants, prostitutes and the Roma, wore black. Purple was for Jehovah's Witnesses ("Internationale Bibelforscher-Vereinigung" or IBV)'s and pink for gay men, who were mostly German. An estimated 5,000–15,000 gay men prosecuted under German Penal Code Section 175 (proscribing sexual acts between men) were detained in concentration camps, of whom an unknown number were sent to Auschwitz. Jews wore a yellow badge, the shape of the Star of David, overlaid by a second triangle if they also belonged to a second category. The nationality of the inmate was indicated by a letter stitched onto the cloth. A racial hierarchy existed, with German prisoners at the top. Next were non-Jewish prisoners from other countries. Jewish prisoners were at the bottom.
Transports.
Deportees were brought to Auschwitz crammed in wretched conditions into goods or cattle wagons, arriving near a railway station or at one of several dedicated trackside ramps, including one next to Auschwitz I. The "Altejudenrampe" (old Jewish ramp), part of the Oświęcim freight railway station, was used from 1942 to 1944 for Jewish transports. Located between Auschwitz I and Auschwitz II, arriving at this ramp meant a 2.5 km journey to Auschwitz II and the gas chambers. Most deportees were forced to walk, accompanied by SS men and a car with a Red Cross symbol that carried the Zyklon B, as well as an SS doctor in case officers were poisoned by mistake. Inmates arriving at night, or who were too weak to walk, were taken by truck. Work on a new railway line and ramp "(right)" between sectors BI and BII in Auschwitz II, was completed in May 1944 for the arrival of Hungarian Jews between May and early July 1944. The rails led directly to the area around the gas chambers.
Life for the inmates.
The day began at 4:30 am for the men (an hour later in winter), and earlier for the women, when the block supervisor sounded a gong and started beating inmates with sticks to make them wash and use the latrines quickly. Sanitary arrangements were atrocious, with few latrines and a lack of clean water. Each washhouse had to service thousands of prisoners. In sectors BIa and BIb in Auschwitz II, two buildings containing latrines and washrooms were installed in 1943. These contained troughs for washing and 90 faucets; the toilet facilities were "sewage channels" covered by concrete with 58 holes for seating. There were three barracks with washing facilities or toilets to serve 16 residential barracks in BIIa, and six washrooms/latrines for 32 barracks in BIIb, BIIc, BIId, and BIIe. Primo Levi described a 1944 Auschwitz III washroom:
Prisoners received half a liter of coffee substitute or a herbal tea in the morning, but no food. A second gong heralded roll call, when inmates lined up outside in rows of ten to be counted. No matter the weather, they had to wait for the SS to arrive for the count; how long they stood there depended on the officers' mood, and whether there had been escapes or other events attracting punishment. Guards might force the prisoners to squat for an hour with their hands above their heads or hand out beatings or detention for infractions such as having a missing button or an improperly cleaned food bowl. The inmates were counted and re-counted.
After roll call, to the sound of "Arbeitskommandos formieren" ("form work details"), prisoners walked to their place of work, five abreast, to begin a working day that was normally 11 hours long—longer in summer and shorter in winter. A prison orchestra, such as the Women's Orchestra of Auschwitz, was forced to play cheerful music as the workers left the camp. "Kapos" were responsible for the prisoners' behavior while they worked, as was an SS escort. Much of the work took place outdoors at construction sites, gravel pits, and lumber yards. No rest periods were allowed. One prisoner was assigned to the latrines to measure the time the workers took to empty their bladders and bowels.
Lunch was three-quarters of a liter of watery soup at midday, reportedly foul-tasting, with meat in the soup four times a week and vegetables (mostly potatoes and rutabaga) three times. The evening meal was 300 grams of bread, often moldy, part of which the inmates were expected to keep for breakfast the next day, with a tablespoon of cheese or marmalade, or 25 grams of margarine or sausage. Prisoners engaged in hard labor were given extra rations.
A second roll call took place at seven in the evening, in the course of which prisoners might be hanged or flogged. If a prisoner was missing, the others had to remain standing until the absentee was found or the reason for the absence discovered, even if it took hours. On 6 July 1940, roll call lasted 19 hours because a Polish prisoner, Tadeusz Wiejowski, had escaped; following an escape in 1941, a group of prisoners was picked out from the escapee's barracks and sent to block 11 to be starved to death. After roll call, prisoners retired to their blocks for the night and received their bread rations. Then they had some free time to use the washrooms and receive their mail, unless they were Jews: Jews were not allowed to receive mail. Curfew ("nighttime quiet") was marked by a gong at nine o'clock. Inmates slept in long rows of brick or wooden bunks, or on the floor, lying in and on their clothes and shoes to prevent them from being stolen. The wooden bunks had blankets and paper mattresses filled with wood shavings; in the brick barracks, inmates lay on straw. According to Miklós Nyiszli:
Sunday was not a work day, but prisoners had to clean the barracks and take their weekly shower, and were allowed to write (in German) to their families, although the SS censored the mail. Inmates who did not speak German would trade bread for help. Observant Jews tried to keep track of the Hebrew calendar and Jewish holidays, including Shabbat, and the weekly Torah portion. No watches, calendars, or clocks were permitted in the camp. Only two Jewish calendars made in Auschwitz survived to the end of the war. Prisoners kept track of the days in other ways, such as obtaining information from newcomers.
Women's camp.
About 30 percent of the registered inmates were female. The first mass transport of women, 999 non-Jewish German women from the Ravensbrück concentration camp, arrived on 26 March 1942. Classified as criminal, asocial and political, they were brought to Auschwitz as founder functionaries of the women's camp. Rudolf Höss wrote of them: "It was easy to predict that these beasts would mistreat the women over whom they exercised power ... Spiritual suffering was completely alien to them." They were given serial numbers 1–999. The women's guard from Ravensbrück, Johanna Langefeld, became the first Auschwitz women's camp "Lagerführerin". A second mass transport of women, 999 Jews from Poprad, Slovakia, arrived on the same day. According to Danuta Czech, this was the first registered transport sent to Auschwitz by the Reich Security Head Office (RSHA) office IV B4, known as the Jewish Office, led by SS "Obersturmbannführer" Adolf Eichmann. (Office IV was the Gestapo.) A third transport of 798 Jewish women from Bratislava, Slovakia, followed on 28 March.
Women were at first held in blocks 1–10 of Auschwitz I, but from 6 August 1942, 13,000 inmates were transferred to a new women's camp ("Frauenkonzentrationslager" or FKL) in Auschwitz II. This consisted at first of 15 brick and 15 wooden barracks in sector ("Bauabschnitt") BIa; it was later extended into BIb, and by October 1943 it held 32,066 women. In 1943–1944, about 11,000 women were also housed in the Gypsy family camp, as were several thousand in the Theresienstadt family camp.
Conditions in the women's camp were so poor that when a group of male prisoners arrived to set up an infirmary in October 1942, their first task, according to researchers from the Auschwitz museum, was to distinguish the corpses from the women who were still alive. Gisella Perl, a Romanian-Jewish gynecologist and inmate of the women's camp, wrote in 1948:
Langefeld was succeeded as "Lagerführerin" in October 1942 by SS "Oberaufseherin" Maria Mandl, who developed a reputation for cruelty. Höss hired men to oversee the female supervisors, first SS "Obersturmführer" Paul Müller, then SS "Hauptsturmführer" Franz Hössler. Mandl and Hössler were executed after the war. Sterilization experiments were carried out in barracks 30 by a German gynecologist, Carl Clauberg, and another German doctor, Horst Schumann.
Medical experiments, block 10.
German doctors performed a variety of experiments on prisoners at Auschwitz. SS doctors tested the efficacy of X-rays as a sterilization device by administering large doses to female prisoners. Carl Clauberg injected chemicals into women's uteruses in an effort to glue them shut. Prisoners were infected with spotted fever for vaccination research and exposed to toxic substances to study the effects. In one experiment, Bayer—then part of IG Farben—paid RM 150 each for 150 female inmates from Auschwitz (the camp had asked for RM 200 per woman), who were transferred to a Bayer facility to test an anesthetic. A Bayer employee wrote to Rudolf Höss: "The transport of 150 women arrived in good condition. However, we were unable to obtain conclusive results because they died during the experiments. We would kindly request that you send us another group of women to the same number and at the same price." The Bayer research was led at Auschwitz by Helmuth Vetter of Bayer/IG Farben, who was also an Auschwitz physician and SS captain, and by Auschwitz physicians Friedrich Entress and Eduard Wirths.
The most infamous doctor at Auschwitz was Josef Mengele, the "Angel of Death", who worked in Auschwitz II from 30 May 1943, at first in the gypsy family camp. Interested in performing research on identical twins, dwarfs, and those with hereditary disease, Mengele set up a kindergarten in barracks 29 and 31 for children he was experimenting on, and for all Romani children under six, where they were given better food rations. From May 1944, he would select twins and dwarfs from among the new arrivals during "selection", reportedly calling for twins with "Zwillinge heraus!" ("twins step forward!"). He and other doctors (the latter prisoners) would measure the twins' body parts, photograph them, and subject them to dental, sight and hearing tests, x-rays, blood tests, surgery, and blood transfusions between them. Then he would have them killed and dissected. Kurt Heissmeyer, another German doctor and SS officer, took 20 Polish Jewish children from Auschwitz to use in pseudoscientific experiments at the Neuengamme concentration camp near Hamburg, where he injected them with the tuberculosis bacilli to test a cure for tuberculosis. In April 1945, the children were murdered by hanging to conceal the project.
A Jewish skeleton collection was obtained from among a pool of 115 Jewish inmates, chosen for their perceived stereotypical racial characteristics. Rudolf Brandt and Wolfram Sievers, general manager of the "Ahnenerbe" (a Nazi research institute), delivered the skeletons to the collection of the Anatomy Institute at the Reichsuniversität Straßburg in Alsace-Lorraine. The collection was sanctioned by Heinrich Himmler and under the direction of August Hirt. Ultimately 87 of the inmates were shipped to Natzweiler-Struthof and murdered in August 1943. Brandt and Sievers were executed in 1948 after being convicted during the Doctors' trial, part of the Subsequent Nuremberg trials.
Punishment, block 11.
Prisoners could be beaten and killed by guards and "kapos" for the slightest infraction of the rules. Polish historian Irena Strzelecka writes that "kapos" were given nicknames that reflected their sadism: "Bloody", "Iron", "The Strangler", "The Boxer". Based on the 275 extant reports of punishment in the Auschwitz archives, Strzelecka lists common infractions: returning a second time for food at mealtimes, removing your own gold teeth to buy bread, breaking into the pigsty to steal the pigs' food, putting your hands in your pockets.
Flogging during roll-call was common. A flogging table called "the goat" immobilized prisoners' feet in a box, while they stretched themselves across the table. Prisoners had to count out the lashes—"25 mit besten Dank habe ich erhalten" ("25 received with many thanks")— and if they got the figure wrong, the flogging resumed from the beginning. Punishment by "the post" involved tying prisoners hands behind their backs with chains attached to hooks, then raising the chains so the prisoners were left dangling by the wrists. If their shoulders were too damaged afterwards to work, they might be sent to the gas chamber. Prisoners were subjected to the post for helping a prisoner who had been beaten, and for picking up a cigarette butt. To extract information from inmates, guards would force their heads onto the stove, and hold them there, burning their faces and eyes.
Known as block 13 until 1941, block 11 of Auschwitz I was the prison within the prison, reserved for inmates suspected of resistance activities. Cell 22 in block 11 was a windowless standing cell ("Stehbunker"). Split into four sections, each section measured less than and held four prisoners, who entered it through a hatch near the floor. There was a 5 cm x 5 cm vent for air, covered by a perforated sheet. Strzelecka writes that prisoners might have to spend several nights in cell 22; Wiesław Kielar spent four weeks in it for breaking a pipe. Several rooms in block 11 were deemed the "Polizei-Ersatz-Gefängnis Myslowitz in Auschwitz" (Auschwitz branch of the police station at Mysłowice). There were also "Sonderbehandlung" cases ("special treatment") for Poles and others regarded as dangerous to Nazi Germany.
Death wall.
The courtyard between blocks 10 and 11, known as the "death wall", served as an execution area, including for Poles in the General Government area who had been sentenced to death by a criminal court. The first executions, by shooting inmates in the back of the head, took place at the death wall on 11 November 1941, Poland's National Independence Day. The 151 accused were led to the wall one at a time, stripped naked and with their hands tied behind their backs. Danuta Czech noted that a "clandestine Catholic mass" was said the following Sunday on the second floor of Block 4 in Auschwitz I, in a narrow space between bunks.
An estimated 4,500 Polish political prisoners were executed at the death wall, including members of the camp resistance. An additional 10,000 Poles were brought to the camp to be executed without being registered. About 1,000 Soviet prisoners of war died by execution, although this is a rough estimate. A Polish government-in-exile report stated that 11,274 prisoners and 6,314 prisoners of war had been executed. Rudolf Höss wrote that "execution orders arrived in an unbroken stream". According to SS officer Perry Broad, "[s]ome of these walking skeletons had spent months in the stinking cells, where not even animals would be kept, and they could barely manage to stand straight. And yet, at that last moment, many of them shouted 'Long live Poland', or 'Long live freedom'." The dead included Colonel Jan Karcz and Major Edward Gött-Getyński, executed on 25 January 1943 with 51 others suspected of resistance activities. Józef Noji, the Polish long-distance runner, was executed on 15 February that year. In October 1944, 200 "Sonderkommando" were executed for their part in the "Sonderkommando" revolt.
Family camps.
Gypsy family camp.
A separate camp for the Roma, the "Zigeunerfamilienlager" ("Gypsy family camp"), was set up in the BIIe sector of Auschwitz II-Birkenau in February 1943. For unknown reasons, they were not subject to selection and families were allowed to stay together. The first transport of German Roma arrived on 26 February that year. There had been a small number of Romani inmates before that; two Czech Romani prisoners, Ignatz and Frank Denhel, tried to escape in December 1942, the latter successfully, and a Polish Romani woman, Stefania Ciuron, arrived on 12 February 1943 and escaped in April. Josef Mengele, the Holocaust's most infamous physician, worked in the gypsy family camp from 30 May 1943 when he began his work in Auschwitz.
The Auschwitz registry ("Hauptbücher") shows that 20,946 Roma were registered prisoners, and another 3,000 are thought to have entered unregistered. On 22 March 1943, one transport of 1,700 Polish Sinti and Roma was gassed on arrival because of illness, as was a second group of 1,035 on 25 May 1943. The SS tried to liquidate the camp on 16 May 1944, but the Roma fought them, armed with knives and iron pipes, and the SS retreated. Shortly after this, the SS removed nearly 2,908 from the family camp to work, and on 2 August 1944 gassed the other 2,897. Ten thousand remain unaccounted for.
Theresienstadt family camp.
The SS deported around 18,000 Jews to Auschwitz from the Theresienstadt ghetto in Terezin, Czechoslovakia, beginning on 8 September 1943 with a transport of 2,293 male and 2,713 female prisoners. Placed in sector BIIb as a "family camp", they were allowed to keep their belongings, wear their own clothes, and write letters to family; they did not have their hair shaved and were not subjected to selection. Correspondence between Adolf Eichmann's office and the International Red Cross suggests that the Germans set up the camp to cast doubt on reports, in time for a planned Red Cross visit to Auschwitz, that mass murder was taking place there. The women and girls were placed in odd-numbered barracks and the men and boys in even-numbered. An infirmary was set up in barracks 30 and 32, and barracks 31 became a school and kindergarten. The somewhat better living conditions were nevertheless inadequate; 1,000 members of the family camp were dead within six months. Two other groups of 2,491 and 2,473 Jews arrived from Theresienstadt in the family camp on 16 and 20 December 1943.
On 8 March 1944, 3,791 of the prisoners (men, women and children) were sent to the gas chambers; the men were taken to crematorium III and the women later to crematorium II. Some of the group were reported to have sung Hatikvah and the Czech national anthem on the way. Before they were murdered, they had been asked to write postcards to relatives, postdated to 25–27 March. Several twins were held back for medical experiments. The Czechoslovak government-in-exile initiated diplomatic manoeuvers to save the remaining Czech Jews after its representative in Bern received the Vrba-Wetzler report, written by two escaped prisoners, Rudolf Vrba and Alfred Wetzler, which warned that the remaining family-camp inmates would be gassed soon. The BBC also became aware of the report; its German service broadcast news of the family-camp murders during its women's programme on 16 June 1944, warning: "All those responsible for such massacres from top downwards will be called to account." The Red Cross visited Theresienstadt in June 1944 and were persuaded by the SS that no one was being deported from there. The following month, about 2,000 women from the family camp were selected to be moved to other camps and 80 boys were moved to the men's camp; the remaining 7,000 were gassed between 10 and 12 July.
Selection and extermination process.
Gas chambers.
The first gassings at Auschwitz took place in early September 1941, when around 850 inmates—Soviet prisoners of war and sick Polish inmates—were killed with Zyklon B in the basement of block 11 in Auschwitz I. The building proved unsuitable, so gassings were conducted instead in crematorium I, also in Auschwitz I, which operated until December 1942. There, more than 700 victims could be killed at once. Tens of thousands were killed in crematorium I. To keep the victims calm, they were told they were to undergo disinfection and de-lousing; they were ordered to undress outside, then were locked in the building and gassed. After its decommissioning as a gas chamber, the building was converted to a storage facility and later served as an SS air raid shelter. The gas chamber and crematorium were reconstructed after the war. Dwork and van Pelt write that a chimney was recreated; four openings in the roof were installed to show where the Zyklon B had entered; and two of the three furnaces were rebuilt with the original components.
In early 1942, mass exterminations were moved to two provisional gas chambers (the "red house" and "white house", known as bunkers 1 and 2) in Auschwitz II, while the larger crematoria (II, III, IV, and V) were under construction. Bunker 2 was temporarily reactivated from May to November 1944, when large numbers of Hungarian Jews were gassed. In summer 1944 the combined capacity of the crematoria and outdoor incineration pits was 20,000 bodies per day. A planned sixth facility—crematorium VI—was never built.
From 1942, Jews were being transported to Auschwitz from all over German-occupied Europe by rail, arriving in daily convoys. The gas chambers worked to their fullest capacity from May to July 1944, during the Holocaust in Hungary. A rail spur leading to crematoria II and III in Auschwitz II was completed that May, and a new ramp was built between sectors BI and BII to deliver the victims closer to the gas chambers (images top right). On 29 April the first 1,800 Jews from Hungary arrived at the camp. From 14 May until early July 1944, 437,000 Hungarian Jews, half the pre-war population, were deported to Auschwitz, at a rate of 12,000 a day for a considerable part of that period. The crematoria had to be overhauled. Crematoria II and III were given new elevators leading from the stoves to the gas chambers, new grates were fitted, and several of the dressing rooms and gas chambers were painted. Cremation pits were dug behind crematorium V. The incoming volume was so great that the "Sonderkommando" resorted to burning corpses in open-air pits as well as in the crematoria.
Selection.
According to Polish historian Franciszek Piper, of the 1,095,000 Jews deported to Auschwitz, around 205,000 were registered in the camp and given serial numbers; 25,000 were sent to other camps; and 865,000 were murdered soon after arrival. Adding non-Jewish victims gives a figure of 900,000 who were murdered without being registered.
During "selection" on arrival, those deemed able to work were sent to the right and admitted into the camp (registered), and the rest were sent to the left to be gassed. The group selected to die included almost all children, women with small children, the elderly, and others who appeared on brief and superficial inspection by an SS doctor not to be fit for work. Practically any fault—scars, bandages, boils and emaciation—might provide reason enough to be deemed unfit. Children might be made to walk toward a stick held at a certain height; those who could walk under it were selected for the gas. Inmates unable to walk or who arrived at night were taken to the crematoria on trucks; otherwise the new arrivals were marched there. Their belongings were seized and sorted by inmates in the "Kanada" warehouses, an area of the camp in sector BIIg that housed 30 barracks used as storage facilities for plundered goods; it derived its name from the inmates' view of Canada as a land of plenty.
Inside the crematoria.
The crematoria consisted of a dressing room, gas chamber, and furnace room. In crematoria II and III, the dressing room and gas chamber were underground; in IV and V, they were on the ground floor. The dressing room had numbered hooks on the wall to hang clothes. In crematorium II, there was also a dissection room ("Sezierraum"). SS officers told the victims they had to take a shower and undergo delousing. The victims undressed in the dressing room and walked into the gas chamber; signs said "Bade" (bath) or "Desinfektionsraum" (disinfection room). A former prisoner testified that the language of the signs changed depending on who was being killed. Some inmates were given soap and a towel. A gas chamber could hold up to 2,000; one former prisoner said it was around 3,000.
The Zyklon B was delivered to the crematoria by a special SS bureau known as the Hygiene Institute. After the doors were shut, SS men dumped in the Zyklon B pellets through vents in the roof or holes in the side of the chamber. The victims were usually dead within 10 minutes; Rudolf Höss testified that it took up to 20 minutes. Leib Langfus, a member of the "Sonderkommando", buried his diary (written in Yiddish) near crematorium III in Auschwitz II. It was found in 1952, signed "A.Y.R.A":
Use of corpses.
"Sonderkommando" wearing gas masks dragged the bodies from the chamber. They removed glasses and artificial limbs and shaved off the women's hair; women's hair was removed before they entered the gas chamber at Bełżec, Sobibór, and Treblinka, but at Auschwitz it was done after death. By 6 February 1943, the Reich Economic Ministry had received 3,000 kg of women's hair from Auschwitz and Majdanek. The hair was first cleaned in a solution of sal ammoniac, dried on the brick floor of the crematoria, combed, and placed in paper bags. The hair was shipped to various companies, including one manufacturing plant in Bremen-Bluementhal, where workers found tiny coins with Greek letters on some of the braids, possibly from some of the 50,000 Greek Jews deported to Auschwitz in 1943. When they liberated the camp in January 1945, the Red Army found 7,000 kg of human hair in bags ready to ship.
Just before cremation, jewelry was removed, along with dental work and teeth containing precious metals. Gold was removed from the teeth of dead prisoners from 23 September 1940 onwards by order of Heinrich Himmler. The work was carried out by members of the "Sonderkommando" who were dentists; anyone overlooking dental work might themselves be cremated alive. The gold was sent to the SS Health Service and used by dentists to treat the SS and their families; 50 kg had been collected by 8 October 1942. By early 1944, 10–12 kg of gold were being extracted monthly from victims' teeth.
The corpses were burned in the nearby incinerators, and the ashes were buried, thrown in the Vistula river, or used as fertilizer. Any bits of bone that had not burned properly were ground down in wooden mortars.
Death toll.
At least 1.3 million people were sent to Auschwitz between 1940 and 1945, and at least 1.1 million died. Overall 400,207 prisoners were registered in the camp: 268,657 male and 131,560 female. A study in the late 1980s by Polish historian Franciszek Piper, published by Yad Vashem in 1991, used timetables of train arrivals combined with deportation records to calculate that, of the 1.3 million sent to the camp, 1,082,000 had died there, a figure (rounded up to 1.1 million) that Piper regarded as a minimum. That figure came to be widely accepted.
The Germans tried to conceal how many they had murdered. In July 1942, according to Rudolf Höss's post-war memoir, Höss received an order from Heinrich Himmler, via Adolf Eichmann's office and SS commander Paul Blobel, that "[a]ll mass graves were to be opened and the corpses burned. In addition the ashes were to be disposed of in such a way that it would be impossible at some future time to calculate the number of corpses burned."
Earlier estimates of the death toll were higher than Piper's. Following the camp's liberation, the Soviet government issued a statement, on 8 May 1945, that four million people had been murdered on the site, a figure based on the capacity of the crematoria. Höss told prosecutors at Nuremberg that at least 2,500,000 people had been gassed there, and that another 500,000 had died of starvation and disease. He testified that the figure of over two million had come from Eichmann. In his memoirs, written in custody, Höss wrote that Eichmann had given the figure of 2.5 million to Höss's superior officer Richard Glücks, based on records that had been destroyed. Höss regarded this figure as "far too high. Even Auschwitz had limits to its destructive possibilities," he wrote.
Around one in six Jews murdered in the Holocaust died in Auschwitz. By nation, the greatest number of Auschwitz's Jewish victims originated from Hungary, accounting for 430,000 deaths, followed by Poland (300,000), France (69,000), Netherlands (60,000), Greece (55,000), Protectorate of Bohemia and Moravia (46,000), Slovakia (27,000), Belgium (25,000), Germany and Austria (23,000), Yugoslavia (10,000), Italy (7,500), Norway (690), and others (34,000). Timothy Snyder writes that fewer than one percent of the million Soviet Jews murdered in the Holocaust were murdered in Auschwitz. Of the at least 387 Jehovah's Witnesses who were imprisoned at Auschwitz, 132 died in the camp.
Resistance, escapes, and liberation.
Camp resistance, flow of information.
Information about Auschwitz became available to the Allies as a result of reports by Captain Witold Pilecki of the Polish Home Army who, as "Tomasz Serafiński" (serial number 4859), allowed himself to be arrested in Warsaw and taken to Auschwitz. He was imprisoned there from 22 September 1940 until his escape on 27 April 1943. Michael Fleming writes that Pilecki was instructed to sustain morale, organize food, clothing and resistance, prepare to take over the camp if possible, and smuggle information out to the Polish military. Pilecki called his resistance movement Związek Organizacji Wojskowej (ZOW, "Union of Military Organization").
The resistance sent out the first oral message about Auschwitz with Aleksander Wielkopolski, a Polish engineer who was released in October 1940. The following month the Polish underground in Warsaw prepared a report on the basis of that information, "The camp in Auschwitz", part of which was published in London in May 1941 in a booklet, "The German Occupation of Poland", by the Polish Ministry of Foreign Affairs. The report said of the Jews in the camp that "scarcely any of them came out alive". According to Fleming, the booklet was "widely circulated amongst British officials". The "Polish Fortnightly Review" based a story on it, writing that "three crematorium furnaces were insufficient to cope with the bodies being cremated", as did "The Scotsman" on 8 January 1942, the only British news organization to do so.
On 24 December 1941, the resistance groups representing the various prisoner factions met in block 45 and agreed to cooperate. Fleming writes that it has not been possible to track Pilecki's early intelligence from the camp. Pilecki compiled two reports after he escaped in April 1943; the second, Raport W, detailed his life in Auschwitz I and estimated that 1.5 million people, mostly Jews, had been murdered. On 1 July 1942, the "Polish Fortnightly Review" published a report describing Birkenau, writing that "prisoners call this supplementary camp 'Paradisal', presumably because there is only one road, leading to Paradise". Reporting that inmates were being killed "through excessive work, torture and medical means", it noted the gassing of the Soviet prisoners of war and Polish inmates in Auschwitz I in September 1941, the first gassing in the camp. It said: "It is estimated that the Oswiecim camp can accommodate fifteen thousand prisoners, but as they die on a mass scale there is always room for new arrivals."
The Polish government-in-exile in London first reported the gassing of prisoners in Auschwitz on 21 July 1942, and reported the gassing of Soviet POWs and Jews on 4 September 1942. In 1943, the "Kampfgruppe Auschwitz" (Combat Group Auschwitz) was organized within the camp with the aim of sending out information about what was happening. The "Sonderkommando" buried notes in the ground, hoping they would be found by the camp's liberators. The group also smuggled out photographs; the "Sonderkommando" photographs, of events around the gas chambers in Auschwitz II, were smuggled out of the camp in September 1944 in a toothpaste tube.
According to Fleming, the British press responded, in 1943 and the first half of 1944, either by not publishing reports about Auschwitz or by burying them on the inside pages. The exception was the "Polish Jewish Observer", a "City and East London Observer" supplement edited by Joel Cang, a former Warsaw correspondent for the "Manchester Guardian". The British reticence stemmed from a Foreign Office concern that the public might pressure the government to respond or provide refuge for the Jews, and that British actions on behalf of the Jews might affect its relationships in the Middle East. There was similar reticence in the United States, and indeed within the Polish government-in-exile and the Polish resistance. According to Fleming, the scholarship suggests that the Polish resistance distributed information about the Holocaust in Auschwitz without challenging the Allies' reluctance to highlight it.
Escapes, "Auschwitz Protocols".
From the first escape on 6 July 1940 of Tadeusz Wiejowski, at least 802 prisoners (757 men and 45 women) tried to escape from the camp, according to Polish historian Henryk Świebocki. He writes that most escapes were attempted from work sites outside the camp's perimeter fence. Of the 802 escapes, 144 were successful, 327 were caught, and the fate of 331 is unknown.
Four Polish prisoners— (serial number 8502), Kazimierz Piechowski (no. 918), (no. 6438), and Józef Lempart (no. 3419)—escaped successfully on 20 June 1942. After breaking into a warehouse, three of them dressed as SS officers and stole rifles and an SS staff car, which they drove out of the camp with the fourth handcuffed as a prisoner. They wrote later to Rudolf Höss apologizing for the loss of the vehicle. On 21 July 1944, Polish inmate Jerzy Bielecki dressed in an SS uniform and, using a faked pass, managed to cross the camp's gate with his Jewish girlfriend, Cyla Cybulska, pretending that she was wanted for questioning. Both survived the war. For having saved her, Bielecki was recognized by Yad Vashem as Righteous Among the Nations.
Jerzy Tabeau (no. 27273, registered as Jerzy Wesołowski) and Roman Cieliczko (no. 27089), both Polish prisoners, escaped on 19 November 1943; Tabeau made contact with the Polish underground and, between December 1943 and early 1944, wrote what became known as the "Polish Major's report" about the situation in the camp. On 27 April 1944, Rudolf Vrba (no. 44070) and Alfréd Wetzler (no. 29162) escaped to Slovakia, carrying detailed information to the Slovak Jewish Council about the gas chambers. The distribution of the Vrba-Wetzler report, and publication of parts of it in June 1944, helped to halt the deportation of Hungarian Jews to Auschwitz. On 27 May 1944, Arnost Rosin (no. 29858) and Czesław Mordowicz (no. 84216) also escaped to Slovakia; the Rosin-Mordowicz report was added to the Vrba-Wetzler and Tabeau reports to become what is known as the "Auschwitz Protocols". The reports were first published in their entirety in November 1944 by the United States War Refugee Board as "The Extermination Camps of Auschwitz (Oświęcim) and Birkenau in Upper Silesia".
Bombing proposal.
In January 1941, the Commander-in-Chief of the Polish Army and prime minister-in-exile, Władysław Sikorski, arranged for a report to be forwarded to Air Marshal Richard Pierse, head of RAF Bomber Command. Written by Auschwitz prisoners in or around December 1940, the report described the camp's atrocious living conditions and asked the Polish government-in-exile to bomb it:
Pierse replied that it was not technically feasible to bomb the camp without harming the prisoners. In May 1944 Slovak rabbi Michael Dov Weissmandl suggested that the Allies bomb the rails leading to the camp. Historian David Wyman published an essay in "Commentary" in 1978 entitled "Why Auschwitz Was Never Bombed", arguing that the United States Army Air Forces could and should have attacked Auschwitz. In his book "The Abandonment of the Jews: America and the Holocaust 1941–1945" (1984), Wyman argued that, since the IG Farben plant at Auschwitz III had been bombed three times between August and December 1944 by the US Fifteenth Air Force in Italy, it would have been feasible for the other camps or railway lines to be bombed too. Bernard Wasserstein's "Britain and the Jews of Europe" (1979) and Martin Gilbert's "Auschwitz and the Allies" (1981) raised similar questions about British inaction. Since the 1990s, other historians have argued that Allied bombing accuracy was not sufficient for Wyman's proposed attack, and that counterfactual history is an inherently problematic endeavor.
"Sonderkommando" revolt.
The "Sonderkommando" who worked in the crematoria were witnesses to the mass murder and were therefore regularly murdered themselves. On 7 October 1944, following an announcement that 300 of them were to be sent to a nearby town to clear away rubble—"transfers" were a common ruse for the murder of prisoners—the group, mostly Jews from Greece and Hungary, staged an uprising. They attacked the SS with stones and hammers, killing three of them, and set crematorium IV on fire with rags soaked in oil that they had hidden. Hearing the commotion, the "Sonderkommando" at crematorium II believed that a camp uprising had begun and threw their "Oberkapo" into a furnace. After escaping through a fence using wirecutters, they managed to reach Rajsko, where they hid in the granary of an Auschwitz satellite camp, but the SS pursued and killed them by setting the granary on fire.
By the time the rebellion at crematorium IV had been suppressed, 212 members of the "Sonderkommando" were still alive and 451 had been killed. The dead included Zalmen Gradowski, who kept notes of his time in Auschwitz and buried them near crematorium III; after the war, another "Sonderkommando" member showed the prosecutors where to dig. The notes were published in several formats, including in 2017 as "From the Heart of Hell".
Evacuation and death marches.
The last mass transports to arrive in Auschwitz were 60,000–70,000 Jews from the Łódź Ghetto, some 2,000 from Theresienstadt, and 8,000 from Slovakia. The last selection took place on 30 October 1944. On 1 or 2 November 1944, Heinrich Himmler ordered the SS to halt the mass murder by gas. On 25 November, he ordered that Auschwitz's gas chambers and crematoria be destroyed. The "Sonderkommando" and other prisoners began the job of dismantling the buildings and cleaning up the site. On 18 January 1945, Engelbert Marketsch, a German criminal transferred from Mauthausen, became the last prisoner to be assigned a serial number in Auschwitz, number 202499.
According to Polish historian Andrzej Strzelecki, the evacuation of the camp was one of its "most tragic chapters". Himmler ordered the evacuation of all camps in January 1945, telling camp commanders: "The Führer holds you personally responsible for ... making sure that not a single prisoner from the concentration camps falls alive into the hands of the enemy." The plundered goods from the "Kanada" barracks, together with building supplies, were transported to the German interior. Between 1 December 1944 and 15 January 1945, over one million items of clothing were packed to be shipped out of Auschwitz; 95,000 such parcels were sent to concentration camps in Germany.
Beginning on 17 January, some 58,000 Auschwitz detainees (about two-thirds Jews)—over 20,000 from Auschwitz I and II and over 30,000 from the subcamps—were evacuated under guard, at first heading west on foot, then by open-topped freight trains, to concentration camps in Germany and Austria: Bergen-Belsen, Buchenwald, Dachau, Flossenburg, Gross-Rosen, Mauthausen, Dora-Mittelbau, Ravensbruck, and Sachsenhausen. Fewer than 9,000 remained in the camps, deemed too sick to move. During the marches, the SS shot or otherwise dispatched anyone unable to continue; "execution details" followed the marchers, killing prisoners who lagged behind. Peter Longerich estimated that a quarter of the detainees were thus killed. By December 1944 some 15,000 Jewish prisoners had made it from Auschwitz to Bergen-Belsen, where they were liberated by the British on 15 April 1945.
On 20 January, crematoria II and III were blown up, and on 23 January the "Kanada" warehouses were set on fire; they apparently burned for five days. Crematorium IV had been partly demolished after the "Sonderkommando" revolt in October, and the rest of it was destroyed later. On 26 January, one day ahead of the Red Army's arrival, crematorium V was blown up.
Liberation.
The first in the camp complex to be liberated was Auschwitz III, the IG Farben camp at Monowitz; a soldier from the 100th Infantry Division of the Red Army entered the camp around 9 am on Saturday, 27 January 1945. The 60th Army of the 1st Ukrainian Front (also part of the Red Army) arrived in Auschwitz I and II around 3 pm. They found 7,000 prisoners alive in the three main camps, 500 in the other subcamps, and over 600 corpses. Items found included 837,000 women's garments, 370,000 men's suits, 44,000 pairs of shoes, and 7,000 kg of human hair, estimated by the Soviet war crimes commission to have come from 140,000 people. Some of the hair was examined by the Forensic Science Institute in Kraków, where it was found to contain traces of hydrogen cyanide, the main ingredient of Zyklon B. Primo Levi described seeing the first four soldiers on horseback approach Auschwitz III, where he had been in the sick bay. They threw "strangely embarrassed glances at the sprawling bodies, at the battered huts and at us few still alive ...":
Georgii Elisavetskii, a Soviet soldier who entered one of the barracks, said in 1980 that he could hear other soldiers telling the inmates: "You are free, comrades!" But they did not respond, so he tried in Russian, Polish, German, Ukrainian. Then he used some Yiddish: "They think that I am provoking them. They begin to hide. And only when I said to them: 'Do not be afraid, I am a colonel of Soviet Army and a Jew. We have come to liberate you' ... Finally, as if the barrier collapsed ... they rushed toward us shouting, fell on their knees, kissed the flaps of our overcoats, and threw their arms around our legs."
The Soviet military medical service and Polish Red Cross (PCK) set up field hospitals that looked after 4,500 prisoners suffering from the effects of starvation (mostly diarrhea) and tuberculosis. Local volunteers helped until the Red Cross team arrived from Kraków in early February. In Auschwitz II, the layers of excrement on the barracks floors had to be scraped off with shovels. Water was obtained from snow and from fire-fighting wells. Before more help arrived, 2,200 patients there were looked after by a few doctors and 12 PCK nurses. All the patients were later moved to the brick buildings in Auschwitz I, where several blocks became a hospital, with medical personnel working 18-hour shifts.
The liberation of Auschwitz received little press attention at the time; the Red Army was focusing on its advance toward Germany and liberating the camp had not been one of its key aims. Boris Polevoi reported on the liberation in "Pravda" on 2 February 1945 but made no mention of Jews; inmates were described collectively as "victims of Fascism". It was when the Western Allies arrived in Buchenwald, Bergen-Belsen, and Dachau in April 1945 that the liberation of the camps received extensive coverage.
After the war.
Trials of war criminals.
Only 789 Auschwitz staff, up to 15 percent, ever stood trial; most of the cases were pursued in Poland and the Federal Republic of Germany. According to Aleksander Lasik, female SS officers were treated more harshly than male; of the 17 women sentenced, four received the death penalty and the others longer prison terms than the men. He writes that this may have been because there were only 200 women overseers, and therefore they were more visible and memorable to the inmates.
Camp commandant Rudolf Höss was arrested by the British on 11 March 1946 near Flensburg, northern Germany, where he had been working as a farmer under the pseudonym Franz Lang. He was imprisoned in Heide, then transferred to Minden for interrogation, part of the British occupation zone. From there he was taken to Nuremberg to testify for the defense in the trial of "SS-Obergruppenführer" Ernst Kaltenbrunner. Höss was straightforward about his own role in the mass murder and said he had followed the orders of Heinrich Himmler. Extradited to Poland on 25 May 1946, he wrote his memoirs in custody, first published in Polish in 1951 then in German in 1958 as "Kommandant in Auschwitz". His trial before the Supreme National Tribunal in Warsaw opened on 11 March 1947; he was sentenced to death on 2 April and hanged in Auschwitz I on 16 April, near crematorium I.
On 25 November 1947, the Auschwitz trial began in Kraków, when Poland's Supreme National Tribunal brought to court 40 former Auschwitz staff, including commandant Arthur Liebehenschel, women's camp leader Maria Mandel, and camp leader Hans Aumeier. The trials ended on 22 December 1947, with 23 death sentences, seven life sentences, and nine prison sentences ranging from three to 15 years. Hans Münch, an SS doctor who had several former prisoners testify on his behalf, was the only person to be acquitted.
Other former staff were hanged for war crimes in the Dachau Trials and the Belsen Trial, including camp leaders Josef Kramer, Franz Hössler, and Vinzenz Schöttl; doctor Friedrich Entress; and guards Irma Grese and Elisabeth Volkenrath. Bruno Tesch and Karl Weinbacher, the owner and chief executive officer of the firm Tesch & Stabenow, one of the suppliers of Zyklon B, were arrested by the British after the war and executed for knowingly supplying the chemical for use on humans. The 180-day Frankfurt Auschwitz trials, held in West Germany from 20 December 1963 to 20 August 1965, tried 22 defendants, including two dentists, a doctor, two camp adjudants and the camp's pharmacist. The 700-page indictment, presenting the testimony of 254 witnesses, was accompanied by a 300-page report about the camp, "Nationalsozialistische Konzentrationslager", written by historians from the "Institut für Zeitgeschichte" in Germany, including Martin Broszat and Helmut Krausnick. The report became the basis of their book, "Anatomy of the SS State" (1968), the first comprehensive study of the camp and the SS. The court convicted 19 of the defendants, giving six of them life sentences and the others between three and ten years. East Germany also held trials against several former staff members of Auschwitz. One of the defendants they tried was Horst Fischer. Fischer, one of the highest ranking SS physicians in the camp, had personally selected at least 75,000 men, women, and children to be gassed. He was arrested in 1965. The following year, he was convicted of crimes against humanity, sentenced to death, and guillotined. Fischer was the highest-ranking SS physician from Auschwitz to ever be tried by a German court.
Legacy.
In the decades since its liberation, Auschwitz has become a primary symbol of the Holocaust. Seweryna Szmaglewska's 1945 autobiograpy "Dymy nad Birkenau" ("Smoke over Birkenau") has been credited with spreading knowledge about the camp to the general public. Historian Timothy D. Snyder attributes this to the camp's high death toll and "unusual combination of an industrial camp complex and a killing facility", which left behind far more witnesses than single-purpose killing facilities such as Chełmno or Treblinka. In 2005 the United Nations General Assembly designated 27 January, the date of the camp's liberation, as International Holocaust Remembrance Day. Helmut Schmidt visited the site in November 1977, the first West German chancellor to do so, followed by his successor, Helmut Kohl, in November 1989. In a statement on the 50th anniversary of the liberation, Kohl said that "[t]he darkest and most awful chapter in German history was written at Auschwitz." In January 2020, world leaders gathered at Yad Vashem in Jerusalem to commemorate the 75th anniversary. It was the city's largest-ever political gathering, with over 45 heads of state and world leaders, including royalty. At Auschwitz itself, Reuven Rivlin and Andrzej Duda, the presidents of Israel and Poland, laid wreaths.
Notable memoirists of the camp include Primo Levi, Elie Wiesel, and Tadeusz Borowski. Levi's "If This is a Man", first published in Italy in 1947 as "Se questo è un uomo", became a classic of Holocaust literature, an "imperishable masterpiece". Wiesel wrote about his imprisonment at Auschwitz in "Night" (1960) and other works, and became a prominent spokesman against ethnic violence; in 1986, he was awarded the Nobel Peace Prize. Camp survivor Simone Veil was elected President of the European Parliament, serving from 1979 to 1982. Two Auschwitz victims—Maximilian Kolbe, a priest who volunteered to die by starvation in place of a stranger, and Edith Stein, a Jewish convert to Catholicism—were named saints of the Catholic Church.
In 2017, a Körber Foundation survey found that 40 percent of 14-year-olds in Germany did not know what Auschwitz was. The following year a survey organized by the Claims Conference, United States Holocaust Memorial Museum and others found that 41 percent of 1,350 American adults surveyed, and 66 percent of millennials, did not know what Auschwitz was, while 22 percent said they had never heard of the Holocaust. A CNN-ComRes poll in 2018 found a similar situation in Europe.
Auschwitz-Birkenau State Museum.
On 2 July 1947, the Polish government passed a law establishing a state memorial to remember "the martyrdom of the Polish nation and other nations in Oswiecim". The museum established its exhibits at Auschwitz I; after the war, the barracks in Auschwitz II-Birkenau had been mostly dismantled and moved to Warsaw to be used on building sites. Dwork and van Pelt write that, in addition, Auschwitz I played a more central role in the persecution of the Polish people, in opposition to the importance of Auschwitz II to the Jews, including Polish Jews. An exhibition opened in Auschwitz I in 1955, displaying prisoner mug shots; hair, suitcases, and shoes taken from murdered prisoners; canisters of Zyklon B pellets; and other objects related to the killings. UNESCO added the camp to its list of World Heritage Sites in 1979. All the museum's directors were, until 1990, former Auschwitz prisoners. Visitors to the site have increased from 492,500 in 2001, to over one million in 2009, to two million in 2016.
There have been protracted disputes over the perceived Christianization of the site. Pope John Paul II celebrated mass over the train tracks leading to Auschwitz II-Birkenau on 7 June 1979 and called the camp "the Golgotha of our age", referring to the crucifixion of Jesus. More controversy followed when Carmelite nuns founded a convent in 1984 in a former theater outside the camp's perimeter, near block 11 of Auschwitz I, after which a local priest and some survivors erected a large cross—one that had been used during the pope's mass—behind block 11 to commemorate 152 Polish inmates shot by the Germans in 1941. After a long dispute, Pope John Paul II intervened and the nuns moved the convent elsewhere in 1993. The cross remained, triggering the "War of the Crosses", as more crosses were erected to commemorate Christian victims, despite international objections. The Polish government and Catholic Church eventually agreed to remove all but the original.
On 4 September 2003, despite a protest from the museum, three Israeli Air Force F-15 Eagles performed a fly-over of Auschwitz II-Birkenau during a ceremony at the camp below. All three pilots were descendants of Holocaust survivors, including the man who led the flight, Major-General Amir Eshel. On 27 January 2015, some 300 Auschwitz survivors gathered with world leaders under a giant tent at the entrance to Auschwitz II to commemorate the 70th anniversary of the camp's liberation.
Museum curators consider visitors who pick up items from the ground to be thieves, and local police will charge them as such; the maximum penalty is a 10-year prison sentence. In 2017 two British youths from the Perse School were fined in Poland after picking up buttons and shards of decorative glass in 2015 from the "Kanada" area of Auschwitz II, where camp victims' personal effects were stored. The "Arbeit Macht Frei" sign over the main camp's gate was stolen in December 2009 by a Swedish former neo-Nazi and two Polish men. The sign was later recovered.
In 2018 the Polish government passed an amendment to its Act on the Institute of National Remembrance, making it a criminal offence to violate the "good name" of Poland by accusing it of crimes committed by Germany in the Holocaust, which would include referring to Auschwitz and other camps as "Polish death camps". Staff at the museum were accused by nationalist media in Poland of focusing too much on the fate of the Jews in Auschwitz at the expense of ethnic Poles. The brother of the museum's director, Piotr Cywiński, wrote that Cywiński had experienced "50 days of incessant hatred". After discussions with Israel's prime minister, amid international concern that the new law would stifle research, the Polish government adjusted the amendment so that anyone accusing Poland of complicity would be guilty only of a civil offence.
|
2007 | Archery | Archery is the sport, practice, or skill of using a bow to shoot arrows. The word comes from the Latin "arcus", meaning bow. Historically, archery has been used for hunting and combat. In modern times, it is mainly a competitive sport and recreational activity. A person who practices archery is typically called an archer, bowman, or toxophilite.
History.
Origins and ancient archery.
The oldest known evidence of the bow and arrow comes from South African sites such as Sibudu Cave, where the remains of bone and stone arrowheads have been found dating approximately 72,000 to 60,000 years ago.
Based on indirect evidence, the bow also seems to have appeared or reappeared later in Eurasia, near the transition from the Upper Paleolithic to the Mesolithic. The earliest definite remains of bow and arrow from Europe are possible fragments from Germany found at Mannheim-Vogelstang dated 17,500 to 18,000 years ago, and at Stellmoor dated 11,000 years ago. Azilian points found in Grotte du Bichon, Switzerland, alongside the remains of both a bear and a hunter, with flint fragments found in the bear's third vertebra, suggest the use of arrows at 13,500 years ago. Other signs of its use in Europe come from the in the north of Hamburg, Germany and dates from the late Paleolithic, about 10,000–9000 BC. The arrows were made of pine and consisted of a main shaft and a fore shaft with a flint point. There are no definite earlier bows; previous pointed shafts are known, but may have been launched by spear-throwers rather than bows. The oldest bows known so far comes from the Holmegård swamp in Denmark.
At the site of Nataruk in Turkana County, Kenya, obsidian bladelets found embedded in a skull and within the thoracic cavity of another skeleton, suggest the use of stone-tipped arrows as weapons about 10,000 years ago.
Bows eventually replaced the spear-thrower as the predominant means for launching shafted projectiles, on every continent except Australasia, though spear-throwers persisted alongside the bow in parts of the Americas, notably Mexico and among the Inuit.
Bows and arrows have been present in Egyptian and neighbouring Nubian culture since its respective predynastic and Pre-Kerma origins. In the Levant, artifacts that could be arrow-shaft straighteners are known from the Natufian culture, (c. 10,800–8,300 BC) onwards. The Khiamian and PPN A shouldered Khiam-points may well be arrowheads.
Classical civilizations, notably the Assyrians, Greeks, Armenians, Persians, Parthians, Romans, Indians, Koreans, Chinese, and Japanese fielded large numbers of archers in their armies. Akkadians were the first to use composite bows in war according to the victory stele of Naram-Sin of Akkad. Egyptians referred to Nubia as "Ta-Seti," or "The Land of the Bow," since the Nubians were known to be expert archers, and by the 16th Century BC Egyptians were using the composite bow in warfare. The Bronze Age Aegean Cultures were able to deploy a number of state-owned specialized bow makers for warfare and hunting purposes already from the 15th century BC. The Welsh longbow proved its worth for the first time in Continental warfare at the Battle of Crécy. In the Americas archery was widespread at European contact.
Archery was highly developed in Asia. The Sanskrit term for archery, dhanurvidya, came to refer to martial arts in general. In East Asia, Goguryeo, one of the Three Kingdoms of Korea was well known for its regiments of exceptionally skilled archers.
Medieval archery.
The medieval shortbow was technically identical with the classical era bows, having a range of approximately . It was the primary ranged weapon of the battlefield through the early medieval period. Around the tenth century the crossbow was introduced in Europe. Crossbows generally had a longer range, greater accuracy and more penetration than the shortbow, but suffered from a much slower rate of fire. Crossbows were used in the early Crusades, with models having a range of and being able to penetrate armour or kill a horse.
During the late medieval period the English army famously relied on massed archers armed with the longbow. The French army relied more on the crossbow. Like their predecessors archers were more likely to be peasants or yeomen than men-at-arms. The longbow had a range of up to . However its lack of accuracy at long ranges made it a mass weapon rather than an individual one. Significant victories attributable to the longbow, such as the Battle of Crecy and Battle of Agincourt resulted in the English longbow becoming part of military lore.
Mounted archery.
Tribesmen of Central Asia (after the domestication of the horse) and American Plains Indians (after gaining access to horses by Europeans) became extremely adept at archery on horseback. Lightly armoured, but highly mobile archers were excellently suited to warfare in the Central Asian steppes, and they formed a large part of armies that repeatedly conquered large areas of Eurasia. Shorter bows are more suited to use on horseback, and the composite bow enabled mounted archers to use powerful weapons. Seljuk Turks used mounted archers against the European First Crusade, especially at the Battle of Dorylaeum (1097). Their tactic was to shoot at the enemy infantry, and use their superior mobility to prevent the enemy from closing with them. Empires throughout the Eurasian landmass often strongly associated their respective "barbarian" counterparts with the usage of the bow and arrow, to the point where powerful states like the Han Dynasty referred to their neighbours, the Xiong-nu, as "Those Who Draw the Bow". For example, Xiong-nu mounted bowmen made them more than a match for the Han military, and their threat was at least partially responsible for Chinese expansion into the Ordos region, to create a stronger, more powerful buffer zone against them. It is possible that "barbarian" peoples were responsible for introducing archery or certain types of bows to their "civilized" counterpartsthe Xiong-nu and the Han being one example. Similarly, short bows seem to have been introduced to Japan by northeast Asian groups.
Decline of archery.
The development of firearms rendered bows obsolete in warfare, although efforts were sometimes made to preserve archery practice. In England and Wales, for example, the government tried to enforce practice with the longbow until the end of the 16th century. This was because it was recognized that the bow had been instrumental to military success during the Hundred Years' War. Despite the high social status, ongoing utility, and widespread pleasure of archery in Armenia, China, Egypt, England and Wales, the Americas, India, Japan, Korea, Turkey and elsewhere, almost every culture that gained access to even early firearms used them widely, to the neglect of archery. Early firearms were inferior in rate-of-fire, and were very sensitive to wet weather. However, they had longer effective range and were tactically superior in the common situation of soldiers shooting at each other from behind obstructions. They also required significantly less training to use properly, in particular penetrating steel armor without any need to develop special musculature. Armies equipped with guns could thus provide superior firepower, and highly trained archers became obsolete on the battlefield. However, the bow and arrow is still an effective weapon, and archers have seen military action in the 21st century. Traditional archery remains in use for sport, and for hunting in many areas.
18th century revival as a sport.
Early recreational archery societies included the Finsbury Archers and the Ancient Society of Kilwinning Archers. The latter's annual Papingo event was first recorded in 1483. (In this event, archers shoot vertically from the base of an abbey tower to dislodge a wood pigeon placed approximately above.) The Royal Company of Archers was formed in 1676 and is one of the oldest sporting bodies in the world. Archery remained a small and scattered pastime, however, until the late 18th century when it experienced a fashionable revival among the aristocracy. Sir Ashton Lever, an antiquarian and collector, formed the Toxophilite Society in London in 1781, with the patronage of George, the Prince of Wales.
Archery societies were set up across the country, each with its own strict entry criteria and outlandish costumes. Recreational archery soon became extravagant social and ceremonial events for the nobility, complete with flags, music and 21-gun salutes for the competitors. The clubs were "the drawing rooms of the great country houses placed outside" and thus came to play an important role in the social networks of the local upper class. As well as its emphasis on display and status, the sport was notable for its popularity with females. Young women could not only compete in the contests but retain and show off their sexuality while doing so. Thus, archery came to act as a forum for introductions, flirtation and romance. It was often consciously styled in the manner of a Medieval tournament with titles and laurel wreaths being presented as a reward to the victor. General meetings were held from 1789, in which local lodges convened together to standardise the rules and ceremonies. Archery was also co-opted as a distinctively British tradition, dating back to the lore of Robin Hood and it served as a patriotic form of entertainment at a time of political tension in Europe. The societies were also elitist, and the new middle class bourgeoisie were excluded from the clubs due to their lack of social status.
After the Napoleonic Wars, the sport became increasingly popular among all classes, and it was framed as a nostalgic reimagining of the preindustrial rural Britain. Particularly influential was Sir Walter Scott's 1819 novel, "Ivanhoe" that depicted the heroic character Lockseley winning an archery tournament.
A modern sport.
The 1840s saw the second attempts at turning the recreation into a modern sport. The first Grand National Archery Society meeting was held in York in 1844 and over the next decade the extravagant and festive practices of the past were gradually whittled away and the rules were standardized as the 'York Round' - a series of shoots at , , and . Horace A. Ford helped to improve archery standards and pioneered new archery techniques. He won the Grand National 11 times in a row and published a highly influential guide to the sport in 1856.
Towards the end of the 19th century, the sport experienced declining participation as alternative sports such as croquet and tennis became more popular among the middle class. By 1889, just 50 archery clubs were left in Britain, but it was still included as a sport at the 1900 Paris Olympics.
The National Archery Association of the United States was organized in 1879, in part by Maurice Thompson (the author of the seminal text “The Witchery of Archery”) and his brother Will Thompson. Maurice was president in its inaugural year and Will was president in 1882, 1903, and 1904. The 1910 President was Frank E Canfield. Today it is known as USA Archery and is recognized by United States Olympic & Paralympic Committee.
In the United States, primitive archery was revived in the early 20th century. The last of the Yahi Indian tribe, a native known as Ishi, came out of hiding in California in 1911. His doctor, Saxton Pope, learned many of Ishi's traditional archery skills, and popularized them.
From the 1920s, professional engineers took an interest in archery, previously the exclusive field of traditional craft experts. They led the commercial development of new forms of bow including the modern recurve and compound bow. These modern forms are now dominant in modern Western archery; traditional bows are in a minority. Archery returned to the Olympics in 1972. In the 1980s, the skills of traditional archery were revived by American enthusiasts, and combined with the new scientific understanding. Much of this expertise is available in the "Traditional Bowyer's Bibles" (see Further reading). Modern game archery owes much of its success to Fred Bear, an American bow hunter and bow manufacturer.
In 2021, five people were killed and three injured by an archer in Norway in the Kongsberg attack.
Mythology.
Deities and heroes in several mythologies are described as archers, including the Greek Artemis and Apollo, the Roman Diana and Cupid, the Germanic Agilaz, continuing in legends like those of Wilhelm Tell, Palnetoke, or Robin Hood. Armenian Hayk and Babylonian Marduk, Indian Karna (also known as Radheya/son of Radha), Abhimanyu, Eklavya, Arjuna, Bhishma, Drona, Rama, and Shiva were known for their shooting skills. The famous archery competition of hitting the eye of a rotating fish while watching its reflection in the water bowl was one of the many archery skills depicted in the "Mahabharata".
Persian Arash was a famous archer. Earlier Greek representations of Heracles normally depict him as an archer. Archery, and the bow, play an important part in the epic poem the "Odyssey," when Odysseus returns home in disguise and then bests the suitors in an archery competition after hinting at his identity by stringing and drawing his great bow that only he can draw, a similar motif is present in the Turkic Iranian heroic archeheroic poem "Alpamysh".
The () were worshipped on the Greek island of Delos as attendants of Artemis, presiding over aspects of archery; (), represented distancing, (), trajectory, and (), aim.
Yi the archer and his apprentice Feng Meng appear in several early Chinese myths, and the historical character of Zhou Tong features in many fictional forms. Jumong, the first Taewang of the Goguryeo kingdom of the Three Kingdoms of Korea, is claimed by legend to have been a near-godlike archer. Archery features in the story of Oguz Khagan. Similarly, archery and the bow feature heavily into historical Korean identity.
In West African Yoruba belief, Osoosi is one of several deities of the hunt who are identified with bow and arrow iconography and other insignia associated with archery.
Equipment.
Types of bows.
While there is great variety in the construction details of bows (both historic and modern), all bows consist of a string attached to elastic limbs that store mechanical energy imparted by the user drawing the string. Bows may be broadly split into two categories: those drawn by pulling the string directly and those that use a mechanism to pull the string.
Directly drawn bows may be further divided based upon differences in the method of limb construction, notable examples being self bows, laminated bows and composite bows. Bows can also be classified by the bow shape of the limbs when unstrung; in contrast to traditional European straight bows, a recurve bow and some types of longbow have tips that curve away from the archer when the bow is unstrung. The cross-section of the limb also varies; the classic longbow is a tall bow with narrow limbs that are D-shaped in cross section, and the flatbow has flat wide limbs that are approximately rectangular in cross-section. Cable-backed bows use cords as the back of the bow; the draw weight of the bow can be adjusted by changing the tension of the cable. They were widespread among Inuit who lacked easy access to good bow wood. One variety of cable-backed bow is the Penobscot bow or Wabenaki bow, invented by Frank Loring (Chief Big Thunder) about 1900. It consists of a small bow attached by cables on the back of a larger main bow.
In different cultures, the arrows are released from either the left or right side of the bow, and this affects the hand grip and position of the bow. In Arab archery, Turkish archery and Kyūdō, the arrows are released from the right hand side of the bow, and this affects construction of the bow. In western archery, the arrow is usually released from the left hand side of the bow for a right-handed archer.
Compound bows are designed to reduce the force required to hold the string at full draw, hence allowing the archer more time to aim with less muscular stress. Most compound designs use cams or elliptical wheels on the ends of the limbs to achieve this. A typical let-off is anywhere from 65% to 80%. For example, a bow with 80% let-off only requires to hold at full draw. Up to 99% let-off is possible. The compound bow was invented by Holless Wilbur Allen in the 1960s (a US patent was filed in 1966 and granted in 1969) and it has become the most widely used type of bow for all forms of archery in North America.
Mechanically drawn bows typically have a stock or other mounting, such as the crossbow. Crossbows typically have shorter draw lengths compared to compound bows. Because of this, heavier draw weights are required to achieve the same energy transfer to the arrow. These mechanically drawn bows also have devices to hold the tension when the bow is fully drawn. They are not limited by the strength of a single archer and larger varieties have been used as siege engines.
Types of arrows and fletchings.
The most common form of arrow consists of a shaft, with an arrowhead at the front end, and fletchings and a nock at the other end. Arrows across time and history have normally been carried in a container known as a quiver, which can take many different forms. Shafts of arrows are typically composed of solid wood, bamboo, fiberglass, aluminium alloy, carbon fiber, or composite materials. Wooden arrows are prone to warping. Fiberglass arrows are brittle, but can be produced to uniform specifications easily. Aluminium shafts were a very popular high-performance choice in the latter half of the 20th century, due to their straightness, lighter weight, and subsequently higher speed and flatter trajectories. Carbon fiber arrows became popular in the 1990s because they are very light, flying even faster and flatter than aluminium arrows. Today, the most popular arrows at tournaments and Olympic events are made of composite materials.
The arrowhead is the primary functional component of the arrow. Some arrows may simply use a sharpened tip of the solid shaft, but separate arrowheads are far more common, usually made from metal, stone, or other hard materials. The most commonly used forms are target points, field points, and broadheads, although there are also other types, such as bodkin, judo, and blunt heads.
Fletching is traditionally made from bird feathers, but solid plastic vanes and thin sheet-like spin vanes are used. They are attached near the nock (rear) end of the arrow with thin double sided tape, glue, or, traditionally, sinew. The most common configuration in all cultures is three fletches, though as many as six have been used. Two makes the arrow unstable in flight. When the arrow is "three-fletched", the fletches are equally spaced around the shaft, with one placed such that it is perpendicular to the bow when nocked on the string, though variations are seen with modern equipment, especially when using the modern spin vanes. This fletch is called the "index fletch" or "cock feather" (also known as "the odd vane out" or "the nocking vane"), and the others are sometimes called the "hen feathers". Commonly, the cock feather is of a different color. However, if archers are using fletching made of feather or similar material, they may use same color vanes, as different dyes can give varying stiffness to vanes, resulting in less precision. When an arrow is "four-fletched", two opposing fletches are often cock feathers, and occasionally the fletches are not evenly spaced.
The fletching may be either "parabolic" cut (short feathers in a smooth parabolic curve) or "shield" cut (generally shaped like half of a narrow shield), and is often attached at an angle, known as "helical" fletching, to introduce a stabilizing spin to the arrow while in flight. Whether helical or straight fletched, when natural fletching (bird feathers) is used it is critical that all feathers come from the same side of the bird. Oversized fletchings can be used to accentuate drag and thus limit the range of the arrow significantly; these arrows are called "flu-flus". Misplacement of fletchings can change the arrow's flight path dramatically.
Bowstring.
Dacron and other modern materials offer high strength for their weight and are used on most modern bows. Linen and other traditional materials are still used on traditional bows. Several modern methods of making a bowstring exist, such as the 'endless loop' and 'Flemish twist'. Almost any fiber can be made into a bowstring. The author of "Arab Archery" suggests the hide of a young, emaciated camel. Njál's saga describes the refusal of a wife, Hallgerður, to cut her hair to make an emergency bowstring for her husband, Gunnar Hámundarson, who is then killed.
Protective equipment.
Most modern archers wear a bracer (also known as an arm-guard) to protect the inside of the bow arm from being hit by the string and prevent clothing from catching the bowstring. The bracer does not brace the arm; the word comes from the armoury term "brassard", meaning an armoured sleeve or badge. The Navajo people have developed highly ornamented bracers as non-functional items of adornment. Some archers (nearly all female archers) wear protection on their chests, called chestguards or plastrons. The myth of the Amazons was that they had one breast removed to solve this problem. Roger Ascham mentions one archer, presumably with an unusual shooting style, who wore a leather guard for his face.
The drawing digits are normally protected by a leather tab, glove, or thumb ring. A simple tab of leather is commonly used, as is a skeleton glove. Medieval Europeans probably used a complete leather glove.
Eurasiatic archers who used the thumb or Mongolian draw protected their thumbs, usually with leather according to the author of "Arab Archery", but also with special rings of various hard materials. Many surviving Turkish and Chinese examples are works of considerable art. Some are so highly ornamented that the users could not have used them to loose an arrow. Possibly these were items of personal adornment, and hence value, remaining extant whilst leather had virtually no intrinsic value and would also deteriorate with time. In traditional Japanese archery a special glove is used that has a ridge to assist in drawing the string.
Release aids.
A release aid is a mechanical device designed to give a crisp and precise loose of arrows from a compound bow. In the most commonly used, the string is released by a finger-operated trigger mechanism, held in the archer's hand or attached to their wrist. In another type, known as a back-tension release, the string is automatically released when drawn to a pre-determined tension.
Stabilizers.
Stabilizers are mounted at various points on the bow. Common with competitive archery equipment are special brackets that allow multiple stabilizers to be mounted at various angles to fine tune the bow's balance.
Stabilizers aid in aiming by improving the balance of the bow. Sights, quivers, rests, and design of the riser (the central, non-bending part of the bow) make one side of the bow heavier. One purpose of stabilizers are to offset these forces. A reflex riser design will cause the top limb to lean towards the shooter. In this case a heavier front stabilizer is desired to offset this action. A deflex riser design has the opposite effect and a lighter front stabilizer may be used.
Stabilizers can reduce noise and vibration. These energies are absorbed by viscoelastic polymers, gels, powders, and other materials used to build stabilizers.
Stabilizers improve the forgiveness and accuracy by increasing the moment of inertia of the bow to resist movement during the shooting process. Lightweight carbon stabilizers with weighted ends are desirable because they improve the moment of inertia while minimizing the weight added.
Shooting technique and form.
The standard convention on teaching archery is to hold the bow depending upon eye dominance. (One exception is in modern kyūdō where all archers are trained to hold the bow in the left hand.) Therefore, if one is right-eye dominant, they would hold the bow in the left hand and draw the string with the right hand. However, not everyone agrees with this line of thought. A smoother, and more fluid release of the string will produce the most consistently repeatable shots, and therefore may provide greater accuracy of the arrow flight. Some believe that the hand with the greatest dexterity should therefore be the hand that draws and releases the string. Either eye can be used for aiming, and the less dominant eye can be trained over time to become more effective for use. To assist with this, an eye patch can be temporarily worn over the dominant eye.
The hand that holds the bow is referred to as the "bow hand" and its arm the "bow arm". The opposite hand is called the "drawing hand" or "string hand". Terms such as "bow shoulder" or "string elbow" follow the same convention.
If shooting according to eye dominance, right-eye-dominant archers shooting conventionally hold the bow with their left hand. If shooting according to hand dexterity, the archer draws the string with the hand that possesses the greatest dexterity, regardless of eye dominance.
Modern form.
To shoot an arrow, an archer first assumes the correct stance. The body should be at or nearly perpendicular to the target and the shooting line, with the feet placed shoulder-width apart. As an archer progresses from beginner to a more advanced level other stances such as the "open stance" or the "closed stance" may be used, although many choose to stick with a "neutral stance". Each archer has a particular preference, but mostly this term indicates that the leg furthest from the shooting line is a half to a whole foot-length from the other foot, on the ground.
To load, the bow is pointed toward the ground, tipped slightly clockwise of vertical (for a right handed shooter) and the shaft of the arrow is placed on the arrow rest or shelf. The back of the arrow is attached to the bowstring with the nock (a small locking groove located at the proximal end of the arrow). This step is called "nocking the arrow". Typical arrows with three vanes should be oriented such that a single vane, the "cock feather", is pointing away from the bow, to improve the clearance of the arrow as it passes the arrow rest.
A compound bow is fitted with a special type of arrow rest, known as a launcher, and the arrow is usually loaded with the cock feather/vane pointed either up, or down, depending upon the type of launcher being used.
The bowstring and arrow are held with three fingers, or with a mechanical arrow release. Most commonly, for finger shooters, the index finger is placed above the arrow and the next two fingers below, although several other techniques have their adherents around the world, involving three fingers below the arrow, or an arrow pinching technique. "Instinctive" shooting is a technique eschewing sights and is often preferred by traditional archers (shooters of longbows and recurves). In either the split finger or three finger under case, the string is usually placed in the first or second joint, or else on the pads of the fingers. When using a mechanical release aid, the release is hooked onto the D-loop.
Another type of string hold, used on traditional bows, is the type favoured by the Mongol warriors, known as the "thumb release", style. This involves using the thumb to draw the string, with the fingers curling around the thumb to add some support. To release the string, the fingers are opened out and the thumb relaxes to allow the string to slide off the thumb. When using this type of release, the arrow should rest on the same side of the bow as the drawing hand i.e. Left hand draw = arrow on left side of bow.
The archer then raises the bow and draws the string, with varying alignments for vertical versus slightly canted bow positions. This is often one fluid motion for shooters of recurves and longbows, which tend to vary from archer to archer. Compound shooters often experience a slight jerk during the drawback, at around the last , where the draw weight is at its maximum—before relaxing into a comfortable stable full draw position. The archer draws the string hand towards the face, where it should rest lightly at a fixed "anchor point". This point is consistent from shot to shot, and is usually at the corner of the mouth, on the chin, to the cheek, or to the ear, depending on preferred shooting style. The archer holds the bow arm outwards, toward the target. The elbow of this arm should be rotated so that the inner elbow is perpendicular to the ground, though archers with hyper extendable elbows tend to angle the inner elbow toward the ground, as exemplified by the Korean archer Jang Yong-Ho. This keeps the forearm out of the way of the bowstring.
In modern form, the archer stands erect, forming a "T". The archer's lower trapezius muscles are used to pull the arrow to the anchor point. Some modern recurve bows are equipped with a mechanical device, called a clicker, which produces a clicking sound when the archer reaches the correct draw length. , traditional English Longbow shooters step "into the bow", exerting force with both the bow arm and the string hand arm simultaneously, especially when using bows having draw weights from to over . Heavily stacked traditional bows (recurves, long bows, and the like) are released immediately upon reaching full draw at maximum weight, whereas compound bows reach their maximum weight around the last , dropping holding weight significantly at full draw. Compound bows are often held at full draw for a short time to achieve maximum accuracy.
The arrow is typically released by relaxing the fingers of the drawing hand (see bow draw), or triggering the mechanical release aid. Usually the release aims to keep the drawing arm rigid, the bow hand relaxed, and the arrow is moved back using the back muscles, as opposed to using just arm motions. An archer should also pay attention to the recoil or "follow through" of his or her body, as it may indicate problems with form (technique) that affect accuracy.
Aiming methods.
There are two main forms of aiming in archery: using a mechanical or fixed sight, or barebow.
Mechanical sights can be affixed to the bow to aid in aiming. They can be as simple as a pin, or may use optics with magnification. They usually also have a peep sight (rear sight) built into the string, which aids in a consistent anchor point. Modern compound bows automatically limit the draw length to give a consistent arrow velocity, while traditional bows allow great variation in draw length. Some bows use mechanical methods to make the draw length consistent. Barebow archers often use a sight picture, which includes the target, the bow, the hand, the arrow shaft and the arrow tip, as seen at the same time by the archer. With a fixed "anchor point" (where the string is brought to, or close to, the face), and a fully extended bow arm, successive shots taken with the sight picture in the same position fall on the same point. This lets the archer adjust aim with successive shots to achieve accuracy.
Modern archery equipment usually includes sights. Instinctive aiming is used by many archers who use traditional bows. The two most common forms of a non-mechanical release are split-finger and three-under. Split-finger aiming requires the archer to place the index finger above the nocked arrow, while the middle and ring fingers are both placed below. Three-under aiming places the index, middle, and ring fingers under the nocked arrow. This technique allows the archer to better look down the arrow since the back of the arrow is closer to the dominant eye, and is commonly called "gun barreling" (referring to common aiming techniques used with firearms).
When using short bows or shooting from horseback, it is difficult to use the sight picture. The archer may look at the target, but without including the weapon in the field of accurate view. Aiming then involves hand-eye coordination—which includes proprioception and motor-muscle memory, similar to that used when throwing a ball. With sufficient practice, such archers can normally achieve good practical accuracy for hunting or for war. Aiming without a sight picture may allow more rapid shooting, not however increasing accuracy.
Instinctive shooting is a style of shooting that includes the barebow aiming method that relies heavily upon the subconscious mind, proprioception, and motor/muscle memory to make aiming adjustments; the term used to refer to a general category of archers who did not use a mechanical or fixed sight.
Physics.
When a projectile is thrown by hand, the speed of the projectile is determined by the kinetic energy imparted by the thrower's muscles performing work. However, the energy must be imparted over a limited distance (determined by arm length) and therefore (because the projectile is accelerating) over a limited time, so the limiting factor is not work but rather power, which determines how much energy can be added in the limited time available. Power generated by muscles, however, is limited by force–velocity relationship, and even at the optimal contraction speed for power production, total work by the muscle is less than half of what it would be if the muscle contracted over the same distance at slow speeds, resulting in less than 1/4 the projectile launch velocity possible without the limitations of the force–velocity relationship.
When a bow is used, the muscles are able to perform work much more slowly, resulting in greater force and greater work done. This work is stored in the bow as elastic potential energy, and when the bowstring is released, this stored energy is imparted to the arrow much more quickly than can be delivered by the muscles, resulting in much higher velocity and, hence, greater distance. This same process is employed by frogs, which use elastic tendons to increase jumping distance. In archery, some energy dissipates through elastic hysteresis, reducing the overall amount released when the bow is shot. Of the remaining energy, some is dampened both by the limbs of the bow and the bowstring. Depending on the arrow's elasticity, some of the energy is also absorbed by compressing the arrow, primarily because the release of the bowstring is rarely in line with the arrow shaft, causing it to flex out to one side. This is because the bowstring accelerates faster than the archer's fingers can open, and consequently some sideways motion is imparted to the string, and hence arrow nock, as the power and speed of the bow pulls the string off the opening fingers.
Even with a release aid mechanism some of this effect is usually experienced, since the string always accelerates faster than the retaining part of the mechanism. This makes the arrow oscillate in flight—its center flexing to one side and then the other repeatedly, gradually reducing as the arrow's flight proceeds. This is clearly visible in high-speed photography of arrows at discharge. A direct effect of these energy transfers can clearly be seen when dry firing. Dry firing refers to releasing the bowstring without a nocked arrow. Because there is no arrow to receive the stored potential energy, almost all the energy stays in the bow. Some have suggested that dry firing may cause physical damage to the bow, such as cracks and fractures—and because most bows are not specifically made to handle the high amounts of energy dry firing produces, should never be done.
Modern arrows are made to a specified 'spine', or stiffness rating, to maintain matched flexing and hence accuracy of aim. This flexing can be a desirable feature, since, when the spine of the shaft is matched to the acceleration of the bow(string), the arrow bends or flexes around the bow and any arrow-rest, and consequently the arrow, and fletchings, have an un-impeded flight. This feature is known as the archer's paradox. It maintains accuracy, for if part of the arrow struck a glancing blow on discharge, some inconsistency would be present, and the excellent accuracy of modern equipment would not be achieved.
The accurate flight of an arrow depends on its fletchings. The arrow's manufacturer (a "fletcher") can arrange fletching to cause the arrow to rotate along its axis. This improves accuracy by evening pressure buildups that would otherwise cause the arrow to "plane" on the air in a random direction after shooting. Even with a carefully made arrow, the slightest imperfection or air movement causes some unbalanced turbulence in air flow. Consequently, rotation creates an equalization of such turbulence, which, overall, maintains the intended direction of flight i.e. accuracy. This rotation is not to be confused with the rapid gyroscopic rotation of a rifle bullet. Fletching that is not arranged to induce rotation still improves accuracy by causing a restoring drag any time the arrow tilts from its intended direction of travel.
The innovative aspect of the invention of the bow and arrow was the amount of power delivered to an extremely small area by the arrow. The huge ratio of length vs. cross sectional area, coupled with velocity, made the arrow more powerful than any other hand held weapon until firearms were invented. Arrows can spread or concentrate force, depending on the application. Practice arrows, for instance, have a blunt tip that spreads the force over a wider area to reduce the risk of injury or limit penetration. Arrows designed to pierce armor in the Middle Ages used a very narrow and sharp tip ("bodkinhead") to concentrate the force. Arrows used for hunting used a narrow tip ("broadhead") that widens further, to facilitate both penetration and a large wound.
Hunting.
Using archery to take game animals is known as "bow hunting". Bow hunting differs markedly from hunting with firearms, as distance between hunter and prey must be much shorter to ensure a humane kill. The skills and practices of bow hunting therefore emphasize very close approach to the prey, whether by still hunting, stalking, or waiting in a blind or tree stand. In many countries, including much of the United States, bow hunting for large and small game is legal. Bow hunters generally enjoy longer seasons than are allowed with other forms of hunting such as black powder, shotgun, or rifle. Usually, compound bows are used for large game hunting due to the relatively short time it takes to master them as opposed to the longbow or recurve bow. These compound bows may feature fiber optic sights, stabilizers, and other accessories designed to increase accuracy at longer distances. Using a bow and arrow to take fish is known as "bow fishing".
Modern competitive archery.
Competitive archery involves shooting arrows at a target for accuracy from a set distance or distances. This is the most popular form of competitive archery worldwide and is called target archery. A form particularly popular in Europe and America is field archery, shot at targets generally set at various distances in a wooded setting. Competitive archery in the United States is governed by USA Archery and National Field Archery Association (NFAA), which also certifies instructors.
Para-archery is an adaptation of archery for athletes with a disability, governed by the World Archery Federation (WA), and is one of the sports in the Summer Paralympic Games. There are also several other lesser-known and historical forms of archery, as well as archery novelty games and flight archery, where the aim is to shoot the greatest distance.
|
2009 | Alvar Aalto | Hugo Alvar Henrik Aalto (; 3 February 1898 – 11 May 1976) was a Finnish architect and designer. His work includes architecture, furniture, textiles and glassware, as well as sculptures and paintings. He never regarded himself as an artist, seeing painting and sculpture as "branches of the tree whose trunk is architecture." Aalto's early career ran in parallel with the rapid economic growth and industrialization of Finland during the first half of the 20th century. Many of his clients were industrialists, among them the Ahlström-Gullichsen family, who became his patrons. The span of his career, from the 1920s to the 1970s, is reflected in the styles of his work, ranging from Nordic Classicism of the early work, to a rational International Style Modernism during the 1930s to a more organic modernist style from the 1940s onwards.
His architectural work, throughout his entire career, is characterized by a concern for design as Gesamtkunstwerk—a "total work of art" in which he, together with his first wife Aino Aalto, would design not only the building but the interior surfaces, furniture, lamps, and glassware as well. His furniture designs are considered Scandinavian Modern, an aesthetic reflected in their elegant simplification and concern for materials, especially wood, but also in Aalto's technical innovations, which led him to receiving patents for various manufacturing processes, such as those used to produce bent wood. As a designer he is celebrated as a forerunner of midcentury modernism in design; his invention of bent plywood furniture had a profound impact on the aesthetics of Charles and Ray Eames and George Nelson. The Alvar Aalto Museum, designed by Aalto himself, is located in what is regarded as his home city, Jyväskylä.
The entry for him on the Museum of Modern Art website notes his "remarkable synthesis of romantic and pragmatic ideas," adding
His work reflects a deep desire to humanize architecture through an unorthodox handling of form and materials that was both rational and intuitive. Influenced by the so-called International Style modernism (or functionalism, as it was called in Finland) and his acquaintance with leading modernists in Europe, including Swedish architect Erik Gunnar Asplund and many of the artists and architects associated with the Bauhaus, Aalto created designs that had a profound impact on the trajectory of modernism before and after World War II.
Biography.
Life.
Hugo Alvar Henrik Aalto was born in Kuortane, Finland. His father, Johan Henrik Aalto, was a Finnish-speaking land-surveyor and his mother, Selma Matilda "Selly" (née Hackstedt) was a Swedish-speaking postmistress. When Aalto was 5 years old, the family moved to Alajärvi, and from there to Jyväskylä in Central Finland.
He studied at the Jyväskylä Lyceum school, where he completed his basic education in 1916, and took drawing lessons from local artist Jonas Heiska. In 1916, he then enrolled to study architecture at the Helsinki University of Technology. His studies were interrupted by the Finnish Civil War, in which he fought. He fought on the side of the "White Army" and fought at the Battle of Länkipohja and the Battle of Tampere.
He built his first piece of architecture while a student; a house for his parents at Alajärvi. Later, he continued his education, graduating in 1921. In the summer of 1922 he began military service, finishing at Hamina reserve officer training school, and was promoted to reserve second lieutenant in June 1923.
In 1920, while a student, Aalto made his first trip abroad, travelling via Stockholm to Gothenburg, where he briefly found work with architect Arvid Bjerke. In 1922, he accomplished his first independent piece at the Industrial Exposition in Tampere. In 1923, he returned to Jyväskylä, where he opened an architectural office under the name 'Alvar Aalto, Architect and Monumental Artist'. At that time he wrote articles for the Jyväskylä newspaper "Sisä-Suomi" under the pseudonym Remus. During this time, he designed a number of small single-family houses in Jyväskylä, and the office's workload steadily increased.
On 6 October 1924, Aalto married architect Aino Marsio. Their honeymoon in Italy was Aalto's first trip there, though Aino had previously made a study trip there. The latter trip together sealed an intellectual bond with the culture of the Mediterranean region that remained important to Aalto for life.
On their return they continued with several local projects, notably the Jyväskylä Worker's Club, which incorporated a number of motifs which they had studied during their trip, most notably the decorations of the Festival hall modelled on the Rucellai Sepulchre in Florence by Leon Battista Alberti. After winning the architecture competition for the Southwest Finland Agricultural Cooperative building in 1927, the Aaltos moved their office to Turku. They had made contact with the city's most progressive architect, Erik Bryggman before moving. They began collaborating with him, most notably on the Turku Fair of 1928–29. Aalto's biographer, Göran Schildt, claimed that Bryggman was the only architect with whom Aalto cooperated as an equal. With an increasing quantity of work in the Finnish capital, the Aaltos' office moved again in 1933 to Helsinki.
The Aaltos designed and built a joint house-office (1935–36) for themselves in Munkkiniemi, Helsinki, but later (1954–56) had a purpose-built office erected in the same neighbourhood – now the former is a "home museum" and the latter the premises of the Alvar Aalto Academy. In 1926, the young Aaltos designed and had built for themselves a summer cottage in Alajärvi, Villa Flora.
Aino and Alvar had two children, a daughter, Johanna "Hanni" (married surname Alanen; born 1925), and a son, Hamilkar Aalto (born 1928). Aino Aalto died of cancer in 1949.
In 1952, Aalto married architect Elissa Mäkiniemi (died 1994). In 1952, he designed and built a summer cottage, the so-called Experimental House, for himself and his second wife, now Elissa Aalto, in Muuratsalo in Central Finland. Alvar Aalto died on 11 May 1976, in Helsinki, and is buried in the Hietaniemi cemetery in Helsinki. Elissa Aalto became the director of the practice, running the office from 1976−1994. In 1978, the Museum of Finnish Architecture in Helsinki arranged a major exhibition of Aalto's works.
Architecture career.
Early career: classicism.
Although he is sometimes regarded as among the first and most influential architects of Nordic modernism, closer examination reveals that Aalto (while a pioneer in Finland) closely followed and had personal contacts with other pioneers in Sweden, in particular Gunnar Asplund and Sven Markelius. What they and many others of that generation in the Nordic countries shared was a classical education and an approach to classical architecture that historians now call Nordic Classicism. It was a style that had been a reaction to the previous dominant style of National Romanticism before moving, in the late 1920s, towards Modernism.
Upon returning to Jyväskylä in 1923 to establish his own architect's office, Aalto designed several single-family homes designed in the style of Nordic Classicism. For example, the manor-like house for his mother's cousin Terho Manner in Töysa (1923), a summer villa for the Jyväskylä chief constable (also from 1923) and the Alatalo farmhouse in Tarvaala (1924). During this period he completed his first public buildings, the Jyväskylä Workers' Club in 1925, the Jyväskylä Defence Corps Building in 1926 and the Seinäjoki Civil Guard House building in 1924–29. He entered several architectural competitions for prestigious state public buildings, in Finland and abroad. This included two competitions for the Finnish Parliament building in 1923 and 1924, the extension to the University of Helsinki in 1931, and the building to house the League of Nations in Geneva, Switzerland, in 1926–27.
Aalto's first church design to be completed, Muurame church, illustrates his transition from Nordic Classicism to Functionalism.
This was the period when Aalto was most prolific in his writings, with articles for professional journals and newspapers. Among his most well-known essays from this period are "Urban culture" (1924), "Temple baths on Jyväskylä ridge" (1925), "Abbé Coignard's sermon" (1925), and "From doorstep to living room" (1926).
Early career: functionalism.
The shift in Aalto's design approach from classicism to modernism is epitomised by the Viipuri Library in Vyborg (1927–35), which went through a transformation from an originally classical competition entry proposal to the completed high-modernist building. His humanistic approach is in full evidence in the library: the interior displays natural materials, warm colours, and undulating lines. Due to problems related to financing, compounded by a change of site, the Viipuri Library project lasted eight years. During that time, Aalto designed the Standard Apartment Building (1928–29) in Turku, the Turun Sanomat Building (1929–30), and the Paimio Sanatorium (1929–32), which he designed in collaboration with his first wife Aino Aalto. A number of factors contributed to Aalto's shift towards modernism: his increased familiarity with international trends, facilitated by his travels throughout Europe; the opportunity to experiment with concrete prefabrication in the Standard Apartment Building; the cutting-edge Le Corbusier-inspired formal language of the Turun Sanomat Building; and Aalto's application of both in the Paimio Sanatorium and in the ongoing design for the library. Although the Turun Sanomat Building and Paimio Sanatorium are comparatively pure modernist works, they carried the seeds of his questioning of such an orthodox modernist approach and a move to a more daring, synthetic attitude. It has been pointed out that the planning principle for Paimio Sanatorium – the splayed wings – was indebted to the Zonnestraal Sanatorium (1925–31) by Jan Duiker, which Aalto visited while it was under construction. While these early Functionalist bear hallmarks of influences from Le Corbusier, Walter Gropius, and other key modernist figures of central Europe, Aalto nevertheless started to show his individuality in a departure from such norms with the introduction of organic references.
Through Sven Markelius, Aalto became a member of the Congres Internationaux d'Architecture Moderne (CIAM), attending the second congress in Frankfurt in 1929 and the fourth congress in Athens in 1933, where he established a close friendship with László Moholy-Nagy, Sigfried Giedion, and Philip Morton Shand. It was during this time that he closely followed the work of the main force driving the new modernism, Le Corbusier, visiting him in his Paris office several times in the following years.
It was not until the completion of the Paimio Sanatorium (1932) and Viipuri Library (1935) that Aalto first achieved world attention in architecture. His reputation grew in the US following the invitation to hold a retrospective exhibition of his works at MOMA in New York in 1938. (This was his first visit to the States.) The exhibition, which later went on a 12-city tour of the country, was a landmark: Aalto was the second-ever architect – after Le Corbusier – to have a solo exhibition at the museum. His reputation grew in the US following the critical reception of his design for the Finnish Pavilion at the 1939 New York World's Fair, described by Frank Lloyd Wright as a "work of genius". It could be said that Aalto's international reputation was sealed with his inclusion in the second edition of Sigfried Giedion's influential book on Modernist architecture, "Space, Time, and Architecture: The growth of a new tradition" (1949), in which Aalto received more attention than any other Modernist architect, including Le Corbusier. In his analysis of Aalto, Giedion gave primacy to qualities that depart from direct functionality, such as mood, atmosphere, intensity of life, and even national characteristics, declaring that "Finland is with Aalto wherever he goes."
Mid career: experimentation.
During the 1930s Alvar spent some time experimenting with laminated wood, sculpture and abstract relief, characterized by irregular curved forms. Utilizing this knowledge, he was able to solve technical problems concerning the flexibility of wood while at the same time working out spatial issues in his designs. Aalto's early experiments with wood and his move away from a purist modernism would be tested in built form with the commission to design Villa Mairea (1939) in Noormarkku, the luxury home of young industrialist couple Harry and Maire Gullichsen. It was Maire Gullichsen who acted as the main client, and she worked closely not only with Alvar but also with Aino Aalto on the design, encouraging them to be more daring in their work. The building forms a U-shape around a central inner 'garden' whose central feature is a kidney-shaped swimming pool. Adjacent to the pool is a sauna executed in a rustic style, alluding to both Finnish and Japanese precedents. The design of the house is a synthesis of numerous stylistic influences, from traditional Finnish vernacular to purist modernism, as well as influences from English and Japanese architecture. While the house is clearly intended for a wealthy family, Aalto nevertheless argued that it was also an experiment that would prove useful in the design of mass housing.
His increased fame led to offers and commissions outside Finland. In 1941, he accepted an invitation as a visiting professor to the Massachusetts Institute of Technology in the US. During the Second World War, he returned to Finland to direct the Reconstruction Office. After the war, he returned to MIT, where he designed the student dormitory Baker House, completed in 1949. The dormitory flanked the Charles River, and its undulating form provided maximum view and ventilation for each resident. This was the first building of Aalto's redbrick period. Originally used in Baker House to signify the Ivy League university tradition, Aalto went on to use it in a number of key buildings after his return to Finland, most notably in several of the buildings in the new Helsinki University of Technology campus (starting in 1950), Säynätsalo Town Hall (1952), Helsinki Pensions Institute (1954), Helsinki House of Culture (1958), as well as in his own summer house, the Experimental House in Muuratsalo (1957).
In the 1950s Aalto immersed himself in sculpting, exploring wood, bronze, marble, and mixed media. Among the notable works from this period is his memorial to the Battle of Suomussalmi (1960). Located on the battlefield, it consists of a leaning bronze pillar on a pedestal.
Mature career: monumentalism.
Foremost among Aalto's work from the early 1960s until his death in 1976 were his projects in Helsinki, in particular the huge town plan for the void in the centre of Helsinki adjacent to Töölö Bay and the vast railway yards, an area marked on the edges by significant buildings such as the National Museum and the main railway station, both by Eliel Saarinen. In his town plan, Aalto proposed a line of separate marble-clad buildings fronting the bay, which would house various cultural institutions, including a concert hall, opera, museum of architecture, and headquarters for the Finnish Academy. The scheme also extended into the Kamppi district with a series of tall office blocks. Aalto first presented his vision in 1961, but it went through various modifications during the early '60s. Only two fragments of the overall plan were realized: the Finlandia Hall concert hall (1976) fronting on Töölö Bay and an office building in the Kamppi district for the Helsinki Electricity Company (1975). Aalto also employed the Miesian formal language of geometric grids used in those buildings for other sites in Helsinki, including the Enso-Gutzeit headquarters building (1962), the Academic Bookstore (1962), and the SYP Bank building (1969).
Following Aalto's death in 1976, his office continued to operate under the direction of his widow Elissa, who oversaw the completion of works already designed (to some extent), among them the Jyväskylä City Theatre and Essen opera house. Since the death of Elissa Aalto, the office has continued to operate as the Alvar Aalto Academy, giving advice on the restoration of Aalto buildings and organizing the practice's vast archives.
Furniture career.
Although Aalto was famous for his architecture, his furniture designs were admired and are still popular today. He studied with the architect-designer Josef Hoffmann at the Wiener Werkstätte(engl.: "Vienna Workshop") and worked, for a time, under Eliel Saarinen. He also drew inspiration from Gebrüder Thonet. During the late 1920s and 1930s, he worked closely with Aino Aalto on his furniture designs, a focus due in part to his decision to design many of the individual furniture pieces and lamps for the Paimio Sanatorium. Of particular significance was the Aaltos' experimentation in bent plywood chairs, most notably the so-called Paimio chair, designed for tuberculosis patients, and the Model 60 stacking stool. The Aaltos, together with visual arts promoter Maire Gullichsen and art historian Nils-Gustav Hahl, founded the Artek company in 1935, ostensibly to sell Aalto products but which also imported pieces by other designers. Aalto became the first furniture designer to use the cantilever principle in chair designs using wood.
Awards.
Aalto's awards included the Prince Eugen Medal in 1954, the Royal Gold Medal for Architecture from the Royal Institute of British Architects in 1957 and the Gold Medal from the American Institute of Architects in 1963. He was elected a Foreign Honorary Member of the American Academy of Arts and Sciences in 1957. He also was a member of the Academy of Finland, and was its president from 1963 to 1968. From 1925 to 1956 he was a member of the Congrès International d'Architecture Moderne. In 1960 he received an honorary doctorate at the Norwegian University of Science and Technology (NTNU).
Works.
Aalto's career spans the changes in style from (Nordic Classicism) to purist International Style Modernism to a more personal, synthetic, and idiosyncratic Modernism. Aalto's wide field of design activity ranges from large-scale projects such as city planning and architecture to more intimate, human-scale work in interior design, furniture and glassware design, and painting. It has been estimated that during his entire career Aalto designed over 500 individual buildings, approximately 300 of which were built. The vast majority of them are in Finland. He also has a few buildings in France, Germany, Italy, and the US.
Aalto's work with wood was influenced by early Scandinavian architects. His experiments and bold departures from aesthetic norms brought attention to his ability to make wood do things not previously done. His techniques in the way he cut beech wood, for example, and his ability to use plywood as a structural element while at the same time exploiting its aesthetic properties, were at once technically innovative and artistically inspired. Other examples of his boundary-pushing sensibility include the vertical placement of rough-hewn logs at his pavilion at the Lapua expo, a design element that evoked a medieval barricade. At the orchestra platform at Turku and the Paris expo at the World Fair, he used varying sizes and shapes of planks. Also at Paris (and at Villa Mairea), he utilized birch boards in a vertical arrangement. His Vyborg Library, built in what was then Viipuri (it became Vyborg after Soviet annexation in 1944), is acclaimed for its stunning ceiling, with its undulating waves of red-hearted pine (which grows in the region ). In his roofing, he created massive spans (155-foot at the covered stadium at Otaniemi), all without tie rods. In his stairway at Villa Mairea, he evokes the feeling of a natural forest by binding beech wood with withes into columns.
Aalto claimed that his paintings were not made as individual artworks but as part of his process of architectural design, and many of his small-scale "sculptural" experiments with wood led to later larger architectural details and forms. These experiments also led to a number of patents: for example, he invented a new form of laminated bent-plywood furniture in 1932 (which was patented in 1933). His experimental method had been influenced by his meetings with various members of the Bauhaus design school, especially László Moholy-Nagy, whom he first met in 1930. Aalto's furniture was exhibited in London in 1935, to great critical acclaim. To cope with the consumer demand, Aalto, together with his wife Aino, Maire Gullichsen, and Nils-Gustav Hahl founded the company Artek that same year. Aalto glassware (Aino as well as Alvar) is manufactured by Iittala.
Aalto's 'High Stool' and 'Stool E60' (manufactured by Artek) are currently used in Apple Stores across the world to serve as seating for customers. Finished in black lacquer, the stools are used to seat customers at the 'Genius Bar' and also in other areas of the store at times when seating is required for a product workshop or special event. Aalto was also influential in bringing modern art to the attention of the Finnish people, in particular the work of his friends Alexander Milne Calder and Fernand Léger.
Critique of Aalto's architecture.
As mentioned above, Aalto's international reputation was sealed with his inclusion in the second edition of Sigfried Giedion's influential book on Modernist architecture, "Space, Time and Architecture: The growth of a new tradition" (1949), in which Aalto received more attention than any other Modernist architect, including Le Corbusier. In his analysis of Aalto, Giedion gave primacy to qualities that depart from direct functionality, such as mood, atmosphere, intensity of life and even national characteristics, declaring that "Finland is with Aalto wherever he goes."
More recently, however, some architecture critics and historians have questioned Aalto's influence on the historical canon. The Italian Marxist architecture historians Manfredo Tafuri and Francesco Dal Co contend that Aalto's "historical significance has perhaps been rather exaggerated; with Aalto we are outside of the great themes that have made the course of contemporary architecture so dramatic. The qualities of his works have a meaning only as masterful distractions, not subject to reproduction outside the remote reality [sic] in which they have their roots." At the heart of their critique was the perception of Aalto's work as unsuited to the urban context: "Essentially, his architecture is not appropriate to urban typologies."
At the other end of the political spectrum (though similarly concerned with the appropriateness of Aalto's formal language), the American cultural theorist and architectural historian Charles Jencks singled out his Pensions Institute as an example of what he termed the architect's "soft paternalism": "Conceived as a fragmented mass to break up the feeling of bureaucracy, it succeeds all too well in being humane and killing the pensioner with kindness. The forms are familiar – red brick and ribbon-strip windows broken by copper and bronze elements – all carried through with a literal-mindedness that borders on the soporific."
During his lifetime, Aalto faced criticisms from his fellow architects in Finland, most notably Kirmo Mikkola and Juhani Pallasmaa. By the last decade of Aalto's life, his work was seen as unfashionably individualistic at a time when the opposing tendencies of rationalism and constructivism – often championed under left-wing politics – argued for anonymous, aggressively non-aesthetic architecture. Of Aalto's late works, Mikkola wrote, "Aalto has moved to [a] baroque line..."
Memorials.
Aalto has been commemorated in a number of ways:
Further reading.
Göran Schildt has written and edited many books on Aalto, the most well-known being the three-volume biography, usually referred to as the definitive biography on Aalto.
Other books
Aalto research
External links.
Archives
Resources
Catalogs
Buildings and reviews
|
2011 | Comparison of American and British English | The English language was introduced to the Americas by British colonisation, beginning in the late 16th and early 17th centuries. The language also spread to numerous other parts of the world as a result of British trade and colonisation and the spread of the former British Empire, which, by 1921, included 470–570 million people, about a quarter of the world's population. Written forms of British and American English as found in newspapers and textbooks vary little in their essential features, with only occasional noticeable differences.
Over the past 400 years, the forms of the language used in the Americas—especially in the United States—and that used in the United Kingdom have diverged in a few minor ways, leading to the versions now often referred to as American English and British English. Differences between the two include pronunciation, grammar, vocabulary (lexis), spelling, punctuation, idioms, and formatting of dates and numbers. However, the differences in written and most spoken grammar structure tend to be much fewer than in other aspects of the language in terms of mutual intelligibility. A few words have completely different meanings in the two versions or are even unknown or not used in one of the versions. One particular contribution towards formalising these differences came from Noah Webster, who wrote the first American dictionary (published 1828) with the intention of showing that people in the United States spoke a different dialect from those spoken in the UK, much like a regional accent.
This divergence between American English and British English has provided opportunities for humorous comment: e.g. in fiction George Bernard Shaw says that the United States and United Kingdom are "two countries divided by a common language"; and Oscar Wilde says that "We have really everything in common with America nowadays, except, of course, the language" ("The Canterville Ghost", 1888). Henry Sweet incorrectly predicted in 1877 that within a century American English, Australian English and British English would be mutually unintelligible ("A Handbook of Phonetics"). Perhaps increased worldwide communication through radio, television, the Internet and globalisation has tended to reduce regional variation. This can lead to some variations becoming extinct (for instance "the wireless" being progressively superseded by "the radio") or the acceptance of wide variations as "perfectly good English" everywhere.
Although spoken American and British English are generally mutually intelligible, there are occasional differences which might cause embarrassment—for example, in American English a "rubber" is usually interpreted as a "condom" rather than an "eraser"; and a British "fanny" refers to the female genitals, while the American "fanny" refers to a "butt" or "ass" (US) or an "arse" (UK).
Vocabulary.
The familiarity of speakers with words and phrases from different regions varies, and the difficulty of discerning an unfamiliar definition also depends on the context and the term. As expressions spread with the globalisation of telecommunication, they are often but not always recognised as foreign to the speaker's dialect, and words from other dialects may carry connotations with regard to register, social status, origin, and intelligence.
Words and phrases with different meanings.
Words such as "bill" and "biscuit" are used regularly in both AmE and BrE but can mean different things in each form. The word "bill" has several meanings, most of which are shared between AmE and BrE. However, in AmE "bill" often refers to a piece of paper money (as in a "dollar bill") which in BrE is more commonly referred to as a note. In AmE it can also refer to the visor of a cap, though this is by no means common. In AmE a biscuit (from the French "twice baked" as in biscotto) is a soft bready product that is known in BrE as a scone or a specifically hard, sweet biscuit. Meanwhile, a BrE biscuit incorporates both dessert biscuits and AmE cookies (from the Dutch 'little cake').
As chronicled by Winston Churchill, the opposite meanings of the verb "to table" created a misunderstanding during a meeting of the Allied forces; in BrE to table an item on an agenda means to "open it up" for discussion whereas in AmE, it means to "remove" it from discussion, or at times, to suspend or delay discussion; e.g. "Let's table that topic for later".
The word "football" in BrE refers to association football, also known as soccer. In AmE, "football" means American football. The standard AmE term "soccer", a contraction of "association (football)", is actually of British origin, derived from the formalisation of different codes of football in the 19th century, and was a fairly unremarkable usage (possibly marked for class) in BrE until relatively recently; it has lately become perceived as an Americanism. In non-American and non-Canadian contexts, particularly in sports news from outside the United States and Canada, American (or US branches of foreign) news agencies and media organisations also use "football" to mean "soccer", especially in direct quotes.
Similarly, the word "hockey" in BrE refers to field hockey and in AmE, "hockey" means ice hockey.
Words with completely different meanings are relatively few; most of the time there are either (1) words with one or more shared meanings and one or more meanings unique to one variety (for example, bathroom and toilet) or (2) words the meanings of which are actually common to both BrE and AmE but that show differences in frequency, connotation or denotation (for example, "smart", "clever", "mad").
Some differences in usage and meaning can cause confusion or embarrassment. For example, the word "fanny" is a slang word for vulva in BrE but means buttocks in AmE—the AmE phrase "fanny pack" is "bum bag" in BrE. In AmE the word "pissed" means being annoyed or angry whereas in BrE it is a coarse word for being drunk (in both varieties, "pissed off" means irritated).
Similarly, in AmE the word "pants" is the common word for the BrE "trousers" and "knickers" refers to a variety of half-length trousers (though most AmE users would use the term "shorts" rather than knickers), while the majority of BrE speakers would understand "pants" to mean "underpants" and "knickers" to mean "female underpants".
Sometimes the confusion is more subtle. In AmE the word "quite" used as a qualifier is generally a reinforcement, though it is somewhat uncommon in actual colloquial American use today and carries an air of formality: for example, "I'm quite hungry" is a very polite way to say "I'm very hungry". In BrE "quite" (which is much more common in conversation) may have this meaning, as in "quite right" or "quite mad", but it more commonly means "somewhat", so that in BrE "I'm quite hungry" can mean "I'm somewhat hungry". This divergence of use can lead to misunderstanding.
Different terms in different dialects.
Most speakers of American English are aware of some uniquely British terms. It is generally very easy to guess what some words, such as BrE "driving licence", mean, the AmE equivalent being "driver's license". However, use of many other British words such as "naff" (slang but commonly used to mean "not very good") are unheard of in American English.
Speakers of BrE usually find it easy to understand most common AmE terms, such as "sidewalk (pavement or footpath)", "gas (gasoline/petrol)", "counterclockwise (anticlockwise)" or "elevator (lift)", thanks in large part to considerable exposure to American popular culture and literature. Terms heard less often, especially when rare or absent in American popular culture, such as "copacetic (very satisfactory)", are unlikely to be understood by most BrE speakers.
Other examples:
Holiday greetings.
It is increasingly common for Americans to say "Happy holidays", referring to all, or at least multiple, winter (in the Northern hemisphere) or summer (in the Southern hemisphere) holidays (Christmas, Hanukkah, Kwanzaa, etc.) especially when one's religious observances are not known; the phrase is rarely heard in the UK. In the UK, the phrases "holiday season" and "holiday period" refer to the period in the summer when most people take time off from work, and travel; AmE does not use "holiday" in this sense, instead using "vacation" for recreational excursions.
In AmE, the prevailing Christmas greeting is "Merry Christmas", which is the traditional English Christmas greeting, as found in the English Christmas carol "We Wish You a Merry Christmas", and which appears several times in Charles Dickens' "A Christmas Carol". In BrE, "Happy Christmas" is a common alternative to "Merry Christmas".
Idiosyncratic differences.
Omission of "and" and "on".
Generally in British English, numbers with a value over one hundred have the word "and" inserted before the last two digits. For example, the number 115, when written in words or spoken aloud, would be "One hundred "and" fifteen", in British English. In American English, numbers are typically said or written in words in the same way, however if the word "and" is omitted ("One hundred fifteen"), this is also considered acceptable (in BrE this would be considered grammatically incorrect).
Likewise, in the US, the word "on" can be left out when referring to events occurring on any particular day of the week. The US possibility "The Cowboys won the game Saturday" would have the equivalent in the UK of "Sheffield United won the match on Saturday."
Figures of speech.
Both BrE and AmE use the expression "I couldn't care less", to mean that the speaker does not care at all. Some Americans use "I could care less" to mean the same thing. This variant is frequently derided as sloppy, as the literal meaning of the words is that the speaker "does" care to some extent.
In both areas, saying, "I don't mind" often means, "I'm not annoyed" (for example, by someone's smoking), while "I don't care" often means, "The matter is trivial or boring". However, in answering a question such as "Tea or coffee?", if either alternative is equally acceptable an American may answer, "I don't care", while a British person may answer, "I don't mind". Either can sound odd, confusing, or rude, to those accustomed to the other variant.
"To be "all set" in both BrE and AmE can mean "to be prepared or ready", though it appears to be more common in AmE. It can also have an additional meaning in AmE of "to be finished or done", for example, a customer at a restaurant telling a waiter "I'm all set. I'll take the check."
Equivalent idioms.
A number of English idioms that have essentially the same meaning show lexical differences between the British and the American version; for instance:
In the US, a "carpet" typically refers to a fitted carpet, rather than a rug.
Social and cultural differences.
Lexical items that reflect separate social and cultural development.
Education.
Primary and secondary school.
The US has a more uniform nationwide system of terms than does the UK, where terminology and structure varies among constituent countries, but the division by grades varies somewhat among the states and even among local school districts. For example, "elementary school" often includes kindergarten and may include sixth grade, with "middle school" including only two grades or extending to ninth grade.
In the UK, the US equivalent of a "high school" is often referred to as a "secondary school" regardless of whether it is state funded or private. US Secondary education also includes "middle school" or "junior high school", a two- or three-year transitional school between elementary school and high school. "Middle school" is sometimes used in the UK as a synonym for the younger "junior school", covering the second half of the primary curriculum, current years four to six in some areas. However, in Dorset (South England), it is used to describe the second school in the three-tier system, which is normally from year 5 to year 8. In other regions, such as Evesham and the surrounding area in Worcestershire, the second tier goes from year 6 to year 8, and both starting secondary school in year nine. In Kirklees, West Yorkshire, in the villages of the Dearne Valley there is a three tier system: first schools year reception to year five, middle school (Scissett/Kirkburton Middle School) year 6 to year 8, and high school year 9 to year 13.
A "public school" has opposite meanings in the two countries. In American English this is a government-owned institution open to all students, supported by public funding. The British English use of the term is in the context of "private" education: to be educated privately with a tutor. In England and Wales the term strictly refers to an ill-defined group of prestigious private independent schools funded by students' fees, although it is often more loosely used to refer to any independent school. Independent schools are also known as "private schools", and the latter is the term used in Scotland and Northern Ireland for all such fee-funded schools. Strictly, the term "public school" is not used in Scotland and Northern Ireland in the same sense as in England, but nevertheless Gordonstoun, the Scottish private school, is sometimes referred to as a "public school", as are some other Scottish private schools. Government-funded schools in Scotland and Northern Ireland are properly referred to as "state schools" but are sometimes confusingly referred to as "public schools" (with the same meaning as in the US), and in the US, where most public schools are administered by local governments, a "state school" typically refers to a college or university run by one of the U.S. states.
Speakers in both the United States and the United Kingdom use several additional terms for specific types of secondary school. A US "prep school" or "preparatory school" is an independent school funded by tuition fees; the same term is used in the UK for a private school for pupils under 13, designed to prepare them for fee-paying public schools. In the US, "Catholic schools" cover costs through tuition and have affiliations with a religious institution, most often a Catholic church or diocese. In England, where the state-funded education system grew from parish schools organised by the local established church, the Church of England (C of E, or CE), and many schools, especially primary schools (up to age 11) retain a church connection and are known as "church schools", "CE schools" or "CE (aided) schools". There are also "faith schools" associated with the Roman Catholic Church and other major faiths, with a mixture of funding arrangements. In Scotland, Catholic schools are generally operated as government-funded state schools for Catholic communities, particularly in large cities such as Glasgow.
In the US, a "magnet school" receives government funding and has special admission requirements: in some cases pupils gain admission through superior performance on admission tests, while other magnet schools admit students through a lottery. The UK has city academies, which are independent privately sponsored schools run with public funding and which can select up to 10% of pupils by aptitude. Moreover, in the UK 36 local education authorities retain selection by ability at 11. They maintain grammar schools (state funded secondary schools), which admit pupils according to performance in an examination (known as the 11+) and comprehensive schools that take pupils of all abilities. Grammar schools select the most academically able 10% to 23% of those who sit the exam. Students who fail the exam go to a secondary modern school, sometimes called a "high school", or increasingly an "academy". In areas where there are no grammar schools the comprehensives likewise may term themselves high schools or academies. Nationally only 6% of pupils attend grammar schools, mainly in four distinct counties. Some private schools are called "grammar schools", chiefly those that were grammar schools long before the advent of state education.
University.
In the UK a university student is said to "study", to "read" or, informally, simply to "do" a subject. In the recent past the expression 'to read a subject' was more common at the older universities such as Oxford and Cambridge. In the US a student "studies" or "majors in" a subject (although a student's "major", "concentration" or, less commonly, "emphasis" is also used in US colleges or universities to refer to the major subject of study). "To major in" something refers to the student's principal course of study; "to study" may refer to any class being taken.
BrE:
AmE:
At university level in BrE, each "module" is taught or facilitated by a "lecturer" or "tutor"; "professor" is the job-title of a senior academic (in AmE, at some universities, the equivalent of the BrE lecturer is instructor, especially when the teacher has a lesser degree or no university degree, though the usage may become confusing according to whether the subject being taught is considered technical or not; it is also different from adjunct instructor/professor). In AmE each "class" is generally taught by a "professor" (although some US tertiary educational institutions follow the BrE usage), while the position of "lecturer" is occasionally given to individuals hired on a temporary basis to teach one or more classes and who may or may not have a doctoral degree.
The word "course" in American use typically refers to the study of a restricted topic or individual subject (for example, "a course in Early Medieval England", "a course in integral calculus") over a limited period of time (such as a semester or term) and is equivalent to a "module" or sometimes "unit" at a British university. In the UK, a "course of study" or simply "course" is likely to refer to the entire programme of study, which may extend over several years and be made up of any number of "modules," hence it is also practically synonymous to a degree programme. A few university-specific exceptions exist: for example, at Cambridge the word "paper" is used to refer to a "module", while the whole course of study is called "tripos".
A "dissertation" in AmE refers to the final written product of a doctoral student to fulfil the requirement of that program. In BrE, the same word refers to the final written product of a student in an undergraduate or taught master's programme. A dissertation in the AmE sense would be a thesis in BrE, though "dissertation" is also used.
Another source of confusion is the different usage of the word "college". (See a full international discussion of the various meanings at college.) In the US, it refers to a post-high school institution that grants either associate's or bachelor's degrees, and in the UK, it refers to any post-secondary institution that is not a university (including "sixth form college" after the name in secondary education for years 12 and 13, the "sixth form") where intermediary courses such as A levels or NVQs can be taken and GCSE courses can be retaken. College may sometimes be used in the UK or in Commonwealth countries as part of the name of a secondary or high school (for example, Dubai College). In the case of Oxford, Cambridge, Aberdeen, London, Lancaster, Durham, Kent and York universities, all members are also members of a college which is part of the university, for example, one is a member of King's College, Cambridge and hence of the university.
In both the US and UK "college" can refer to some division within a university that comprises related academic departments such as the "college of business and economics" though in the UK "faculty" is more often used. Institutions in the US that offer two to four years of post-high school education often have the word "college" as part of their name, while those offering more advanced degrees are called a "university". (There are exceptions: Boston College, Dartmouth College and the College of William & Mary are examples of colleges that offer advanced degrees, while Vincennes University is an unusual example of a "university" that offers only associate degrees in the vast majority of its academic programs.) American students who pursue a "bachelor's degree" (four years of higher education) or an "associate degree" (two years of higher education) are "college students" regardless of whether they attend a college or a university and refer to their educational institutions informally as "colleges." A student who pursues a master's degree or a doctorate degree in the arts and sciences is in AmE a "graduate student"; in BrE a "postgraduate student" although "graduate student" is also sometimes used. Students of advanced professional programs are known by their field ("business student", "law student", "medical student"). Some universities also have a residential college system, the details of which may vary but generally involve common living and dining spaces as well as college-organised activities. Nonetheless, when it comes to the level of education, AmE generally uses the word "college" (e.g., going to college) whereas BrE generally uses the word "university" (e.g., going to university) regardless of the institution's official designation/status in both countries.
In the context of higher education, the word "school" is used slightly differently in BrE and AmE. In BrE, except for the University of London, the word school is used to refer to an academic department in a university. In AmE, the word school is used to refer to a collection of related academic departments and is headed by a dean. When it refers to a division of a university, school is practically synonymous to a college.
"Professor" has different meanings in BrE and AmE. In BrE it is the highest academic rank, followed by reader, senior lecturer and lecturer. In AmE "professor" refers to academic staff of all ranks, with (full) professor (largely equivalent to the UK meaning) followed by associate professor and assistant professor.
"Tuition" has traditionally had separate meaning in each variation. In BrE it is the educational content transferred from teacher to student at a university. In AmE it is the money (the fees) paid to receive that education (BrE: tuition fees).
General terms.
In both the US and the UK, a student "takes" an exam, but in BrE a student can also be said to "sit" an exam. When preparing for an exam students "revise" (BrE)/"review" (AmE) what they have studied; the BrE idiom "to revise for" has the equivalent "to review for" in AmE.
Examinations are supervised by "invigilators" in the UK and "proctors" (or "(exam) supervisors") in the US (a "proctor" in the UK is an official responsible for student discipline at the University of Oxford or Cambridge). In the UK a teacher first "sets" and then "administers" exam, while in the US, a teacher first "writes", "makes", "prepares", etc. and then "gives" an exam. With the same basic meaning of the latter idea but with a more formal or official connotation, a teacher in the US may also "administer" or "proctor" an exam.
BrE:
AmE:
In BrE, students are awarded "marks" as credit for requirements (e.g., tests, projects) while in AmE, students are awarded "points" or "grades" for the same. Similarly, in BrE, a candidate's work is being "marked", while in AmE it is said to be "graded" to determine what mark or grade is given.
There is additionally a difference between American and British usage in the word "school". In British usage "school" by itself refers only to primary (elementary) and secondary (high) schools and to "sixth forms" attached to secondary schools—if one "goes to school", this type of institution is implied. By contrast an American student at a university may be "in/at school", "coming/going to school", etc. US and British law students and medical students both commonly speak in terms of going to "law school" and "med[ical] school", respectively. However, the word "school" is used in BrE in the context of higher education to describe a division grouping together several related subjects within a university, for example a "School of European Languages" containing "departments" for each language and also in the term "art school". It is also the name of some of the constituent colleges of the University of London, for example, School of Oriental and African Studies, London School of Economics.
Among high-school and college students in the United States, the words "freshman" (or the gender-neutral terms "first year" or sometimes "freshie"), "sophomore", "junior" and "senior" refer to the first, second, third, and fourth years respectively. It is important that the context of either high school or college first be established or else it must be stated directly (that is, "She is a high-school freshman". "He is a college junior."). Many institutes in both countries also use the term "first-year" as a gender-neutral replacement for "freshman", although in the US this is recent usage, formerly referring only to those in the first year as a graduate student. One exception is the University of Virginia; since its founding in 1819 the terms "first-year", "second-year", "third-year", and "fourth-year" have been used to describe undergraduate university students. At the United States service academies, at least those operated by the federal government directly, a different terminology is used, namely "fourth class", "third class", "second class" and "first class" (the order of numbering being the reverse of the number of years in attendance). In the UK first-year university students are sometimes called "freshers" early in the academic year; however, there are no specific names for those in other years nor for school pupils. Graduate and professional students in the United States are known by their year of study, such as a "second-year medical student" or a "fifth-year doctoral candidate." Law students are often referred to as "1L", "2L", or "3L" rather than "nth-year law students"; similarly, medical students are frequently referred to as "M1", "M2", "M3", or "M4".
While anyone in the US who finishes studying at any educational institution by passing relevant examinations is said to "graduate" and to be a "graduate", in the UK only degree and above level students can "graduate". "Student" itself has a wider meaning in AmE, meaning any person of any age studying any subject at any level (including those not doing so at an educational institution, such as a "piano student" taking private lessons in a home), whereas in BrE it tends to be used for people studying at a post-secondary educational institution and the term "pupil" is more widely used for a young person at primary or secondary school, though the use of "student" for secondary school pupils in the UK is increasingly used, particularly for "sixth form" (years 12 and 13).
The names of individual institutions can be confusing. There are several high schools with the word "university" in their names in the United States that are not affiliated with any post-secondary institutions and cannot grant degrees, and there is one public high school, Central High School of Philadelphia, that does grant bachelor's degrees to the top ten per cent of graduating seniors. British secondary schools occasionally have the word "college" in their names.
When it comes to the admissions process, applicants are usually asked to solicit "letters of reference" or reference forms from referees in BrE. In AmE, these are called "letters of recommendation" or recommendation forms. Consequently, the writers of these letters are known as "referees" and "recommenders", respectively by country. In AmE, the word "referee" is nearly always understood to refer to an umpire of a sporting match.
In the context of education, for AmE, the word "staff" mainly refers to school personnel who are neither administrators nor have teaching loads or academic responsibilities; personnel who have academic responsibilities are referred to as members of their institution's "faculty." In BrE, the word "staff" refers to both academic and non-academic school personnel. As mentioned previously, the term "faculty" in BrE refers more to a collection of related academic departments.
Government and politics.
In the UK, political candidates "stand for election", while in the US, they "run for office". There is virtually no crossover between BrE and AmE in the use of these terms. Also, the document which contains a party's positions/principles is referred to as a "party platform" in AmE, whereas it is commonly known as a "party manifesto" in BrE. (In AmE, using the term "manifesto" may connote that the party is an extremist or radical organisation.) The term "general election" is used slightly differently in British and American English. In BrE, it refers exclusively to a nationwide parliamentary election and is differentiated from local elections (mayoral and council) and by-elections; whereas in AmE, it refers to a final election for any government position in the US, where the term is differentiated from the term "primary" (an election that determines a party's candidate for the position in question). Additionally, a "by-election" in BrE is called a "special election" in AmE.
In AmE, the term "swing state", "swing county", "swing district" is used to denote a jurisdiction/constituency where results are expected to be close but crucial to the overall outcome of the general election. In BrE, the term "marginal constituency" is more often used for the same and "swing" is more commonly used to refer to how much one party has gained (or lost) an advantage over another compared to the previous election.
In the UK, the term "government" only refers to what is commonly known in America as the "executive branch" or the particular "administration".
A local government in the UK is generically referred to as the "council," whereas in the United States, a local government will be generically referred to as the "City" (or county, village, etc., depending on what kind of entity the government serves).
Business and finance.
In financial statements, what is referred to in AmE as "revenue" or "sales" is known in BrE as "turnover." In AmE, having "high turnover" in a business context would generally carry negative implications, though the precise meaning would differ by industry.
A bankrupt firm "goes into administration" or liquidation in BrE; in AmE it "goes bankrupt", or "files for Chapter 7" (liquidation) or "Chapter 11" (reorganisation). An insolvent individual or partnership "goes bankrupt" in both BrE and AmE.
If a finance company takes possession of a mortgaged property from a debtor, it is called "foreclosure" in AmE and "repossession" in BrE. In some limited scenarios, "repossession" may be used in AmE, but it is much less commonly compared to "foreclosure". One common exception in AmE is for automobiles, which are always said to be "repossessed". Indeed, an agent who collects these cars for the bank is colloquially known in AmE as a "repo man".
Employment and recruitment.
In BrE, the term "curriculum vitae" (commonly abbreviated to "CV") is used to describe the document prepared by applicants containing their credentials required for a job. In AmE, the term "résumé" is more commonly used, with "CV" primarily used in academic or research contexts, and is usually more comprehensive than a "résumé".
Insurance.
AmE distinguishes between "coverage" as a noun and "cover" as a verb; an American seeks to buy enough insurance coverage in order to adequately cover a particular risk. BrE uses the word "cover" for both the noun and verb forms.
Transport.
AmE speakers refer to "transportation" and BrE speakers to "transport". ("Transportation" in the UK has traditionally meant the punishment of criminals by deporting them to an overseas penal colony.) In AmE, the word "transport" is usually used only as a verb, seldom as a noun or adjective except in reference to certain specialised objects, such as a "tape transport" or a "military transport" (e.g., a troop transport, a kind of vehicle, not an act of transporting).
Road transport.
Differences in terminology are especially obvious in the context of roads. The British term "dual carriageway", in American parlance, would be "divided highway" or perhaps, simply "highway". The "central reservation" on a "motorway" or "dual carriageway" in the UK would be the "median" or "center divide" on a "freeway", "expressway", "highway" or "parkway" in the US. The one-way lanes that make it possible to enter and leave such roads at an intermediate point without disrupting the flow of traffic are known as "slip roads" in the UK but in the US, they are typically known as "ramps" and both further distinguish between "on-ramps" or "on-slips" (for entering onto a highway/carriageway) and "off-ramps" or "exit-slips" (for leaving a highway/carriageway). When American engineers speak of "slip roads", they are referring to a street that runs alongside the main road (separated by a berm) to allow off-the-highway access to the premises that are there; however, the term "frontage road" is more commonly used, as this term is the equivalent of "service road" in the UK. However, it is not uncommon for an American to use "service road" as well instead of "frontage road".
In the UK, the term "outside lane" refers to the higher-speed "overtaking lane" ("passing lane" in the US) closest to the centre of the road, while "inside lane" refers to the lane closer to the edge of the road. In the US, "outside lane" is used only in the context of a turn, in which case it depends in which direction the road is turning (i.e., if the road bends right, the left lane is the "outside lane", but if the road bends left, it is the right lane). Both also refer to "slow" and "fast" lanes (even though all actual traffic speeds may be at or around the legal speed limit).
In the UK "drink driving" refers to driving after having consumed alcoholic beverages, while in the US, the term is "drunk driving". The legal term in the US is "driving while intoxicated" (DWI) or "driving under the influence (of alcohol)" (DUI). The equivalent legal phrase in the UK is "drunk in charge of a motor vehicle" (DIC) or more commonly "driving with excess alcohol".
In the UK, a hire car is the US equivalent of a rental car. The term "hired car" can be especially misleading for those in the US, where the term "hire" is generally only applied to the employment of people and the term "rent" is applied to the temporary custody of goods. To an American, "hired car" would imply that the car has been brought into the employment of an organisation as if it were a person, which would sound nonsensical.
In the UK, a saloon is a vehicle that is equivalent to the American sedan. This is particularly confusing to Americans, because in the US the term "saloon" is used in only one context: describing an old bar (UK pub) in the American West (a Western saloon). "Coupé" is used by both to refer to a two-door car, but is usually pronounced with two syllables in the UK (coo-pay) and one syllable in the US (coop).
In the UK, "van" may refer to a lorry (UK) of any size, whereas in the US, "van" is only understood to be a very small, boxy truck (US) (such as a "moving van") or a long passenger automobile with several rows of seats (such as a "minivan"). A large, long vehicle used for cargo transport would nearly always be called a "truck" in the US, though alternate terms such as "eighteen-wheeler" may be occasionally heard (regardless of the actual number of tires on the truck).
In the UK, a silencer is the equivalent to the US muffler. In the US, the word silencer has only one meaning: an attachment on the barrel of a gun designed to stop the distinctive crack of a gunshot.
Specific auto parts and transport terms have different names in the two dialects, for example:<br>
Rail transport.
There are also differences in terminology in the context of rail transport. The best known is "railway" in the UK and "railroad" in North America, but there are several others. A "railway station" in the UK is a "railroad station" in the US, while "train station" is used in both; trains have "drivers" (often called "engine drivers") in the UK, while in America trains are driven by "engineers"; trains have "guards" in the UK and "conductors" in the US, though the latter is also common in the UK; a place where two tracks meet is called a set of "points" in the UK and a "switch" in the US; and a place where a road crosses a railway line at ground level is called a "level crossing" in the UK and a "grade crossing" or "railroad crossing" in America. In the UK, the term "sleeper" is used for the devices that bear the weight of the rails and are known as "ties" or "crossties" in the United States. In a rail context, "sleeper" (more often, "sleeper car") would be understood in the US as a rail car with sleeping quarters for its passengers. The British term "platform" in the sense "The train is at Platform 1" would be known in the US by the term "track", and used in the phrase "The train is on Track 1". The British term "brake van" or "guard's van" is a "caboose" in the US. The American English phrase "All aboard" when boarding a train is rarely used in the UK, and when the train reaches its final stop, in the UK the phrase used by rail personnel is "All change" while in the US it is "All out", though such announcements are uncommon in both regions.
For sub-surface rail networks, while "underground" is commonly used in the UK, only the London Underground actually carries this name: the UK's only other such system, the smaller Glasgow Subway, was in fact the first to be called "subway". Nevertheless, both "subway" and "metro" are now more common in the US, varying by city: in Washington D.C., for example, "metro" is used, while in New York City "subway" is preferred. Another variation is the "T" in Boston.
Television.
Traditionally, a "show" on British television would have referred to a light-entertainment program (BrE "programme") with one or more performers and a participative audience, whereas in American television, the term is used for any type of program. British English traditionally referred to other types of program by their type, such as drama, serial etc., but the term "show" has now taken on the generalised American meaning. In American television the episodes of a program first broadcast in a particular year constitute a "season", while the entire run of the program—which may span several seasons—is called a "series". In British television, on the other hand, the word "series" may apply to the episodes of a program in one particular year, for example, "The 1998 series of "Grange Hill", as well as to the entire run. However, the entire run may occasionally be referred to as a "show".
The term "telecast", meaning television broadcast and uncommon even in the US, is not used in British English. A television program would be "broadcast", "aired" or "shown" in both the UK and US.
Telecommunications.
A long-distance call is a "trunk call" in British English, but is a "toll call" in American English, though neither term is well known among younger Americans. The distinction is a result of historical differences in the way local service was billed; the Bell System traditionally flat-rated local calls in all but a few markets, subsidising local service by charging higher rates, or tolls, for intercity calls, allowing local calls to appear to be free. British Telecom (and the British Post Office before it) charged for all calls, local and long distance, so labelling one class of call as "toll" would have been meaningless.
Similarly, a toll-free number in America is a freephone number in the UK. The term "freefone" is a BT trademark.
Rivers.
In British English, the name of a river is usually placed after the word (River Thames) however there are a small number of exceptions such as Wick River. In American English, the name is placed before the word (Hudson River).
Grammar.
Subject-verb agreement.
In American English (AmE), collective nouns are almost always singular in construction: "the committee was unable to agree". However, when a speaker wishes to emphasize that the individuals are acting separately, a plural pronoun may be employed with a singular or plural verb: "the team takes their seats", rather than "the team takes its seats". Such a sentence would most likely be recast as "the team members take their seats". Despite exceptions such as usage in "The New York Times", the names of sports teams are usually treated as plurals even if the form of the name is singular.
In British English (BrE), collective nouns can take either singular ("formal agreement") or plural ("notional agreement") verb forms, according to whether the emphasis is on the body as a whole or on the individual members respectively; compare "a committee was appointed" with "the committee were unable to agree". The term "the Government" always takes a plural verb in British civil service convention, perhaps to emphasise the principle of cabinet collective responsibility. Compare also the following lines of Elvis Costello's song "Oliver's Army": "Oliver's Army is here to stay / Oliver's Army are on their way ". Some of these nouns, for example "staff", actually combine with plural verbs most of the time.
The difference occurs for all nouns of multitude, both general terms such as "team" and "company" and proper nouns (for example where a place name is used to refer to a sports team). For instance,
Proper nouns that are plural in form take a plural verb in both AmE and BrE; for example, "The Beatles are a well-known band"; "The Diamondbacks are the champions", with one major exception: in American English, "the United States" is almost universally used with a singular verb. Although the construction "the United States are" was more common early in the history of the country, as the singular federal government exercised more authority and a singular national identity developed (especially following the American Civil War), it became standard to treat "the United States" as a singular noun.
Style.
Use of "that" and "which" in restrictive and non-restrictive relative clauses.
Generally, a non-restrictive relative clause (also called non-defining or supplementary) is one containing information that is supplementary, i.e. does not change the meaning of the rest of the sentence, while a restrictive relative clause (also called defining or integrated) contains information essential to the meaning of the sentence, effectively limiting the modified noun phrase to a subset that is defined by the relative clause.
An example of a restrictive clause is "The dog that bit the man was brown."
An example of a non-restrictive clause is "The dog, which bit the man, was brown."
In the former, "that bit the man" identifies which dog the statement is about.
In the latter, "which bit the man" provides supplementary information about a known dog.
A non-restrictive relative clause is typically set off by commas, whereas a restrictive relative clause is not, but this is not a rule that is universally observed. In speech, this is also reflected in the intonation.
Writers commonly use "which" to introduce a non-restrictive clause, and "that" to introduce a restrictive clause. "That" is rarely used to introduce a non-restrictive relative clause in prose. "Which" and "that" are both commonly used to introduce a restrictive clause; a study in 1977 reported that about 75 per cent of occurrences of "which" were in restrictive clauses.
H. W. Fowler, in "A Dictionary of Modern English Usage" of 1926, followed others in suggesting that it would be preferable to use "which" as the non-restrictive (what he calls "non-defining") pronoun and "that" as the restrictive (what he calls defining) pronoun, but he also stated that this rule was observed neither by most writers nor by the best writers.
He implied that his suggested usage was more common in American English.
Fowler notes that his recommended usage presents problems, in particular that "that" must be the first word of the clause, which means, for instance, that "which" cannot be replaced by "that" when it immediately follows a preposition (e.g. "the basic unit "from which" matter is constructed") – though this would not prevent a stranded preposition (e.g. "the basic unit "that" matter is constructed "from").
Style guides by American prescriptivists, such as Bryan Garner, typically insist, for stylistic reasons, that "that" be used for restrictive relative clauses and "which" be used for non-restrictive clauses, referring to the use of "which" in restrictive clauses as a "mistake". According to the 2015 edition of "Fowler's Dictionary of Modern English Usage", "In AmE "which" is 'not generally used in restrictive clauses, and that fact is then interpreted as the absolute rule that only "that" may introduce a restrictive clause', whereas in BrE 'either "that" or "which" may be used in restrictive clauses', but many British people 'believe that "that" is obligatory'".
Subjunctive.
The subjunctive mood is commoner in colloquial American English than in colloquial British English.
Writing.
Spelling.
Before the early 18th century English spelling was not standardised. Different standards became noticeable after the publishing of influential dictionaries. For the most part current BrE spellings follow those of Samuel Johnson's "Dictionary of the English Language" (1755), while AmE spellings follow those of Noah Webster's "An American Dictionary of the English Language" (1828). In the United Kingdom, the influences of those who preferred the French spellings of certain words proved decisive. In many cases AmE spelling deviated from mainstream British spelling; on the other hand it has also often retained older forms. Many of the now characteristic AmE spellings were popularised, although often not created, by Noah Webster. Webster chose already-existing alternative spellings "on such grounds as simplicity, analogy or etymology". Webster did attempt to introduce some reformed spellings, as did the Simplified Spelling Board in the early 20th century, but most were not adopted. Later spelling changes in the UK had little effect on present-day US spelling, and vice versa.
Punctuation.
Full stops and periods in abbreviations.
There have been some trends of transatlantic difference in use of periods in some abbreviations. These are discussed at "Abbreviation § Periods (full stops) and spaces". Unit symbols such as kg and Hz are never punctuated.
Parentheses/brackets.
In British English, "( )" marks are often referred to as brackets, whereas "[ ]" are called square brackets and "{ }" are called curly brackets. In formal British English and in American English "( )" marks are parentheses (singular: parenthesis), "[ ]" are called brackets or square brackets, and "{ }" can be called either curly brackets or braces. Despite the different names, these marks are used in the same way in both varieties.
Quoting.
British and American English differ in the preferred quotation mark style, including the placement of commas and periods. In American English, " and ' are called quotation marks, whereas in British English, " and ' are referred to as either inverted commas or speech marks. Additionally, in American English direct speech typically uses the double quote mark ( " ), whereas in British English it is common to use the inverted comma ( ' ).
Commas in headlines.
American newspapers commonly use a comma as a shorthand for "and" in headlines. For example, "The Washington Post" had the headline "A TRUE CONSERVATIVE: For McCain, Bush Has Both Praise, Advice."
Numerical expressions.
There are many differences in the writing and speaking of English numerals, most of which are matters of style, with the notable exception of different definitions for billion.
The two countries have different conventions for floor numbering. The UK uses a mixture of the metric system and Imperial units, where in the US, United States customary units are dominant in everyday life with a few fields using the metric system.
Monetary amounts.
Monetary amounts in the range of one to two major currency units are often spoken differently. In AmE one may say "a dollar fifty" or "a pound eighty", whereas in BrE these amounts would be expressed "one dollar fifty" and "one pound eighty". For amounts over a dollar an American will generally either drop denominations or give both dollars and cents, as in "two-twenty" or "two dollars and twenty cents" for $2.20. An American would not say "two dollars twenty". On the other hand, in BrE, "two-twenty" or "two pounds twenty" would be most common.
It is more common to hear a British-English speaker say "one thousand two hundred dollars" than "a thousand and two hundred dollars", although the latter construct is common in AmE. In British English, the "and" comes after the hundreds ("one thousand, two hundred and thirty dollars"). The term "twelve hundred dollars", popular in AmE, is frequently used in BrE but only for exact multiples of 100 up to 1,900. Speakers of BrE very rarely hear amounts over 1,900 expressed in hundreds, for example, "twenty-three hundred". In AmE it would not be unusual to refer to a high, uneven figure such as 2,307 as "twenty-three hundred and seven".
In BrE, particularly in television or radio advertisements, integers can be pronounced individually in the expression of amounts. For example, "on sale for £399" might be expressed "on sale for three nine nine", though the full "three hundred and ninety-nine pounds" is at least as common. An American advertiser would almost always say "on sale for three ninety-nine", with context distinguishing $399 from $3.99. In British English the latter pronunciation implies a value in pounds and pence, so "three ninety-nine" would be understood as £3.99.
In spoken BrE the word "pound" is sometimes colloquially used for the plural as well. For example, "three pound forty" and "twenty pound a week" are both heard in British English. Some other currencies do not change in the plural; yen and rand being examples. This is in addition to normal adjectival use, as in "a twenty-pound-a-week pay-rise" (US "raise"). The euro most often takes a regular plural "-s" in practice despite the EU dictum that it should remain invariable in formal contexts; the invariable usage is more common in Ireland, where it is the official currency.
In BrE the use of "p" instead of "pence" is common in spoken usage. Each of the following has equal legitimacy: "3 pounds 12 p"; "3 pounds and 12 p"; "3 pounds 12 pence"; "3 pounds and 12 pence"; as well as just "8 p" or "8 pence". In everyday usage the amount is simply read as figures (£3.50 = three pounds fifty) as in AmE.
AmE uses words such as "nickel", "dime", and "quarter" for small coins. In BrE the usual usage is "a 10-pence piece" or "a 10p piece" or simply "a 10p", for any coin below £1, "pound coin" and "two-pound coin". BrE did have specific words for a number of coins before decimalisation. Formal coin names such as "half crown" (2/6) and "florin" (2/-), as well as slang or familiar names such as "bob" (1/-) and "tanner" (6d) for pre-decimalisation coins are still familiar to older BrE speakers but they are not used for modern coins. In older terms like "two-bob bit" (2/-) and "thrupenny bit" (3d), the word "bit" had common usage before decimalisation similar to that of "piece" today.
In order to make explicit the amount in words on a check (BrE "cheque"), Americans write "three and " (using this solidus construction or with a horizontal division line): they do not need to write the word "dollars" as it is usually already printed on the check. On a cheque UK residents would write "three pounds and 24 pence", "three pounds ‒ 24", or "three pounds ‒ 24p" since the currency unit is not preprinted. To make unauthorised amendment difficult, it is useful to have an expression terminator even when a whole number of dollars/pounds is in use: thus, Americans would write "three and " or "three and " on a three-dollar check (so that it cannot easily be changed to, for example, "three million"), and UK residents would write "three pounds only".
Dates.
Dates are usually written differently in the short (numerical) form. Christmas Day 2000, for example, is 25/12/00 or 25.12.00 in the UK and 12/25/00 in the US, although the formats 25/12/2000, 25.12.2000, and 12/25/2000 now have more currency than they had before Y2K. Occasionally other formats are encountered, such as the ISO 8601 2000-12-25, popular among programmers, scientists and others seeking to avoid ambiguity, and to make alphanumerical order coincide with chronological order. The difference in short-form date order can lead to misunderstanding, especially when using software or equipment that uses the foreign format. For example, 06/04/05 could mean either June 4, 2005 (if read as US format), 6 April 2005 (if seen as in UK format) or even 5 April 2006 if taken to be an older ISO 8601-style format where 2-digit years were allowed.
When using the name of the month rather than the number to write a date in the UK, the recent standard style is for the day to precede the month, e. g., 21 April. Month preceding date is almost invariably the style in the US, and was common in the UK until the late twentieth century. British usage normally changes the day from an integer to an ordinal, i.e., 21st instead of 21. In speech, "of" and "the" are used in the UK, as in "the 21st of April". In written language, the words "the" and "of" may be and are usually dropped, i.e., 21st April. The US would say this as "April 21st", and this form is still common in the UK. One of the few exceptions in American English is saying "the Fourth of July" as a shorthand for the United States Independence Day. In the US military the British forms are used, but the day is read cardinally, while among some speakers of New England and Southern American English varieties and who come from those regions but live elsewhere, those forms are common, even in formal contexts.
Phrases such as the following are common in the UK but are generally unknown in the US: "A week today", "a week tomorrow", "a week (on) Tuesday" and "Tuesday week"; these all refer to a day which is more than a week into the future. "A fortnight Friday" and "Friday fortnight" refer to a day two weeks after the coming Friday). "A week on Tuesday" and "a fortnight on Friday" could refer either to a day in the past ("it's a week on Tuesday, you need to get another one") or in the future ("see you a week on Tuesday"), depending on context. In the US the standard construction is "a week from today", "a week from tomorrow", etc. BrE speakers may also say "Thursday last" or "Thursday gone" where AmE would prefer "last Thursday". "I'll see you (on) Thursday coming" or "let's meet this coming Thursday" in BrE refer to a meeting later this week, while "not until Thursday next" would refer to next week. In BrE there is also common use of the term 'Thursday after next' or 'week after next' meaning 2 weeks in the future and 'Thursday before last' and 'week before last' meaning 2 weeks in the past, but not when referring to times more than 2 weeks been or gone or when using the terms tomorrow today or yesterday then in BrE you would say '5 weeks on Tuesday' or '2 weeks yesterday'.
Time.
The 24-hour clock ("18:00", "18.00" or "1800") is considered normal in the UK and Europe in many applications including air, rail and bus timetables; it is largely unused in the US outside military, police, aviation and medical applications. As a result, many Americans refer to the 24-hour clock as "military time". Some British English style guides recommend the full stop (.) when telling time, compared to American English which uses colons (:) (i.e., 11:15 PM/pm/p.m. or 23:15 for AmE and 11.15 pm or 23.15 for BrE). Usually in the military (and sometimes in the police, aviation and medical) applications on both sides of the Atlantic "0800" and "1800" are read as ("oh/zero") "eight hundred" and "eighteen hundred" hours respectively. Even in the UK, "hundred" follows "twenty", "twenty-one", "twenty-two" and "twenty-three" when reading "2000", "2100", "2200" and "2300" according to those applications.
Fifteen minutes after the hour is called "quarter past" in British usage and "a quarter after" or, less commonly, "a quarter past" in American usage. Fifteen minutes before the hour is usually called "quarter to" in British usage and "a quarter of", "a quarter to" or "a quarter 'til" in American usage; the form "a quarter to" is associated with parts of the Northern United States, while "a quarter 'til" or "till" is found chiefly in the Appalachian region. Thirty minutes after the hour is commonly called "half past" in both BrE and AmE; "half after" used to be more common in the US. In informal British speech, the preposition is sometimes omitted, so that 5:30 may be referred to as "half five"; this construction is entirely foreign to US speakers, who would possibly interpret "half five" as 4:30 (halfway to 5:00) rather than 5:30. The AmE formations "top of the hour" and "bottom of the hour" are not used in BrE. Forms such as "eleven forty" are common in both varieties. To be simple and direct in telling time, no terms relating to fifteen or thirty minutes before/after the hour are used; rather the time is told exactly as for example "nine fifteen", "ten forty-five".
Sports percentages.
In sports statistics, certain percentages such as those for winning or win–loss records and saves in field or ice hockey and association football are almost always expressed as a decimal proportion to three places in AmE and are usually read aloud as if they are whole numbers, e.g. (0).500 or five hundred, hence the phrase "games/matches over five hundred", whereas in BrE they are also expressed but as true percentages instead, after multiplying the decimal by 100%, that is, 50% or "fifty per cent" and "games/matches over 50% or 50 per cent". However, "games/matches over 50% or 50 percent" is also found in AmE, albeit sporadically, e.g., hitting percentages in volleyball.
The American practice of expressing so-called percentages in sports statistics as decimals originated with baseball's batting averages, developed by English-born statistician and historian Henry Chadwick.
|
2014 | Atomic semantics | Atomic semantics is a type of guarantee provided by a data register shared by several processors in a parallel machine or in a network of computers working together.
Atomic semantics are very strong. An atomic register provides strong guarantees even when there is concurrency and failures.
A read/write register R stores a value and is accessed by two basic operations: read and write(v). A read returns the value stored in R and write(v) changes the value stored in R to v.
A register is called atomic if it satisfies the two following properties:
1) Each invocation op of a read or write operation:
•Must appear as if it were executed at a single point τ(op) in time.
•τ (op) works as follow:
τb(op) ≤ τ (op) ≤ τe(op): where τb(op) and τe(op) indicate the time when the operation op begins and ends.
•If op1 ≠ op2, then τ (op1)≠τ (op2)
2) Each read operation returns the value written by the last write operation before the read, in the sequence where all operations are ordered by their τ values.
Atomic/Linearizable register:
Termination: when a node is correct, sooner or later each read and write operation will complete.
Safety Property (Linearization points for read and write and failed operations):
Read operation:It appears as if happened at all nodes at some times between the invocation and response time.
Write operation: Similar to read operation, it appears as if happened at all nodes at some times between the invocation and response time.
Failed operation(The atomic term comes from this notion):It appears as if it is completed at every single node or it never happened at any node.
Example : We know that an atomic register is one that is linearizable to a sequential safe register.
The following picture shows where we should put the linearization point for each operation:
An atomic register could be defined for a variable with a single writer but multi- readers (SWMR), single-writer/single-reader (SWSR), or multi-writer/multi-reader (MWMR). Here is an example of a multi-reader multi-writer atomic register which is accessed by three processes (P1, P2, P3). Note that R. read() → v means that the corresponding read operation returns v, which is the value of the register. Therefore, the following execution of the register R could satisfies the definition of the atomic registers:
R.write(1), R.read()→1, R.write(3), R.write(2), R.read()→2, R.read()→2.
|
2015 | Antarctic Circumpolar Current | The Antarctic Circumpolar Current (ACC) is an ocean current that flows clockwise (as seen from the South Pole) from west to east around Antarctica. An alternative name for the ACC is the West Wind Drift. The ACC is the dominant circulation feature of the Southern Ocean and has a mean transport estimated at 100–150 Sverdrups (Sv, million m3/s), or possibly even higher, making it the largest ocean current. The current is circumpolar due to the lack of any landmass connecting with Antarctica and this keeps warm ocean waters away from Antarctica, enabling that continent to maintain its huge ice sheet.
Associated with the Circumpolar Current is the Antarctic Convergence, where the cold Antarctic waters meet the warmer waters of the subantarctic, creating a zone of upwelling nutrients. These nurture high levels of phytoplankton with associated copepods and krill, and resultant food chains supporting fish, whales, seals, penguins, albatrosses, and a wealth of other species.
The ACC has been known to sailors for centuries; it greatly speeds up any travel from west to east, but makes sailing extremely difficult from east to west, although this is mostly due to the prevailing westerly winds. Jack London's story "Make Westing" and the circumstances preceding the mutiny on the "Bounty" poignantly illustrate the difficulty it caused for mariners seeking to round Cape Horn westbound on the clipper ship route from New York to California. The eastbound clipper route, which is the fastest sailing route around the world, follows the ACC around three continental capes – Cape Agulhas (Africa), South East Cape (Australia), and Cape Horn (South America).
The current creates the Ross and Weddell gyres.
Structure.
The ACC connects the Atlantic, Pacific, and Indian Oceans, and serves as a principal pathway of exchange among them. The current is strongly constrained by landform and bathymetric features. To trace it starting arbitrarily at South America, it flows through the Drake Passage between South America and the Antarctic Peninsula and then is split by the Scotia Arc to the east, with a shallow warm branch flowing to the north in the Falkland Current and a deeper branch passing through the Arc more to the east before also turning to the north. Passing through the Indian Ocean, the current first retroflects the Agulhas Current to form the Agulhas Return Current before it is split by the Kerguelen Plateau, and then moving northward again. Deflection is also seen as it passes over the mid-ocean ridge in the Southeast Pacific.
Fronts.
The current is accompanied by three fronts: the Subantarctic front (SAF), the Polar front (PF), and the Southern ACC front (SACC). Furthermore, the waters of the Southern Ocean are separated from the warmer and saltier subtropical waters by the subtropical front (STF).
The northern boundary of the ACC is defined by the northern edge of the SAF, this being the most northerly water to pass through Drake Passage and therefore be circumpolar. Much of the ACC transport is carried in this front, which is defined as the latitude at which a subsurface salinity minimum or a thick layer of unstratified Subantarctic mode water first appears, allowed by temperature dominating density stratification. Still further south lies the PF, which is marked by a transition to very cold, relatively fresh, Antarctic Surface Water at the surface. Here a temperature minimum is allowed by salinity dominating density stratification, due to the lower temperatures. Farther south still is the SACC, which is determined as the southernmost extent of Circumpolar Deep Water (temperature of about 2 °C at 400 m). This water mass flows along the shelfbreak of the western Antarctic Peninsula and thus marks the most southerly water flowing through Drake Passage and therefore circumpolar. The bulk of the transport is carried in the middle two fronts.
The total transport of the ACC at Drake Passage is estimated to be around 135 Sv, or about 135 times the transport of all the world's rivers combined. There is a relatively small addition of flow in the Indian Ocean, with the transport south of Tasmania reaching around 147 Sv, at which point the current is probably the largest on the planet.
Dynamics.
The circumpolar current is driven by the strong westerly winds in the latitudes of the Southern Ocean.
In latitudes where there are continents, winds blowing on light surface water can simply pile up light water against these continents. But in the Southern Ocean, the momentum imparted to the surface waters cannot be offset in this way. There are different theories on how the Circumpolar Current balances the momentum imparted by the winds. The increasing eastward momentum imparted by the winds causes water parcels to drift outward from the axis of the Earth's rotation (in other words, northward) as a result of the Coriolis force. This northward Ekman transport is balanced by a southward, pressure-driven flow below the depths of the major ridge systems. Some theories connect these flows directly, implying that there is significant upwelling of dense deep waters within the Southern Ocean, transformation of these waters into light surface waters, and a transformation of waters in the opposite direction to the north. Such theories link the magnitude of the Circumpolar Current with the global thermohaline circulation, particularly the properties of the North Atlantic.
Alternatively, ocean eddies, the oceanic equivalent of atmospheric storms, or the large-scale meanders of the Circumpolar Current may directly transport momentum downward in the water column. This is because such flows can produce a net southward flow in the troughs and a net northward flow over the ridges without requiring any transformation of density. In practice both the thermohaline and the eddy/meander mechanisms are likely to be important.
The current flows at a rate of about over the Macquarie Ridge south of New Zealand. The ACC varies with time. Evidence of this is the Antarctic Circumpolar Wave, a periodic oscillation that affects the climate of much of the southern hemisphere. There is also the Antarctic oscillation, which involves changes in the location and strength of Antarctic winds. Trends in the Antarctic Oscillation have been hypothesized to account for an increase in the transport of the Circumpolar Current over the past two decades.
Formation.
Published estimates of the onset of the Antarctic Circumpolar Current vary, but it is commonly considered to have started at the Eocene/Oligocene boundary. The isolation of Antarctica and formation of the ACC occurred with the openings of the Tasmanian Passage and the Drake Passage. The Tasmanian Seaway separates East Antarctica and Australia, and is reported to have opened to water circulation 33.5 Ma. The timing of the opening of the Drake Passage, between South America and the Antarctic Peninsula, is more disputed; tectonic and sediment evidence show that it could have been open as early as pre-34 Ma, estimates of the opening of the Drake passage are between 20 and 40 Ma. The isolation of Antarctica by the current is credited by many researchers with causing the glaciation of Antarctica and global cooling in the Eocene epoch. Oceanic models have shown that the opening of these two passages limited polar heat convergence and caused a cooling of sea surface temperatures by several degrees; other models have shown that CO2 levels also played a significant role in the glaciation of Antarctica.
Phytoplankton.
Antarctic sea ice cycles seasonally, in February–March the amount of sea ice is lowest, and in August–September the sea ice is at its greatest extent. Ice levels have been monitored by satellite since 1973. Upwelling of deep water under the sea ice brings substantial amounts of nutrients. As the ice melts, the melt water provides stability and the critical depth is well below the mixing depth, which allows for a positive net primary production. As the sea ice recedes epontic algae dominate the first phase of the bloom, and a strong bloom dominate by diatoms follows the ice melt south.
Another phytoplankton bloom occurs more to the north near the Antarctic convergence, here nutrients are present from thermohaline circulation. Phytoplankton blooms are dominated by diatoms and grazed by copepods in the open ocean, and by krill closer to the continent. Diatom production continues through the summer, and populations of krill are sustained, bringing large numbers of cetaceans, cephalopods, seals, birds, and fish to the area.
Phytoplankton blooms are believed to be limited by irradiance in the austral (southern hemisphere) spring, and by biologically available iron in the summer. Much of the biology in the area occurs along the major fronts of the current, the Subtropical, Subantarctic, and the Antarctic Polar fronts, these are areas associated with well defined temperature changes. Size and distribution of phytoplankton are also related to fronts. Microphytoplankton (>20 μm) are found at fronts and at sea ice boundaries, while nanophytoplankton (<20 μm) are found between fronts.
Studies of phytoplankton stocks in the southern sea have shown that the Antarctic Circumpolar Current is dominated by diatoms, while the Weddell Sea has abundant coccolithophorids and silicoflagellates. Surveys of the SW Indian Ocean have shown phytoplankton group variation based on their location relative to the Polar Front, with diatoms dominating South of the front, and dinoflagellates and flagellates in higher populations North of the front.
Some research has been conducted on Antarctic phytoplankton as a carbon sink. Areas of open water left from ice melt are good areas for phytoplankton blooms. The phytoplankton takes carbon from the atmosphere during photosynthesis. As the blooms die and sink, the carbon can be stored in sediments for thousands of years. This natural carbon sink is estimated to remove 3.5 million tonnes from the ocean each year. 3.5 million tonnes of carbon taken from the ocean and atmosphere is equivalent to 12.8 million tonnes of carbon dioxide.
Studies.
An expedition in May 2008 by 19 scientists studied the geology and biology of eight Macquarie Ridge sea mounts, as well as the Antarctic Circumpolar Current to investigate the effects of climate change of the Southern Ocean. The circumpolar current merges the waters of the Atlantic, Indian, and Pacific Oceans and carries up to 150 times the volume of water flowing in all of the world's rivers. The study found that any damage on the cold-water corals nourished by the current will have a long-lasting effect. After studying the circumpolar current it is clear that it strongly influences regional and global climate as well as underwater biodiversity. The subject has been characterized recently as "the spectral peak of the global extra-tropical circulation at ≈ 10^4 kilometers".
The current helps preserve wooden shipwrecks by preventing wood-boring "ship worms" from reaching targets such as Ernest Shackleton's ship, the "Endurance".
|
2017 | Arbor Day | Arbor Day (or Arbour Day in some countries) is a secular day of observance in which individuals and groups are encouraged to plant trees. Today, many countries observe such a holiday. Though usually observed in the spring, the date varies, depending on climate and suitable planting season.
Origins and history.
First Arbor Day.
The Spanish village of Mondoñedo held the first documented arbor plantation festival in the world organized by its mayor in 1594. The place remains as Alameda de los Remedios and it is still planted with lime and horse-chestnut trees. A humble granite marker and a bronze plate recall the event. Additionally, the small Spanish village of Villanueva de la Sierra held the first modern Arbor Day, an initiative launched in 1805 by the local priest with the enthusiastic support of the entire population.
First American Arbor Day.
The first American Arbor Day was originated by J. Sterling Morton of Nebraska City, Nebraska, at an annual meeting of the Nebraska State board of agriculture held in Lincoln. On April 10, 1872, an estimated one million trees were planted in Nebraska.
In 1883, the American Forestry Association made Northrop the Chairman of the committee to campaign for Arbor Day nationwide. Birdsey Northrop of Connecticut was responsible for globalizing the idea when he visited Japan in 1895 and delivered his Arbor Day and Village Improvement message. He also brought his enthusiasm for Arbor Day to Australia, Canada, and Europe.
McCreight and Theodore Roosevelt.
Beginning in 1906, Pennsylvania conservationist Major Israel McCreight of DuBois, Pennsylvania, argued that President Theodore Roosevelt’s conservation speeches were limited to businessmen in the lumber industry and recommended a campaign of youth education and a national policy on conservation education. McCreight urged Roosevelt to make a public statement to school children about trees and the destruction of American forests. Conservationist Gifford Pinchot, Chief of the United States Forest Service, embraced McCreight’s recommendations and asked the President to speak to the public school children of the United States about conservation. On April 15, 1907, Roosevelt issued an "Arbor Day Proclamation to the School Children of the United States" about the importance of trees and that forestry deserves to be taught in U.S. schools. Pinchot wrote McCreight, "we shall all be indebted to you for having made the suggestion."
Around the world.
Australia.
Arbour Day has been observed in Australia since 20 June 1889. National Schools Tree Day is held on the last Friday of July for schools and National Tree Day the last Sunday in July throughout Australia. Many states have Arbour Day, although Victoria has an Arbour Week, which was suggested by Premier Rupert (Dick) Hamer in the 1980s.
Belgium.
International Day of Treeplanting is celebrated in Flanders on or around 21 March as a theme-day/educational-day/observance, not as a public holiday. Tree planting is sometimes combined with awareness campaigns of the fight against cancer: "Kom Op Tegen Kanker".
Brazil.
The Arbor Day (Dia da Árvore) is celebrated on September 21. It is not a national holiday. However, schools nationwide celebrate this day with environment-related activities, namely tree planting.
British Virgin Islands.
Arbour Day is celebrated on November 22. It is sponsored by the National Parks Trust of the Virgin Islands. Activities include an annual national Arbour Day Poetry Competition and tree planting ceremonies throughout the territory.
Cambodia.
Cambodia celebrates Arbor Day on July 9 with a tree planting ceremony attended by the king.
Canada.
The day was founded by Sir George William Ross, later the premier of Ontario, when he was minister of education in Ontario (1883–1899). According to the Ontario Teachers' Manuals "History of Education" (1915), Ross established both Arbour Day and Empire Day—"the former to give the school children an interest in making and keeping the school grounds attractive, and the latter to inspire the children with a spirit of patriotism" (p. 222). This predates the claimed founding of the day by Don Clark of Schomberg, Ontario for his wife Margret Clark in 1906. In Canada, National Forest Week is the last full week of September, and National Tree Day (Maple Leaf Day) falls on the Wednesday of that week. Ontario celebrates Arbour Week from the last Friday in April to the first Sunday in May. Prince Edward Island celebrates Arbour Day on the third Friday in May during Arbour Week. Arbour Day is the longest running civic greening project in Calgary and is celebrated on the first Thursday in May. On this day, each grade 1 student in Calgary's schools receives a tree seedling to be taken home to be planted on private property.
Central African Republic.
National Tree Planting Day is on July 22.
Chile.
"Dia del Arbol" was celebrated on June 28, 2022, as defined by Chile's Environment Ministry
Greater China.
Republic of China.
Arbor Day (植樹節) was founded by the forester Ling Daoyang in 1915 and has been a traditional holiday in the Republic of China since 1916. The Beiyang government's Ministry of Agriculture and Commerce first commemorated Arbor Day in 1915 at the suggestion of forester Ling Daoyang. In 1916, the government announced that all provinces of the Republic of China would celebrate the on the same day as the Qingming Festival, April 5, despite the differences in climate across China, which is on the first day of the fifth solar term of the traditional Chinese lunisolar calendar. From 1929, by decree of the Nationalist government, Arbor Day was , to commemorate the death of Sun Yat-sen, who had been a major advocate of afforestation in his life. Following the retreat of the government of the Republic of China to Taiwan in 1949, the celebration of Arbor Day on March 12 was retained.
People's Republic of China.
In People's Republic of China, during the fourth session of the Fifth National People's Congress of the People's Republic of China in 1979 adopted the Resolution on the Unfolding of a Nationwide Voluntary Tree-planting Campaign. This resolution established the Arbor Day (植树节), also March 12, and stipulated that every able-bodied citizen between the ages of 11 and 60 should plant three to five trees per year or do the equivalent amount of work in seedling, cultivation, tree tending, or other services. Supporting documentation instructs all units to report population statistics to the local afforestation committees for workload allocation. Many couples choose to marry the day before the annual celebration, and they plant the tree to mark beginning of their life together and the new life of the tree.
Republic of Congo.
National Tree Planting Day is on November 6.
Costa Rica.
"Día del Árbol" is on June 15.
Cuba.
"Dia del Árbol" (Day of the Tree) was first observed on October 10, 1904, and is today observed in October of each year.
Czech Republic.
is celebrated on October 20.
Egypt.
Arbor Day is on January 15.
Germany.
Arbor Day ("Tag des Baumes") is on April 25. Its first celebration was in 1952.
India.
Van Mahotsav is an annual pan-Indian tree planting festival, occupying a week in the month of July. During this event millions of trees are planted. It was initiated in 1950 by K. M. Munshi, the then Union Minister for Agriculture and Food, to create an enthusiasm in the mind of the populace for the conservation of forests and planting of trees.
The name Van Mahotsava (the festival of trees) originated in July 1947 after a successful tree-planting drive was undertaken in Delhi, in which national leaders like Jawaharlal Nehru, Dr Rajendra Prasad and Abul Kalam Azad participated. Paryawaran Sachetak Samiti, a leading environmental organization conducts mass events and activities on this special day celebration each year. The week was simultaneously celebrated in a number of states in the country.
Iran.
In Iran, it is known as "National Tree Planting Day". By the Solar Hijri calendar, it is on the fifteenth day of the month Esfand, which usually corresponds with March 5. This day is the first day of the "Natural Recyclable Resources Week" (March 5 to 12).
This is the time when the saplings of the all kinds in terms of different climates of different parts of Iran are shared among the people. They are also taught how to plant trees.
Israel.
The Jewish holiday Tu Bishvat, the new year for trees, is on the 15th day of the month of Shvat, which usually falls in January or February. Originally based on the date used to calculate the age of fruit trees for tithing as mandated in Leviticus 19:23–25, the holiday now is most often observed by planting trees or raising money to plant trees, and by eating fruit, specifically grapes, figs, pomegranates, olives, and dates. Tu Bishvat is a semi-official holiday in Israel; schools are open but Hebrew-speaking schools often go on tree-planting excursions.
Japan.
Japan celebrates a similarly themed Greenery Day, held on May 4.
Kenya.
National Tree Planting Day is on April 21. Often people plant palm trees and coconut trees along the Indian Ocean that borders the east coast of Kenya. They plant trees to remember Prof. Wangari Maathai, who won a Nobel Peace Prize for planting of trees and caring for them all over Kenya.
Korea.
North Korea marks "Tree Planting Day" on March 2, when people across the country plant trees. This day is considered to combine traditional Asian cultural values with the country's dominant Communist ideology.
In South Korea, April 5, Singmogil or Sikmogil (식목일), the Arbor Day, was a public holiday until 2005. Even though Singmogil is no longer an official holiday, the day is still celebrated, with the South Korean public continuing to take part in tree-planting activities.
Lesotho.
National Tree Planting Day is usually on March 21 depending on the lunar cycle.
Luxembourg.
National Tree Planting Day is on the second Saturday in November.
Malawi.
National Tree Planting Day is on the 2nd Monday of December.
Mexico.
The "Día del Árbol" was established in Mexico in 1959 with President Adolfo López Mateos issuing a decree that it should be observed on the 2nd Thursday of July.
Mongolia.
National Tree Planting Day is on the 2nd Saturday of May and October. The first National Tree Planting Day was celebrated May 8, 2010.
Namibia.
Namibia's first Arbor Day was celebrated on October 8, 2004. It takes place annually on the second Friday of October.
Netherlands.
Since conference and of the Food and Agriculture Organization's publication "World Festival of Trees", and a resolution of the United Nations in 1954: "The Conference, recognising the need of arousing mass consciousness of the aesthetic, physical and economic value of trees, recommends a World Festival of Trees to be celebrated annually in each member country on a date suited to local conditions"; it has been adopted by the Netherlands. In 1957, the National Committee Day of Planting Trees/Foundation of National Festival of Trees ("Nationale Boomplantdag"/"Nationale Boomfeestdag") was created.
On the third Wednesday in March each year (near the spring equinox), three quarters of Dutch schoolchildren aged 10/11 and Dutch celebrities plant trees. Stichting Nationale Boomfeestdag organizes all the activities in the Netherlands for this day. Some municipalities however plant the trees around 21 September because of the planting season.
In 2007, the 50th anniversary was celebrated with special golden jubilee activities.
New Zealand.
New Zealand’s first Arbor Day planting was on 3 July 1890 at Greytown, in the Wairarapa. The first official celebration was scheduled to take place in Wellington in August 2012, with the planting of pohutukawa and Norfolk pines along Thorndon Esplanade.
Prominent New Zealand botanist Dr Leonard Cockayne worked extensively on native plants throughout New Zealand and wrote many notable botanical texts. As early as the 1920s he held a vision for school students of New Zealand to be involved in planting native trees and plants in their school grounds. This vision bore fruit and schools in New Zealand have long planted native trees on Arbor Day.
Since 1977, New Zealand has celebrated Arbor Day on 5 June, which is also World Environment Day. Prior to then, Arbor Day was celebrated on 4 August, which is rather late in the year for tree planting in New Zealand, hence the date change.
Many of the Department of Conservation's Arbor Day activities focus on ecological restoration projects using native plants to restore habitats that have been damaged or destroyed by humans or invasive pests and weeds. There are great restoration projects underway around New Zealand and many organisations including community groups, landowners, conservation organisations, iwi, volunteers, schools, local businesses, nurseries and councils are involved in them. These projects are part of a vision to protect and restore the indigenous biodiversity.
Niger.
Since 1975, Niger has celebrated Arbor Day as part of its Independence Day: 3 August. On this day, aiding the fight against desertification, each Nigerien plants a tree.
North Macedonia.
Having in mind the bad condition of the forest fund, and in particular the catastrophic wildfires which occurred in the summer of 2007, a citizens' initiative for afforestation was started in North Macedonia. The campaign by the name 'Tree Day-Plant Your Future' was first organized on 12 March 2008, when an official non-working day was declared and more than 150,000 Macedonians planted 2 million trees in one day (symbolically, one for each citizen). Six million more were planted in November the same year, and another 12,5 million trees in 2009. This has been established as a tradition and takes place every year.
Pakistan.
National tree plantation day of Pakistan (قومی شجر کاری دن) is celebrated on 18 August.
Philippines.
Since 1947, Arbor Day in the Philippines has been institutionalized to be observed throughout the nation by planting trees and ornamental plants and other forms of relevant activities. Its practice was instituted through Proclamation No. 30. It was subsequently revised by Proclamation No. 41, issued in the same year. In 1955, the commemoration was extended from a day to a week and moved to the last full week of July. Over two decades later, its commemoration was moved to the second week of June. In 2003, the commemorations were reduced from a week to a day and was moved to June 25 per Proclamation No. 396. The same proclamation directed "the active participation of all government agencies, including government-owned and controlled corporations, private sector, schools, civil society groups and the citizenry in tree planting activity". It was subsequently revised by Proclamation 643 in the succeeding year.
In 2012, Republic Act 10176 was passed, which revived tree planting events "as [a] yearly event for local government units" and mandated the planting of at least one tree per year for able-bodied Filipino citizens aged 12 years old and above. Since 2012, many local arbor day celebrations have been commemorated, as in the cases of Natividad and Tayug in Pangasinan and Santa Rita in Pampanga.
Poland.
In Poland, Arbor Day has been celebrated since 2002. Each October 10, many Polish people plant trees as well as participate in events organized by ecological foundations. Moreover, Polish Forest Inspectorates and schools give special lectures and lead ecological awareness campaigns.
Portugal.
Arbor Day is celebrated on March 21. It is not a national holiday but instead schools nationwide celebrate this day with environment-related activities, namely tree planting.
Russia.
All-Russian day of forest plantation was celebrated for the first time on 14 May 2011. Now it is held in April–May (it depends on the weather in different regions).
Samoa.
Arbor Day in Samoa is celebrated on the first Friday in November.
Saudi Arabia.
Arbor Day in Saudi Arabia is celebrated on April 29.
South Africa.
Arbor Day was celebrated from 1945 until 2000 in South Africa. After that, the national government extended it to National Arbor Week, which lasts annually from 1–7 September. Two trees, one common and one rare, are highlighted to increase public awareness of indigenous trees, while various "greening" activities are undertaken by schools, businesses and other organizations. For example, the social enterprise Greenpop, which focusses on sustainable urban greening, forest restoration and environmental awareness in Sub-Saharan Africa, leverages Arbor Day each year to call for tree planting action. During Arbor Month 2019, responding to recent studies that underscore the importance of tree restoration, they launched their new goal of planting 500,000 by 2025.
Spain.
In 1896 Mariano Belmás Estrada promoted the first "Festival of Trees" in Madrid.
In Spain there was an International Forest Day on 21 March, but a decree in 1915 also brought in an Arbor Day throughout Spain. Each municipality or collective decides the date for its Arbor Day, usually between February and May. In Villanueva de la Sierra (Extremadura), where the first Arbor Day in the world was held in 1805, it is celebrated, as on that occasion, on Tuesday Carnaval. It is a great day in the local festive calendar.
As an example of commitment to nature, the small town of Pescueza, with only 180 inhabitants, organizes every spring a large plantation of holm oaks, which is called the "Festivalino", promoted by city council, several foundations, and citizen participation.
Sri Lanka.
National Tree Planting Day is on November 15.
Tanzania.
National Tree Planting Day is on April 1.
Uganda.
National Tree Planting Day is on March 24.
United Kingdom.
First mounted in 1975, National Tree Week is a celebration of the start of the winter tree planting season, usually at the end of November. Around a million trees are planted each year by schools, community organizations and local authorities.
On 6 February 2020, Myerscough College in Lancashire, England, supported by the Arbor Day Foundation, celebrated the UK's first Arbor Day.
United States.
Arbor Day was founded in 1872 by J. Sterling Morton in Nebraska City, Nebraska. By the 1920s, each state in the United States had passed public laws that stipulated a certain day to be Arbor Day or "Arbor and Bird Day" observance.
National Arbor Day is celebrated every year on the last Friday in April; it is a civic holiday in Nebraska. Other states have selected their own dates for Arbor Day.
The customary observance is to plant a tree. On the first Arbor Day, April 10, 1872, an estimated one million trees were planted.
Venezuela.
Venezuela recognizes "Día del Arbol" (Day of the Tree) on the last Sunday of May.
|
2018 | A. J. Ayer | Sir Alfred Jules "Freddie" Ayer ( ; 29 October 1910 – 27 June 1989), usually cited as A. J. Ayer, was an English philosopher known for his promotion of logical positivism, particularly in his books "Language, Truth, and Logic" (1936) and "The Problem of Knowledge" (1956).
Ayer was educated at Eton College and the University of Oxford, after which he studied the philosophy of logical positivism at the University of Vienna. From 1933 to 1940 he lectured on philosophy at Christ Church, Oxford.
During the Second World War Ayer was a Special Operations Executive and MI6 agent.
Ayer was Grote Professor of the Philosophy of Mind and Logic at University College London from 1946 until 1959, after which he returned to Oxford to become Wykeham Professor of Logic at New College. He was president of the Aristotelian Society from 1951 to 1952 and knighted in 1970. He was known for his advocacy of humanism, and was the second president of the British Humanist Association (now known as Humanists UK).
Ayer was president of the Homosexual Law Reform Society for a time; he remarked, "as a notorious heterosexual I could never be accused of feathering my own nest."
Life.
Ayer was born in St John's Wood, in north west London, to Jules Louis Cyprien Ayer and Reine (née Citroen), wealthy parents from continental Europe. His mother was from the Dutch-Jewish family that founded the Citroën car company in France; his father was a Swiss Calvinist financier who worked for the Rothschild family, including for their bank and as secretary to Alfred Rothschild.
Ayer was educated at Ascham St Vincent's School, a former boarding preparatory school for boys in the seaside town of Eastbourne in Sussex, where he started boarding at the relatively early age of seven for reasons to do with the First World War, and at Eton College, where he was a King's Scholar. At Eton Ayer first became known for his characteristic bravado and precocity. Though primarily interested in his intellectual pursuits, he was very keen on sports, particularly rugby, and reputedly played the Eton Wall Game very well. In the final examinations at Eton, Ayer came second in his year, and first in classics. In his final year, as a member of Eton's senior council, he unsuccessfully campaigned for the abolition of corporal punishment at the school. He won a classics scholarship to Christ Church, Oxford. He graduated with a BA with first-class honours.
After graduating from Oxford, Ayer spent a year in Vienna, returned to England and published his first book, "Language, Truth and Logic", in 1936. The first exposition in English of logical positivism as newly developed by the Vienna Circle, this made Ayer at age 26 the "enfant terrible" of British philosophy. As a newly famous intellectual, he played a prominent role in the Oxford by-election campaign of 1938. Ayer campaigned first for the Labour candidate Patrick Gordon Walker, and then for the joint Labour-Liberal "Independent Progressive" candidate Sandie Lindsay, who ran on an anti-appeasement platform against the Conservative candidate, Quintin Hogg, who ran as the appeasement candidate. The by-election, held on 27 October 1938, was quite close, with Hogg winning narrowly.
In the Second World War, Ayer served as an officer in the Welsh Guards, chiefly in intelligence (Special Operations Executive (SOE) and MI6). He was commissioned second lieutenant into the Welsh Guards from Officer Cadet Training Unit on 21 September 1940.
After the war, Ayer briefly returned to the University of Oxford where he became a fellow and Dean of Wadham College. He then taught philosophy at London University from 1946 until 1959, when he also started to appear on radio and television. He was an extrovert and social mixer who liked dancing and attending clubs in London and New York. He was also obsessed with sport: he had played rugby for Eton, and was a noted cricketer and a keen supporter of Tottenham Hotspur football team, where he was for many years a season ticket holder. For an academic, Ayer was an unusually well-connected figure in his time, with close links to 'high society' and the establishment. Presiding over Oxford high-tables, he is often described as charming, but could also be intimidating.
Ayer was married four times to three women. His first marriage was from 1932 to 1941, to (Grace Isabel) Renée, with whom he had a son—allegedly in fact the son of Ayer's friend and colleague Stuart Hampshire—and a daughter. Renée subsequently married Hampshire. In 1960, Ayer married Alberta Constance (Dee) Wells, with whom he had one son. That marriage was dissolved in 1983, and the same year, Ayer married Vanessa Salmon, the former wife of politician Nigel Lawson. She died in 1985, and in 1989 Ayer remarried Wells, who survived him. He also had a daughter with Hollywood columnist Sheilah Graham Westbrook.
In 1950, Ayer attended the founding meeting of the Congress for Cultural Freedom in West Berlin, though he later said he went only because of the offer of a "free trip". He gave a speech on why John Stuart Mill's conceptions of liberty and freedom were still valid in the 20th century. Together with the historian Hugh Trevor-Roper, Ayer fought against Arthur Koestler and Franz Borkenau, arguing that they were far too dogmatic and extreme in their anti-communism, in fact proposing illiberal measures in the defense of liberty. Adding to the tension was the location in West Berlin, together with the fact that the Korean War began on 25 June 1950, the fourth day of the congress, giving a feeling that the world was on the brink of war.
From 1959 to his retirement in 1978, Ayer held the Wykeham Chair, Professor of Logic at Oxford. He was knighted in 1970. After his retirement, Ayer taught or lectured several times in the United States, including as a visiting professor at Bard College in 1987. At a party that same year held by fashion designer Fernando Sanchez, Ayer confronted Mike Tyson, who was forcing himself upon the then little-known model Naomi Campbell. When Ayer demanded that Tyson stop, Tyson reportedly asked, "Do you know who the fuck I am? I'm the heavyweight champion of the world", to which Ayer replied, "And I am the former Wykeham Professor of Logic. We are both preeminent in our field. I suggest that we talk about this like rational men". Ayer and Tyson then began to talk, allowing Campbell to slip out. Ayer was also involved in politics, including anti-Vietnam War activism, supporting the Labour Party (and later the Social Democratic Party), chairing the Campaign Against Racial Discrimination in Sport, and serving as president of the Homosexual Law Reform Society.
In 1988, a year before his death, Ayer wrote an article titled "What I saw when I was dead", describing an unusual near-death experience. Of the experience, he first said that it "slightly weakened my conviction that my genuine death ... will be the end of me, though I continue to hope that it will be." A few weeks later, he revised this, saying, "what I should have said is that my experiences have weakened, not my belief that there is no life after death, but my inflexible attitude towards that belief".
Ayer died on 27 June 1989. From 1980 to 1989 he lived at 51 York Street, Marylebone, where a memorial plaque was unveiled on 19 November 1995.
Philosophical ideas.
In "Language, Truth and Logic" (1936), Ayer presents the verification principle as the only valid basis for philosophy. Unless logical or empirical verification is possible, statements like "God exists" or "charity is good" are not true or untrue but meaningless, and may thus be excluded or ignored. Religious language in particular is unverifiable and as such literally nonsense. He also criticises C. A. Mace's opinion that metaphysics is a form of intellectual poetry. The stance that a belief in God denotes no verifiable hypothesis is sometimes referred to as igtheism (for example, by Paul Kurtz). In later years, Ayer reiterated that he did not believe in God and began to call himself an atheist. He followed in the footsteps of Bertrand Russell by debating religion with the Jesuit scholar Frederick Copleston.
Ayer's version of emotivism divides "the ordinary system of ethics" into four classes:
He focuses on propositions of the first class—moral judgments—saying that those of the second class belong to science, those of the third are mere commands, and those of the fourth (which are considered normative ethics as opposed to meta-ethics) are too concrete for ethical philosophy.
Ayer argues that moral judgments cannot be translated into non-ethical, empirical terms and thus cannot be verified; in this he agrees with ethical intuitionists. But he differs from intuitionists by discarding appeals to intuition of non-empirical moral truths as "worthless" since the intuition of one person often contradicts that of another. Instead, Ayer concludes that ethical concepts are "mere pseudo-concepts":
Between 1945 and 1947, together with Russell and George Orwell, Ayer contributed a series of articles to "Polemic", a short-lived British "Magazine of Philosophy, Psychology, and Aesthetics" edited by the ex-Communist Humphrey Slater.
Ayer was closely associated with the British humanist movement. He was an Honorary Associate of the Rationalist Press Association from 1947 until his death. He was elected a Foreign Honorary Member of the American Academy of Arts and Sciences in 1963. In 1965, he became the first president of the Agnostics' Adoption Society and in the same year succeeded Julian Huxley as president of the British Humanist Association, a post he held until 1970. In 1968 he edited "The Humanist Outlook", a collection of essays on the meaning of humanism. He was one of the signers of the Humanist Manifesto.
Works.
Ayer is best known for popularising the verification principle, in particular through his presentation of it in "Language, Truth, and Logic". The principle was at the time at the heart of the debates of the so-called Vienna Circle, which Ayer had visited as a young guest. Others, including the circle's leading light, Moritz Schlick, were already writing papers on the issue. Ayer's formulation was that a sentence can be meaningful only if it has verifiable empirical import; otherwise, it is either "analytical" if tautologous or "metaphysical" (i.e. meaningless, or "literally senseless"). He started to work on the book at the age of 23 and it was published when he was 26. Ayer's philosophical ideas were deeply influenced by those of the Vienna Circle and David Hume. His clear, vibrant and polemical exposition of them makes "Language, Truth and Logic" essential reading on the tenets of logical empiricism; the book is regarded as a classic of 20th-century analytic philosophy and is widely read in philosophy courses around the world. In it, Ayer also proposes that the distinction between a conscious man and an unconscious machine resolves itself into a distinction between "different types of perceptible behaviour", an argument that anticipates the Turing test published in 1950 to test a machine's capability to demonstrate intelligence.
Ayer wrote two books on the philosopher Bertrand Russell, "Russell and Moore: The Analytic Heritage" (1971) and "Russell" (1972). He also wrote an introductory book on the philosophy of David Hume and a short biography of Voltaire.
Ayer was a strong critic of the German philosopher Martin Heidegger. As a logical positivist, Ayer was in conflict with Heidegger's vast, overarching theories of existence. Ayer considered them completely unverifiable through empirical demonstration and logical analysis, and this sort of philosophy an unfortunate strain in modern thought. He considered Heidegger the worst example of such philosophy, which Ayer believed entirely useless. In "Philosophy in the Twentieth Century", Ayer accuses Heidegger of "surprising ignorance" or "unscrupulous distortion" and "what can fairly be described as charlatanism."
In 1972–73, Ayer gave the Gifford Lectures at the University of St Andrews, later published as "The Central Questions of Philosophy". In the book's preface, he defends his selection to hold the lectureship on the basis that Lord Gifford wished to promote "natural theology, in the widest sense of that term", and that non-believers are allowed to give the lectures if they are "able reverent men, true thinkers, sincere lovers of and earnest inquirers after truth". He still believed in the viewpoint he shared with the logical positivists: that large parts of what was traditionally called philosophyincluding metaphysics, theology and aestheticswere not matters that could be judged true or false, and that it was thus meaningless to discuss them.
In "The Concept of a Person and Other Essays" (1963), Ayer heavily criticized Wittgenstein's private language argument.
Ayer's sense-data theory in "Foundations of Empirical Knowledge" was famously criticised by fellow Oxonian J. L. Austin in "Sense and Sensibilia", a landmark 1950s work of common language philosophy. Ayer responded in the essay "Has Austin Refuted the Sense-datum Theory?", which can be found in his "Metaphysics and Common Sense" (1969).
Awards.
Ayer was awarded a Knighthood as Knight Bachelor in the London Gazette on 1 January 1970.
|
2019 | André Weil | André Weil (; ; 6 May 1906 – 6 August 1998) was a French mathematician, known for his foundational work in number theory and algebraic geometry. He was one of the most influential mathematicians of the twentieth century. His influence is due
both to his original contributions to a remarkably broad
spectrum of mathematical theories, and to the mark
he left on mathematical practice and style, through
some of his own works as well as through the Bourbaki group, of which he was one of the principal
founders.
Life.
André Weil was born in Paris to agnostic Alsatian Jewish parents who fled the annexation of Alsace-Lorraine by the German Empire after the Franco-Prussian War in 1870–71. Simone Weil, who would later become a famous philosopher, was Weil's younger sister and only sibling. He studied in Paris, Rome and Göttingen and received his doctorate in 1928. While in Germany, Weil befriended Carl Ludwig Siegel. Starting in 1930, he spent two academic years at Aligarh Muslim University in India. Aside from mathematics, Weil held lifelong interests in classical Greek and Latin literature, in Hinduism and Sanskrit literature: he had taught himself Sanskrit in 1920. After teaching for one year at Aix-Marseille University, he taught for six years at University of Strasbourg. He married Éveline de Possel (née Éveline Gillet) in 1937.
Weil was in Finland when World War II broke out; he had been traveling in Scandinavia since April 1939. His wife Éveline returned to France without him. Weil was arrested in Finland at the outbreak of the Winter War on suspicion of spying; however, accounts of his life having been in danger were shown to be exaggerated. Weil returned to France via Sweden and the United Kingdom, and was detained at Le Havre in January 1940. He was charged with failure to report for duty, and was imprisoned in Le Havre and then Rouen. It was in the military prison in Bonne-Nouvelle, a district of Rouen, from February to May, that Weil completed the work that made his reputation. He was tried on 3 May 1940. Sentenced to five years, he requested to be attached to a military unit instead, and was given the chance to join a regiment in Cherbourg. After the fall of France in June 1940, he met up with his family in Marseille, where he arrived by sea. He then went to Clermont-Ferrand, where he managed to join his wife Éveline, who had been living in German-occupied France.
In January 1941, Weil and his family sailed from Marseille to New York. He spent the remainder of the war in the United States, where he was supported by the Rockefeller Foundation and the Guggenheim Foundation. For two years, he taught undergraduate mathematics at Lehigh University, where he was unappreciated, overworked and poorly paid, although he did not have to worry about being drafted, unlike his American students. He quit the job at Lehigh and moved to Brazil, where he taught at the Universidade de São Paulo from 1945 to 1947, working with Oscar Zariski. Weil and his wife had two daughters, Sylvie (born in 1942) and Nicolette (born in 1946).
He then returned to the United States and taught at the University of Chicago from 1947 to 1958, before moving to the Institute for Advanced Study, where he would spend the remainder of his career. He was a Plenary Speaker at the ICM in 1950 in Cambridge, Massachusetts, in 1954 in Amsterdam, and in 1978 in Helsinki. Weil was elected Foreign Member of the Royal Society in 1966. In 1979, he shared the second Wolf Prize in Mathematics with Jean Leray.
Work.
Weil made substantial contributions in a number of areas, the most important being his discovery of profound connections between algebraic geometry and number theory. This began in his doctoral work leading to the Mordell–Weil theorem (1928, and shortly applied in Siegel's theorem on integral points). Mordell's theorem had an "ad hoc" proof; Weil began the separation of the infinite descent argument into two types of structural approach, by means of height functions for sizing rational points, and by means of Galois cohomology, which would not be categorized as such for another two decades. Both aspects of Weil's work have steadily developed into substantial theories.
Among his major accomplishments were the 1940s proof of the Riemann hypothesis for zeta-functions of curves over finite fields, and his subsequent laying of proper foundations for algebraic geometry to support that result (from 1942 to 1946, most intensively). The so-called Weil conjectures were hugely influential from around 1950; these statements were later proved by Bernard Dwork, Alexander Grothendieck, Michael Artin, and finally by Pierre Deligne, who completed the most difficult step in 1973.
Weil introduced the adele ring in the late 1930s, following Claude Chevalley's lead with the ideles, and gave a proof of the Riemann–Roch theorem with them (a version appeared in his "Basic Number Theory" in 1967). His 'matrix divisor' (vector bundle "avant la lettre") Riemann–Roch theorem from 1938 was a very early anticipation of later ideas such as moduli spaces of bundles. The Weil conjecture on Tamagawa numbers proved resistant for many years. Eventually the adelic approach became basic in automorphic representation theory. He picked up another credited "Weil conjecture", around 1967, which later under pressure from Serge Lang (resp. of Serre) became known as the Taniyama–Shimura conjecture (resp. Taniyama–Weil conjecture) based on a roughly formulated question of Taniyama at the 1955 Nikkō conference. His attitude towards conjectures was that one should not dignify a guess as a conjecture lightly, and in the Taniyama case, the evidence was only there after extensive computational work carried out from the late 1960s.
Other significant results were on Pontryagin duality and differential geometry. He introduced the concept of a uniform space in general topology, as a by-product of his collaboration with Nicolas Bourbaki (of which he was a Founding Father). His work on sheaf theory hardly appears in his published papers, but correspondence with Henri Cartan in the late 1940s, and reprinted in his collected papers, proved most influential. He also chose the symbol ∅, derived from the letter Ø in the Norwegian alphabet (which he alone among the Bourbaki group was familiar with), to represent the empty set.
Weil also made a well-known contribution in Riemannian geometry in his very first paper in 1926, when he showed that the classical isoperimetric inequality holds on non-positively curved surfaces. This established the 2-dimensional case of what later became known as the Cartan–Hadamard conjecture.
He discovered that the so-called Weil representation, previously introduced in quantum mechanics by Irving Segal and David Shale, gave a contemporary framework for understanding the classical theory of quadratic forms. This was also a beginning of a substantial development by others, connecting representation theory and theta functions.
Weil was a member of both the National Academy of Sciences and the American Philosophical Society.
As expositor.
Weil's ideas made an important contribution to the writings and seminars of Bourbaki, before and after World War II. He also wrote several books on the history of number theory.
Beliefs.
Indian (Hindu) thought had great influence on Weil. He was an agnostic, and he respected religions.
Legacy.
Asteroid 289085 Andreweil, discovered by astronomers at the Saint-Sulpice Observatory in 2004, was named in his memory. The official was published by the Minor Planet Center on 14 February 2014 ().
Books.
Mathematical works:
Collected papers:
Autobiography:
Memoir by his daughter:
|
2020 | Achaeans (Homer) | The Achaeans or Akhaians (; , "the Achaeans" or "of Achaea") is one of the names in Homer which is used to refer to the Greeks collectively.
The term "Achaean" is believed to be related to the Hittite term Ahhiyawa and the Egyptian term Ekwesh which appear in texts from the Late Bronze Age and are believed to refer to the Mycenaean civilization or some part of it.
In the historical period, the term fell into disuse as a general term for Greek people, and was generally reserved for inhabitants of the region of Achaea, a region in the north-central part of the Peloponnese. The city-states of this region later formed a confederation known as the Achaean League, which was influential during the 3rd and 2nd centuries BC.
Etymology.
According to Margalit Finkelberg the name Ἀχαιοί (earlier Ἀχαιϝοί) is possibly derived, via an intermediate form *Ἀχαϝyοί, from a hypothetical older Greek form reflected in the Hittite form "Aḫḫiyawā"; the latter is attested in the Hittite archives, e.g. in the Tawagalawa letter. However, Robert S. P. Beekes doubted its validity and suggested a Pre-Greek "*Akaywa-".
Homeric versus later use.
In Homer, the term Achaeans is one of the primary terms used to refer to the Greeks as a whole. It is used 598 times in the "Iliad", often accompanied by the epithet "long-haired". Other common names used in Homer are Danaans (; "Danaoi"; used 138 times in the "Iliad") and Argives (; ; used 182 times in the "Iliad") while Panhellenes ( "Panhellenes," "All of the Greeks") and Hellenes (; "Hellenes") both appear only once; All of the aforementioned terms were used synonymously to denote a common Greek identity. In some English translations of the "Iliad", the Achaeans are simply called the Greeks throughout.
Later, by the Archaic and Classical periods, the term "Achaeans" referred to inhabitants of the much smaller region of Achaea. Herodotus identified the Achaeans of the northern Peloponnese as descendants of the earlier, Homeric Achaeans. According to Pausanias, writing in the 2nd century AD, the term "Achaean" was originally given to those Greeks inhabiting the Argolis and Laconia.
Pausanias and Herodotus both recount the legend that the Achaeans were forced from their homelands by the Dorians, during the legendary Dorian invasion of the Peloponnese. They then moved into the region later called Achaea.
A scholarly consensus has not yet been reached on the origin of the historic Achaeans relative to the Homeric Achaeans and is still hotly debated. Former emphasis on presumed race, such as John A. Scott's article about the blond locks of the Achaeans as compared to the dark locks of "Mediterranean" Poseidon, on the basis of hints in Homer, has been rejected by some. The contrasting belief that "Achaeans", as understood through Homer, is "a name without a country", an "ethnos" created in the Epic tradition, has modern supporters among those who conclude that "Achaeans" were redefined in the 5th century BC, as contemporary speakers of Aeolic Greek.
Karl Beloch suggested there was no Dorian invasion, but rather that the Peloponnesian Dorians were the Achaeans. Eduard Meyer, disagreeing with Beloch, instead put forth the suggestion that the real-life Achaeans were mainland pre-Dorian Greeks. His conclusion is based on his research on the similarity between the languages of the Achaeans and pre-historic Arcadians. William Prentice disagreed with both, noting archeological evidence suggests the Achaeans instead migrated from "southern Asia Minor to Greece, probably settling first in lower Thessaly" probably prior to 2000 BC.
Hittite documents.
Some Hittite texts mention a nation to the west called Ahhiyawa. In the earliest reference to this land, a letter outlining the treaty violations of the Hittite vassal Madduwatta, it is called "Ahhiya". Another important example is the "Tawagalawa Letter" written by an unnamed Hittite king (most probably Hattusili III) of the empire period (14th–13th century BC) to the king of "Ahhiyawa", treating him as an equal and implying Miletus ("Millawanda") was under his control. It also refers to an earlier ""Wilusa" episode" involving hostility on the part of "Ahhiyawa". Ahhiya(wa) has been identified with the Achaeans of the Trojan War and the city of Wilusa with the legendary city of Troy (note the similarity with early Greek "Wilion", later "Ilion", the name of the acropolis of Troy).
Emil Forrer, a Swiss Hittitologist who worked on the Boghazköy tablets in Berlin, said the Achaeans of pre-Homeric Greece were directly associated with the term "Land of Ahhiyawa" mentioned in the Hittite texts. His conclusions at the time were challenged by other Hittitologists (i.e. Johannes Friedrich in 1927 and Albrecht Götze in 1930), as well as by Ferdinand Sommer, who published his ("The Ahhiyawa Documents") in 1932. The exact relationship of the term "Ahhiyawa" to the Achaeans beyond a similarity in pronunciation was hotly debated by scholars, even following the discovery that Mycenaean Linear B is an early form of Greek; the earlier debate was summed up in 1984 by Hans G. Güterbock of the Oriental Institute. More recent research based on new readings and interpretations of the Hittite texts, as well as of the material evidence for Mycenaean contacts with the Anatolian mainland, came to the conclusion that "Ahhiyawa" referred to the Mycenaean world, or at least to a part of it.
Egyptian sources.
It has been proposed that "Ekwesh" of the Egyptian records may relate to "Achaea" (compared to Hittite "Ahhiyawa"), whereas "Denyen" and "Tanaju" may relate to Classical Greek "Danaoi". The earliest textual reference to the Mycenaean world is in the Annals of Thutmosis III (ca. 1479–1425 BC), which refers to messengers from the king of the Tanaju, circa 1437 BC, offering greeting gifts to the Egyptian king, in order to initiate diplomatic relations, when the latter campaigned in Syria. Tanaju is also listed in an inscription at the Mortuary Temple of Amenhotep III. The latter ruled Egypt in circa 1382–1344 BC. Moreover, a list of the cities and regions of the Tanaju is also mentioned in this inscription; among the cities listed are Mycenae, Nauplion, Kythera, Messenia and the Thebaid (region of Thebes).
During the 5th year of Pharaoh Merneptah, a confederation of Libyan and northern peoples is supposed to have attacked the western delta. Included amongst the ethnic names of the repulsed invaders is the Ekwesh or Eqwesh, whom some have seen as Achaeans, although Egyptian texts specifically mention these Ekwesh to be circumcised. Homer mentions an Achaean attack upon the delta, and Menelaus speaks of the same in Book IV of the "Odyssey" to Telemachus when he recounts his own return home from the Trojan War. Some ancient Greek authors also say that Helen had spent the time of the Trojan War in Egypt, and not at Troy, and that after Troy the Greeks went there to recover her.
Greek mythology.
In Greek mythology, the perceived cultural divisions among the Hellenes were represented as legendary lines of descent that identified kinship groups, with each line being derived from an eponymous ancestor. Each of the Greek "ethne" were said to be named in honor of their respective ancestors: Achaeus of the Achaeans, Danaus of the Danaans, Cadmus of the Cadmeans (the Thebans), Hellen of the Hellenes (not to be confused with Helen of Troy), Aeolus of the Aeolians, Ion of the Ionians, and Dorus of the Dorians.
Cadmus from Phoenicia, Danaus from Egypt, and Pelops from Anatolia each gained a foothold in mainland Greece and were assimilated and Hellenized. Hellen, Graikos, Magnes, and Macedon were sons of Deucalion and Pyrrha, the only people who survived the Great Flood; the "ethne" were said to have originally been named "Graikoi" after the elder son but later renamed "Hellenes" after Hellen who was proved to be the strongest. Sons of Hellen and the nymph Orseis were Dorus, Xuthos, and Aeolus. Sons of Xuthos and Kreousa, daughter of Erechthea, were Ion and Achaeus.
According to Hyginus, 22 Achaeans killed 362 Trojans during their ten years at Troy.
|
2021 | Atle Selberg | Atle Selberg (14 June 1917 – 6 August 2007) was a Norwegian mathematician known for his work in analytic number theory and the theory of automorphic forms, and in particular for bringing them into relation with spectral theory. He was awarded the Fields Medal in 1950 and an honorary Abel Prize in 2002.
Early years.
Selberg was born in Langesund, Norway, the son of teacher Anna Kristina Selberg and mathematician Ole Michael Ludvigsen Selberg. Two of his three brothers, Sigmund and Henrik, were also mathematicians. His other brother, Arne, was a professor of engineering.
While he was still at school he was influenced by the work of Srinivasa Ramanujan and he found an exact analytical formula for the partition function as suggested by the works of Ramanujan; however, this result was first published by Hans Rademacher.
He studied at the University of Oslo and completed his PhD in 1943.
World War II.
During World War II, Selberg worked in isolation due to the German occupation of Norway. After the war, his accomplishments became known, including a proof that a positive proportion of the zeros of the Riemann zeta function lie on the line formula_1.
During the war, he fought against the German invasion of Norway, and was imprisoned several times.
Post-war in Norway.
After the war, he turned to sieve theory, a previously neglected topic which Selberg's work brought into prominence. In a 1947 paper he introduced the Selberg sieve, a method well adapted in particular to providing auxiliary upper bounds, and which contributed to Chen's theorem, among other important results.
In 1948 Selberg submitted two papers in "Annals of Mathematics" in which he proved by elementary means the theorems for primes in arithmetic progression and the density of primes. This challenged the widely held view of his time that certain theorems are only obtainable with the advanced methods of complex analysis. Both results were based on his work on the asymptotic formula
where
for primes formula_4. He established this result by elementary means in March 1948, and by July of that year, Selberg and Paul Erdős each obtained elementary proofs of the prime number theorem, both using the asymptotic formula above as a starting point. Circumstances leading up to the proofs, as well as publication disagreements, led to a bitter dispute between the two mathematicians.
For his fundamental accomplishments during the 1940s, Selberg received the 1950 Fields Medal.
Institute for Advanced Study.
Selberg moved to the United States and worked as an associate professor at Syracuse University and later settled at the Institute for Advanced Study in Princeton, New Jersey in the 1950s, where he remained until his death. During the 1950s he worked on introducing spectral theory into number theory, culminating in his development of the Selberg trace formula, the most famous and influential of his results. In its simplest form, this establishes a duality between the lengths of closed geodesics on a compact Riemann surface and the eigenvalues of the Laplacian, which is analogous to the duality between the prime numbers and the zeros of the zeta function.
He was awarded the 1986 Wolf Prize in Mathematics. He was also awarded an honorary Abel Prize in 2002, its founding year, before the awarding of the regular prizes began.
Selberg received many distinctions for his work, in addition to the Fields Medal, the Wolf Prize and the Gunnerus Medal. He was elected to the Norwegian Academy of Science and Letters, the Royal Danish Academy of Sciences and Letters and the American Academy of Arts and Sciences.
In 1972, he was awarded an honorary degree, doctor philos. honoris causa, at the Norwegian Institute of Technology, later part of Norwegian University of Science and Technology.
His first wife, Hedvig, died in 1995. With her, Selberg had two children: Ingrid Selberg (married to playwright Mustapha Matura) and Lars Selberg. In 2003 Atle Selberg married Betty Frances ("Mickey") Compton (born in 1929).
He died at home in Princeton, New Jersey on 6 August 2007 of heart failure.
|
2023 | Aeschylus | Aeschylus (, ; ; c. 525/524 – c. 456/455 BC) was an ancient Greek tragedian, and is often described as the father of tragedy. Academic knowledge of the genre begins with his work, and understanding of earlier Greek tragedy is largely based on inferences made from reading his surviving plays. According to Aristotle, he expanded the number of characters in the theatre and allowed conflict among them. Formerly, characters interacted only with the chorus.
Only seven of his estimated seventy to ninety plays have survived. There is a long-standing debate regarding the authorship of one of them, "Prometheus Bound", with some scholars arguing that it may be the work of his son Euphorion. Fragments from other plays have survived in quotations, and more continue to be discovered on Egyptian papyri. These fragments often give further insights into Aeschylus' work. He was likely the first dramatist to present plays as a trilogy. His "Oresteia" is the only extant ancient example. At least one of his plays was influenced by the Persians' second invasion of Greece (480–479 BC). This work, "The Persians", is one of very few classical Greek tragedies concerned with contemporary events, and the only one extant. The significance of the war with Persia was so great to Aeschylus and the Greeks that his epitaph commemorates his participation in the Greek victory at Marathon while making no mention of his success as a playwright.
Life.
Aeschylus was born in in Eleusis, a small town about 27 km northwest of Athens, in the fertile valleys of western Attica. Some scholars argue that his date of birth may be based on counting back forty years from his first victory in the Great Dionysia. His family was wealthy and well established. His father, Euphorion, was said to be a member of the Eupatridae, the ancient nobility of Attica, but this might be a fiction invented by the ancients to account for the grandeur of Aeschylus' plays.
As a youth, Aeschylus worked at a vineyard until, according to the 2nd-century AD geographer Pausanias, the god Dionysus visited him in his sleep and commanded him to turn his attention to the nascent art of tragedy. As soon as he woke, he began to write a tragedy, and his first performance took place in 499 BC, when he was 26 years old. He won his first victory at the City Dionysia in 484 BC.
In 510 BC, when Aeschylus was 15 years old, Cleomenes I expelled the sons of Peisistratus from Athens, and Cleisthenes came to power. Cleisthenes' reforms included a system of registration that emphasized the importance of the deme over family tradition. In the last decade of the 6th century, Aeschylus and his family were living in the deme of Eleusis.
The Persian Wars played a large role in Aeschylus' life and career. In 490 BC, he and his brother Cynegeirus fought to defend Athens against the invading army of Darius I of Persia at the Battle of Marathon. The Athenians emerged triumphant, and the victory was celebrated across the city-states of Greece. Cynegeirus was killed while trying to prevent a Persian ship retreating from the shore, for which his countrymen extolled him as a hero.
In 480 BC, Aeschylus was called into military service again, together with his younger brother Ameinias, against Xerxes I's invading forces at the Battle of Salamis. Aeschylus also fought at the Battle of Plataea in 479 BC. Ion of Chios was a witness for Aeschylus' war record and his contribution in Salamis. Salamis holds a prominent place in "The Persians", his oldest surviving play, which was performed in 472 BC and won first prize at the Dionysia.
Aeschylus was one of many Greeks who were initiated into the Eleusinian Mysteries, an ancient cult of Demeter based in his home town of Eleusis. According to Aristotle, Aeschylus was accused of asebeia (impiety) for revealing some of the cult's secrets on stage.
Other sources claim that an angry mob tried to kill Aeschylus on the spot but he fled the scene. Heracleides of Pontus asserts that the audience tried to stone Aeschylus. Aeschylus took refuge at the altar in the orchestra of the Theater of Dionysus. He pleaded ignorance at his trial. He was acquitted, with the jury sympathetic to the military service of him and his brothers during the Persian Wars. According to the 2nd-century AD author Aelian, Aeschylus' younger brother Ameinias helped to acquit Aeschylus by showing the jury the stump of the hand he had lost at Salamis, where he was voted bravest warrior. The truth is that the award for bravery at Salamis went not to Aeschylus' brother but to Ameinias of Pallene.
Aeschylus travelled to Sicily once or twice in the 470s BC, having been invited by Hiero I, tyrant of Syracuse, a major Greek city on the eastern side of the island. He produced "The Women of Aetna" during one of these trips (in honor of the city founded by Hieron), and restaged his "Persians". By 473 BC, after the death of Phrynichus, one of his chief rivals, Aeschylus was the yearly favorite in the Dionysia, winning first prize in nearly every competition. In 472 BC, Aeschylus staged the production that included the "Persians", with Pericles serving as "choregos".
Personal life.
Aeschylus married and had two sons, Euphorion and Euaeon, both of whom became tragic poets. Euphorion won first prize in 431 BC in competition against both Sophocles and Euripides. A nephew of Aeschylus, Philocles (his sister's son), was also a tragic poet, and won first prize in the competition against Sophocles' "Oedipus Rex". Aeschylus had at least two brothers, Cynegeirus and Ameinias.
Death.
In 458 BC, Aeschylus returned to Sicily for the last time, visiting the city of Gela, where he died in 456 or 455 BC. Valerius Maximus wrote that he was killed outside the city by a tortoise dropped by an eagle which had mistaken his head for a rock suitable for shattering the shell, and killed him. Pliny, in his "Naturalis Historiæ", adds that Aeschylus had been staying outdoors to avoid a prophecy that he would be killed by a falling object, but this story may be legendary and due to a misunderstanding of the iconography on Aeschylus's tomb. Aeschylus' work was so respected by the Athenians that after his death his tragedies were the only ones allowed to be restaged in subsequent competitions. His sons Euphorion and Euæon and his nephew Philocles also became playwrights.
The inscription on Aeschylus' gravestone makes no mention of his theatrical renown, commemorating only his military achievements:
According to Castoriadis, the inscription on his grave signifies the primary importance of "belonging to the City" (polis), of the solidarity that existed within the collective body of citizen-soldiers.
Works.
The seeds of Greek drama were sown in religious festivals for the gods, chiefly Dionysus, the god of wine. During Aeschylus' lifetime, dramatic competitions became part of the City Dionysia, held in spring. The festival opened with a procession which was followed by a competition of boys singing dithyrambs, and all culminated in a pair of dramatic competitions. The first competition Aeschylus would have participated in involved three playwrights each presenting three tragedies and one satyr play. Such format is called a continuous tragic tetralogy. It allowed Aeschylus to explore the human and theological and cosmic dimensions of a mythic sequence, developing it in successive phases. A second competition involving five comedic playwrights followed, and the winners of both competitions were chosen by a panel of judges.
Aeschylus entered many of these competitions, and various ancient sources attribute between seventy and ninety plays to him. Only seven tragedies attributed to him have survived intact: "The Persians", "Seven Against Thebes", "The Suppliants", the trilogy known as "The Oresteia" (the three tragedies "Agamemnon", "The Libation Bearers" and "The Eumenides"), and "Prometheus Bound" (whose authorship is disputed). With the exception of this last play – the success of which is uncertain – all of Aeschylus's extant tragedies are known to have won first prize at the City Dionysia.
The Alexandrian "Life of Aeschylus" claims that he won the first prize at the City Dionysia thirteen times. This compares favorably with Sophocles' reported eighteen victories (with a substantially larger catalogue, an estimated 120 plays), and dwarfs the five victories of Euripides, who is thought to have written roughly 90 plays.
Trilogies.
One hallmark of Aeschylean dramaturgy appears to have been his tendency to write connected trilogies in which each play serves as a chapter in a continuous dramatic narrative. The "Oresteia" is the only extant example of this type of connected trilogy, but there is evidence that Aeschylus often wrote such trilogies. The satyr plays that followed his tragic trilogies also drew from myth.
The satyr play Proteus, which followed the "Oresteia", treated the story of Menelaus' detour in Egypt on his way home from the Trojan War. It is assumed, based on the evidence provided by a catalogue of Aeschylean play titles, scholia, and play fragments recorded by later authors, that three other extant plays of his were components of connected trilogies: "Seven Against Thebes" was the final play in an Oedipus trilogy, and "The Suppliants" and "Prometheus Bound" were each the first play in a Danaid trilogy and Prometheus trilogy, respectively. Scholars have also suggested several completely lost trilogies, based on known play titles. A number of these treated myths about the Trojan War. One, collectively called the "Achilleis", comprised "Myrmidons", "Nereids" and "Phrygians" (alternately, "The Ransoming of Hector").
Another trilogy apparently recounted the entrance of the Trojan ally Memnon into the war, and his death at the hands of Achilles ("Memnon" and "The Weighing of Souls" being two components of the trilogy). "The Award of the Arms", "The Phrygian Women", and "The Salaminian Women" suggest a trilogy about the madness and subsequent suicide of the Greek hero Ajax. Aeschylus seems to have written about Odysseus' return to Ithaca after the war (including his killing of his wife Penelope's suitors and its consequences) in a trilogy consisting of "The Soul-raisers", "Penelope", and "The Bone-gatherers". Other suggested trilogies touched on the myth of Jason and the Argonauts ("Argô", "Lemnian Women", "Hypsipylê"), the life of Perseus ("The Net-draggers", "Polydektês", "Phorkides"), the birth and exploits of Dionysus ("Semele", "Bacchae", "Pentheus"), and the aftermath of the war portrayed in "Seven Against Thebes" ("Eleusinians", "Argives" (or "Argive Women"), "Sons of the Seven").
Surviving plays.
"The Persians" (472 BC).
"The Persians" ("Persai") is the earliest of Aeschylus' extant plays. It was performed in 472 BC. It was based on Aeschylus' own experiences, specifically the Battle of Salamis. It is unique among surviving Greek tragedies in that it describes a recent historical event. "The Persians" focuses on the popular Greek theme of hubris and blames Persia's loss on the pride of its king.
It opens with the arrival of a messenger in Susa, the Persian capital, bearing news of the catastrophic Persian defeat at Salamis, to Atossa, the mother of the Persian King Xerxes. Atossa then travels to the tomb of Darius, her husband, where his ghost appears, to explain the cause of the defeat. It is, he says, the result of Xerxes' hubris in building a bridge across the Hellespont, an action which angered the gods. Xerxes appears at the end of the play, not realizing the cause of his defeat, and the play closes to lamentations by Xerxes and the chorus.
"Seven Against Thebes" (467 BC).
"Seven against Thebes" ("Hepta epi Thebas") was performed in 467 BC. It has the contrasting theme of the interference of the gods in human affairs. Another theme, with which Aeschylus' would continually involve himself, makes its first known appearance in this play, namely that the polis was a key development of human civilization.
The play tells the story of Eteocles and Polynices, the sons of the shamed king of Thebes, Oedipus. Eteocles and Polynices agree to share and alternate the throne of the city. After the first year, Eteocles refuses to step down. Polynices therefore undertakes war. The pair kill each other in single combat, and the original ending of the play consisted of lamentations for the dead brothers. But a new ending was added to the play some fifty years later: Antigone and Ismene mourn their dead brothers, a messenger enters announcing an edict prohibiting the burial of Polynices, and Antigone declares her intention to defy this edict. The play was the third in a connected Oedipus trilogy. The first two plays were "Laius" and "Oedipus". The concluding satyr play was "The Sphinx".
"The Suppliants" (463 BC).
Aeschylus continued his emphasis on the polis with "The Suppliants" ("Hiketides") in 463 BC. The play gives tribute to the democratic undercurrents which were running through Athens and preceding the establishment of a democratic government in 461. The Danaids (50 daughters of Danaus, founder of Argos) flee a forced marriage to their cousins in Egypt. They turn to King Pelasgus of Argos for protection, but Pelasgus refuses until the people of Argos weigh in on the decision (a distinctly democratic move on the part of the king). The people decide that the Danaids deserve protection and are allowed within the walls of Argos despite Egyptian protests.
A Danaid trilogy had long been assumed because of "The Suppliants"' cliffhanger ending. This was confirmed by the 1952 publication of Oxyrhynchus Papyrus 2256 fr. 3. The constituent plays are generally agreed to be "The Suppliants" and "The Egyptians" and "The Danaids". A plausible reconstruction of the trilogy's last two-thirds runs thus: In "The Egyptians", the Argive-Egyptian war threatened in the first play has transpired. King Pelasgus was killed during the war, and Danaus rules Argos. Danaus negotiates a settlement with Aegyptus, a condition of which requires his 50 daughters to marry the 50 sons of Aegyptus. Danaus secretly informs his daughters of an oracle which predicts that one of his sons-in-law would kill him. He orders the Danaids to murder their husbands therefore on their wedding night. His daughters agree. "The Danaids" would open the day after the wedding.
It is revealed that 49 of the 50 Danaids killed their husbands. Hypermnestra did not kill her husband, Lynceus, and helped him escape. Danaus is angered by his daughter's disobedience and orders her imprisonment and possibly execution. In the trilogy's climax and dénouement, Lynceus reveals himself to Danaus and kills him, thus fulfilling the oracle. He and Hypermnestra will establish a ruling dynasty in Argos. The other 49 Danaids are absolved of their murders, and married off to unspecified Argive men. The satyr play following this trilogy was titled "Amymone", after one of the Danaids.
"The Oresteia" (458 BC).
Besides a few missing lines, the "Oresteia" of 458 BC is the only complete trilogy of Greek plays by any playwright still extant (of "Proteus", the satyr play which followed, only fragments are known). "Agamemnon" and "The Libation Bearers" ("Choephoroi") and "The Eumenides" together tell the violent story of the family of Agamemnon, king of Argos.
"Agamemnon".
Aeschylus begins in Greece, describing the return of King Agamemnon from his victory in the Trojan War, from the perspective of the townspeople (the Chorus) and his wife, Clytemnestra. Dark foreshadowings build to the death of the king at the hands of his wife, who was angry that their daughter Iphigenia was killed so that the gods would restore the winds and allow the Greek fleet to sail to Troy. Clytemnestra was also unhappy that Agamemnon kept the Trojan prophetess Cassandra as his concubine. Cassandra foretells the murder of Agamemnon and of herself to the assembled townsfolk, who are horrified. She then enters the palace knowing that she cannot avoid her fate. The ending of the play includes a prediction of the return of Orestes, son of Agamemnon, who will seek to avenge his father.
"The Libation Bearers".
"The Libation Bearers" opens with Orestes' arrival at Agamemnon's tomb, from exile in Phocis. Electra meets Orestes there. They plan revenge against Clytemnestra and her lover, Aegisthus. Clytemnestra's account of a nightmare in which she gives birth to a snake is recounted by the chorus. This leads her to order her daughter, Electra, to pour libations on Agamemnon's tomb (with the assistance of libation bearers) in hope of making amends. Orestes enters the palace pretending to bear news of his own death. Clytemnestra calls in Aegisthus to learn the news. Orestes kills them both. Orestes is then beset by the Furies, who avenge the murders of kin in Greek mythology.
"The Eumenides".
The third play addresses the question of Orestes' guilt. The Furies drive Orestes from Argos and into the wilderness. He makes his way to the temple of Apollo and begs Apollo to drive the Furies away. Apollo had encouraged Orestes to kill Clytemnestra, so he bears some of the guilt for the murder. Apollo sends Orestes to the temple of Athena with Hermes as a guide.
The Furies track him down, and Athena steps in and declares that a trial is necessary. Apollo argues Orestes' case, and after the judges (including Athena) deliver a tie vote, Athena announces that Orestes is acquitted. She renames the Furies "The Eumenides" (The Good-spirited, or Kindly Ones), and extols the importance of reason in the development of laws. As in "The Suppliants", the ideals of a democratic Athens are praised.
"Prometheus Bound" (date disputed).
"Prometheus Bound" is attributed to Aeschylus by ancient authorities. Since the late 19th century, however, scholars have increasingly doubted this ascription, largely on stylistic grounds. Its production date is also in dispute, with theories ranging from the 480s BC to as late as the 410s.
The play consists mostly of static dialogue. The Titan Prometheus is bound to a rock throughout, which is his punishment from the Olympian Zeus for providing fire to humans. The god Hephaestus and the Titan Oceanus and the chorus of Oceanids all express sympathy for Prometheus' plight. Prometheus is met by Io, a fellow victim of Zeus' cruelty. He prophesies her future travels, revealing that one of her descendants will free Prometheus. The play closes with Zeus sending Prometheus into the abyss because Prometheus will not tell him of a potential marriage which could prove Zeus' downfall.
"Prometheus Bound" seems to have been the first play in a trilogy, the "Prometheia". In the second play, "Prometheus Unbound", Heracles frees Prometheus from his chains and kills the eagle that had been sent daily to eat Prometheus' perpetually regenerating liver, then believed the source of feeling. We learn that Zeus has released the other Titans which he imprisoned at the conclusion of the Titanomachy, perhaps foreshadowing his eventual reconciliation with Prometheus.
In the trilogy's conclusion, "Prometheus the Fire-Bringer", it seems that the Titan finally warns Zeus not to sleep with the sea nymph Thetis, for she is fated to beget a son greater than the father. Not wishing to be overthrown, Zeus marries Thetis off to the mortal Peleus. The product of that union is Achilles, Greek hero of the Trojan War. After reconciling with Prometheus, Zeus probably inaugurates a festival in his honor at Athens.
Lost plays.
Of Aeschylus' other plays, only titles and assorted fragments are known. There are enough fragments (along with comments made by later authors and scholiasts) to produce rough synopses for some plays.
"Myrmidons".
This play was based on books 9 and 16 of the "Iliad". Achilles sits in silent indignation over his humiliation at Agamemnon's hands for most of the play. Envoys from the Greek army attempt to reconcile Achilles to Agamemnon, but he yields only to his friend Patroclus, who then battles the Trojans in Achilles' armour. The bravery and death of Patroclus are reported in a messenger's speech, which is followed by mourning.
"Nereids".
This play was based on books 18 and 19 and 22 of the "Iliad". It follows the Daughters of Nereus, the sea god, who lament Patroclus' death. A messenger tells how Achilles (perhaps reconciled to Agamemnon and the Greeks) slew Hector.
"Phrygians", or "Hector's Ransom".
After a brief discussion with Hermes, Achilles sits in silent mourning over Patroclus. Hermes then brings in King Priam of Troy, who wins over Achilles and ransoms his son's body in a spectacular coup de théâtre. A scale is brought on stage and Hector's body is placed in one scale and gold in the other. The dynamic dancing of the chorus of Trojans when they enter with Priam is reported by Aristophanes.
"Niobe".
The children of Niobe, the heroine, have been slain by Apollo and Artemis because Niobe had gloated that she had more children than their mother, Leto. Niobe sits in silent mourning on stage during most of the play. In the "Republic", Plato quotes the line "God plants a fault in mortals when he wills to destroy a house utterly."
These are the remaining 71 plays ascribed to Aeschylus which are known:
Influence.
Influence on Greek drama and culture.
The theatre was just beginning to evolve when Aeschylus started writing for it. Earlier playwrights such as Thespis had already expanded the cast to include an actor who was able to interact with the chorus. Aeschylus added a second actor, allowing for greater dramatic variety, while the chorus played a less important role. He is sometimes credited with introducing "skenographia", or scene-decoration, though Aristotle gives this distinction to Sophocles. Aeschylus is also said to have made the costumes more elaborate and dramatic, and made his actors wear platform boots ("cothurni") to make them more visible to the audience. According to a later account of Aeschylus' life, the chorus of Furies in the first performance of the "Eumenides" were so frightening when they entered that children fainted and patriarchs urinated and pregnant women went into labour.
Aeschylus wrote his plays in verse. No violence is performed onstage. The plays have a remoteness from daily life in Athens, relating stories about the gods, or being set, like "The Persians", far away. Aeschylus' work has a strong moral and religious emphasis. The "Oresteia" trilogy concentrated on humans' position in the cosmos relative to the gods and divine law and divine punishment.
Aeschylus' popularity is evident in the praise that the comic playwright Aristophanes gives him in "The Frogs", produced some 50 years after Aeschylus' death. Aeschylus appears as a character in the play and claims, at line 1022, that his "Seven against Thebes" "made everyone watching it to love being warlike". He claims, at lines 1026–7, that with "The Persians" he "taught the Athenians to desire always to defeat their enemies." Aeschylus goes on to say, at lines 1039ff., that his plays inspired the Athenians to be brave and virtuous.
Influence outside Greek culture.
Aeschylus' works were influential beyond his own time. Hugh Lloyd-Jones draws attention to Richard Wagner's reverence of Aeschylus. Michael Ewans argues in his "Wagner and Aeschylus. The Ring and the Oresteia" (London: Faber. 1982) that the influence was so great as to merit a direct character by character comparison between Wagner's "Ring" and Aeschylus's "Oresteia". But a critic of that book, while not denying that Wagner read and respected Aeschylus, has described the arguments as unreasonable and forced.
J.T. Sheppard argues in the second half of his "Aeschylus and Sophocles: Their Work and Influence" that Aeschylus and Sophocles have played a major part in the formation of dramatic literature from the Renaissance to the present, specifically in French and Elizabethan drama. He also claims that their influence went beyond just drama and applies to literature in general, citing Milton and the Romantics.
Eugene O'Neill's "Mourning Becomes Electra" (1931), a trilogy of three plays set in America after the Civil War, is modeled after the "Oresteia". Before writing his acclaimed trilogy, O'Neill had been developing a play about Aeschylus, and he noted that Aeschylus "so changed the system of the tragic stage that he has more claim than anyone else to be regarded as the founder (Father) of Tragedy."
During his presidential campaign in 1968, Senator Robert F. Kennedy quoted the Edith Hamilton translation of Aeschylus on the night of the assassination of Martin Luther King Jr. Kennedy was notified of King's murder before a campaign stop in Indianapolis, Indiana, and was warned not to attend the event due to fears of rioting from the mostly African-American crowd. Kennedy insisted on attending and delivered an impromptu speech that delivered news of King's death. Acknowledging the audience's emotions, Kennedy referred to his own grief at the murder of Martin Luther King and, quoting a passage from the play "Agamemnon" (in translation), said: "My favorite poet was Aeschylus. And he once wrote: 'Even in our sleep, pain which cannot forget falls drop by drop upon the heart, until in our own despair, against our will, comes wisdom through the awful grace of God.' What we need in the United States is not division; what we need in the United States is not hatred; what we need in the United States is not violence and lawlessness; but is love and wisdom, and compassion toward one another, and a feeling of justice toward those who still suffer within our country, whether they be white or whether they be black ... Let us dedicate ourselves to what the Greeks wrote so many years ago: to tame the savageness of man and make gentle the life of this world." The quotation from Aeschylus was later inscribed on a memorial at the gravesite of Robert Kennedy following his own assassination.
Editions.
The first translation of the seven plays into English was by Robert Potter in 1779, using blank verse for the iambic trimeters and rhymed verse for the choruses, a convention adopted by most translators for the next century.
|
2024 | Amber Road | The Amber Road was an ancient trade route for the transfer of amber from coastal areas of the North Sea and the Baltic Sea to the Mediterranean Sea. Prehistoric trade routes between Northern and Southern Europe were defined by the amber trade.
As an important commodity, sometimes dubbed "the gold of the north", amber was transported from the North Sea and Baltic Sea coasts overland by way of the Vistula and Dnieper rivers to Italy, Greece, the Black Sea, Syria and Egypt over a period of thousands of years.
Antiquity.
The oldest trade in amber started from Sicily. The Sicilian amber trade was directed to Greece, North Africa and Spain. Sicilian amber was also discovered in Mycenae by the archaeologist Heinrich Schliemann, and it appeared in sites in southern Spain and Portugal. Its distribution is similar to that of ivory, so it is possible that amber from Sicily reached the Iberian Peninsula through contacts with North Africa. After a decline in the consumption and trade of amber at the beginning of the Bronze Age, around 2000 BC, the influence of Baltic amber gradually took the place of the Sicilian one throughout the Iberian Peninsula starting around 1000 BC The new evidence comes from various archaeological and geological locations on the Iberian Peninsula.
From at least the 16th century BC, amber was moved from Northern Europe to the Mediterranean area. The breast ornament of the Egyptian Pharaoh Tutankhamen (c. ) contains large Baltic amber beads. Schliemann found Sicilian amber beads at Mycenae, as shown by spectroscopic investigation. The quantity of amber in the Royal Tomb of Qatna, Syria, is unparalleled for known second millennium BC sites in the Levant and the Ancient Near East. Amber was sent from the North Sea to the Temple of Apollo at Delphi as an offering. From the Black Sea, trade could continue to Asia along the Silk Road, another ancient trade route.
In Roman times, a main route ran south from the Baltic coast (modern Lithuania), the entire north–south length of modern-day Poland (likely through the Iron Age settlement of Biskupin), through the land of the Boii (modern Czech Republic and Slovakia) to the head of the Adriatic Sea (Aquileia by the modern Gulf of Venice). Along with amber, other commodities such as animal fur and skin, honey, and wax were exported to the Romans in exchange for Roman glass, brass, gold, and non-ferrous metals such as tin and copper to the early Baltic region. As this road was a lucrative trade route connecting the Baltic Sea to the Mediterranean Sea, Roman military fortifications were constructed along the route to protect merchants and traders from Germanic raids.
The Old Prussian towns of Kaup and Truso on the Baltic were the starting points of the route to the south. In Scandinavia the amber road probably gave rise to the thriving Nordic Bronze Age culture, bringing influences from the Mediterranean Sea to the northernmost countries of Europe.
Kaliningrad Oblast is occasionally referred to in Russian as , which means "the amber region" (see Kaliningrad Regional Amber Museum).
Known roads by country.
Poland.
The shortest (and possibly oldest) road avoids alpine areas and led from the Baltic coastline (nowadays Lithuania and Poland), through Biskupin, Milicz, Wrocław, the Kłodzko Valley (less often through the Moravian Gate), crossed the Danube near Carnuntum in the Noricum province, headed southwest past Poetovio, Celeia, Emona, Nauportus, and reached Patavium and Aquileia at the Adriatic coast. One of the oldest directions of the last stage of the Amber Road to the south of the Danube, noted in the myth about the Argonauts, used the rivers Sava and Kupa, ending with a short continental road from Nauportus to Tarsatica in Rijeka on the coast of the Adriatic.
Germany.
Several roads connected the North Sea and Baltic Sea, especially the city of Hamburg to the Brenner Pass, proceeding southwards to Brindisi (nowadays Italy) and Ambracia (nowadays Greece).
Switzerland.
The Swiss region indicates a number of alpine roads, concentrating around the capital city Bern and probably originating from the banks of the Rhône and Rhine.
The Netherlands.
A small section, including Baarn, Barneveld, Amersfoort and Amerongen, connected the North Sea with the Lower Rhine.
Belgium.
A small section led southwards from Antwerp and Bruges to the towns Braine-l'Alleud and Braine-le-Comte, both originally named "Brennia-Brenna". The route continued by following the Meuse towards Bern in Switzerland.
Southern France and Spain.
Routes connected amber finding locations at Ambares (near Bordeaux), leading to Béarn and the Pyrenees. Routes connecting the amber finding locations in northern Spain and in the Pyrenees were a trading route to the Mediterranean Sea.
Mongolia.
Archaeological sources also suggest that routes may have connected Mongolia to Eastern Europe during the Kitan/Liao Period.
Modern usage.
There is a tourist route stretching along the Baltic coast from Kaliningrad to Latvia called "Amber Road".
"Amber Road" sites are:
In Poland, the north–south motorway A1 is officially named Amber Highway.
EV9 The Amber Route is a long-distance cycling route between Gdańsk, Poland and Pula, Croatia which follows the course of the Amber Road.
|
2025 | Crandall University | Crandall University is a Baptist Christian liberal arts university located in Moncton, New Brunswick, Canada. It is affiliated with the Canadian Baptists of Atlantic Canada (Canadian Baptist Ministries).
History.
The school was founded in 1949 under the name United Baptist Bible Training School (UBBTS), and served as both a secondary school and a Bible school by the Canadian Baptists of Atlantic Canada. Over two decades, the focus of the school gradually shifted toward post-secondary programs. In 1968, UBBTS became a Bible and junior Christian liberal arts college, and in 1970 the name was changed to Atlantic Baptist College (ABC). A sustained campaign to expand the school's faculty and improve the level of education resulted in ABC being able to grant full Bachelor of Arts degrees in 1983. Its campus at this time was located along the Salisbury Road, west of Moncton's central business district.
The institution moved to a new campus built on the Gorge Road, north of the central business district, in 1996. The name was changed to Atlantic Baptist University (ABU), a reflection of expanded student enrolment and academic accreditation. In 2003, the ABU sports teams adopted the name "The Blue Tide". The institution was the first, and thus far only, English-language university in Moncton. The "Atlantic Baptist University Act" was passed by the Legislative Assembly of New Brunswick in 2008.
On August 21, 2009, it was announced that the institution had changed its name to Crandall University in honour of Rev. Joseph Crandall, a pioneering Baptist minister in the maritime region. In conjunction with the university name change, Crandall Athletics took on a new identity as "The Crandall Chargers."
Controversy.
In 2012, Crandall University came under public scrutiny for receiving municipal funds for having a scripturally based hiring policy consistent with its denomination's tradition, that is, forbidding the hiring of non-celibate LGBTQ people. This has been characterized by the press as an anti-gay hiring policy. That same year, the Crandall Student Association publicly broke with the university's administration over the policy, with the student president at the time telling the CBC, "The Christian faith does say do not judge others. And the Christian faith is all about love. So I feel that this policy – to me – doesn’t seem like it’s following those specific guidelines." In 2013, a year after the controversy erupted, the university opted to not apply for $150,000 in public funding that it had received annually. The university president also issued an apology, stating: "We wish to apologize for anything that Crandall University might possibly have communicated in the past that may have seemed unloving or disrespectful in any way toward any individual or groups."
Affiliations.
Crandall is an affiliate member of the Association of the Registrars of the Universities and Colleges of Canada (ARUCC); a full member of the ARUCC regional association, the Atlantic Association of Registrars and Admissions Officers (AARAO); an active member of Christian Higher Education Canada (CHEC); and an active member of the New Brunswick Association of Private Colleges and Universities. However, Crandall faculty are not members of the Canadian Association of University Teachers (CAUT). In their report, the CAUT found that "while the university has a statement on academic freedom, it is significantly inconsistent with that of the CAUT and the majority of universities across the western world, and assurances that free enquiry is still possible within its constraints are unconvincing." They therefore recommended that Crandall University "be placed on the list of institutions 'found to have imposed a requirement of a commitment to a particular ideology or statement of faith as a condition of employment.'"
The university is affiliated with the Canadian Baptists of Atlantic Canada (Canadian Baptist Ministries). It is a member of the Council for Christian Colleges and Universities.
Library and archives.
Crandall University houses the Baptist Heritage Center whose 300 artifacts preserve the material history of Atlantic Baptists, the Convention of Atlantic Baptist Churches, and its predecessor organizations. The collection and archives includes objects used in worship services, furniture, musical instruments, church building architecture pictures and printed material.
Athletics.
Crandall University is represented in the Atlantic Collegiate Athletic Association (ACAA) by 7 varsity teams. The Chargers teams include men's and women's soccer, basketball, and cross country, and women's volleyball. The Chargers also offer a boxing club program that competes internationally.
The Chargers have won four ACAA banners: women's soccer in 2003–04, men's cross country in 2021–22, and both men's and women's cross country in 2022–23.
|
2027 | Andrew Wiles | Sir Andrew John Wiles (born 11 April 1953) is an English mathematician and a Royal Society Research Professor at the University of Oxford, specialising in number theory. He is best known for proving Fermat's Last Theorem, for which he was awarded the 2016 Abel Prize and the 2017 Copley Medal by the Royal Society. He was appointed Knight Commander of the Order of the British Empire in 2000, and in 2018, was appointed the first Regius Professor of Mathematics at Oxford. Wiles is also a 1997 MacArthur Fellow.
Education and early life.
Wiles was born on 11 April 1953 in Cambridge, England, the son of Maurice Frank Wiles (1923–2005) and Patricia Wiles (née Mowll). From 1952 to 1955, his father worked as the chaplain at Ridley Hall, Cambridge, and later became the Regius Professor of Divinity at the University of Oxford.
Wiles began his formal schooling in Nigeria, while living there as a very young boy with his parents. However, according to letters written by his parents, for at least the first several months after he was supposed to be attending classes, he refused to go. From that fact, Wiles himself concluded that he was not in his earliest years enthusiastic about spending time in academic institutions. He trusts the letters, though he could not remember himself a time when he did not enjoy solving mathematical problems.
Wiles attended King's College School, Cambridge, and The Leys School, Cambridge. Wiles states that he came across Fermat's Last Theorem on his way home from school when he was 10 years old. He stopped at his local library where he found a book "The Last Problem", by Eric Temple Bell, about the theorem. Fascinated by the existence of a theorem that was so easy to state that he, a ten-year-old, could understand it, but that no one had proven, he decided to be the first person to prove it. However, he soon realised that his knowledge was too limited, so he abandoned his childhood dream until it was brought back to his attention at the age of 33 by Ken Ribet's 1986 proof of the epsilon conjecture, which Gerhard Frey had previously linked to Fermat's famous equation.
Career and research.
In 1974, Wiles earned his bachelor's degree in mathematics at Merton College, Oxford. Wiles's graduate research was guided by John Coates, beginning in the summer of 1975. Together they worked on the arithmetic of elliptic curves with complex multiplication by the methods of Iwasawa theory. He further worked with Barry Mazur on the main conjecture of Iwasawa theory over the rational numbers, and soon afterward, he generalised this result to totally real fields.
In 1980, Wiles earned a PhD while at Clare College, Cambridge. After a stay at the Institute for Advanced Study in Princeton, New Jersey, in 1981, Wiles became a Professor of Mathematics at Princeton University.
In 1985–86, Wiles was a Guggenheim Fellow at the Institut des Hautes Études Scientifiques near Paris and at the École Normale Supérieure.
From 1988 to 1990, Wiles was a Royal Society Research Professor at the University of Oxford, and then he returned to Princeton.
From 1994 to 2009, Wiles was a Eugene Higgins Professor at Princeton. He rejoined Oxford in 2011 as Royal Society Research Professor.
In May 2018, Wiles was appointed Regius Professor of Mathematics at Oxford, the first in the university's history.
Proof of Fermat's Last Theorem.
Starting in mid-1986, based on successive progress of the previous few years of Gerhard Frey, Jean-Pierre Serre and Ken Ribet, it became clear that Fermat's Last Theorem could be proven as a corollary of a limited form of the modularity theorem (unproven at the time and then known as the "Taniyama–Shimura–Weil conjecture"). The modularity theorem involved elliptic curves, which was also Wiles's own specialist area.
The conjecture was seen by contemporary mathematicians as important, but extraordinarily difficult or perhaps impossible to prove. For example, Wiles's ex-supervisor John Coates stated that it seemed "impossible to actually prove", and Ken Ribet considered himself "one of the vast majority of people who believed [it] was completely inaccessible", adding that "Andrew Wiles was probably one of the few people on earth who had the audacity to dream that you can actually go and prove [it]."
Despite this, Wiles, with his from-childhood fascination with Fermat's Last Theorem, decided to undertake the challenge of proving the conjecture, at least to the extent needed for Frey's curve. He dedicated all of his research time to this problem for over six years in near-total secrecy, covering up his efforts by releasing prior work in small segments as separate papers and confiding only in his wife.
In June 1993, he presented his proof to the public for the first time at a conference in Cambridge.
In August 1993, it was discovered that the proof contained a flaw in one area. Wiles tried and failed for over a year to repair his proof. According to Wiles, the crucial idea for circumventing—rather than closing—this area came to him on 19 September 1994, when he was on the verge of giving up. Together with his former student Richard Taylor, he published a second paper which circumvented the problem and thus completed the proof. Both papers were published in May 1995 in a dedicated issue of the "Annals of Mathematics."
Awards and honours.
Wiles's proof of Fermat's Last Theorem has stood up to the scrutiny of the world's other mathematical experts. Wiles was interviewed for an episode of the BBC documentary series "Horizon" about Fermat's Last Theorem. This was broadcast as an episode of the PBS science television series "Nova" with the title "The Proof". His work and life are also described in great detail in Simon Singh's popular book "Fermat's Last Theorem".
Wiles has been awarded a number of major prizes in mathematics and science:
Wiles's 1987 certificate of election to the Royal Society reads:
|
2028 | Ambient | Ambient or Ambiance or Ambience may refer to:
|
2029 | Anne Brontë | Anne Brontë (, ; 17 January 1820 – 28 May 1849) was an English novelist and poet, and the youngest member of the Brontë literary family.
Anne Brontë was the daughter of Maria ( Branwell) and Patrick Brontë, a poor Irish clergyman in the Church of England. Anne lived most of her life with her family at the parish of Haworth on the Yorkshire moors. Otherwise, she attended a boarding school in Mirfield between 1836 and 1837, and between 1839 and 1845 lived elsewhere working as a governess. In 1846 she published a book of poems with her sisters and later two novels, initially under the pen name Acton Bell. Her first novel, "Agnes Grey", was published in 1847 at the same time as "Wuthering Heights" by her sister Emily Brontë. Anne’s second novel, "The Tenant of Wildfell Hall", was published in 1848. "The Tenant of Wildfell Hall" is often considered one of the first feminist novels.
Anne died at 29, most likely of pulmonary tuberculosis. After her death, her sister Charlotte edited "Agnes Grey" to fix issues with its first edition, but prevented republication of "The Tenant of Wildfell Hall". As a result, Anne is not as well known as her sisters. Nonetheless, both of her novels are considered classics of English literature.
Family background.
Anne's father was Patrick Brontë (1777–1861). Patrick Brontë was born in a two-room cottage in Emdale, Loughbrickland, County Down, Ireland. He was the oldest of ten children born to Hugh Brunty and Eleanor McCrory, poor Irish peasant farmers. The family surname, "mac Aedh Ó Proinntigh", was Anglicised as Prunty or Brunty. Struggling against poverty, Patrick learned to read and write, and from 1798 taught others. In 1802, at 25, he won a place to study theology at St. John's College, Cambridge. Here he changed his name, Brunty, to the more distinguished sounding Brontë. In 1807, he was ordained in the priesthood in the Church of England. He served as a curate in Essex and then in Wellington, Shropshire. In 1810, he published his first poem, "Winter Evening Thoughts", in a local newspaper. In 1811, he published a collection of moral verse, "Cottage Poems". Also in 1811, he became vicar of St. Peter's Church in Hartshead, Yorkshire. In 1812, he was appointed an examiner in Classics at Woodhouse Grove School, near Bradford. This was a Wesleyan academy where, at 35, he met his future wife, the headmaster's niece, Maria Branwell.
Maria Branwell (1783–1821), Anne's mother, was the daughter of Anne Carne, the daughter of a silversmith, and Thomas Branwell, a successful and property-owning grocer and tea merchant in Penzance. Maria was the eleventh of twelve children and enjoyed the benefits of a prosperous family in a small town. After the death of her parents, Maria went to help her aunt with housekeeping functions at the school. Maria was intelligent and well read, and her strong Methodist faith attracted Patrick Brontë, whose own leanings were similar.
Within three months, on 29 December 1812, though from considerably different backgrounds, Patrick Brontë and Maria Branwell were married. Their first child, Maria (1814–1825), was born after they moved to Hartshead. In 1815, Patrick was appointed curate of the chapel in Market Street Thornton, near Bradford. A second daughter, Elizabeth (1815–1825), was born shortly after. Four more children followed: Charlotte (1816–1855), Patrick Branwell (1817–1848), Emily (1818–1848), and Anne (1820–1849).
Early life.
Anne was the youngest of the Brontë children. She was born on 17 January 1820 on the outskirts of Bradford. Her father, Patrick, was curate there. Anne was baptised there on 25 March 1820. Later Patrick was appointed to the perpetual curacy in Haworth, a small town away. In April 1820 the family moved into the five-roomed Haworth Parsonage.
When Anne was barely a year old her mother, Maria, became ill, probably with uterine cancer. Maria Branwell died on 15 September 1821. Patrick tried to remarry, without success. Maria's sister, Elizabeth Branwell (1776–1842), had moved to the parsonage initially for Maria, but spent the rest of her life there raising Maria's children. She did it from a sense of duty. She was stern and expected respect, not love. There was little affection between her and the older children. According to tradition Anne was her favourite.
In Elizabeth Gaskell's biography of Charlotte, Patrick remembered Anne as precocious. Patrick said that when Anne was four years old he had asked her what a child most wanted and that she had said: "age and experience".
In summer 1824 Patrick sent daughters Maria, Elizabeth, Charlotte, and Emily to Crofton Hall in Crofton, West Yorkshire, and subsequently to the Clergy Daughter's School at Cowan Bridge in Lancashire. Maria and Elizabeth Brontë died of consumption on 6 May and 15 June 1825 respectively, and Charlotte and Emily were brought home. The unexpected deaths distressed the family so much that Patrick could not face sending them away again. They were educated at home for the next five years, largely by Elizabeth Branwell and Patrick. The children made little attempt to mix with others outside the parsonage and relied on each other for company. The bleak moors surrounding Haworth became their playground. Anne shared a room with her aunt, Elizabeth. They were close, and she may have influenced Anne's personality and religious beliefs.
Education.
Anne's studies at home included music and drawing. The Keighley church organist gave piano lessons to Anne and Emily and Branwell, and John Bradley of Keighley gave them art lessons. Each drew with some skill. Their aunt tried to teach the girls how to run a household, but they inclined more to literature. They read much from their father's well-stocked library. Their reading included the Bible, Homer, Virgil, Shakespeare, Milton, Byron, Scott, articles from "Blackwood's Edinburgh Magazine" and "Fraser's Magazine" and "The Edinburgh Review", and miscellaneous books of history and geography and biography.
Their reading fed their imaginations, and their creativity soared after their father gave Branwell a set of toy soldiers in June 1826. They gave names to the soldiers, or the "Twelves", and developed their characters. This led to the creation of an imaginary world: the African kingdom of "Angria", which was illustrated with maps and watercolour renderings. The children devised plots about the inhabitants of Angria and its capital city, "Glass Town", later called Verreopolis or Verdopolis.
Their fantastical worlds and kingdoms gradually acquired characteristics from their historical world, drawing from its sovereigns, armies, heroes, outlaws, fugitives, inns, schools, and publishers. The characters and lands created by the children were given newspapers and magazines and chronicles written in tiny books with writing so small that it was difficult to read without a magnifying glass. These creations and writings were an apprenticeship for their later literary talents.
Juvenilia.
Around 1831, when Anne was eleven, she and Emily broke away from Charlotte and Branwell to create and develop their own fantasy world, "Gondal". Anne and Emily were particularly close, especially after Charlotte left for Roe Head School in January 1831. Charlotte's friend Ellen Nussey visited Haworth in 1833 and reported that Emily and Anne were "like twins" and "inseparable companions". She described Anne so:
Anne took lessons from Charlotte after Charlotte had returned from Roe Head. Charlotte returned to Roe Head as a teacher on 29 July 1835, accompanied by Emily as a pupil. Emily's tuition was largely financed by Charlotte's teaching. Emily was unable to adapt to life at school and was physically ill from homesickness within a few months. She was withdrawn from school by October and replaced by Anne.
Anne was 15 and it was her first time away from home. She made few friends at Roe Head. She was quiet and hardworking and determined to stay to acquire the education which she would need to support herself. She stayed for two years and returned home only during Christmas and summer holidays. She won a good-conduct medal in December 1836. Charlotte's letters almost never mention Anne while Anne was at Roe Head, which might imply that they were not close, but Charlotte was at least concerned about Anne's health. By December 1837 Anne had become seriously ill with gastritis and embroiled in religious crisis. A Moravian minister was called to see her several times during her illness, suggesting her distress was caused, in part, by conflict with the local Anglican clergy. Charlotte wrote to their father and he brought Anne home.
Employment at Blake Hall.
A year after leaving the school, and aged 19, Anne was seeking a teaching position. She was the daughter of a poor clergyman and needed to earn money. Her father had no private income and the parsonage would revert to the church on his death. Teaching or working as a governess were among few options for a poor and educated woman. In April 1839 Anne started work as a governess for the Ingham family at Blake Hall, near Mirfield.
The children in her charge were spoiled and disobedient. Anne had great difficulty controlling them and little success in educating them. She was not allowed to punish them, and when she complained about their behaviour she received no support and was criticised for being incapable. The Inghams were dissatisfied with their children's progress and dismissed Anne. She returned home in 1839 at Christmas. At home also were Charlotte and Emily, who had left their positions, and Branwell. Anne's time at Blake Hall was so traumatic that she reproduced it in almost perfect detail in her novel "Agnes Grey".
William Weightman.
Anne returned to Haworth and met William Weightman (1814–1842), her father's new curate who had started work in the parish in August 1839. Weightman was 25 and had obtained a two-year licentiate in theology from the University of Durham. He was welcome at the parsonage. Anne's acquaintance with him parallels her writing a number of poems, which may suggest she fell in love with him although there is disagreement over this possibility. Little evidence exists beyond a small anecdote of Charlotte's to Ellen Nussey in January 1842.
In "Agnes Grey", Agnes' interest in the curate refreshes her interest in poetry. Outside fiction, William Weightman aroused much curiosity. It seems that he was good-looking and engaging, and that his easy humour and kindness towards the sisters made an impression. It is such a character that she portrays in Edward Weston, and that her heroine Agnes Grey finds deeply appealing.
Weightman died of cholera in the same year. Anne expressed her grief for his death in her poem "I will not mourn thee, lovely one", in which she called him "our darling".
Governess.
From 1840 to 1845 Anne worked at Thorp Green Hall, a comfortable country house near York. Here she was governess to the children of the Reverend Edmund Robinson and his wife, Lydia. The house appeared as Horton Lodge in "Agnes Grey". Anne had four pupils: Lydia (15), Elizabeth (13), Mary (12), and Edmund (8). She initially had problems similar to those at Blake Hall. Anne missed her home and family. In a diary paper in 1841 she wrote that she did not like her situation and wished to leave it. Her quiet and gentle disposition did not help. But Anne was determined and made a success of her position, becoming well-liked by her employers. Her charges, the Robinson girls, became lifelong friends.
Anne spent only five or six weeks a year with her family, during holidays at Christmas and in June. The rest of her time was spent with the Robinsons. She accompanied the Robinsons on annual holidays to Scarborough. Between 1840 and 1844 Anne spent around five weeks each summer at the coastal town and loved it. A number of locations in Scarborough were used for her novels. She had opportunities to collect semi-precious stones, considering an interest in geology, at least in her novels, or from personal experience, as something suitable for men and women to be considered as equals.
Anne and her sisters considered setting up a school while she was still working for the Robinsons. Various locations were considered, including the parsonage, but the project never materialised. Anne came home on the death of her aunt in early November 1842 while her sisters were in Brussels. Elizabeth Branwell left a £350 legacy (equivalent to £ in ) for each of her nieces.
It was at the Long Plantation at Thorp Green in 1842 that Anne wrote her three-verse poem "Lines Composed in a Wood on a Windy Day", which was published in 1846 under the name Acton Bell.
In January 1843 Anne returned to Thorp Green and secured a position for Branwell. He was to tutor Edmund, who was growing too old to be in Anne's care. Branwell did not live in the house as Anne did. Anne's vaunted calm appears to have been the result of hard-fought battles, balancing deeply felt emotions with careful thought, a sense of responsibility and resolute determination. All three Brontë sisters worked as governesses or teachers, and all experienced problems controlling their charges, gaining support from their employers, and coping with homesickness, but Anne was the only one who persevered and made a success of her work.
Back at the parsonage.
Anne and Branwell taught at Thorp Green for the next three years. Branwell entered into a secret relationship with his employer's wife, Lydia Robinson. When Anne and Branwell returned home for the holidays in June 1846 Anne resigned. Anne gave no reason, but the reason may have been the relationship between her brother and Mrs Robinson. Branwell was dismissed when his employer found out about the relationship. Anne continued to exchange letters with Elizabeth and Mary Robinson. They came to visit Anne in December 1848.
Anne took Emily to visit some of the places which Anne had become fond of. A plan to visit Scarborough fell through, but they went to York and saw York Minster.
A book of poems.
The Brontës were at home with their father during the summer of 1845. None had any immediate prospect of employment. Charlotte found Emily's poems, which had been shared only with Anne. Charlotte said that they should be published. Anne showed her own poems to Charlotte, and Charlotte "thought that these verses too had a sweet sincere pathos of their own". The sisters eventually reached an agreement. They told nobody what they were doing. With the money from Elizabeth Branwell they paid for publication of a collection of poems, 21 from Anne and 21 from Emily and 19 from Charlotte.
The book was published under pen names which retained their initials but concealed their sex. Anne's pseudonym was Acton Bell. "Poems by Currer, Ellis, and Acton Bell" was available for sale in May 1846. The cost of publication was 31 pounds and 10 shillings, about three-quarters of Anne's salary at Thorp Green. On 7 May 1846 the first three copies were delivered to Haworth Parsonage. The book achieved three somewhat favourable reviews, but was a commercial failure, with only two copies sold in the first year. Anne nonetheless found a market for her later poetry. The "Leeds Intelligencer" and "Fraser's Magazine" published her poem "The Narrow Way" under her pseudonym in December 1848. Four months earlier, Fraser's Magazine had published her poem "The Three Guides".
Novels.
"Agnes Grey".
By July 1846 a package containing the manuscripts of each sister's first novel was making the rounds of London publishers. Charlotte had written "The Professor", Emily had written "Wuthering Heights", and Anne had written "Agnes Grey".
After some rejections "Wuthering Heights" and "Agnes Grey" were accepted by the publisher Thomas Cautley Newby. "The Professor" was rejected. It was not long before Charlotte had completed her second novel, "Jane Eyre". "Jane Eyre" was accepted immediately by Smith, Elder & Co. It was the first published of the sisters' novels, and an immediate and resounding success. Meanwhile, Anne and Emily's novels "lingered in the press". Anne and Emily were obliged to pay fifty pounds to help meet their publishing costs. Their publisher was galvanised by the success of "Jane Eyre" and published "Wuthering Heights" and "Agnes Grey" together in December 1847. They sold well, but "Agnes Grey" was outshone by Emily's more dramatic "Wuthering Heights".
"The Tenant of Wildfell Hall".
Anne's second novel, "The Tenant of Wildfell Hall", was published in the last week of June 1848.
The novel challenged contemporary social and legal structures. In 1913 May Sinclair said that the slamming of Helen Huntingdon's bedroom door against her husband reverberated throughout Victorian England.
In the book Helen has left her husband to protect their son from his influence. She supports herself and her son in hiding by painting. She has violated social conventions and English law. Until the Married Women's Property Act 1870 was passed, a married woman had no legal existence independent from her husband and could not own property nor sue for divorce nor control the custody of her children. Helen's husband had a right to reclaim her and charge her with kidnapping. By subsisting on her own income she was stealing her husband's property since this income was legally his.
Anne stated her intentions in the second edition, published in August 1848. She presented a forceful rebuttal to critics (among them Charlotte) who considered her portrayal of Huntingdon overly graphic and disturbing. Anne "wished to tell the truth". She explained further that
Anne also castigated reviewers who speculated on the sex of authors and the perceived appropriateness of their writing. She was
London visit.
In July 1848 Anne and Charlotte went to Charlotte's publisher George Smith in London to dispel the rumour that the "Bell brothers" were one person. Emily refused to go. Anne and Charlotte spent several days with Smith. Many years after Anne's death, he wrote in the "Cornhill Magazine" his impressions of her:
The increasing popularity of the Bells' works led to renewed interest in "Poems by Currer, Ellis, and Acton Bell", originally published by Aylott and Jones. The remaining print run was bought by Smith and Elder, and reissued under new covers in November 1848. It still sold poorly.
Family tragedies.
Branwell's persistent drunkenness disguised the decline of his health and he died on 24 September 1848. His sudden death shocked the family. He was 31. The cause was recorded as chronic bronchitismarasmus, but was probably tuberculosis.
The family suffered from coughs and colds during the winter of 1848, and Emily became very ill. She worsened over two months and rejected medical aid until the morning of 19 December. She was very weak and said that "if you will send for a doctor, I will see him now". But Emily died at about two o'clock that afternoon, aged 30.
Emily's death deeply affected Anne. Her grief undermined her physical health. Over Christmas Anne had influenza. Her symptoms intensified and in early January her father sent for a Leeds physician. The doctor diagnosed advanced consumption with little hope of recovery. Anne met the news with characteristic determination and self-control. However, in her letter to Ellen Nussey she expressed her frustrated ambitions:
Unlike Emily, Anne took all the recommended medicines and followed the advice she was given. That same month she wrote her last poem, "A dreadful darkness closes in", in which she deals with being terminally ill. Her health fluctuated for months, but she grew thinner and weaker.
Death.
Anne seemed somewhat better in February. She decided to visit Scarborough to see if the change of location and the fresh sea air might benefit her. Charlotte was initially against the journey, fearing that it would be too stressful, but changed her mind after the doctor's approval and Anne's assurance that it was her last hope.
On 24 May 1849 Anne set off for Scarborough with Charlotte and Ellen Nussey. They spent a day and night in York en route. Here they escorted Anne in a wheelchair and did some shopping and visited York Minster. It was clear that Anne had little strength left.
On Sunday 27 May Anne asked Charlotte whether it would be easier to return home and die instead of remaining in Scarborough. A doctor was consulted the next day and said that death was close. Anne received the news quietly. She expressed her love and concern for Ellen and Charlotte, and whispered for Charlotte to "take courage". Anne died at about two o'clock in the afternoon on Monday 28 May 1849, aged 29.
Charlotte decided to "lay the flower where it had fallen". So Anne was buried in Scarborough. The funeral was held on 30 May. Patrick Brontë could not have made the journey if he had wished to. The former schoolmistress at Roe Head, Miss Wooler, was in Scarborough, and she was the only other mourner at Anne's funeral. Anne was buried in St Mary's churchyard, beneath the castle walls and overlooking the bay. Charlotte commissioned a stone to be placed over her grave with the inscription, When Charlotte visited the grave three years later she discovered multiple errors on the headstone and had it refaced. But this was not free of error. For Anne was 29 when she died, not 28 as written.
In 2011 the Brontë Society installed a new plaque at Anne Brontë's grave. The original gravestone had become illegible at places and could not be restored. It was left undisturbed while the new plaque was laid horizontally, interpreting the fading words of the original and correcting its error. In April 2013 the Brontë Society held a dedication and blessing service at the gravesite to mark the installation of the new plaque.
Reputation.
After Anne's death Charlotte addressed issues with the first edition of "Agnes Grey" for its republication, but she prevented republication of "The Tenant of Wildfell Hall". In 1850 Charlotte wrote that Subsequent critics paid less attention to Anne's work and some dismissed her as "a Brontë without genius".
But since the mid-20th century her life and works have been given better attention. Biographies by Winifred Gérin (1959), Elizabeth Langland (1989) and Edward Chitham (1991), as well as Juliet Barker's group biography, "The Brontës" (1994; revised edition 2000), and work by critics such as Inga-Stina Ewbank, Marianne Thormählen, Laura C Berry, Jan B Gordon, Mary Summers, and Juliet McMaster has led to acceptance of Anne Brontë as a major literary figure. Sally McDonald of the Brontë Society said in 2013 that in some ways Anne "is now viewed as the most radical of the sisters, writing about tough subjects such as women's need to maintain independence and how alcoholism can tear a family apart." In 2016 Lucy Mangan championed Anne Brontë in the BBC's "Being the Brontës", declaring that "her time has come".
|
2030 | Augustine of Hippo | Augustine of Hippo ( , ; ; 13 November 354 – 28 August 430), also known as Saint Augustine, was a theologian and philosopher of Berber origin and the bishop of Hippo Regius in Numidia, Roman North Africa. His writings influenced the development of Western philosophy and Western Christianity, and he is viewed as one of the most important Church Fathers of the Latin Church in the Patristic Period. His many important works include "The City of God", "On Christian Doctrine", and "Confessions".
According to his contemporary, Jerome, Augustine "established anew the ancient Faith". In his youth he was drawn to the eclectic Manichaean faith, and later to the Hellenistic philosophy of Neoplatonism. After his conversion to Christianity and baptism in 386, Augustine developed his own approach to philosophy and theology, accommodating a variety of methods and perspectives. Believing the grace of Christ was indispensable to human freedom, he helped formulate the doctrine of original sin and made significant contributions to the development of just war theory. When the Western Roman Empire began to disintegrate, Augustine imagined the Church as a spiritual City of God, distinct from the material Earthly City. The segment of the Church that adhered to the concept of the Trinity as defined by the Council of Nicaea and the Council of Constantinople closely identified with Augustine's "On the Trinity".
Augustine is recognized as a saint in the Catholic Church, the Eastern Orthodox Church, the Lutheran Churches and the Anglican Communion. He is also a preeminent Catholic Doctor of the Church and the patron of the Augustinians. His memorial is celebrated on 28 August, the day of his death. Augustine is the patron saint of brewers, printers, theologians, and a number of cities and dioceses. His thoughts profoundly influenced the medieval worldview. Many Protestants, especially Calvinists and Lutherans, consider him one of the theological fathers of the Protestant Reformation due to his teachings on salvation and divine grace. Protestant Reformers generally, and Martin Luther in particular, held Augustine in preeminence among early Church Fathers. From 1505 to 1521, Luther was a member of the Order of the Augustinian Eremites.
In the East, his teachings are more disputed, and were notably attacked by John Romanides, but other theologians and figures of the Eastern Orthodox Church have shown significant approbation of his writings, chiefly Georges Florovsky. The most controversial doctrine associated with him, the filioque, was rejected by the Eastern Orthodox Church. Other disputed teachings include his views on original sin, the doctrine of grace, and predestination. Though considered to be mistaken on some points, he is still considered a saint and has influenced some Eastern Church Fathers, most notably Gregory Palamas. In the Greek and Russian Orthodox Churches, his feast day is celebrated on 15 June. The historian Diarmaid MacCulloch has written: "Augustine's impact on Western Christian thought can hardly be overstated; only his beloved example, Paul of Tarsus, has been more influential, and Westerners have generally seen Paul through Augustine's eyes."
Life.
Background.
Augustine of Hippo, also known as "Saint Augustine" or "Saint Austin", is known by various cognomens throughout the many denominations of the Christian world, including "Blessed Augustine" and the "Doctor of Grace" ().
Hippo Regius, where Augustine was the bishop, was in modern-day Annaba, Algeria.
Childhood and education.
Augustine was born in 354 in the municipium of Thagaste (now Souk Ahras, Algeria) in the Roman province of Numidia. His mother, Monica or Monnica, was a devout Christian; his father Patricius was a pagan who converted to Christianity on his deathbed. He had a brother named Navigius and a sister whose name is lost but is conventionally remembered as Perpetua.
Scholars generally agree that Augustine and his family were Berbers, an ethnic group indigenous to North Africa, but were heavily Romanized, speaking only Latin at home as a matter of pride and dignity. In his writings, Augustine leaves some information as to the consciousness of his African heritage, at least geographically and perhaps ethnically. For example, he refers to Apuleius as "the most notorious of us Africans," to Ponticianus as "a country man of ours, insofar as being African," and to Faustus of Mileve as "an African Gentleman".
Augustine's family name, Aurelius, suggests his father's ancestors were freedmen of the "gens Aurelia" given full Roman citizenship by the Edict of Caracalla in 212. Augustine's family had been Roman, from a legal standpoint, for at least a century when he was born. It is assumed that his mother, Monica, was of Berber origin, on the basis of her name, but as his family were "honestiores", an upper class of citizens known as honorable men, Augustine's first language was likely Latin.
At the age of 11, Augustine was sent to school at Madaurus (now M'Daourouch), a small Numidian city about south of Thagaste. There he became familiar with Latin literature, as well as pagan beliefs and practices. His first insight into the nature of sin occurred when he and a number of friends stole fruit they did not want from a neighborhood garden. He tells this story in his autobiography, "Confessions". He remembers he stole the fruit, not because he was hungry, but because "it was not permitted." His very nature, he says, was flawed. "It was foul, and I loved it. I loved my own error—not that for which I erred, but the error itself." From this incident he concluded the human person is naturally inclined to sin, and in need of the grace of Christ.
At the age of 17, through the generosity of his fellow citizen Romanianus, Augustine went to Carthage to continue his education in rhetoric, though it was above the financial means of his family. In spite of the good warnings of his mother, as a youth Augustine lived a hedonistic lifestyle for a time, associating with young men who boasted of their sexual exploits. The need to gain their acceptance encouraged inexperienced boys like Augustine to seek or make up stories about sexual experiences. Despite multiple claims to the contrary, it has been suggested that Augustine's actual sexual experiences were likely with members of the opposite sex only.
It was while he was a student in Carthage that he read Cicero's dialogue "Hortensius" (now lost), which he described as leaving a lasting impression, enkindling in his heart the love of wisdom and a great thirst for truth. It started his interest in philosophy. Although raised Christian, Augustine became a Manichaean, much to his mother's chagrin.
At about the age of 17, Augustine began a relationship with a young woman in Carthage. Though his mother wanted him to marry a person of his class, the woman remained his lover. He was warned by his mother to avoid fornication (sex outside marriage), but Augustine persisted in the relationship for over fifteen years, and the woman gave birth to his son Adeodatus (372–388), which means "Gift from God", who was viewed as extremely intelligent by his contemporaries. In 385, Augustine ended his relationship with his lover in order to prepare to marry a teenaged heiress. By the time he was able to marry her, however, he had decided to become a Christian priest and the marriage did not happen.
Augustine was, from the beginning, a brilliant student, with an eager intellectual curiosity, but he never mastered Greek – his first Greek teacher was a brutal man who constantly beat his students, and Augustine rebelled and refused to study. By the time he realized he needed to know Greek, it was too late; and although he acquired a smattering of the language, he was never eloquent with it. He did however, become a master of Latin.
Move to Carthage, Rome, and Milan.
Augustine taught grammar at Thagaste during 373 and 374. The following year he moved to Carthage to conduct a school of rhetoric and remained there for the next nine years. Disturbed by unruly students in Carthage, he moved to establish a school in Rome, where he believed the best and brightest rhetoricians practiced, in 383. However, Augustine was disappointed with the apathetic reception. It was the custom for students to pay their fees to the professor on the last day of the term, and many students attended faithfully all term, and then did not pay.
Manichaean friends introduced him to the prefect of the City of Rome, Symmachus, who had been asked by the imperial court at Milan to provide a rhetoric professor. Augustine won the job and headed north to take his position in Milan in late 384. Thirty years old, he had won the most visible academic position in the Latin world at a time when such posts gave ready access to political careers.
Although Augustine spent ten years as a Manichaean, he was never an initiate or "elect", but an "auditor", the lowest level in this religion's hierarchy. While still at Carthage a disappointing meeting with the Manichaean bishop, Faustus of Mileve, a key exponent of Manichaean theology, started Augustine's scepticism of Manichaeanism. In Rome, he reportedly turned away from Manichaeanism, embracing the scepticism of the New Academy movement. Because of his education, Augustine had great rhetorical prowess and was very knowledgeable of the philosophies behind many faiths. At Milan, his mother's religiosity, Augustine's own studies in Neoplatonism, and his friend Simplicianus all urged him towards Christianity. This was shortly after the Roman emperor Theodosius I declared Christianity to be the only legitimate religion for the Roman Empire on 27 February 380 by the Edict of Thessalonica and then issued a decree of death for all Manichaean monks in 382. Initially Augustine was not strongly influenced by Christianity and its ideologies, but after coming in contact with Ambrose of Milan, Augustine reevaluated himself and was forever changed.
Augustine arrived in Milan and visited Ambrose, having heard of his reputation as an orator. Like Augustine, Ambrose was a master of rhetoric, but older and more experienced. Soon, their relationship grew, as Augustine wrote, "And I began to love him, of course, not at the first as a teacher of the truth, for I had entirely despaired of finding that in thy Church—but as a friendly man." Augustine was very much influenced by Ambrose, even more than by his own mother and others he admired. In his "Confessions", Augustine states, "That man of God received me as a father would, and welcomed my coming as a good bishop should." Ambrose adopted Augustine as a spiritual son after the death of Augustine's father.
Augustine's mother had followed him to Milan and arranged a respectable marriage for him. Although Augustine acquiesced, he had to dismiss his concubine and grieved for having forsaken his lover. He wrote, "My mistress being torn from my side as an impediment to my marriage, my heart, which clave to her, was racked, and wounded, and bleeding." Augustine confessed he had not been a lover of wedlock so much as a slave of lust, so he procured another concubine since he had to wait two years until his fiancée came of age. However, his emotional wound was not healed. It was during this period that he uttered his famously insincere prayer, "Grant me chastity and continence, but not yet."
There is evidence Augustine may have considered this former relationship to be equivalent to marriage. In his "Confessions", he admitted the experience eventually produced a decreased sensitivity to pain. Augustine eventually broke off his engagement to his eleven-year-old fiancée, but never renewed his relationship with either of his concubines. Alypius of Thagaste steered Augustine away from marriage, saying they could not live a life together in the love of wisdom if he married. Augustine looked back years later on the life at Cassiciacum, a villa outside of Milan where he gathered with his followers, and described it as "Christianae vitae otium" – the leisure of Christian life.
Conversion to Christianity and priesthood.
In late August of 386, at the age of 31, having heard of Ponticianus's and his friends' first reading of the life of Anthony of the Desert, Augustine converted to Christianity. As Augustine later told it, his conversion was prompted by hearing a child's voice say "take up and read" (). Resorting to the "sortes biblicae", he opened a book of St. Paul's writings (codex apostoli, 8.12.29) at random and read Romans 13: 13–14: "Not in rioting and drunkenness, not in chambering and wantonness, not in strife and envying, but put on the Lord Jesus Christ, and make no provision for the flesh to fulfill the lusts thereof."
He later wrote an account of his conversion in his "Confessions" (), which has since become a classic of Christian theology and a key text in the history of autobiography. This work is an outpouring of thanksgiving and penitence. Although it is written as an account of his life, the "Confessions" also talks about the nature of time, causality, free will, and other important philosophical topics. The following is taken from that work:
Ambrose baptized Augustine and his son Adeodatus, in Milan on Easter Vigil, 24–25 April 387. A year later, in 388, Augustine completed his apology "On the Holiness of the Catholic Church". That year, also, Adeodatus and Augustine returned home to Africa. Augustine's mother Monica died at Ostia, Italy, as they prepared to embark for Africa. Upon their arrival, they began a life of aristocratic leisure at Augustine's family's property. Soon after, Adeodatus, too, died. Augustine then sold his patrimony and gave the money to the poor. He only kept the family house, which he converted into a monastic foundation for himself and a group of friends. Furthermore, while he was known for his major contributions regarding Christian rhetoric, another major contribution was his preaching style.
After converting to Christianity, Augustine turned against his profession as a rhetoric professor in order to devote more time to preaching. In 391 Augustine was ordained a priest in Hippo Regius (now Annaba), in Algeria. He was especially interested in discovering how his previous rhetorical training in Italian schools would help the Christian Church achieve its objective of discovering and teaching the different scriptures in the Bible. He became a famous preacher (more than 350 preserved sermons are believed to be authentic), and was noted for combating the Manichaean religion, to which he had formerly adhered. He preached around 6,000 to 10,000 sermons when he was alive; however, there are only around 500 sermons that are accessible today. When Augustine preached his sermons, they were recorded by stenographers. Some of his sermons would last over one hour and he would preach multiple times throughout a given week. When talking to his audience, he would stand on an elevated platform; however, he would walk towards the audience during his sermons. When he was preaching, he used a variety of rhetorical devices that included analogies, word pictures, similes, metaphors, repetition, and antithesis when trying to explain more about the Bible. In addition, he used questions and rhymes when talking about the differences between people's life on Earth and Heaven as seen in one of his sermons that was preached in 412 AD. Augustine believed that the preachers' ultimate goal is to ensure the salvation of their audience.
In 395, he was made coadjutor Bishop of Hippo and became full Bishop shortly thereafter, hence the name "Augustine of Hippo"; and he gave his property to the church of Thagaste. He remained in that position until his death in 430. Bishops were the only individuals allowed to preach when he was alive and he scheduled time to preach after being ordained despite a busy schedule made up of preparing sermons and preaching at other churches besides his own. When serving as the Bishop of Hippo, his goal was to minister to individuals in his congregation and he would choose the passages that the church planned to read every week. As bishop, he believed that it was his job to interpret the work of the Bible. He wrote his autobiographical "Confessions" in 397–398. His work "The City of God" was written to console his fellow Christians shortly after the Visigoths had sacked Rome in 410. Augustine worked tirelessly to convince the people of Hippo to convert to Christianity. Though he had left his monastery, he continued to lead a monastic life in the episcopal residence.
Much of Augustine's later life was recorded by his friend Possidius, bishop of Calama (present-day Guelma, Algeria), in his "Sancti Augustini Vita". During this latter part of Augustine's life, he helped lead a large community of Christians against different political and religious factors which had major influence on his writings. Possidius admired Augustine as a man of powerful intellect and a stirring orator who took every opportunity to defend Christianity against its detractors. Possidius also described Augustine's personal traits in detail, drawing a portrait of a man who ate sparingly, worked tirelessly, despised gossip, shunned the temptations of the flesh, and exercised prudence in the financial stewardship of his see.
Death and sainthood.
Shortly before Augustine's death, the Vandals, a Germanic tribe that had converted to Arianism, invaded Roman Africa. The Vandals besieged Hippo in the spring of 430, when Augustine entered his final illness. According to Possidius, one of the few miracles attributed to Augustine, the healing of an ill man, took place during the siege. Augustine has been cited to have excommunicated himself upon the approach of his death in an act of public penance and solidarity with sinners. Spending his final days in prayer and repentance, he requested the penitential Psalms of David be hung on his walls so he could read them and upon which led him to "[weep] freely and constantly" according to Posiddius' biography. He directed the library of the church in Hippo and all the books therein should be carefully preserved. He died on 28 August 430. Shortly after his death, the Vandals lifted the siege of Hippo, but they returned soon after and burned the city. They destroyed all but Augustine's cathedral and library, which they left untouched.
Augustine was canonized by popular acclaim, and later recognized as a Doctor of the Church in 1298 by Pope Boniface VIII. His feast day is 28 August, the day on which he died. He is considered the patron saint of brewers, printers, theologians, and a number of cities and dioceses. He is invoked against sore eyes.
Augustine is remembered in the Church of England's calendar of saints with a lesser festival on 28 August.
Relics.
According to Bede's "True Martyrology", Augustine's body was later translated or moved to Cagliari, Sardinia, by the Catholic bishops expelled from North Africa by Huneric. Around 720, his remains were transported again by Peter, bishop of Pavia and uncle of the Lombard king Liutprand, to the church of San Pietro in Ciel d'Oro in Pavia, in order to save them from frequent coastal raids by Saracens. In January 1327, Pope John XXII issued the papal bull "Veneranda Santorum Patrum", in which he appointed the Augustinians guardians of the tomb of Augustine (called "Arca"), which was remade in 1362 and elaborately carved with bas-reliefs of scenes from Augustine's life.
In October 1695, some workmen in the Church of San Pietro in Ciel d'Oro in Pavia discovered a marble box containing human bones (including part of a skull). A dispute arose between the Augustinian hermits (Order of Saint Augustine) and the regular canons (Canons Regular of Saint Augustine) as to whether these were the bones of Augustine. The hermits did not believe so; the canons affirmed they were. Eventually Pope Benedict XIII (1724–1730) directed the Bishop of Pavia, Monsignor Pertusati, to make a determination. The bishop declared that, in his opinion, the bones were those of Augustine.
The Augustinians were expelled from Pavia in 1700, taking refuge in Milan with the relics of Augustine, and the disassembled "Arca", which were removed to the cathedral there. San Pietro fell into disrepair, but was finally rebuilt in the 1870s, under the urging of Agostino Gaetano Riboldi, and reconsecrated in 1896 when the relics of Augustine and the shrine were once again reinstalled.
In 1842, a portion of Augustine's right arm (cubitus) was secured from Pavia and returned to Annaba. It now rests in the Saint Augustin Basilica within a glass tube inserted into the arm of a life-size marble statue of the saint.
Views and thought.
Augustine's large contribution of writings covered diverse fields including theology, philosophy and sociology. Along with John Chrysostom, Augustine was among the most prolific scholars of the early church by quantity.
Theology.
Christian anthropology.
Augustine was one of the first Christian ancient Latin authors with a very clear vision of theological anthropology. He saw the human being as a perfect unity of soul and body. In his late treatise "" (420) he exhorted respect for the body on the grounds it belonged to the very nature of the human person. Augustine's favourite figure to describe "body-soul" unity is marriage: "caro tua, coniunx tua – your body is your wife".
Initially, the two elements were in perfect harmony. After the fall of humanity they are now experiencing dramatic combat between one another. They are two categorically different things. The body is a three-dimensional object composed of the four elements, whereas the soul has no spatial dimensions. Soul is a kind of substance, participating in reason, fit for ruling the body.
Augustine was not preoccupied, as Plato and Descartes were, in detailed efforts to explain the metaphysics of the soul-body union. It sufficed for him to admit they are metaphysically distinct: to be a human is to be a composite of soul and body, with the soul superior to the body. The latter statement is grounded in his hierarchical classification of things into those that merely exist, those that exist and live, and those that exist, live, and have intelligence or reason.
Like other Church Fathers such as Athenagoras, Tertullian, Clement of Alexandria and Basil of Caesarea, Augustine "vigorously condemned the practice of induced abortion", and although he disapproved of an abortion during any stage of pregnancy, he made a distinction between early and later abortions. He acknowledged the distinction between "formed" and "unformed" fetuses mentioned in the Septuagint translation of Exodus 21:22–23, which incorrectly translates the word "harm" (from the original Hebrew text) as "form" in the Koine Greek of the Septuagint. His view was based on the Aristotelian distinction "between the fetus before and after its supposed 'vivification. Therefore, he did not classify as murder the abortion of an "unformed" fetus since he thought it could not be known with certainty the fetus had received a soul.
Augustine held that "the timing of the infusion of the soul was a mystery known to God alone". However, he considered procreation as "one of the goods of marriage; abortion figured as a means, along with drugs which cause sterility, of frustrating this good. It lay along a continuum which included infanticide as an instance of 'lustful cruelty' or 'cruel lust.' Augustine called the use of means to avoid the birth of a child an 'evil work:' a reference to either abortion or contraception or both."
Creation.
In "City of God", Augustine rejected both the contemporary ideas of ages (such as those of certain Greeks and Egyptians) that differed from the Church's sacred writings. In "The Literal Interpretation of Genesis", Augustine argued that God had created everything in the universe simultaneously and not over a period of six days. He argued the six-day structure of creation presented in the Book of Genesis represents a logical framework, rather than the passage of time in a physical way – it would bear a spiritual, rather than physical, meaning, which is no less literal. One reason for this interpretation is the passage in Sirach 18:1, "creavit omnia simul" ("He created all things at once"), which Augustine took as proof that the days of Genesis 1 had to be taken non-literalistically. As an additional support for describing the six days of creation as a heuristic device, Augustine thought the actual event of creation would be incomprehensible by humans and therefore needed to be translated.
Augustine also does not envision original sin as causing structural changes in the universe, and even suggests that the bodies of Adam and Eve were already created mortal before the Fall.
Ecclesiology.
Augustine developed his doctrine of the Church principally in reaction to the Donatist sect. He taught there is one Church, but within this Church there are two realities, namely, the visible aspect (the institutional hierarchy, the Catholic sacraments, and the laity) and the invisible (the souls of those in the Church, who are either dead, sinful members or elect predestined for Heaven). The former is the institutional body established by Christ on earth which proclaims salvation and administers the sacraments, while the latter is the invisible body of the elect, made up of genuine believers from all ages, and who are known only to God. The Church, which is visible and societal, will be made up of "wheat" and "tares", that is, good and wicked people (as per Mat. 13:30), until the end of time. This concept countered the Donatist claim that only those in a state of grace were the "true" or "pure" church on earth, and that priests and bishops who were not in a state of grace had no authority or ability to confect the sacraments.
Augustine's ecclesiology was more fully developed in "City of God". There he conceives of the church as a heavenly city or kingdom, ruled by love, which will ultimately triumph over all earthly empires which are self-indulgent and ruled by pride. Augustine followed Cyprian in teaching that bishops and priests of the Church are the successors of the Apostles, and their authority in the Church is God-given.
The concept of Church invisible was advocated by Augustine as part of his refutation of the Donatist sect, though he, as other Church Fathers before him, saw the invisible Church and visible Church as one and the same thing, unlike the later Protestant reformers who did not identify the Catholic Church as the true church. He was strongly influenced by the Platonist belief that true reality is invisible and that, if the visible reflects the invisible, it does so only partially and imperfectly (see Theory of Forms). Others question whether Augustine really held to some form of an "invisible true Church" concept.
Eschatology.
Augustine originally believed in premillennialism, namely that Christ would establish a literal 1,000-year kingdom prior to the general resurrection, but later rejected the belief, viewing it as carnal. During the medieval period, the Catholic Church built its system of eschatology on Augustinian amillennialism, where Christ rules the earth spiritually through his triumphant church.
During the Reformation, theologians such as John Calvin accepted amillennialism. Augustine taught that the eternal fate of the soul is determined at death, and that purgatorial fires of the intermediate state purify only those who died in communion with the Church. His teaching provided fuel for later theology.
Mariology.
Although Augustine did not develop an independent Mariology, his statements on Mary surpass in number and depth those of other early writers. Even before the Council of Ephesus, he defended the Ever-Virgin Mary as the Mother of God, believing her to be "full of grace" (following earlier Latin writers such as Jerome) on account of her sexual integrity and innocence. Likewise, he affirmed that the Virgin Mary "conceived as virgin, gave birth as virgin and stayed virgin forever".
Natural knowledge and biblical interpretation.
Augustine took the view that, if a literal interpretation contradicts science and humans' God-given reason, the biblical text should be interpreted metaphorically. While each passage of Scripture has a literal sense, this "literal sense" does not always mean the Scriptures are mere history; at times they are rather an extended metaphor.
Original sin.
Augustine taught that the sin of Adam and Eve was either an act of foolishness ("insipientia") followed by pride and disobedience to God or that pride came first. The first couple disobeyed God, who had told them not to eat of the Tree of the knowledge of good and evil (Gen 2:17). The tree was a symbol of the order of creation. Self-centeredness made Adam and Eve eat of it, thus failing to acknowledge and respect the world as it was created by God, with its hierarchy of beings and values.
They would not have fallen into pride and lack of wisdom if Satan had not sown into their senses "the root of evil" ("radix Mali"). Their nature was wounded by concupiscence or libido, which affected human intelligence and will, as well as affections and desires, including sexual desire. In terms of metaphysics, concupiscence is not a state of being but a bad quality, the privation of good or a wound.
Augustine's understanding of the consequences of original sin and the necessity of redeeming grace was developed in the struggle against Pelagius and his Pelagian disciples, Caelestius and Julian of Eclanum, who had been inspired by Rufinus of Syria, a disciple of Theodore of Mopsuestia. They refused to agree original sin wounded human will and mind, insisting human nature was given the power to act, to speak, and to think when God created it. Human nature cannot lose its moral capacity for doing good, but a person is free to act or not act in a righteous way. Pelagius gave an example of eyes: they have capacity for seeing, but a person can make either good or bad use of it.
Pelagians insisted human affections and desires were not touched by the fall either. Immorality, e.g. fornication, is exclusively a matter of will, i.e. a person does not use natural desires in a proper way. In opposition, Augustine pointed out the apparent disobedience of the flesh to the spirit, and explained it as one of the results of original sin, punishment of Adam and Eve's disobedience to God.
Augustine had served as a "Hearer" for the Manichaeans for about nine years, who taught that the original sin was carnal knowledge. But his struggle to understand the cause of evil in the world started before that, at the age of nineteen. By "malum" (evil) he understood most of all concupiscence, which he interpreted as a vice dominating people and causing in men and women moral disorder. Agostino Trapè insists Augustine's personal experience cannot be credited for his doctrine about concupiscence. He considers Augustine's marital experience to be quite normal, and even exemplary, aside from the absence of Christian wedding rites. As J. Brachtendorf showed, Augustine used Ciceronian Stoic concept of passions, to interpret Paul's doctrine of universal sin and redemption.
The view that not only human soul but also senses were influenced by the fall of Adam and Eve was prevalent in Augustine's time among the Fathers of the Church. It is clear the reason for Augustine's distancing from the affairs of the flesh was different from that of Plotinus, a Neoplatonist who taught that only through disdain for fleshly desire could one reach the ultimate state of mankind. Augustine taught the redemption, i.e. transformation and purification, of the body in the resurrection.
Some authors perceive Augustine's doctrine as directed against human sexuality and attribute his insistence on continence and devotion to God as coming from Augustine's need to reject his own highly sensual nature as described in the "Confessions". Augustine taught that human sexuality has been wounded, together with the whole of human nature, and requires redemption of Christ. That healing is a process realized in conjugal acts. The virtue of continence is achieved thanks to the grace of the sacrament of Christian marriage, which becomes therefore a "remedium concupiscentiae" – remedy of concupiscence. The redemption of human sexuality will be, however, fully accomplished only in the resurrection of the body.
The sin of Adam is inherited by all human beings. Already in his pre-Pelagian writings, Augustine taught that Original Sin is transmitted to his descendants by concupiscence, which he regarded as the passion of both soul and body, making humanity a "massa damnata" (mass of perdition, condemned crowd) and much enfeebling, though not destroying, the freedom of the will. Although earlier Christian authors taught the elements of physical death, moral weakness, and a sin propensity within original sin, Augustine was the first to add the concept of inherited guilt ("reatus") from Adam whereby an infant was eternally damned at birth.
Although Augustine's anti-Pelagian defense of original sin was confirmed at numerous councils, i.e. Carthage (418), Ephesus (431), Orange (529), Trent (1546) and by popes, i.e. Pope Innocent I (401–417) and Pope Zosimus (417–418), his inherited guilt eternally damning infants was omitted by these councils and popes. Anselm of Canterbury established in his "Cur Deus Homo" the definition that was followed by the great 13th-century Schoolmen, namely that Original Sin is the "privation of the righteousness which every man ought to possess," thus separating it from "concupiscence," with which some of Augustine's disciples had identified it, as later did Luther and Calvin. In 1567, Pope Pius V condemned the identification of Original Sin with concupiscence.
Predestination.
Augustine taught that God orders all things while preserving human freedom. Prior to 396, he believed predestination was based on God's foreknowledge of whether individuals would believe in Christ, that God's grace was "a reward for human assent". Later, in response to Pelagius, Augustine said that the sin of pride consists in assuming "we are the ones who choose God or that God chooses us (in his foreknowledge) because of something worthy in us", and argued that God's grace causes the individual act of faith.
Scholars are divided over whether Augustine's teaching implies double predestination, or the belief God chooses some people for damnation as well as some for salvation. Catholic scholars tend to deny he held such a view while some Protestants and secular scholars have held that Augustine did believe in double predestination. About 412, Augustine became the first Christian to understand predestination as a divine unilateral pre-determination of individuals' eternal destinies independently of human choice, although his prior Manichaean sect did teach this concept. Some Protestant theologians, such as Justo L. González and Bengt Hägglund, interpret Augustine's teaching that grace is irresistible, results in conversion, and leads to perseverance.
In "On Rebuke and Grace" ("De correptione et gratia"), Augustine wrote: "And what is written, that He wills all men to be saved, while yet all men are not saved, may be understood in many ways, some of which I have mentioned in other writings of mine; but here I will say one thing: He wills all men to be saved, is so said that all the predestinated may be understood by it, because every kind of men is among them."
Speaking of the twins Jacob and Esau, Augustine wrote in his book "On the Gift of Perseverance", "[I]t ought to be a most certain fact that the former is of the predestinated, the latter is not."
Sacramental theology.
Also in reaction to the Donatists, Augustine developed a distinction between the "regularity" and "validity" of the sacraments. Regular sacraments are performed by clergy of the Catholic Church, while sacraments performed by schismatics are considered irregular. Nevertheless, the validity of the sacraments does not depend upon the holiness of the priests who perform them ("ex opere operato"); therefore, irregular sacraments are still accepted as valid provided they are done in the name of Christ and in the manner prescribed by the Church. On this point, Augustine departs from the earlier teaching of Cyprian, who taught that converts from schismatic movements must be re-baptised. Augustine taught that sacraments administered outside the Catholic Church, though true sacraments, avail nothing. However, he also stated that baptism, while it does not confer any grace when done outside the Church, does confer grace as soon as one is received into the Catholic Church.
Augustine is said to have held an understanding of the real presence of Christ in the Eucharist, saying that Christ's statement, "This is my body" referred to the bread he carried in his hands, and that Christians must have faith the bread and wine are in fact the body and blood of Christ, despite what they see with their eyes. For instance, he stated that "He [Jesus] walked here in the same flesh, and gave us the same flesh to be eaten unto salvation. But no one eats that flesh unless first he adores it; and thus it is discovered how such a footstool of the Lord's feet is adored; and not only do we not sin by adoring, we do sin by not adoring."
John Riggs argued that Augustine held that Christ is really present in the elements of the Eucharist, but not in a bodily manner, because his body remains in Heaven.
Augustine, in his work "On Christian Doctrine", referred to the Eucharist as a "figure" and a "sign".
Against the Pelagians, Augustine strongly stressed the importance of infant baptism. About the question whether baptism is an absolute necessity for salvation, however, Augustine appears to have refined his beliefs during his lifetime, causing some confusion among later theologians about his position. He said in one of his sermons that only the baptized are saved. This belief was shared by many early Christians. However, a passage from his "City of God", concerning the Apocalypse, may indicate Augustine did believe in an exception for children born to Christian parents.
Philosophy.
Astrology.
Augustine's contemporaries often believed astrology to be an exact and genuine science. Its practitioners were regarded as true men of learning and called "mathemathici". Astrology played a prominent part in Manichaean doctrine, and Augustine himself was attracted by their books in his youth, being particularly fascinated by those who claimed to foretell the future. Later, as a bishop, he warned that one should avoid astrologers who combine science and horoscopes. (Augustine's term "mathematici", meaning "astrologers", is sometimes mistranslated as "mathematicians".) According to Augustine, they were not genuine students of Hipparchus or Eratosthenes but "common swindlers".
Epistemology.
Epistemological concerns shaped Augustine's intellectual development. His early dialogues "Contra academicos" (386) and "De Magistro" (389), both written shortly after his conversion to Christianity, reflect his engagement with sceptical arguments and show the development of his doctrine of divine illumination. The doctrine of illumination claims God plays an active and regular part in human perception and understanding by illuminating the mind so human beings can recognize intelligible realities God presents (as opposed to God designing the human mind to be reliable consistently, as in, for example, Descartes's idea of clear and distinct perceptions). According to Augustine, illumination is obtainable to all rational minds and is different from other forms of sense perception. It is meant to be an explanation of the conditions required for the mind to have a connection with intelligible entities.
Augustine also posed the problem of other minds throughout different works, most famously perhaps in "On the Trinity" (VIII.6.9), and developed what has come to be a standard solution: the argument from analogy to other minds. In contrast to Plato and other earlier philosophers, Augustine recognized the centrality of testimony to human knowledge, and argued that what others tell us can provide knowledge even if we do not have independent reasons to believe their testimonial reports.
Just war.
Augustine asserted Christians should be pacifists as a personal, philosophical stance. However, peacefulness in the face of a grave wrong that could only be stopped by violence would be a sin. Defence of one's self or others could be a necessity, especially when authorized by a legitimate authority. While not breaking down the conditions necessary for war to be just, Augustine coined the phrase in his work "The City of God". In essence, the pursuit of peace must include the option of fighting for its long-term preservation. Such a war could not be pre-emptive, but defensive, to restore peace. Thomas Aquinas, centuries later, used the authority of Augustine's arguments in an attempt to define the conditions under which a war could be just.
Free will.
Included in Augustine's earlier theodicy is the claim God created humans and angels as rational beings possessing free will. Free will was not intended for sin, meaning it is not equally predisposed to both good and evil. A will defiled by sin is not considered as "free" as it once was because it is bound by material things, which could be lost or be difficult to part with, resulting in unhappiness. Sin impairs free will, while grace restores it. Only a will that was once free can be subjected to sin's corruption. After 412, Augustine changed his theology, teaching that humanity had no free will to believe in Christ but only a free will to sin: "I in fact strove on behalf of the free choice of the human 'will,' but God's grace conquered" ("Retract". 2.1).
The early Christians opposed the deterministic views (e.g., fate) of Stoics, Gnostics, and Manichaeans prevalent in the first four centuries. Christians championed the concept of a relational God who interacts with humans rather than a Stoic or Gnostic God who unilaterally foreordained every event (yet Stoics still claimed to teach free will). Patristics scholar Ken Wilson argues that every early Christian author with extant writings who wrote on the topic prior to Augustine of Hippo (412) advanced human free choice rather than a deterministic God. According to Wilson, Augustine taught traditional free choice until 412, when he reverted to his earlier Manichaean and Stoic deterministic training when battling the Pelagians. Only a few Christians accepted Augustine's view of free will until the Protestant Reformation when both Luther and Calvin embraced Augustine's deterministic teachings wholeheartedly.
The Catholic Church considers Augustine's teaching to be consistent with free will. He often said that anyone can be saved if they wish. While God knows who will and will not be saved, with no possibility for the latter to be saved in their lives, this knowledge represents God's perfect knowledge of how humans will freely choose their destinies.
Sociology, morals and ethics.
Natural law.
Augustine was among the earliest to examine the legitimacy of the laws of man, and attempt to define the boundaries of what laws and rights occur naturally, instead of being arbitrarily imposed by mortals. All who have wisdom and conscience, he concludes, are able to use reason to recognize the "lex naturalis", natural law. Mortal law should not attempt to force people to do what is right or avoid what is wrong, but simply to remain just. Therefore "an unjust law is no law at all". People are not obligated to obey laws that are unjust, those that their conscience and reason tell them violate natural law and rights.
Slavery.
Augustine led many clergy under his authority at Hippo to free their slaves as "pious and holy" act. He boldly wrote a letter urging the emperor to set up a new law against slave traders and was very much concerned about the sale of children. Christian emperors of his time for 25 years had permitted the sale of children, not because they approved of the practice, but as a way of preventing infanticide when parents were unable to care for a child. Augustine noted that the tenant farmers in particular were driven to hire out or to sell their children as a means of survival.
In his book, "The City of God", he presents the development of slavery as a product of sin and as contrary to God's divine plan. He wrote that God "did not intend that this rational creature, who was made in his image, should have dominion over anything but the irrational creation – not man over man, but man over the beasts". Thus he wrote that righteous men in primitive times were made shepherds of cattle, not kings over men. "The condition of slavery is the result of sin", he declared. In "The City of God", Augustine wrote he felt the existence of slavery was a punishment for the existence of sin, even if an individual enslaved person committed no sin meriting punishment. He wrote: "Slavery is, however, penal, and is appointed by that law which enjoins the preservation of the natural order and forbids its disturbance." Augustine believed slavery did more harm to the slave owner than the enslaved person himself: "the lowly position does as much good to the servant as the proud position does harm to the master." Augustine proposes as a solution to sin a type of cognitive reimagining of one's situation, where slaves "may themselves make their slavery in some sort free, by serving not in crafty fear, but in faithful love," until the end of the world eradicated slavery for good: "until all unrighteousness pass away, and all principality and every human power be brought to nothing, and God be all in all."
Jews.
Against certain Christian movements, some of which rejected the use of Hebrew Scripture, Augustine countered that God had chosen the Jews as a special people, and he considered the scattering of Jewish people by the Roman Empire to be a fulfilment of prophecy. He rejected homicidal attitudes, quoting part of the same prophecy, namely "Slay them not, lest they should at last forget Thy law" (Psalm 59:11). Augustine, who believed Jewish people would be converted to Christianity at "the end of time", argued God had allowed them to survive their dispersion as a warning to Christians; as such, he argued, they should be permitted to dwell in Christian lands. The sentiment sometimes attributed to Augustine that Christians should let the Jews "survive but not thrive" (it is repeated by author James Carroll in his book "Constantine's Sword", for example) is apocryphal and is not found in any of his writings.
Sexuality.
For Augustine, the evil of sexual immorality was not in the sexual act itself, but in the emotions that typically accompany it. In "On Christian Doctrine" Augustine contrasts love, which is enjoyment on account of God, and lust, which is not on account of God. Augustine claims that, following the Fall, sexual lust ("concupiscentia") has become necessary for copulation (as required to stimulate male erection), sexual lust is an evil result of the Fall, and therefore, evil must inevitably accompany sexual intercourse ("On marriage and concupiscence" 1.19). Therefore, following the Fall, even marital sex carried out merely to procreate inevitably perpetuates evil ("On marriage and concupiscence" 1.27; "A Treatise against Two Letters of the Pelagians" 2.27). For Augustine, proper love exercises a denial of selfish pleasure and the subjugation of corporeal desire to God. The only way to avoid evil caused by sexual intercourse is to take the "better" way ("Confessions" 8.2) and abstain from marriage ("On marriage and concupiscence" 1.31). Sex within marriage is not, however, for Augustine a sin, although necessarily producing the evil of sexual lust. Based on the same logic, Augustine also declared the pious virgins raped during the sack of Rome to be innocent because they did not intend to sin nor enjoy the act.
Before the Fall, Augustine believed sex was a passionless affair, "just like many a laborious work accomplished by the compliant operation of our other limbs, without any lascivious heat", that the seed "might be sown without any shameful lust, the genital members simply obeying the inclination of the will". After the Fall, by contrast, the penis cannot be controlled by mere will, subject instead to both unwanted impotence and involuntary erections: "Sometimes the urge arises unwanted; sometimes, on the other hand, it forsakes the eager lover, and desire grows cold in the body while burning in the mind... It arouses the mind, but it does not follow through what it has begun and arouse the body also" ("City of God" 14.16).
Augustine censured those who try to prevent the creation of offspring when engaging in sexual relations, saying that though they may be nominally married they are not really, but are using that designation as a cloak for turpitude. When they allow their unwanted children to die of exposure, they unmask their sin. Sometimes they use drugs to produce sterility, or other means to try to destroy the fetus before they are born. Their marriage is not wedlock but debauchery.
Augustine believed Adam and Eve had both already chosen in their hearts to disobey God's command not to eat of the Tree of Knowledge before Eve took the fruit, ate it, and gave it to Adam. Accordingly, Augustine did not believe Adam was any less guilty of sin. Augustine praises women and their role in society and in the Church. In his "Tractates on the Gospel of John", Augustine, commenting on the Samaritan woman from John 4:1–42, uses the woman as a figure of the Church in agreement with the New Testament teaching that the Church is the bride of Christ. "Husbands, love your wives, as Christ loved the church and gave himself up for her."
Pedagogy.
Augustine is considered an influential figure in the history of education. A work early in Augustine's writings is "De Magistro" (On the Teacher), which contains insights into education. His ideas changed as he found better directions or better ways of expressing his ideas. In the last years of his life, Augustine wrote his "Retractationes" ("Retractations"), reviewing his writings and improving specific texts. Henry Chadwick believes an accurate translation of "retractationes" may be "reconsiderations". Reconsiderations can be seen as an overarching theme of the way Augustine learned. Augustine's understanding of the search for understanding, meaning, and truth as a restless journey leaves room for doubt, development, and change.
Augustine was a strong advocate of critical thinking skills. Because written works were limited during this time, spoken communication of knowledge was very important. His emphasis on the importance of community as a means of learning distinguishes his pedagogy from some others. Augustine believed dialectic is the best means for learning and that this method should serve as a model for learning encounters between teachers and students. Augustine's dialogue writings model the need for lively interactive dialogue among learners.
He recommended adapting educational practices to fit the students' educational backgrounds:
If a student has been well educated in a wide variety of subjects, the teacher must be careful not to repeat what they have already learned, but to challenge the student with material they do not yet know thoroughly. With the student who has had no education, the teacher must be patient, willing to repeat things until the student understands, and sympathetic. Perhaps the most difficult student, however, is the one with an inferior education who believes he understands something when he does not. Augustine stressed the importance of showing this type of student the difference between "having words and having understanding" and of helping the student to remain humble with his acquisition of knowledge.
Under the influence of Bede, Alcuin, and Rabanus Maurus, "De catechizandis rudibus" came to exercise an important role in the education of clergy at the monastic schools, especially from the eighth century onwards.
Augustine believed students should be given an opportunity to apply learned theories to practical experience. Yet another of Augustine's major contributions to education is his study on the styles of teaching. He claimed there are two basic styles a teacher uses when speaking to the students. The "mixed style" includes complex and sometimes showy language to help students see the beautiful artistry of the subject they are studying. The "grand style" is not quite as elegant as the mixed style, but is exciting and heartfelt, with the purpose of igniting the same passion in the students' hearts. Augustine balanced his teaching philosophy with the traditional Bible-based practice of strict discipline.
Augustine knew Latin and Ancient Greek. He had a long correspondence with St Jerome the textual differences existing between the Hebrew Bible and the Greek Septuagint, concluding that the original Greek manuscripts resulted closely similar to the other Hebrew ones, and also that even the differences in the two original versions of the Holy Scripture could enlight its spiritual meaning so as to have been unitarily inspired by God.
Coercion.
Augustine of Hippo had to deal with issues of violence and coercion throughout his entire career due largely to the Donatist-Catholic conflict. He is one of the very few authors in Antiquity who ever truly theoretically examined the ideas of religious freedom and coercion. Augustine handled the infliction of punishment and the exercise of power over law-breakers by analyzing these issues in ways similar to modern debates on penal reform.
His teaching on coercion has "embarrassed his modern defenders and vexed his modern detractors," because it is seen as making him appear "to generations of religious liberals as "le prince et patriarche de persecuteurs."" Yet Brown asserts that, at the same time, Augustine becomes "an eloquent advocate of the ideal of corrective punishment" and reformation of the wrongdoer. Russell says Augustine's theory of coercion "was not crafted from dogma, but in response to a unique historical situation" and is, therefore, context-dependent, while others see it as inconsistent with his other teachings.
The context.
During the Great Persecution, "When Roman soldiers came calling, some of the [Catholic] officials handed over the sacred books, vessels, and other church goods rather than risk legal penalties" over a few objects. Maureen Tilley says this was a problem by 305, that became a schism by 311, because many of the North African Christians had a long established tradition of a "physicalist approach to religion." The sacred scriptures were not simply books to them, but were the Word of God in physical form, therefore they saw handing over the Bible, and handing over a person to be martyred, as "two sides of the same coin." Those who cooperated with the authorities became known as "traditores." The term originally meant "one who hands over a physical object", but it came to mean "traitor".
According to Tilley, after the persecution ended, those who had apostatized wanted to return to their positions in the church. The North African Christians, (the rigorists who became known as Donatists), refused to accept them. Catholics were more tolerant and wanted to wipe the slate clean. For the next 75 years, both parties existed, often directly alongside each other, with a double line of bishops for the same cities. Competition for the loyalty of the people included multiple new churches and violence. No one is exactly sure when the Circumcellions and the Donatists allied, but for decades, they fomented protests and street violence, accosted travellers and attacked random Catholics without warning, often doing serious and unprovoked bodily harm such as beating people with clubs, cutting off their hands and feet, and gouging out eyes.
Augustine became coadjutor Bishop of Hippo in 395, and since he believed that conversion must be voluntary, his appeals to the Donatists were verbal. For several years, he used popular propaganda, debate, personal appeal, General Councils, appeals to the emperor and political pressure to bring the Donatists back into union with the Catholics, but all attempts failed. The harsh realities Augustine faced can be found in his Letter 28 written to bishop Novatus around 416. Donatists had attacked, cut out the tongue and cut off the hands of a Bishop Rogatus who had recently converted to Catholicism. An unnamed count of Africa had sent his agent with Rogatus, and he too had been attacked; the count was "inclined to pursue the matter." Russell says Augustine demonstrates a "hands on" involvement with the details of his bishopric, but at one point in the letter, he confesses he does not know what to do. "All the issues that plague him are there: stubborn Donatists, Circumcellion violence, the vacillating role of secular officials, the imperative to persuade, and his own trepidations." The empire responded to the civil unrest with the law and its enforcement, and thereafter, Augustine changed his mind about using verbal arguments alone. Instead, he came to support the state's use of coercion. Augustine did not believe the empire's enforcement would "make the Donatists more virtuous" but he did believe it would make them "less vicious."
The theology.
The primary 'proof text' of what Augustine thought concerning coercion is from Letter 93, written in 408, as a reply to bishop Vincentius, of Cartenna (Mauretania, North Africa). This letter shows that both practical and biblical reasons led Augustine to defend the legitimacy of coercion. He confesses that he changed his mind because of "the ineffectiveness of dialogue and the proven efficacy of laws." He had been worried about false conversions if force was used, but "now," he says, "it seems imperial persecution is working." Many Donatists had converted. "Fear had made them reflect, and made them docile." Augustine continued to assert that coercion could not directly convert someone, but concluded it could make a person ready to be reasoned with.
According to Mar Marcos, Augustine made use of several biblical examples to legitimize coercion, but the primary analogy in Letter 93 and in Letter 185, is the parable of the Great Feast in Luke 14.15–24 and its statement "compel them to come in." Russell says, Augustine uses the Latin term "cogo", instead of the "compello" of the Vulgate, since to Augustine, "cogo" meant to "gather together" or "collect" and was not simply "compel by physical force."
In 1970, Robert Markus argued that, for Augustine, a degree of external pressure being brought for the purpose of reform was compatible with the exercise of free will. Russell asserts that "Confessions 13" is crucial to understanding Augustine's thought on coercion; using Peter Brown's explanation of Augustine's view of salvation, he explains that Augustine's past, his own sufferings and "conversion through God's pressures," along with his biblical hermeneutics, is what led him to see the value in suffering for discerning truth. According to Russell, Augustine saw coercion as one among many conversion strategies for forming "a pathway to the inner person."
In Augustine's view, there is such a thing as just and unjust persecution. Augustine explains that when the purpose of persecution is to lovingly correct and instruct, then it becomes discipline and is just. He said the church would discipline its people out of a loving desire to heal them, and that, "once compelled to come in, heretics would gradually give their voluntary assent to the truth of Christian orthodoxy." Frederick H. Russell describes this as "a pastoral strategy in which the church did the persecuting with the dutiful assistance of Roman authorities," adding that it is "a precariously balanced blend of external discipline and inward nurturance."
Augustine placed limits on the use of coercion, recommending fines, imprisonment, banishment, and moderate floggings, preferring beatings with rods which was a common practice in the ecclesial courts. He opposed severity, maiming, and the execution of heretics. While these limits were mostly ignored by Roman authorities, Michael Lamb says that in doing this, "Augustine appropriates republican principles from his Roman predecessors..." and maintains his commitment to liberty, legitimate authority, and the rule of law as a constraint on arbitrary power. He continues to advocate holding authority accountable to prevent domination but affirms the state's right to act.
Herbert A. Deane, on the other hand, says there is a fundamental inconsistency between Augustine's political thought and "his final position of approval of the use of political and legal weapons to punish religious dissidence" and others have seconded this view. Brown asserts that Augustine's thinking on coercion is more of an attitude than a doctrine, since it is "not in a state of rest," but is instead marked by "a painful and protracted attempt to embrace and resolve tensions."
According to Russell, it is possible to see how Augustine himself had evolved from his earlier "Confessions" to this teaching on coercion and the latter's strong patriarchal nature: "Intellectually, the burden has shifted imperceptibly from discovering the truth to disseminating the truth." The bishops had become the church's elite with their own rationale for acting as "stewards of the truth." Russell points out that Augustine's views are limited to time and place and his own community, but later, others took what he said and applied it outside those parameters in ways Augustine never imagined or intended.
Works.
Augustine was one of the most prolific Latin authors in terms of surviving works, and the list of his works consists of more than one hundred separate titles. They include apologetic works against the heresies of the Arians, Donatists, Manichaeans and Pelagians; texts on Christian doctrine, notably "De Doctrina Christiana" ("On Christian Doctrine"); exegetical works such as commentaries on Genesis, the Psalms and Paul's Letter to the Romans; many sermons and letters; and the "Retractationes", a review of his earlier works which he wrote near the end of his life.
Apart from those, Augustine is probably best known for his "Confessions", which is a personal account of his earlier life, and for "De civitate Dei" ("The City of God", consisting of 22 books), which he wrote to restore the confidence of his fellow Christians, which was badly shaken by the sack of Rome by the Visigoths in 410. His "On the Trinity", in which he developed what has become known as the 'psychological analogy' of the Trinity, is also considered to be among his masterpieces, and arguably of more doctrinal importance than the "Confessions" or the "City of God". He also wrote "On Free Choice of the Will" ("De libero arbitrio"), addressing why God gives humans free will that can be used for evil.
Legacy.
In both his philosophical and theological reasoning, Augustine was greatly influenced by Stoicism, Platonism and Neoplatonism, particularly by the work of Plotinus, author of the "Enneads", probably through the mediation of Porphyry and Victorinus (as Pierre Hadot has argued). Some Neoplatonic concepts are still visible in Augustine's early writings. His early and influential writing on the human will, a central topic in ethics, would become a focus for later philosophers such as Schopenhauer, Kierkegaard, and Nietzsche. He was also influenced by the works of Virgil (known for his teaching on language), and Cicero (known for his teaching on argument).
In philosophy.
Philosopher Bertrand Russell was impressed by Augustine's meditation on the nature of time in the "Confessions", comparing it favourably to Kant's version of the view that time is subjective. Catholic theologians generally subscribe to Augustine's belief that God exists outside of time in the "eternal present"; that time only exists within the created universe because only in space is time discernible through motion and change. His meditations on the nature of time are closely linked to his consideration of the human ability of memory. Frances Yates in her 1966 study "The Art of Memory" argues that a brief passage of the "Confessions", 10.8.12, in which Augustine writes of walking up a flight of stairs and entering the vast fields of memory clearly indicates that the ancient Romans were aware of how to use explicit spatial and architectural metaphors as a mnemonic technique for organizing large amounts of information.
Augustine's philosophical method, especially demonstrated in his "Confessions", had a continuing influence on Continental philosophy throughout the 20th century. His descriptive approach to intentionality, memory, and language as these phenomena are experienced within consciousness and time anticipated and inspired the insights of modern phenomenology and hermeneutics. Edmund Husserl writes: "The analysis of time-consciousness is an age-old crux of descriptive psychology and theory of knowledge. The first thinker to be deeply sensitive to the immense difficulties to be found here was Augustine, who laboured almost to despair over this problem."
Martin Heidegger refers to Augustine's descriptive philosophy at several junctures in his influential work "Being and Time". Hannah Arendt began her philosophical writing with a dissertation on Augustine's concept of love, "Der Liebesbegriff bei Augustin" (1929): "The young Arendt attempted to show that the philosophical basis for "vita socialis" in Augustine can be understood as residing in neighbourly love, grounded in his understanding of the common origin of humanity."
Jean Bethke Elshtain in "Augustine and the Limits of Politics" tried to associate Augustine with Arendt in their concept of evil: "Augustine did not see evil as glamorously demonic but rather as absence of good, something which paradoxically is really nothing. Arendt ... envisioned even the extreme evil which produced the Holocaust as merely banal [in "Eichmann in Jerusalem"]." Augustine's philosophical legacy continues to influence contemporary critical theory through the contributions and inheritors of these 20th-century figures. Seen from a historical perspective, there are three main perspectives on the political thought of Augustine: first, political Augustinianism; second, Augustinian political theology; and third, Augustinian political theory.
In theology.
Thomas Aquinas was influenced heavily by Augustine. On the topic of original sin, Aquinas proposed a more optimistic view of man than that of Augustine in that his conception leaves to the reason, will, and passions of fallen man their natural powers even after the Fall, without "supernatural gifts". While in his pre-Pelagian writings Augustine taught that Adam's guilt as transmitted to his descendants much enfeebles, though does not destroy, the freedom of their will, Protestant reformers Martin Luther and John Calvin affirmed that Original Sin completely destroyed liberty (see total depravity).
According to Leo Ruickbie, Augustine's arguments against magic, differentiating it from a miracle, were crucial in the early Church's fight against paganism and became a central thesis in the later denunciation of witches and witchcraft. According to Professor Deepak Lal, Augustine's vision of the heavenly city has influenced the secular projects and traditions of the Enlightenment, Marxism, Freudianism and eco-fundamentalism. Post-Marxist philosophers Antonio Negri and Michael Hardt rely heavily on Augustine's thoughts, particularly "The City of God", in their book of political philosophy "Empire".
Augustine has influenced many modern-day theologians and authors such as John Piper. Hannah Arendt, an influential 20th-century political theorist, wrote her doctoral dissertation in philosophy on Augustine, and continued to rely on his thought throughout her career. Ludwig Wittgenstein extensively quotes Augustine in "Philosophical Investigations" for his approach to language, both admiringly, and as a sparring partner to develop his own ideas, including an extensive opening passage from the "Confessions". Contemporary linguists have argued that Augustine has significantly influenced the thought of Ferdinand de Saussure, who did not 'invent' the modern discipline of semiotics, but rather built upon Aristotelian and Neoplatonic knowledge from the Middle Ages, via an Augustinian connection: "as for the constitution of Saussurian semiotic theory, the importance of the Augustinian thought contribution (correlated to the Stoic one) has also been recognized. Saussure did not do anything but reform an ancient theory in Europe, according to the modern conceptual exigencies."
In his autobiographical book "Milestones", Pope Benedict XVI claims Augustine as one of the deepest influences in his thought.
Oratorio, music.
Marc-Antoine Charpentier, Motet "Pour St Augustin mourant", H.419, for 2 voices and contino (1687), and "Pour St Augustin", H.307, for 2 voices and continuo (1670s).
Much of Augustine's conversion is dramatized in the oratorio "La conversione di Sant'Agostino" (1750) composed by Johann Adolph Hasse. The libretto for this oratorio, written by Duchess Maria Antonia of Bavaria, draws upon the influence of Metastasio (the finished libretto having been edited by him) and is based on an earlier five-act play "Idea perfectae conversionis dive Augustinus" written by the Jesuit priest Franz Neumayr. In the libretto Augustine's mother Monica is presented as a prominent character that is worried that Augustine might not convert to Christianity. As Dr. Andrea Palent says: Throughout the oratorio Augustine shows his willingness to turn to God, but the burden of the act of conversion weighs heavily on him. This is displayed by Hasse through extended recitative passages.
In popular art.
In his poem "Confessional", Frank Bidart compares the relationship between Augustine and his mother, Saint Monica, to the relationship between the poem's speaker and his mother.
In the 2010 TV miniseries "", Augustine is played by Matteo Urzia (aged 15), Alessandro Preziosi (aged 25) and Franco Nero (aged 76).
English pop/rock musician, singer and songwriter Sting wrote a song related to Saint Augustine entitled "Saint Augustine in Hell" which was part of his fourth solo studio album "Ten Summoner's Tales" released in March 1993.
|
2032 | Acting | Acting is an activity in which a story is told by means of its enactment by an actor or actress who adopts a character—in theatre, television, film, radio, or any other medium that makes use of the mimetic mode.
Acting involves a broad range of skills, including a well-developed imagination, emotional facility, physical expressivity, vocal projection, clarity of speech, and the ability to interpret drama. Acting also demands an ability to employ dialects, accents, improvisation, observation and emulation, mime, and stage combat. Many actors train at length in specialist programs or colleges to develop these skills. The vast majority of professional actors have undergone extensive training. Actors and actresses will often have many instructors and teachers for a full range of training involving singing, scene-work, audition techniques, and acting for camera.
Most early sources in the West that examine the art of acting (, "hypokrisis") discuss it as part of rhetoric.
History.
One of the first known actors was an ancient Greek called Thespis of Icaria in Athens. Writing two centuries after the event, Aristotle in his "Poetics" () suggests that Thespis stepped out of the dithyrambic chorus and addressed it as a separate character. Before Thespis, the chorus narrated (for example, "Dionysus did this, Dionysus said that"). When Thespis stepped out from the chorus, he spoke as if he were the character (for example, "I am Dionysus, I did this"). To distinguish between these different modes of storytelling—enactment and narration
—Aristotle uses the terms "mimesis" (via enactment) and "diegesis" (via narration). From Thespis' name derives the word "thespian".
Training.
Conservatories and drama schools typically offer two- to four-year training on all aspects of acting. Universities mostly offer three- to four-year programs, in which a student is often able to choose to focus on acting, whilst continuing to learn about other aspects of theatre. Schools vary in their approach, but in North America the most popular method taught derives from the 'system' of Konstantin Stanislavski, which was developed and popularised in America as method acting by Lee Strasberg, Stella Adler, Sanford Meisner, and others.
Other approaches may include a more physically based orientation, such as that promoted by theatre practitioners as diverse as Anne Bogart, Jacques Lecoq, Jerzy Grotowski, or Vsevolod Meyerhold. Classes may also include psychotechnique, mask work, physical theatre, improvisation, and acting for camera.
Regardless of a school's approach, students should expect intensive training in textual interpretation, voice, and movement. Applications to drama programmes and conservatories usually involve extensive auditions. Anybody over the age of 18 can usually apply. Training may also start at a very young age. Acting classes and professional schools targeted at under-18s are widespread. These classes introduce young actors to different aspects of acting and theatre, including scene study.
Increased training and exposure to public speaking allows humans to maintain calmer and more relaxed physiologically. By measuring a public speaker's heart rate maybe one of the easiest ways to judge shifts in stress as the heart rate increases with anxiety . As actors increase performances, heart rate and other evidence of stress can decrease. This is very important in training for actors, as adaptive strategies gained from increased exposure to public speaking can regulate implicit and explicit anxiety. By attending an institution with a specialization in acting, increased opportunity to act will lead to more relaxed physiology and decrease in stress and its effects on the body. These effects can vary from hormonal to cognitive health that can impact quality of life and performance
Improvisation.
Some classical forms of acting involve a substantial element of improvised performance. Most notable is its use by the troupes of the "commedia dell'arte", a form of masked comedy that originated in Italy.
Improvisation as an approach to acting formed an important part of the Russian theatre practitioner Konstantin Stanislavski's 'system' of actor training, which he developed from the 1910s onwards. Late in 1910, the playwright Maxim Gorky invited Stanislavski to join him in Capri, where they discussed training and Stanislavski's emerging "grammar" of acting. Inspired by a popular theatre performance in Naples that utilised the techniques of the "commedia dell'arte", Gorky suggested that they form a company, modelled on the medieval strolling players, in which a playwright and group of young actors would devise new plays together by means of improvisation. Stanislavski would develop this use of improvisation in his work with his First Studio of the Moscow Art Theatre. Stanislavski's use was extended further in the approaches to acting developed by his students, Michael Chekhov and Maria Knebel.
In the United Kingdom, the use of improvisation was pioneered by Joan Littlewood from the 1930s onwards and, later, by Keith Johnstone and Clive Barker. In the United States, it was promoted by Viola Spolin, after working with Neva Boyd at a Hull House in Chicago, Illinois (Spolin was Boyd's student from 1924 to 1927). Like the British practitioners, Spolin felt that playing games was a useful means of training actors and helped to improve an actor's performance. With improvisation, she argued, people may find expressive freedom, since they do not know how an improvised situation will turn out. Improvisation demands an open mind in order to maintain spontaneity, rather than pre-planning a response. A character is created by the actor, often without reference to a dramatic text, and a drama is developed out of the spontaneous interactions with other actors. This approach to creating new drama has been developed most substantially by the British filmmaker Mike Leigh, in films such as "Secrets & Lies" (1996), "Vera Drake" (2004), "Another Year" (2010), and "Mr. Turner" (2014).
Improvisation is also used to cover up if an actor or actress makes a mistake.
Physiological effects.
Acting in front of an audience many times can cause "stage fright", a form of stress in which someone becomes anxious in front of an audience. This is common among actors, especially new actors, and can cause symptoms such as increased heart rate, increased blood pressure, and sweating.
In a 2017 study on American university students, actors of various experience levels all showed similarly elevated heart rates throughout their performances; this agrees with previous studies on professional and amateur actors' heart rates. While all actors experienced stress, causing elevated heart rate, the more experienced actors displayed less heart rate variability than the less experienced actors in the same play. The more experienced actors experienced less stress while performing, and therefore had a smaller degree of variability than the less experienced, more stressed actors. The more experienced an actor is, the more stable their heart rate will be while performing, but will still experience elevated heart rates.
Semiotics.
The semiotics of acting involves a study of the ways in which aspects of a performance come to operate for its audience as signs. This process largely involves the production of meaning, whereby elements of an actor's performance acquire significance, both within the broader context of the dramatic action and in the relations each establishes with the real world.
Following the ideas proposed by the Surrealist theorist Antonin Artaud, however, it may also be possible to understand communication with an audience that occurs 'beneath' significance and meaning (which the semiotician Félix Guattari described as a process involving the transmission of "a-signifying signs"). In his "The Theatre and its Double" (1938), Artaud compared this interaction to the way in which a snake charmer communicates with a snake, a process which he identified as "mimesis"—the same term that Aristotle in his "Poetics" () used to describe the mode in which drama communicates its story, by virtue of its embodiment by the actor enacting it, as distinct from "diegesis", or the way in which a narrator may describe it. These "vibrations" passing from the actor to the audience may not necessarily precipitate into significant elements as such (that is, consciously perceived "meanings"), but rather may operate by means of the circulation of "affects".
The approach to acting adopted by other theatre practitioners involve varying degrees of concern with the semiotics of acting. Konstantin Stanislavski, for example, addresses the ways in which an actor, building on what he calls the "experiencing" of a role, should also shape and adjust a performance in order to support the overall significance of the drama—a process that he calls establishing the "perspective of the role". The semiotics of acting plays a far more central role in Bertolt Brecht's epic theatre, in which an actor is concerned to bring out clearly the socio historical significance of behaviour and action by means of specific performance choices—a process that he describes as establishing the "not/but" element in a performed physical ""gestus" within context of the play's overal "Fabel"". Eugenio Barba argues that actors ought not to concern themselves with the significance of their performance behaviour; this aspect is the responsibility, he claims, of the director, who weaves the signifying elements of an actor's performance into the director's dramaturgical "montage".
The theatre semiotician Patrice Pavis, alluding to the contrast between Stanislavski's 'system' and Brecht's demonstrating performer—and, beyond that, to Denis Diderot's foundational essay on the art of acting, "Paradox of the Actor" (–78)—argues that:
Elements of a semiotics of acting include the actor's gestures, facial expressions, intonation and other vocal qualities, rhythm, and the ways in which these aspects of an individual performance relate to the drama and the theatrical event (or film, television programme, or radio broadcast, each of which involves different semiotic systems) considered as a whole. A semiotics of acting recognises that all forms of acting involve conventions and codes by means of which performance behaviour acquires significance—including those approaches, such as Stanislvaski's or the closely related method acting developed in the United States, that offer themselves as "a natural kind of acting that can do without conventions and be received as self-evident and universal." Pavis goes on to argue that:
The conventions that govern acting in general are related to structured forms of play, which involve, in each specific experience, "rules of the game." This aspect was first explored by Johan Huizinga (in "Homo Ludens", 1938) and Roger Caillois (in "Man, Play and Games", 1958). Caillois, for example, distinguishes four aspects of play relevant to acting: "mimesis" (simulation), "agon" (conflict or competition), "alea" (chance), and "ilinx" (vertigo, or "vertiginous psychological situations" involving the spectator's identification or catharsis). This connection with play as an activity was first proposed by Aristotle in his "Poetics", in which he defines the desire to imitate in play as an essential part of being human and our first means of learning as children:
This connection with play also informed the words used in English (as was the analogous case in many other European languages) for drama: the word "play" or "game" (translating the Anglo-Saxon "plèga" or Latin "ludus") was the standard term used until William Shakespeare's time for a dramatic entertainment—just as its creator was a "play-maker" rather than a "dramatist", the person acting was known as a "player", and, when in the Elizabethan era specific buildings for acting were built, they was known as "play-houses" rather than "theatres."
Resumes and auditions.
Actors and actresses need to make a resume when applying for roles. The acting resume is very different from a normal resume; it is generally shorter, with lists instead of paragraphs, and it should have a head shot on the back. Sometimes, a resume also contains a short 30 second to 1 minute reel displaying the actors ability's, so that the casting director can see your previous performances, if any. An actor's resume should list projects they have acted in before such as plays, movies, or shows, as well as special skills and their contact information.
Auditioning is the act of performing either a monologue or sides (lines for one character) as sent by the casting director. Auditioning entails showing the actor's skills to present themselves as a different person; it may be as brief as two minutes. For theater auditions it can be longer than two minutes, or they may perform more than one monologue, as each casting director can have different requirements for actors. Actors should go to auditions dressed for the part, to make it easier for the casting director to visualize them as the character. For television or film they will have to undergo more than one audition. Oftentimes actors are called into another audition at the last minute, and are sent the sides either that morning or the night before. Auditioning can be a stressful part of acting, especially if one has not been trained to audition.
Rehearsal.
Rehearsal is a process in which actors prepare and practice a performance, exploring the vicissitudes of conflict between characters, testing specific actions in the scene, and finding means to convey a particular sense. Some actors continue to rehearse a scene throughout the run of a show in order to keep the scene fresh in their minds and exciting for the audience.
Audience.
A critical audience with evaluative spectators is known to induce stress on actors during performance, (see Bode & Brutten). Being in front of an audience sharing a story will makes the actors intensely vulnerable. Shockingly, an actor will typically rate the quality of their performance higher than their spectators. Heart rates are generally always higher during a performance with an audience when compared to rehearsal, however what's interesting is that this audience also seems to induce a higher quality of performance. Simply put, while public performances cause extremely high stress levels in actors (more so amateur ones), the stress actually improves the performance, supporting the idea of "positive stress in challenging situations"
Heart rate.
Depending on what an actor is doing, his or her heart rate will vary. This is the body's way of responding to stress. Prior to a show one will see an increase in heart rate due to anxiety. While performing an actor has an increased sense of exposure which will increase performance anxiety and the associated physiological arousal, such as heart rate. Heart rates increases more during shows compared to rehearsals because of the increased pressure, which is due to the fact that a performance has a potentially greater impact on an actors career. After the show a decrease in the heart rate due to the conclusion of the stress inducing activity can be seen. Often the heart rate will return to normal after the show or performance is done; however, during the applause after the performance there is a rapid spike in heart rate. This can be seen not only in actors but also with public speaking and musicians.
Stress.
There is a correlation between heart-rate and stress when actors' are performing in front of an audience. Actors claim that having an audience has no change in their stress level, but as soon as they come on stage their heart-rate rises quickly. A 2017 study done in an American University looking at actors' stress by measuring heart-rate showed individual heart-rates rose right before the performance began for those actors opening. There are many factors that can add to an actors' stress. For example, length of monologues, experience level, and actions done on stage including moving the set. Throughout the performance heart-rate rises the most before an actor is speaking. The stress and thus heart-rate of the actor then drops significantly at the end of a monologue, big action scene, or performance.
|
2037 | Delian League | The Delian League, founded in 478 BC, was an association of Greek city-states, numbering between 150 and 330, under the leadership of Athens, whose purpose was to continue fighting the Persian Empire after the Greek victory in the Battle of Plataea at the end of the Second Persian invasion of Greece.
The League's modern name derives from its official meeting place, the island of Delos, where congresses were held in the temple and where the treasury stood until, in a symbolic gesture, Pericles moved it to Athens in 454 BC.
Shortly after its inception, Athens began to use the League's funds for its own purposes, which led to conflicts between Athens and the less powerful members of the League. By 431 BC, the threat the League presented to Spartan hegemony combined with Athens's heavy-handed control of the Delian League prompted the outbreak of the Peloponnesian War; the League was dissolved upon the war's conclusion in 404 BC under the direction of Lysander, the Spartan commander.
Background.
The Greco-Persian Wars had their roots in the conquest of the Greek cities of Asia Minor, and particularly Ionia, by the Achaemenid Persian Empire of Cyrus the Great shortly after 550 BC. The Persians found the Ionians difficult to rule, eventually settling for sponsoring a tyrant in each Ionian city. While Greek states had in the past often been ruled by tyrants, this form of government was on the decline. By 500 BC, Ionia appears to have been ripe for rebellion against these Persian clients. The simmering tension finally broke into open revolt due to the actions of the tyrant of Miletus, Aristagoras. Attempting to save himself after a disastrous Persian-sponsored expedition in 499 BC, Aristagoras chose to declare Miletus a democracy. This triggered similar revolutions across Ionia, extending to Doris and Aeolis, beginning the Ionian Revolt.
The Greek states of Athens and Eretria allowed themselves to be drawn into this conflict by Aristagoras, and during their only campaigning season (498 BC) they contributed to the capture and burning of the Persian regional capital of Sardis. After this, the Ionian revolt carried on (without further outside aid) for a further five years, until it was finally completely crushed by the Persians. However, in a decision of great historic significance, the Persian king Darius the Great decided that, despite having subdued the revolt, there remained the unfinished business of exacting punishment on Athens and Eretria for supporting the revolt. The Ionian revolt had severely threatened the stability of Darius's empire, and the states of mainland Greece would continue to threaten that stability unless dealt with. Darius thus began to contemplate the complete conquest of Greece, beginning with the destruction of Athens and Eretria.
In the next two decades, there would be two Persian invasions of Greece, occasioning, thanks to Greek historians, some of the most famous battles in history. During the first invasion, Thrace, Macedon and the Aegean Islands were added to the Persian Empire, and Eretria was duly destroyed. However, the invasion ended in 490 BC with the decisive Athenian victory at the Battle of Marathon. After this invasion, Darius died, and responsibility for the war passed to his son Xerxes I.
Xerxes then personally led a second Persian invasion of Greece in 480 BC, taking an enormous (although oft-exaggerated) army and navy to Greece. Those Greeks who chose to resist (the 'Allies') were defeated in the twin simultaneous battles of Thermopylae on land and Artemisium at sea. All of Greece except the Peloponnesus thus having fallen into Persian hands, the Persians then seeking to destroy the Allied navy once and for all, suffered a decisive defeat at the Battle of Salamis. The following year, 479 BC, the Allies assembled the largest Greek army yet seen and defeated the Persian invasion force at the Battle of Plataea, ending the invasion and the threat to Greece.
The Allied fleet defeated the remnants of the Persian fleet in the Battle of Mycale near the island of Samos—on the same day as Plataea, according to tradition. This action marks the end of the Persian invasion, and the beginning of the next phase in the Greco-Persian wars, the Greek counterattack. After Mycale, the Greek cities of Asia Minor again revolted, with the Persians now powerless to stop them. The Allied fleet then sailed to the Thracian Chersonese, still held by the Persians, and besieged and captured the town of Sestos. The following year, 478 BC, the Allies sent a force to capture the city of Byzantion (modern day Istanbul). The siege was successful, but the behaviour of the Spartan general Pausanias alienated many of the Allies, and resulted in Pausanias's recall.
Formation.
After Byzantion, Sparta was eager to end its involvement in the war. The Spartans greatly feared the rise of the Athenians as a challenge to their power. Additionally, the Spartans were of the view that, with the liberation of mainland Greece, and the Greek cities of Asia Minor, the war's purpose had already been achieved. There was also perhaps a feeling that establishing long-term security for the Asian Greeks would prove impossible. In the aftermath of Mycale, the Spartan king Leotychidas had proposed transplanting all the Greeks from Asia Minor to Europe as the only method of permanently freeing them from Persian dominion.
Xanthippus, the Athenian commander at Mycale, had furiously rejected this; the Ionian cities had been Athenian colonies, and the Athenians, if no one else, would protect the Ionians. This marked the point at which the leadership of the Greek alliance effectively passed to the Athenians. With the Spartan withdrawal after Byzantion, the leadership of the Athenians became explicit.
The loose alliance of city states which had fought against Xerxes's invasion had been dominated by Sparta and the Peloponnesian league. With the withdrawal of these states, a congress was called on the holy island of Delos to institute a new alliance to continue the fight against the Persians; hence the modern designation "Delian League". According to Thucydides, the official aim of the League was to "avenge the wrongs they suffered by ravaging the territory of the king."
In reality, this goal was divided into three main efforts—to prepare for future invasion, to seek revenge against Persia, and to organize a means of dividing spoils of war. The members were given a choice of either offering armed forces or paying a tax to the joint treasury; most states chose the tax. League members swore to have the same friends and enemies, and dropped ingots of iron into the sea to symbolize the permanence of their alliance. The Athenian politician Aristides would spend the rest of his life occupied in the affairs of the alliance, dying (according to Plutarch) a few years later in Pontus, whilst determining what the tax of new members was to be.
Composition and expansion.
In the first ten years of the league's existence, Cimon/Kimon forced Karystos in Euboea to join the league, conquered the island of Skyros and sent Athenian colonists there.
Over time, especially with the suppression of rebellions, Athens exercised hegemony over the rest of the league. Thucydides describes how Athens's control over the League grew:
Of all the causes of defection, that connected with arrears of tribute and vessels, and with failure of service, was the chief; for the Athenians were very severe and exacting, and made themselves offensive by applying the screw of necessity to men who were not used to and in fact not disposed for any continuous labor. In some other respects the Athenians were not the old popular rulers they had been at first; and if they had more than their fair share of service, it was correspondingly easy for them to reduce any that tried to leave the confederacy. The Athenians also arranged for the other members of the league to pay its share of the expense in money instead of in ships and men, and for this the subject city-states had themselves to blame, their wish to get out of giving service making most leave their homes. Thus while Athens was increasing her navy with the funds they contributed, a revolt always found itself without enough resources or experienced leaders for war.
Rebellion.
Naxos.
The first member of the league to attempt to secede was the island of Naxos in c. 471 BC. After being defeated, Naxos is believed (based on similar, later revolts) to have been forced to tear down its walls along with losing its fleet and vote in the League.
Thasos.
In 465 BC, Athens founded the colony of Amphipolis on the Strymon river. Thasos, a member of the League, saw her interests in the mines of Mt. Pangaion threatened and defected from the League to Persia. She called to Sparta for assistance but was denied, as Sparta was facing the largest helot revolt in its history.
After more than two years of siege, Thasos surrendered to the Athenian leader Aristides and was forced back into the league. As a result, the fortification walls of Thasos were torn down, and they had to pay yearly tribute and fines. Additionally, their land, naval ships, and the mines of Thasos were confiscated by Athens. The siege of Thasos marks the transformation of the Delian league from an alliance into, in the words of Thucydides, a hegemony.
Policies of the League.
In 461 BC, Cimon was ostracized and was succeeded in his influence by democrats such as Ephialtes and Pericles. This signaled a complete change in Athenian foreign policy, neglecting the alliance with the Spartans and instead allying with her enemies, Argos and Thessaly. Megara deserted the Spartan-led Peloponnesian League and allied herself with Athens, allowing construction of a double line of walls across the Isthmus of Corinth and protecting Athens from attack from that quarter. Roughly a decade earlier, due to encouragement from influential speaker Themistocles, the Athenians had also constructed the Long Walls connecting their city to the Piraeus, its port, making it effectively invulnerable to attack by land.
In 454 BC, the Athenian general Pericles moved the Delian League's treasury from Delos to Athens, allegedly to keep it safe from Persia. However, Plutarch indicates that many of Pericles's rivals viewed the transfer to Athens as usurping monetary resources to fund elaborate building projects. Athens also switched from accepting ships, men and weapons as dues from league members, to only accepting money.
The new treasury established in Athens was used for many purposes, not all relating to the defence of members of the league. It was from tribute paid to the league that Pericles set to building the Parthenon on the Acropolis, replacing an older temple, as well as many other non-defense related expenditures. The Delian League was turning from an alliance into an empire.
Wars against Persia.
War with the Persians continued. In 460 BC, Egypt revolted under local leaders the Hellenes called Inaros and Amyrtaeus, who requested aid from Athens. Pericles led 250 ships, intended to attack Cyprus, to their aid because it would further damage Persia. After four years, however, the Egyptian rebellion was defeated by the Achaemenid general Megabyzus, who captured the greater part of the Athenian forces. In fact, according to Isocrates, the Athenians and their allies lost some 20,000 men in the expedition, while modern estimates place the figure at 50,000 men and 250 ships including reinforcements. The remainder escaped to Cyrene and thence returned home.
This was the Athenians' main (public) reason for moving the treasury of the League from Delos to Athens, further consolidating their control over the League. The Persians followed up their victory by sending a fleet to re-establish their control over Cyprus, and 200 ships were sent out to counter them under Cimon, who returned from ostracism in 451 BC. He died during the blockade of Citium, though the fleet won a double victory by land and sea over the Persians off Salamis, Cyprus.
This battle was the last major one fought against the Persians. Many writers report that a peace treaty, known as the Peace of Callias, was formalized in 450 BC, but some writers believe that the treaty was a myth created later to inflate the stature of Athens. However, an understanding was definitely reached, enabling the Athenians to focus their attention on events in Greece proper.
Wars in Greece.
Soon, war with the Peloponnesians broke out. In 458 BC, the Athenians blockaded the island of Aegina, and simultaneously defended Megara from the Corinthians by sending out an army composed of those too young or old for regular military service. The following year, Sparta sent an army into Boeotia, reviving the power of Thebes in order to help hold the Athenians in check. Their return was blocked, and they resolved to march on Athens, where the Long Walls were not yet completed, winning a victory at the Battle of Tanagra. All this accomplished, however, was to allow them to return home via the Megarid. Two months later, the Athenians under Myronides invaded Boeotia, and winning the Battle of Oenophyta gained control of the whole country except Thebes.
Reverses followed peace with Persia in 449 BC. The Battle of Coronea, in 447 BC, led to the abandonment of Boeotia. Euboea and Megara revolted, and while the former was restored to its status as a tributary ally, the latter was a permanent loss. The Delian and Peloponnesian Leagues signed a peace treaty, which was set to endure for thirty years. It only lasted until 431 BC, when the Peloponnesian War broke out.
Those who revolted unsuccessfully during the war saw the example made of the Mytilenians, the principal people on Lesbos. After an unsuccessful revolt, the Athenians ordered the death of the entire male population. After some thought, they rescinded this order, and only put to death the leading 1000 ringleaders of the revolt, and redistributed the land of the entire island to Athenian shareholders, who were sent out to reside on Lesbos.
This type of treatment was not reserved solely for those who revolted. Thucydides documents the example of Melos, a small island, neutral in the war, though founded by Spartans. The Melians were offered a choice to join the Athenians, or be conquered. Choosing to resist, their town was besieged and conquered; the males were put to death and the women sold into slavery (see Melian dialogue).
Athenian Empire (454–404 BC).
By 454 BC, the Delian League could be fairly characterised as an Athenian Empire; a key event of 454 BC was the moving of the treasury of the Delian League from Delos to Athens. This is often seen as a key marker of the transition from alliance to empire, but while it is significant, it is important to view the period as a whole when considering the development of Athenian imperialism, and not to focus on a single event as being the main contributor to it. At the start of the Peloponnesian War, only Chios and Lesbos were left to contribute ships, and these states were by now far too weak to secede without support. Lesbos tried to revolt first, and failed completely. Chios, the most powerful of the original members of the Delian League save Athens, was the last to revolt, and in the aftermath of the Syracusan Expedition enjoyed success for several years, inspiring all of Ionia to revolt. Athens was nonetheless eventually able to suppress these revolts.
To further strengthen Athens's grip on its empire, Pericles in 450 BC began a policy of establishing "kleruchiai"—quasi-colonies that remained tied to Athens and which served as garrisons to maintain control of the League's vast territory. Furthermore, Pericles employed a number of offices to maintain Athens' empire: "proxenoi", who fostered good relations between Athens and League members; "episkopoi" and "archontes", who oversaw the collection of tribute; and "hellenotamiai", who received the tribute on Athens' behalf.
Athens's empire was not very stable and after 27 years of war, the Spartans, aided by the Persians and Athenian internal strife, were able to defeat it. However, it did not remain defeated for long. The Second Athenian League, a maritime self-defense league, was founded in 377 BC and was led by Athens. The Athenians would never recover the full extent of their power, and their enemies were now far stronger and more varied.
|
2038 | August Horch | August Horch (12 October 1868 – 3 February 1951) was a German engineer and automobile pioneer, the founder of the manufacturing giant which would eventually become Audi.
Beginnings.
Horch was born in Winningen, Rhenish Prussia. His initial trade was as a blacksmith, and then was educated at (Mittweida Technical College). After receiving a degree in engineering, he worked in shipbuilding. Horch worked for Karl Benz from 1896, before founding "A. Horch & Co." in November 1899, in Ehrenfeld, Cologne, Germany.
Manufacturing.
The first Horch automobile was built in 1901. The company moved to Reichenbach in 1902 and Zwickau in 1904. Horch left the company in 1909 after a dispute, and set up in competition in Zwickau. His new firm was initially called "Horch Automobil-Werke GmbH", but following a legal dispute over the "Horch" name, he decided to make another automobile company. (The court decided that "Horch" was a registered trademark on behalf of August's former partners and August was not entitled to use it any more). Consequently, Horch named his new company "Audi Automobilwerke GmbH" in 1910, "Audi" being the Latinization of Horch.
Post Audi.
Horch left Audi in 1920 and went to Berlin and took various jobs. He published his autobiography, "I Built Cars ()" in 1937. He also served on the board of Auto Union, the successor to Audi Automobilwerke GmbH he founded. Horch remained an honorary executive at Auto Union during and after its reincorporation in Ingolstadt, Bavaria in the late 1940s until his death in 1951, ultimately not living to see the later resurrection of his Audi brand a decade later under the ownership of Volkswagen.
He was an honorary citizen of Zwickau and had a street named for his Audi cars in both Zwickau and his birthplace Winningen. He was made an honorary professor at Braunschweig University of Technology. An "August Horchstrasse" (August Horch Street) also exists at Audi's main manufacturing plant in Ingolstadt.
|
2039 | Avionics | Avionics (a blend of "aviation" and "electronics") are the electronic systems used on aircraft. Avionic systems include communications, navigation, the display and management of multiple systems, and the hundreds of systems that are fitted to aircraft to perform individual functions. These can be as simple as a searchlight for a police helicopter or as complicated as the tactical system for an airborne early warning platform.
History.
The term "avionics" was coined in 1949 by Philip J. Klass, senior editor at "Aviation Week & Space Technology" magazine as a portmanteau of "aviation electronics".
Radio communication was first used in aircraft just prior to World War I. The first airborne radios were in zeppelins, but the military sparked development of light radio sets that could be carried by heavier-than-air craft, so that aerial reconnaissance biplanes could report their observations immediately in case they were shot down. The first experimental radio transmission from an airplane was conducted by the U.S. Navy in August 1910. The first aircraft radios transmitted by radiotelegraphy, so they required two-seat aircraft with a second crewman to tap on a telegraph key to spell out messages by Morse code. During World War I, AM voice two way radio sets were made possible in 1917 by the development of the triode vacuum tube, which were simple enough that the pilot in a single seat aircraft could use it while flying.
Radar, the central technology used today in aircraft navigation and air traffic control, was developed by several nations, mainly in secret, as an air defense system in the 1930s during the runup to World War II. Many modern avionics have their origins in World War II wartime developments. For example, autopilot systems that are commonplace today began as specialized systems to help bomber planes fly steadily enough to hit precision targets from high altitudes. Britain's 1940 decision to share its radar technology with its U.S. ally, particularly the magnetron vacuum tube, in the famous Tizard Mission, significantly shortened the war. Modern avionics is a substantial portion of military aircraft spending. Aircraft like the F-15E and the now retired F-14 have roughly 20 percent of their budget spent on avionics. Most modern helicopters now have budget splits of 60/40 in favour of avionics.
The civilian market has also seen a growth in cost of avionics. Flight control systems (fly-by-wire) and new navigation needs brought on by tighter airspaces, have pushed up development costs. The major change has been the recent boom in consumer flying. As more people begin to use planes as their primary method of transportation, more elaborate methods of controlling aircraft safely in these high restrictive airspaces have been invented.
Modern avionics.
Avionics plays a heavy role in modernization initiatives like the Federal Aviation Administration's (FAA) Next Generation Air Transportation System project in the United States and the Single European Sky ATM Research (SESAR) initiative in Europe. The Joint Planning and Development Office put forth a roadmap for avionics in six areas:
Market.
The Aircraft Electronics Association reports $1.73 billion avionics sales for the first three quarters of 2017 in business and general aviation, a 4.1% yearly improvement: 73.5% came from North America, forward-fit represented 42.3% while 57.7% were retrofits as the U.S. deadline of January 1, 2020 for mandatory ADS-B out approach.
Aircraft avionics.
The cockpit of an aircraft is a typical location for avionic equipment, including control, monitoring, communication, navigation, weather, and anti-collision systems. The majority of aircraft power their avionics using 14- or 28‑volt DC electrical systems; however, larger, more sophisticated aircraft (such as airliners or military combat aircraft) have AC systems operating at 115 volts 400 Hz, AC. There are several major vendors of flight avionics, including The Boeing Company, Panasonic Avionics Corporation, Honeywell (which now owns Bendix/King), Universal Avionics Systems Corporation, Rockwell Collins (now Collins Aerospace), Thales Group, GE Aviation Systems, Garmin, Raytheon, Parker Hannifin, UTC Aerospace Systems (now Collins Aerospace), Selex ES (now Leonardo S.p.A.), Shadin Avionics, and Avidyne Corporation.
International standards for avionics equipment are prepared by the Airlines Electronic Engineering Committee (AEEC) and published by ARINC.
Communications.
Communications connect the flight deck to the ground and the flight deck to the passengers. On‑board communications are provided by public-address systems and aircraft intercoms.
The VHF aviation communication system works on the airband of 118.000 MHz to 136.975 MHz. Each channel is spaced from the adjacent ones by 8.33 kHz in Europe, 25 kHz elsewhere. VHF is also used for line of sight communication such as aircraft-to-aircraft and aircraft-to-ATC. Amplitude modulation (AM) is used, and the conversation is performed in simplex mode. Aircraft communication can also take place using HF (especially for trans-oceanic flights) or satellite communication.
Navigation.
Air navigation is the determination of position and direction on or above the surface of the Earth. Avionics can use satellite navigation systems (such as GPS and WAAS), inertial navigation system (INS), ground-based radio navigation systems (such as VOR or LORAN), or any combination thereof. Some navigation systems such as GPS calculate the position automatically and display it to the flight crew on moving map displays. Older ground-based Navigation systems such as VOR or LORAN requires a pilot or navigator to plot the intersection of signals on a paper map to determine an aircraft's location; modern systems calculate the position automatically and display it to the flight crew on moving map displays.
Monitoring.
The first hints of glass cockpits emerged in the 1970s when flight-worthy cathode ray tube (CRT) screens began to replace electromechanical displays, gauges and instruments. A "glass" cockpit refers to the use of computer monitors instead of gauges and other analog displays. Aircraft were getting progressively more displays, dials and information dashboards that eventually competed for space and pilot attention. In the 1970s, the average aircraft had more than 100 cockpit instruments and controls.
Glass cockpits started to come into being with the Gulfstream G‑IV private jet in 1985. One of the key challenges in glass cockpits is to balance how much control is automated and how much the pilot should do manually. Generally they try to automate flight operations while keeping the pilot constantly informed.
Aircraft flight-control system.
Aircraft have means of automatically controlling flight. Autopilot was first invented by Lawrence Sperry during World War I to fly bomber planes steady enough to hit accurate targets from 25,000 feet. When it was first adopted by the U.S. military, a Honeywell engineer sat in the back seat with bolt cutters to disconnect the autopilot in case of emergency. Nowadays most commercial planes are equipped with aircraft flight control systems in order to reduce pilot error and workload at landing or takeoff.
The first simple commercial auto-pilots were used to control heading and altitude and had limited authority on things like thrust and flight control surfaces. In helicopters, auto-stabilization was used in a similar way. The first systems were electromechanical. The advent of fly-by-wire and electro-actuated flight surfaces (rather than the traditional hydraulic) has increased safety. As with displays and instruments, critical devices that were electro-mechanical had a finite life. With safety critical systems, the software is very strictly tested.
Fuel Systems.
Fuel Quantity Indication System (FQIS) monitors the amount of fuel aboard. Using various sensors, such as capacitance tubes, temperature sensors, densitometers & level sensors, the FQIS computer calculates the mass of fuel remaining on board.
Fuel Control and Monitoring System (FCMS) reports fuel remaining on board in a similar manner, but, by controlling pumps & valves, also manages fuel transfers around various tanks.
Collision-avoidance systems.
To supplement air traffic control, most large transport aircraft and many smaller ones use a traffic alert and collision avoidance system (TCAS), which can detect the location of nearby aircraft, and provide instructions for avoiding a midair collision. Smaller aircraft may use simpler traffic alerting systems such as TPAS, which are passive (they do not actively interrogate the transponders of other aircraft) and do not provide advisories for conflict resolution.
To help avoid controlled flight into terrain (CFIT), aircraft use systems such as ground-proximity warning systems (GPWS), which use radar altimeters as a key element. One of the major weaknesses of GPWS is the lack of "look-ahead" information, because it only provides altitude above terrain "look-down". In order to overcome this weakness, modern aircraft use a terrain awareness warning system (TAWS).
Flight recorders.
Commercial aircraft cockpit data recorders, commonly known as "black boxes", store flight information and audio from the cockpit. They are often recovered from an aircraft after a crash to determine control settings and other parameters during the incident.
Weather systems.
Weather systems such as weather radar (typically Arinc 708 on commercial aircraft) and lightning detectors are important for aircraft flying at night or in instrument meteorological conditions, where it is not possible for pilots to see the weather ahead. Heavy precipitation (as sensed by radar) or severe turbulence (as sensed by lightning activity) are both indications of strong convective activity and severe turbulence, and weather systems allow pilots to deviate around these areas.
Lightning detectors like the Stormscope or Strikefinder have become inexpensive enough that they are practical for light aircraft. In addition to radar and lightning detection, observations and extended radar pictures (such as NEXRAD) are now available through satellite data connections, allowing pilots to see weather conditions far beyond the range of their own in-flight systems. Modern displays allow weather information to be integrated with moving maps, terrain, and traffic onto a single screen, greatly simplifying navigation.
Modern weather systems also include wind shear and turbulence detection and terrain and traffic warning systems. In‑plane weather avionics are especially popular in Africa, India, and other countries where air-travel is a growing market, but ground support is not as well developed.
Aircraft management systems.
There has been a progression towards centralized control of the multiple complex systems fitted to aircraft, including engine monitoring and management. Health and usage monitoring systems (HUMS) are integrated with aircraft management computers to give maintainers early warnings of parts that will need replacement.
The integrated modular avionics concept proposes an integrated architecture with application software portable across an assembly of common hardware modules. It has been used in fourth generation jet fighters and the latest generation of airliners.
Mission or tactical avionics.
Military aircraft have been designed either to deliver a weapon or to be the eyes and ears of other weapon systems. The vast array of sensors available to the military is used for whatever tactical means required. As with aircraft management, the bigger sensor platforms (like the E‑3D, JSTARS, ASTOR, Nimrod MRA4, Merlin HM Mk 1) have mission-management computers.
Police and EMS aircraft also carry sophisticated tactical sensors.
Military communications.
While aircraft communications provide the backbone for safe flight, the tactical systems are designed to withstand the rigors of the battle field. UHF, VHF Tactical (30–88 MHz) and SatCom systems combined with ECCM methods, and cryptography secure the communications. Data links such as Link 11, 16, 22 and BOWMAN, JTRS and even TETRA provide the means of transmitting data (such as images, targeting information etc.).
Radar.
Airborne radar was one of the first tactical sensors. The benefit of altitude providing range has meant a significant focus on airborne radar technologies. Radars include airborne early warning (AEW), anti-submarine warfare (ASW), and even weather radar (Arinc 708) and ground tracking/proximity radar.
The military uses radar in fast jets to help pilots fly at low levels. While the civil market has had weather radar for a while, there are strict rules about using it to navigate the aircraft.
Sonar.
Dipping sonar fitted to a range of military helicopters allows the helicopter to protect shipping assets from submarines or surface threats. Maritime support aircraft can drop active and passive sonar devices (sonobuoys) and these are also used to determine the location of enemy submarines.
Electro-optics.
Electro-optic systems include devices such as the head-up display (HUD), forward looking infrared (FLIR), infrared search and track and other passive infrared devices (Passive infrared sensor). These are all used to provide imagery and information to the flight crew. This imagery is used for everything from search and rescue to navigational aids and target acquisition.
ESM/DAS.
Electronic support measures and defensive aids systems are used extensively to gather information about threats or possible threats. They can be used to launch devices (in some cases automatically) to counter direct threats against the aircraft. They are also used to determine the state of a threat and identify it.
Aircraft networks.
The avionics systems in military, commercial and advanced models of civilian aircraft are interconnected using an avionics databus. Common avionics databus protocols, with their primary application, include:
|
2041 | Ares | Ares (; , "Árēs" ) is the Greek god of war and courage. He is one of the Twelve Olympians, and the son of Zeus and Hera. The Greeks were ambivalent towards him. He embodies the physical valor necessary for success in war but can also personify sheer brutality and bloodlust, in contrast to his sister, the armored Athena, whose martial functions include military strategy and generalship. An association with Ares endows places, objects, and other deities with a savage, dangerous, or militarized quality.
Although Ares' name shows his origins as Mycenaean, his reputation for savagery was thought by some to reflect his likely origins as a Thracian deity. Some cities in Greece and several in Asia Minor held annual festivals to bind and detain him as their protector. In parts of Asia Minor, he was an oracular deity. Still further away from Greece, the Scythians were said to ritually kill one in a hundred prisoners of war as an offering to their equivalent of Ares. The later belief that ancient Spartans had offered human sacrifice to Ares may owe more to mythical prehistory, misunderstandings, and reputation than to reality.
Though there are many literary allusions to Ares' love affairs and children, he has a limited role in Greek mythology. When he does appear, he is often humiliated. In the Trojan War, Aphrodite, protector of Troy, persuades Ares to take the Trojans' side. The Trojans lose, while Ares' sister Athena helps the Greeks to victory. Most famously, when the craftsman-god Hephaestus discovers his wife Aphrodite is having an affair with Ares, he traps the lovers in a net and exposes them to the ridicule of the other gods.
Ares' nearest counterpart in Roman religion is Mars, who was given a more important and dignified place in ancient Roman religion as ancestral protector of the Roman people and state. During the Hellenization of Latin literature, the myths of Ares were reinterpreted by Roman writers under the name of Mars, and in later Western art and literature, the mythology of the two figures became virtually indistinguishable.
Names.
The etymology of the name "Ares" is traditionally connected with the Greek word ("arē"), the Ionic form of the Doric ("ara"), "bane, ruin, curse, imprecation". Walter Burkert notes that "Ares is apparently an ancient abstract noun meaning throng of battle, war." R. S. P. Beekes has suggested a Pre-Greek origin of the name. The earliest attested form of the name is the Mycenaean Greek , "a-re", written in the Linear B syllabic script.
The adjectival epithet, "Areios" ("warlike") was frequently appended to the names of other gods when they took on a warrior aspect or became involved in warfare: "Zeus Areios", "Athena Areia", even Aphrodite Areia ("Aphrodite within Ares" or "feminine Ares"), who was warlike, fully armoured and armed, partnered with Athena in Sparta, and represented at Kythira's temple to Aphrodite Urania.
In the "Iliad", the word "ares" is used as a common noun synonymous with "battle."
In the Classical period, Ares is given the epithet Enyalios, which seems to appear on the Mycenaean KN V 52 tablet as , "e-nu-wa-ri-jo". Enyalios was sometimes identified with Ares and sometimes differentiated from him as another war god with separate cult, even in the same town; Burkert describes them as "doubles almost".
Cult.
In mainland Greece and the Peloponnese, only a few places are known to have had a formal temple and cult of Ares. Pausanias (2nd century AD) notes an altar to Ares at Olympia, and the moving of a Temple of Ares to the Athenian agora during the reign of Augustus, essentially rededicating it (2 AD) as a Roman temple to the Augustan Mars Ultor. The Areopagus ("mount of Ares"), a natural rock outcrop in Athens, some distance from the Acropolis, was supposedly where Ares was tried and acquitted by the gods for his revenge-killing of Poseidon's son, Halirrhothius, who had raped Ares' daughter Alcippe. Its name was used for the court that met there, mostly to investigate and try potential cases of treason.
Numismatist M. Jessop Price states that Ares "typified the traditional Spartan character", but had no important cult in Sparta; and he never occurs on Spartan coins. Pausanias gives two examples of his cult, both of them conjointly with or "within" a warlike Aphrodite, on the Spartan acropolis. Gonzalez observes, in his 2005 survey of Ares' cults in Asia Minor, that cults to Ares on the Greek mainland may have been more common than some sources assert. Wars between Greek states were endemic; war and warriors provided Ares's tribute, and fed his insatiable appetite for battle.
Ares' attributes are instruments of war: a helmet, shield, and sword or spear. Libanius "makes the apple sacred to Ares", but "offers no further comment", nor connections to any aetiological myth. Apples are one of Aphrodites' sacred or symbolic fruits. Littlewood follows Artemidorus claim that to dream of sour apples presages conflict, and lists Ares alongside Eris and the mythological "Apples of Discord".
Chained statues.
Gods were immortal but could be bound and restrained, both in mythic narrative and in cult practice. There was an archaic Spartan statue of Ares in chains in the temple of Enyalios (sometimes regarded as the son of Ares, sometimes as Ares himself), which Pausanias claimed meant that the spirit of war and victory was to be kept in the city. The Spartans are known to have ritually bound the images of other deities, including Aphrodite and Artemis (cf Ares and Aphrodite bound by Hephaestus), and in other places there were chained statues of Artemis and Dionysos.
Statues of Ares in chains are described in the instructions given by an oracle of the late Hellenistic era to various cities of Pamphylia (in Anatolia) including Syedra, Lycia and Cilicia, places almost perpetually under threat from pirates. Each was told to set up a statue of "bloody, man-slaying Ares" and provide it with an annual festival in which it was ritually bound with iron fetters ("by Dike and Hermes") as if a supplicant for justice, put on trial and offered sacrifice. The oracle promises that "thus will he become a peaceful deity for you, once he has driven the enemy horde far from your country, and he will give rise to prosperity much prayed for." This Ares "karpodotes" ("giver of Fruits") is well attested in Lycia and Pisidia.
Sacrifices.
Like most Greek deities, Ares was given animal sacrifice; in Sparta, after battle, he was given an ox for a victory by stratagem, or a rooster for victory through onslaught. The usual recipient of sacrifice before battle was Athena. Reports of historic human sacrifice to Ares in an obscure rite known as the "Hekatomphonia" represent a very long-standing error, repeated through several centuries and well into the modern era. The "hekatomphonia" was an animal sacrifice to Zeus; it could be offered by any warrior who had personally slain one hundred of the enemy. Pausanias reports that in Sparta, each company of youths sacrificed a puppy to Enyalios before engaging in a hand-to-hand "fight without rules" at the Phoebaeum. The chthonic night-time sacrifice of a dog to Enyalios became assimilated to the cult of Ares. Porphyry claims, without detail, that Apollodorus of Athens (circa second century BC) says the Spartans made human sacrifices to Ares, but this may be a reference to mythic pre-history.
Thrace and Scythia.
A Thracian god identified by Herodotus ( – ) as Ares, through "interpretatio Graeca", was one of three otherwise unnamed deities that Thracian commoners were said to worship. Herodotus recognises and names the other two as "Dionysus" and "Artemis", and claims that the Thracian aristocracy exclusively worshiped "Hermes". In Herodotus' "Histories", the Scythians worship an indigenous form of Greek Ares, who is otherwise unnamed, but ranked beneath Tabiti (whom Herodotus claims as a form of Hestia), Api and Papaios in Scythia's divine hierarchy. His cult object was an iron sword. The "Scythian Ares" was offered blood-sacrifices (or ritual killings) of cattle, horses and "one in every hundred human war-captives", whose blood was used to douse the sword. Statues, and complex platform-altars made of heaped brushwood were devoted to him. This sword-cult, or one very similar, is said to have persisted among the Alans. Some have posited that the "Sword of Mars" in later European history alludes to the Huns having adopted Ares.
Asia Minor.
In some parts of Asia Minor, Ares was a prominent oracular deity, something not found in any Hellennic cult to Ares or Roman cult to Mars. Ares was linked in some regions or polities with a local god or cultic hero, and recognised as a higher, more prestigious deity than in mainland Greece. His cults in southern Asia Minor are attested from the 5th century BC and well into the later Roman Imperial era, at 29 different sites, and on over 70 local coin issues. He is sometimes represented on coinage of the region by the "Helmet of Ares" or carrying a spear and a shield, or as a fully armed warrior, sometimes accompanied by a female deity. In what is now western Turkey, the Hellenistic city of Metropolis built a monumental temple to Ares as the city's protector, not before the 3rd century BC. It is now lost, but the names of some of its priests and priestesses survive, along with the temple's likely depictions on coins of the province.
Crete.
A sanctuary of Aphrodite was established at Sta Lenika, on Crete, between the cities of Lato and Olus, possibly during the Geometric period. It was rebuilt in the late 2nd century BC as a double-sanctuary to Ares and Aphrodite. Inscriptions record disputes over the ownership of the sanctuary. The names of Ares and Aphrodite appear as witness to sworn oaths, and there is a Victory thanks-offering to Aphrodite, whom Millington believes had capacity as a "warrior-protector acting in the realm of Ares". There were cultic links between the Sta Lenika sanctuary, Knossos and other Cretan states, and perhaps with Argos on the mainland. While the Greek literary and artistic record from both the Archaic and Classical eras connects Ares and Aphrodite as complementary companions and ideal though adulterous lovers, their cult pairing and Aphrodite as warrior-protector is localised to Crete.
Aksum.
In Africa, Maḥrem, the principal god of the kings of Aksum prior to the 4th century AD, was invoked as Ares in Greek inscriptions. The anonymous king who commissioned the Monumentum Adulitanum in the late 2nd or early 3rd century refers to "my greatest god, Ares, who also begat me, through whom I brought under my sway [various peoples]". The monumental throne celebrating the king's conquests was itself dedicated to Ares. In the early 4th century, the last pagan king of Aksum, Ezana, referred to "the one who brought me forth, the invincible Ares".
Characterisation.
Ares was one of the Twelve Olympians in the archaic tradition represented by the "Iliad" and "Odyssey." In Greek literature, Ares often represents the physical or violent and untamed aspect of war and is the personification of sheer brutality and bloodlust ("overwhelming, insatiable in battle, destructive, and man-slaughtering", as Burkert puts it), in contrast to his sister, the armored Athena, whose functions as a goddess of intelligence include military strategy and generalship. An association with Ares endows places and objects with a savage, dangerous, or militarized quality; but when Ares does appear in myths, he typically faces humiliation.
In the Iliad, Zeus expresses a recurring Greek revulsion toward the god when Ares returns wounded and complaining from the battlefield at Troy:
This ambivalence is expressed also in the Greeks' association of Ares with the Thracians, whom they regarded as a barbarous and warlike people. Thrace was considered to be Ares's birthplace and his refuge after the affair with Aphrodite was exposed to the general mockery of the other gods.
A late-6th-century BC funerary inscription from Attica emphasizes the consequences of coming under Ares's sway:
Mythology.
Birth.
He is one of the Twelve Olympians, and the son of Zeus and Hera.
Argonautica.
In the "Argonautica", the "Golden Fleece" hangs in a grove sacred to Ares, until its theft by Jason. The Birds of Ares ("Ornithes Areioi") drop feather darts in defense of the Amazons' shrine to Ares, as father of their queen, on a coastal island in the Black Sea.
Founding of Thebes.
Ares plays a central role in the founding myth of Thebes, as the progenitor of the water-dragon slain by Cadmus. The dragon's teeth were sown into the ground as if a crop and sprang up as the fully armored autochthonic Spartoi. Cadmus placed himself in the god's service for eight years to atone for killing the dragon. To further propitiate Ares, Cadmus married Harmonia, a daughter of Ares's union with Aphrodite. In this way, Cadmus harmonized all strife and founded the city of Thebes. In reality, Thebes came to dominate Boeotia's great and fertile plain, which in both history and myth was a battleground for competing polities. According to Plutarch, the plain was anciently described as "The dancing-floor of Ares".
Aphrodite.
In Homer's "Odyssey", in the tale sung by the bard in the hall of Alcinous, the Sun-god Helios once spied Ares and Aphrodite having sex secretly in the hall of Hephaestus, her husband. Helios reported the incident to Hephaestus. Contriving to catch the illicit couple in the act, Hephaestus fashioned a finely-knitted and nearly invisible net with which to snare them. At the appropriate time, this net was sprung, and trapped Ares and Aphrodite locked in very private embrace.
But Hephaestus was not satisfied with his revenge, so he invited the Olympian gods and goddesses to view the unfortunate pair. For the sake of modesty, the goddesses demurred, but the male gods went to witness the sight. Some commented on the beauty of Aphrodite, others remarked that they would eagerly trade places with Ares, but all who were present mocked the two. Once the couple was released, the embarrassed Ares returned to his homeland, Thrace, and Aphrodite went to Paphos.
In a much later interpolated detail, Ares put the young soldier Alectryon, who was Ares companion in drinking and even love-making, by his door to warn them of Helios's arrival as Helios would tell Hephaestus of Aphrodite's infidelity if the two were discovered, but Alectryon fell asleep on guard duty. Helios discovered the two and alerted Hephaestus. The furious Ares turned the sleepy Alectryon into a rooster which now always announces the arrival of the sun in the morning, as a way of apologizing to Ares.
The Chorus of Aeschylus' "Suppliants" (written 463 BC) refers to Ares as Aphrodite's "mortal-destroying bedfellow". In the "Illiad", Ares helps the Trojans because of his affection for their divine protector, Aphrodite; she thus redirects his innate destructive savagery to her own purposes.
Giants.
In one archaic myth, related only in the "Iliad" by the goddess Dione to her daughter Aphrodite, two chthonic giants, the Aloadae, named Otus and Ephialtes, bound Ares in chains and imprisoned him in a bronze urn, where he remained for thirteen months, a lunar year. "And that would have been the end of Ares and his appetite for war, if the beautiful Eriboea, the young giants' stepmother, had not told Hermes what they had done," she related. In this, [Burkert] suspects "a festival of licence which is unleashed in the thirteenth month." Ares was held screaming and howling in the urn until Hermes rescued him, and Artemis tricked the Aloadae into slaying each other. In Nonnus's "Dionysiaca", in the war between Cronus and Zeus, Ares killed an unnamed giant son of Echidna who was allied with Cronus, and described as spitting "horrible poison" and having "snaky" feet.
In the 2nd century AD "Metamorphoses" of Antoninus Liberalis, when the monstrous Typhon attacked Olympus the gods transformed into animals and fled to Egypt; Ares changed into a fish, the Lepidotus (sacred to the Egyptian war-god Anhur). Liberalis's koine Greek text is a "completely inartistic" epitome of Nicander's now lost "Heteroeumena" (2nd century BC).
"Iliad".
In Homer's "Iliad", Ares has no fixed allegiance. He promises Athena and Hera that he will fight for the Achaeans but Aphrodite persuades him to side with the Trojans. During the war, Diomedes fights Hector and sees Ares fighting on the Trojans' side. Diomedes calls for his soldiers to withdraw. Zeus grants Athena permission to drive Ares from the battlefield. Encouraged by Hera and Athena, Diomedes thrusts with his spear at Ares. Athena drives the spear home, and all sides tremble at Ares's cries. Ares flees to Mount Olympus, forcing the Trojans to fall back. Ares overhears that his son Ascalaphus has been killed and wants to change sides again, rejoining the Achaeans for vengeance, disregarding Zeus's order that no Olympian should join the battle. Athena stops him. Later, when Zeus allows the gods to fight in the war again, Ares attacks Athena to avenge his previous injury. Athena overpowers him by striking him with a boulder.
Attendants.
Deimos ("Terror" or "Dread") and Phobos ("Fear") are Ares' companions in war, and according to Hesiod, are also his children by Aphrodite. Eris, the goddess of discord, or Enyo, the goddess of war, bloodshed, and violence, was considered the sister and companion of the violent Ares. In at least one tradition, Enyalius, rather than another name for Ares, was his son by Enyo.
Ares may also be accompanied by Kydoimos, the daemon of the din of battle; the Makhai ("Battles"); the "Hysminai" ("Acts of manslaughter"); Polemos, a minor spirit of war, or only an epithet of Ares, since it has no specific dominion; and Polemos's daughter, Alala, the goddess or personification of the Greek war-cry, whose name Ares uses as his own war-cry. Ares's sister Hebe ("Youth") also draws baths for him.
According to Pausanias, local inhabitants of Therapne, Sparta, recognized Thero, "feral, savage," as a nurse of Ares.
Offspring and affairs.
Though Ares plays a relatively limited role in Greek mythology as represented in literary narratives, his numerous love affairs and abundant offspring are often alluded to.
The union of Ares and Aphrodite created the gods Eros, Anteros, Phobos, Deimos, and Harmonia. Other versions include Alcippe as one of his daughters.
Cycnus (Κύκνος) of Macedonia was a son of Ares who tried to build a temple to his father with the skulls and bones of guests and travellers. Heracles fought him and, in one account, killed him. In another account, Ares fought his son's killer but Zeus parted the combatants with a thunderbolt.
Ares had a romantic liaison with Eos, the goddess of the dawn. Aphrodite discovered them, and in anger she cursed Eos with insatiable lust for men.
By a woman named Teirene he had a daughter named Thrassa, who in turn had a daughter named Polyphonte. Polyphonte was cursed by Aphrodite to love and mate with a bear, producing two sons, Agrius and Oreius, who were hubristic toward the gods and had a habit of eating their guests. Zeus sent Hermes to punish them, and he chose to chop off their hands and feet. Since Polyphonte was descended from him, Ares stopped Hermes, and the two brothers came into an agreement to turn Polyphonte's family into birds instead. Oreius became an eagle owl, Agrius a vulture, and Polyphonte a strix, possibly a small owl, certainly a portent of war; Polyphonte's servant prayed not to become a bird of evil omen and Ares and Hermes fulfilled her wish by choosing the woodpecker for her, a good omen for hunters.
List of offspring and their mothers.
Sometimes poets and dramatists recounted ancient traditions, which varied, and sometimes they invented new details; later scholiasts might draw on either or simply guess. Thus while Phobos and Deimos were regularly described as offspring of Ares, others listed here such as Meleager, Sinope and Solymus were sometimes said to be children of Ares and sometimes given other fathers.
Mars.
The nearest counterpart of Ares among the Roman gods is Mars, a son of Jupiter and Juno, pre-eminent among the Roman army's military gods but originally an agricultural deity. As a father of Romulus, Rome's legendary founder, Mars was given an important and dignified place in ancient Roman religion, as a guardian deity of the entire Roman state and its people. Under the influence of Greek culture, Mars was identified with Ares, but the character and dignity of the two deities differed fundamentally. Mars was represented as a means to secure peace, and he was a father "(pater)" of the Roman people. In one tradition, he fathered Romulus and Remus through his rape of Rhea Silvia. In another, his lover, the goddess Venus, gave birth to Aeneas, the Trojan prince and refugee who "founded" Rome several generations before Romulus.
In the Hellenization of Latin literature, the myths of Ares were reinterpreted by Roman writers under the name of Mars. Greek writers under Roman rule also recorded cult practices and beliefs pertaining to Mars under the name of Ares. Thus in the classical tradition of later Western art and literature, the mythology of the two figures later became virtually indistinguishable.
Renaissance and later depictions.
In Renaissance and Neoclassical works of art, Ares's symbols are a spear and helmet, his animal is a dog, and his bird is the vulture. In literary works of these eras, Ares is replaced by the Roman Mars, a romantic emblem of manly valor rather than the cruel and blood-thirsty god of Greek mythology.
|
2042 | Alexander Grothendieck | Alexander Grothendieck (; ; ; 28 March 1928 – 13 November 2014) was a stateless (and then, since 1971, French) mathematician who became the leading figure in the creation of modern algebraic geometry. His research extended the scope of the field and added elements of commutative algebra, homological algebra, sheaf theory, and category theory to its foundations, while his so-called "relative" perspective led to revolutionary advances in many areas of pure mathematics. He is considered by many to be the greatest mathematician of the twentieth century.
Grothendieck began his productive and public career as a mathematician in 1949. In 1958, he was appointed a research professor at the Institut des hautes études scientifiques (IHÉS) and remained there until 1970, when, driven by personal and political convictions, he left following a dispute over military funding. He received the Fields Medal in 1966 for advances in algebraic geometry, homological algebra, and K-theory. He later became professor at the University of Montpellier and, while still producing relevant mathematical work, he withdrew from the mathematical community and devoted himself to political and religious pursuits (first Buddhism and later, a more Christian vision). In 1991, he moved to the French village of Lasserre in the Pyrenees, where he lived in seclusion, still working tirelessly on mathematics and his philosophical and religious thoughts until his death in 2014.
Life.
Family and childhood.
Grothendieck was born in Berlin to anarchist parents. His father, Alexander "Sascha" Schapiro (also known as Alexander Tanaroff), had Hasidic Jewish roots and had been imprisoned in Russia before moving to Germany in 1922, while his mother, Johanna "Hanka" Grothendieck, came from a Protestant German family in Hamburg and worked as a journalist. As teenagers, both of his parents had broken away from their early backgrounds. At the time of his birth, Grothendieck's mother was married to the journalist Johannes Raddatz and initially, his birth name was recorded as "Alexander Raddatz." That marriage was dissolved in 1929 and Schapiro acknowledged his paternity, but never married Hanka Grothendieck. Grothendieck had a maternal sibling, his half sister Maidi.
Grothendieck lived with his parents in Berlin until the end of 1933, when his father moved to Paris to evade Nazism. His mother followed soon thereafter. Grothendieck was left in the care of Wilhelm Heydorn, a Lutheran pastor and teacher in Hamburg. According to Winfried Scharlau, during this time, his parents took part in the Spanish Civil War as non-combatant auxiliaries. However, others state that Schapiro fought in the anarchist militia.
World War II.
In May 1939, Grothendieck was put on a train in Hamburg for France. Shortly afterward his father was interned in Le Vernet. He and his mother were then interned in various camps from 1940 to 1942 as "undesirable dangerous foreigners." The first camp was the Rieucros Camp, where his mother contracted the tuberculosis that would eventually cause her death in 1957. While there, Grothendieck managed to attend the local school, at Mendel. Once, he managed to escape from the camp, intending to assassinate Hitler. Later, his mother Hanka was transferred to the Gurs internment camp for the remainder of World War II. Grothendieck was permitted to live separated from his mother.
In the village of Le Chambon-sur-Lignon, he was sheltered and hidden in local boarding houses or pensions, although he occasionally had to seek refuge in the woods during Nazi raids, surviving at times without food or water for several days.
His father was arrested under the Vichy anti-Jewish legislation, and sent to the Drancy internment camp, and then handed over by the French Vichy government to the Germans to be sent to be murdered at the Auschwitz concentration camp in 1942.
In Le Chambon, Grothendieck attended the Collège Cévenol (now known as the Le Collège-Lycée Cévenol International), a unique secondary school founded in 1938 by local Protestant pacifists and anti-war activists. Many of the refugee children hidden in Le Chambon attended Collège Cévenol, and it was at this school that Grothendieck apparently first became fascinated with mathematics.
Studies and contact with research mathematics.
After the war, the young Grothendieck studied mathematics in France, initially at the University of Montpellier where at first he did not perform well, failing such classes as astronomy. Working on his own, he rediscovered the Lebesgue measure. After three years of increasingly independent studies there, he went to continue his studies in Paris in 1948.
Initially, Grothendieck attended Henri Cartan's Seminar at École Normale Supérieure, but he lacked the necessary background to follow the high-powered seminar. On the advice of Cartan and André Weil, he moved to the University of Nancy where two leading experts were working on Grothendieck's area of interest, topological vector spaces: Jean Dieudonné and Laurent Schwartz. The latter had recently won a Fields Medal. He showed his new student his latest paper; it ended with a list of 14 open questions, relevant for locally convex spaces. Grothendieck introduced new mathematical methods that enabled him to solve all of these problems within a few months.
In Nancy, he wrote his dissertation under those two professors on functional analysis, from 1950 to 1953. At this time he was a leading expert in the theory of topological vector spaces. In 1953 he moved to the University of São Paulo in Brazil, where he immigrated by means of a Nansen passport, given that he had refused to take French nationality (as that would have entailed military service against his convictions). He stayed in São Paulo (apart from a lengthy visit in France from October 1953 - March 1954) until the end of 1954. His published work from the time spent in Brazil is still in the theory of topological vector spaces; it is there that he completed his last major work on that topic (on "metric" theory of Banach spaces).
Grothendieck moved to Lawrence, Kansas at the beginning of 1955, and there he set his old subject aside in order to work in algebraic topology and homological algebra, and increasingly in algebraic geometry. It was in Lawrence that Grothendieck developed his theory of Abelian categories and the reformulation of sheaf cohomology based on them, leading to the very influential "Tôhoku paper".
In 1957 he was invited to visit Harvard by Oscar Zariski, but the offer fell through when he refused to sign a pledge promising not to work to overthrow the United States government—a refusal which, he was warned, threatened to land him in prison. The prospect of prison did not worry him, so long as he could have access to books.
Comparing Grothendieck during his Nancy years to the École Normale Supérieure-trained students at that time (Pierre Samuel, Roger Godement, René Thom, Jacques Dixmier, Jean Cerf, Yvonne Bruhat, Jean-Pierre Serre, and Bernard Malgrange), Leila Schneps said:
His first works on topological vector spaces in 1953 have been successfully applied to physics and computer science, culminating in a relation between Grothendieck inequality and the Einstein-Podolsky-Rosen paradox in quantum physics.
IHÉS years.
In 1958, Grothendieck was installed at the Institut des hautes études scientifiques (IHÉS), a new privately funded research institute that, in effect, had been created for Jean Dieudonné and Grothendieck. Grothendieck attracted attention by an intense and highly productive activity of seminars there ("de facto" working groups drafting into foundational work some of the ablest French and other mathematicians of the younger generation). Grothendieck practically ceased publication of papers through the conventional, learned journal route. He was, however, able to play a dominant role in mathematics for approximately a decade, gathering a strong school.
Officially during this time, he had as students Michel Demazure (who worked on SGA3, on group schemes), Luc Illusie (cotangent complex), Michel Raynaud, Jean-Louis Verdier (co-founder of the derived category theory), and Pierre Deligne. Collaborators on the SGA projects also included Michael Artin (étale cohomology), Nick Katz (monodromy theory, and Lefschetz pencils). Jean Giraud worked out torsor theory extensions of nonabelian cohomology there as well. Many others such as David Mumford, Robin Hartshorne, Barry Mazur and C.P. Ramanujam were also involved.
"Golden Age".
Alexander Grothendieck's work during what is described as the "Golden Age" period at the IHÉS established several unifying themes in algebraic geometry, number theory, topology, category theory, and complex analysis. His first (pre-IHÉS) discovery in algebraic geometry was the Grothendieck–Hirzebruch–Riemann–Roch theorem, a generalisation of the Hirzebruch–Riemann–Roch theorem proved algebraically; in this context he also introduced K-theory. Then, following the programme he outlined in his talk at the 1958 International Congress of Mathematicians, he introduced the theory of schemes, developing it in detail in his "Éléments de géométrie algébrique" ("EGA") and providing the new more flexible and general foundations for algebraic geometry that has been adopted in the field since that time. He went on to introduce the étale cohomology theory of schemes, providing the key tools for proving the Weil conjectures, as well as crystalline cohomology and algebraic de Rham cohomology to complement it. Closely linked to these cohomology theories, he originated topos theory as a generalisation of topology (relevant also in categorical logic). He also provided an algebraic definition of fundamental groups of schemes and more generally the main structures of a categorical Galois theory. As a framework for his coherent duality theory, he also introduced derived categories, which were further developed by Verdier.
The results of his work on these and other topics were published in the "EGA" and in less polished form in the notes of the "Séminaire de géométrie algébrique" ("SGA") that he directed at the IHÉS.
Political activism.
Grothendieck's political views were radical and pacifistic. He strongly opposed both United States intervention in Vietnam and Soviet military expansionism. To protest against the Vietnam War, he gave lectures on category theory in the forests surrounding Hanoi while the city was being bombed. In 1966, he had declined to attend the International Congress of Mathematicians (ICM) in Moscow, where he was to receive the Fields Medal. He retired from scientific life around 1970 after he had found out that IHÉS was partly funded by the military. He returned to academia a few years later as a professor at the University of Montpellier.
While the issue of military funding was perhaps the most obvious explanation for Grothendieck's departure from the IHÉS, those who knew him say that the causes of the rupture ran more deeply. Pierre Cartier, a "visiteur de longue durée" ("long-term guest") at the IHÉS, wrote a piece about Grothendieck for a special volume published on the occasion of the IHÉS's fortieth anniversary. In that publication, Cartier notes that as the son of an antimilitary anarchist and one who grew up among the disenfranchised, Grothendieck always had a deep compassion for the poor and the downtrodden. As Cartier puts it, Grothendieck came to find Bures-sur-Yvette as "une cage dorée" ("a gilded cage"). While Grothendieck was at the IHÉS, opposition to the Vietnam War was heating up, and Cartier suggests that this also reinforced Grothendieck's distaste at having become a mandarin of the scientific world. In addition, after several years at the IHÉS, Grothendieck seemed to cast about for new intellectual interests. By the late 1960s, he had started to become interested in scientific areas outside mathematics. David Ruelle, a physicist who joined the IHÉS faculty in 1964, said that Grothendieck came to talk to him a few times about physics. Biology interested Grothendieck much more than physics, and he organized some seminars on biological topics.
In 1970, Grothendieck, with two other mathematicians, Claude Chevalley and Pierre Samuel, created a political group entitled "Survivre"—the name later changed to "Survivre et vivre". The group published a bulletin and was dedicated to antimilitary and ecological issues. It also developed strong criticism of the indiscriminate use of science and technology. Grothendieck devoted the next three years to this group and served as the main editor of its bulletin.
Although Grothendieck continued with mathematical enquiries, his standard mathematical career mostly ended when he left the IHÉS. After leaving the IHÉS, Grothendieck became a temporary professor at Collège de France for two years. He then became a professor at the University of Montpellier, where he became increasingly estranged from the mathematical community. He formally retired in 1988, a few years after having accepted a research position at the CNRS.
Manuscripts written in the 1980s.
While not publishing mathematical research in conventional ways during the 1980s, he produced several influential manuscripts with limited distribution, with both mathematical and biographical content.
Produced during 1980 and 1981, "La Longue Marche à travers la théorie de Galois" ("The Long March Through Galois Theory") is a 1600-page handwritten manuscript containing many of the ideas that led to the "Esquisse d'un programme". It also includes a study of Teichmüller theory.
In 1983, stimulated by correspondence with Ronald Brown and Tim Porter at Bangor University, Grothendieck wrote a 600-page manuscript entitled "Pursuing Stacks". It began with a letter addressed to Daniel Quillen. This letter and successive parts were distributed from Bangor (see External links below). Within these, in an informal, diary-like manner, Grothendieck explained and developed his ideas on the relationship between algebraic homotopy theory and algebraic geometry and prospects for a noncommutative theory of stacks. The manuscript, which is being edited for publication by G. Maltsiniotis, later led to another of his monumental works, "Les Dérivateurs". Written in 1991, this latter opus of approximately 2000 pages, further developed the homotopical ideas begun in "Pursuing Stacks". Much of this work anticipated the subsequent development during the mid-1990s of the motivic homotopy theory of Fabien Morel and Vladimir Voevodsky.
In 1984, Grothendieck wrote the proposal "Esquisse d'un Programme" ("Sketch of a Programme") for a position at the Centre National de la Recherche Scientifique (CNRS). It describes new ideas for studying the moduli space of complex curves. Although Grothendieck never published his work in this area, the proposal inspired other mathematicians to work in the area by becoming the source of dessin d'enfant theory and Anabelian geometry. Later, it was published in two-volumes and entitled "Geometric Galois Actions" (Cambridge University Press, 1997).
During this period, Grothendieck also gave his consent to publishing some of his drafts for EGA on Bertini-type theorems ("EGA" V, published in Ulam Quarterly in 1992–1993 and later made available on the Grothendieck Circle web site in 2004).
In the 1,000-page autobiographical manuscript, "Récoltes et semailles" (1986), Grothendieck describes his approach to mathematics and his experiences in the mathematical community, a community that initially accepted him in an open and welcoming manner, but which he progressively perceived to be governed by competition and status. He complains about what he saw as the "burial" of his work and betrayal by his former students and colleagues after he had left the community. The "Récoltes et semailles" work is now available on the internet in the French original, and an English translation is underway. A Japanese translation in four volumes was completed by Tsuji Yuichi, a friend of Grothendieck from the "Survivre" period, and its first three volumes were published between 1989 and 1993, while the fourth volume is completed, but it never has been published. Grothendieck helped with the translation and wrote a preface for it. Parts of "Récoltes et semailles" have been translated into Spanish, as well as into a Russian translation that was published in Moscow. The French original was finally published in two volumes in January 2022, with additional texts by people of various professions who discuss certain aspects of the book.
In 1988, Grothendieck declined the Crafoord Prize with an open letter to the media. He wrote that he and other established mathematicians had no need for additional financial support and criticized what he saw as the declining ethics of the scientific community that was characterized by outright scientific theft that he believed had become commonplace and tolerated. The letter also expressed his belief that totally unforeseen events before the end of the century would lead to an unprecedented collapse of civilization. Grothendieck added however that his views were "in no way meant as a criticism of the Royal Academy's aims in the administration of its funds" and he added, "I regret the inconvenience that my refusal to accept the Crafoord prize may have caused you and the Royal Academy."
"La Clef des Songes", a 315-page manuscript written in 1987, is Grothendieck's account of how his consideration of the source of dreams led him to conclude that a deity exists. As part of the notes to this manuscript, Grothendieck described the life and the work of 18 "mutants", people whom he admired as visionaries far ahead of their time and heralding a new age. The only mathematician on his list was Bernhard Riemann. Influenced by the Catholic mystic Marthe Robin who was claimed to have survived on the Holy Eucharist alone, Grothendieck almost starved himself to death in 1988. His growing preoccupation with spiritual matters was also evident in a letter entitled "Lettre de la Bonne Nouvelle" sent to 250 friends in January 1990. In it, he described his encounters with a deity and announced that a "New Age" would commence on 14 October 1996.
The "Grothendieck Festschrift", published in 1990, was a three-volume collection of research papers to mark his sixtieth birthday in 1988.
More than 20,000 pages of Grothendieck's mathematical and other writings are held at the University of Montpellier and remain unpublished. They have been digitized for preservation and are freely available in open access through the Institut Montpelliérain Alexander Grothendieck portal.
Retirement into reclusion and death.
In 1991, Grothendieck moved to a new address that he did not share with his previous contacts in the mathematical community. Very few people visited him afterward. Local villagers helped sustain him with a more varied diet after he tried to live on a staple of dandelion soup. At some point, Leila Schneps and Pierre Lochak located him, then carried on a brief correspondence. Thus they became among "the last members of the mathematical establishment to come into contact with him". After his death, it was revealed that he lived alone in a house in Lasserre, Ariège, a small village at the foot of the Pyrenees.
In January 2010, Grothendieck wrote the letter entitled "Déclaration d'intention de non-publication" to Luc Illusie, claiming that all materials published in his absence had been published without his permission. He asked that none of his work be reproduced in whole or in part and that copies of this work be removed from libraries. He characterized a website devoted to his work as "an abomination". His dictate may have been reversed in 2010.
On 13 November 2014, aged 86, Grothendieck died in the hospital of Saint-Girons, Ariège.
Citizenship.
Grothendieck was born in Weimar Germany. In 1938, aged ten, he moved to France as a refugee. Records of his nationality were destroyed in the fall of Nazi Germany in 1945 and he did not apply for French citizenship after the war. Thus, he became a stateless person for at least the majority of his working life and he traveled on a Nansen passport. Part of his reluctance to hold French nationality is attributed to not wishing to serve in the French military, particularly due to the Algerian War (1954–62). He eventually applied for French citizenship in the early 1980s, after he was well past the age that exempted him from military service.
Family.
Grothendieck was very close to his mother to whom he dedicated his dissertation. She died in 1957 from the tuberculosis that she contracted in camps for displaced persons.
He had five children: a son with his landlady during his time in Nancy; three children, Johanna (1959), Alexander (1961), and Mathieu (1965) with his wife Mireille Dufour; and one child with Justine Skalba, with whom he lived in a commune in the early 1970s.
Mathematical work.
Grothendieck's early mathematical work was in functional analysis. Between 1949 and 1953 he worked on his doctoral thesis in this subject at Nancy, supervised by Jean Dieudonné and Laurent Schwartz. His key contributions include topological tensor products of topological vector spaces, the theory of nuclear spaces as foundational for Schwartz distributions, and the application of Lp spaces in studying linear maps between topological vector spaces. In a few years, he had become a leading authority on this area of functional analysis—to the extent that Dieudonné compares his impact in this field to that of Banach.
It is, however, in algebraic geometry and related fields where Grothendieck did his most important and influential work. From approximately 1955 he started to work on sheaf theory and homological algebra, producing the influential "Tôhoku paper" ("Sur quelques points d'algèbre homologique", published in the Tohoku Mathematical Journal in 1957) where he introduced abelian categories and applied their theory to show that sheaf cohomology may be defined as certain derived functors in this context.
Homological methods and sheaf theory had already been introduced in algebraic geometry by Jean-Pierre Serre and others, after sheaves had been defined by Jean Leray. Grothendieck took them to a higher level of abstraction and turned them into a key organising principle of his theory. He shifted attention from the study of individual varieties to his "relative point of view" (pairs of varieties related by a morphism), allowing a broad generalization of many classical theorems. The first major application was the relative version of Serre's theorem showing that the cohomology of a coherent sheaf on a complete variety is finite-dimensional; Grothendieck's theorem shows that the higher direct images of coherent sheaves under a proper map are coherent; this reduces to Serre's theorem over a one-point space.
In 1956, he applied the same thinking to the Riemann–Roch theorem, which recently had been generalized to any dimension by Hirzebruch. The Grothendieck–Riemann–Roch theorem was announced by Grothendieck at the initial Mathematische Arbeitstagung in Bonn, in 1957. It appeared in print in a paper written by Armand Borel with Serre. This result was his first work in algebraic geometry. Grothendieck went on to plan and execute a programme for rebuilding the foundations of algebraic geometry, which at the time were in a state of flux and under discussion in Claude Chevalley's seminar. He outlined his programme in his talk at the 1958 International Congress of Mathematicians.
His foundational work on algebraic geometry is at a higher level of abstraction than all prior versions. He adapted the use of non-closed generic points, which led to the theory of schemes. Grothendieck also pioneered the systematic use of nilpotents. As 'functions' these can take only the value 0, but they carry infinitesimal information, in purely algebraic settings. His "theory of schemes" has become established as the best universal foundation for this field, because of its expressiveness as well as its technical depth. In that setting one can use birational geometry, techniques from number theory, Galois theory, commutative algebra, and close analogues of the methods of algebraic topology, all in an integrated way.
Grothendieck is noted for his mastery of abstract approaches to mathematics and his perfectionism in matters of formulation and presentation. Relatively little of his work after 1960 was published by the conventional route of the learned journal, circulating initially in duplicated volumes of seminar notes; his influence was to a considerable extent personal. His influence spilled over into many other branches of mathematics, for example the contemporary theory of D-modules. Although lauded as "the Einstein of mathematics", his work also provoked adverse reactions, with many mathematicians seeking out more concrete areas and problems.
"EGA", "SGA", "FGA".
The bulk of Grothendieck's published work is collected in the monumental, yet incomplete, "Éléments de géométrie algébrique" ("EGA") and "Séminaire de géométrie algébrique" ("SGA"). The collection, "Fondements de la Géometrie Algébrique" ("FGA"), which gathers together talks given in the Séminaire Bourbaki, also contains important material.
Grothendieck's work includes the invention of the étale and l-adic cohomology theories, which explain an observation made by André Weil that argued for a connection between the topological characteristics of a variety and its diophantine (number theoretic) properties. For example, the number of solutions of an equation over a finite field reflects the topological nature of its solutions over the complex numbers. Weil had realized that to prove such a connection, one needed a new cohomology theory, but neither he nor any other expert saw how to accomplish this until such a theory was expressed by Grothendieck.
This program culminated in the proofs of the Weil conjectures, the last of which was settled by Grothendieck's student Pierre Deligne in the early 1970s after Grothendieck had largely withdrawn from mathematics.
Major mathematical contributions.
In Grothendieck's retrospective "Récoltes et Semailles", he identified twelve of his contributions that he believed qualified as "great ideas". In chronological order, they are:
Here the term "yoga" denotes a kind of "meta-theory" that may be used heuristically; Michel Raynaud writes the other terms "Ariadne's thread" and "philosophy" as effective equivalents.
Grothendieck wrote that, of these themes, the largest in scope was topoi, as they synthesized algebraic geometry, topology, and arithmetic. The theme that had been most extensively developed was schemes, which were the framework "par excellence" for eight of the other themes (all but 1, 5, and 12). Grothendieck wrote that the first and last themes, topological tensor products and regular configurations, were of more modest size than the others. Topological tensor products had played the role of a tool rather than of a source of inspiration for further developments; but he expected that regular configurations could not be exhausted within the lifetime of a mathematician who devoted oneself to it. He believed that the deepest themes were motives, anabelian geometry, and Galois–Teichmüller theory.
Influence.
Grothendieck is considered by many to be the greatest mathematician of the twentieth century. In an obituary David Mumford and John Tate wrote:
Although mathematics became more and more abstract and general throughout the 20th century, it was Alexander Grothendieck who was the greatest master of this trend. His unique skill was to eliminate all unnecessary hypotheses and burrow into an area so deeply that its inner patterns on the most abstract level revealed themselves–and then, like a magician, show how the solution of old problems fell out in straightforward ways now that their real nature had been revealed.
By the 1970s, Grothendieck's work was seen as influential, not only in algebraic geometry and the allied fields of sheaf theory and homological algebra, but influenced logic, in the field of categorical logic.
Geometry.
Grothendieck approached algebraic geometry by clarifying the foundations of the field, and by developing mathematical tools intended to prove a number of notable conjectures. Algebraic geometry has traditionally meant the understanding of geometric objects, such as algebraic curves and surfaces, through the study of the algebraic equations for those objects. Properties of algebraic equations are in turn studied using the techniques of ring theory. In this approach, the properties of a geometric object are related to the properties of an associated ring. The space (e.g., real, complex, or projective) in which the object is defined, is extrinsic to the object, while the ring is intrinsic.
Grothendieck laid a new foundation for algebraic geometry by making intrinsic spaces ("spectra") and associated rings the primary objects of study. To that end, he developed the theory of schemes that informally can be thought of as topological spaces on which a commutative ring is associated to every open subset of the space. Schemes have become the basic objects of study for practitioners of modern algebraic geometry. Their use as a foundation allowed geometry to absorb technical advances from other fields.
His generalization of the classical Riemann–Roch theorem related topological properties of complex algebraic curves to their algebraic structure and now bears his name, being called "the Grothendieck–Hirzebruch–Riemann–Roch theorem". The tools he developed to prove this theorem started the study of algebraic and topological K-theory, which explores the topological properties of objects by associating them with rings. After direct contact with Grothendieck's ideas at the Bonn Arbeitstagung, topological K-theory was founded by Michael Atiyah and Friedrich Hirzebruch.
Cohomology theories.
Grothendieck's construction of new cohomology theories, which use algebraic techniques to study topological objects, has influenced the development of algebraic number theory, algebraic topology, and representation theory. As part of this project, his creation of topos theory, a category-theoretic generalization of point-set topology, has influenced the fields of set theory and mathematical logic.
The Weil conjectures were formulated in the later 1940s as a set of mathematical problems in arithmetic geometry. They describe properties of analytic invariants, called local zeta functions, of the number of points on an algebraic curve or variety of higher dimension. Grothendieck's discovery of the ℓ-adic étale cohomology, the first example of a Weil cohomology theory, opened the way for a proof of the Weil conjectures, ultimately completed in the 1970s by his student Pierre Deligne. Grothendieck's large-scale approach has been called a "visionary program". The ℓ-adic cohomology then became a fundamental tool for number theorists, with applications to the Langlands program.
Grothendieck's conjectural theory of motives was intended to be the "ℓ-adic" theory but without the choice of "ℓ", a prime number. It did not provide the intended route to the Weil conjectures, but has been behind modern developments in algebraic K-theory, motivic homotopy theory, and motivic integration. This theory, Daniel Quillen's work, and Grothendieck's theory of Chern classes, are considered the background to the theory of algebraic cobordism, another algebraic analogue of topological ideas.
Category theory.
Grothendieck's emphasis on the role of universal properties across varied mathematical structures brought category theory into the mainstream as an organizing principle for mathematics in general. Among its uses, category theory creates a common language for describing similar structures and techniques seen in many different mathematical systems. His notion of abelian category is now the basic object of study in homological algebra. The emergence of a separate mathematical discipline of category theory has been attributed to Grothendieck's influence, although unintentional.
In popular culture.
The novel "Colonel Lágrimas" ("Colonel Tears" in English, available by Restless Books) by Puerto Rican–Costa Rican writer Carlos Fonseca is a semibiographic novel about Grothendieck.
The band "Stone Hill All Stars" have a song named after Alexander Grothendieck.
In the novel "When We Cease to Understand the World", Benjamin Labatut dedicates one chapter to the story of Grothendieck.
In the novel The Passenger and its sequel Stella Maris by Cormac McCarthy one of the main characters is a student of Grothendieck.
|
2047 | Alcoholics Anonymous | Alcoholics Anonymous (AA) is a global peer-led mutual aid fellowship dedicated to abstinence-based recovery from alcoholism through its spiritually-inclined twelve-step program. Following its twelve traditions, AA is non-professional, non-denominational, apolitical and unaffiliated. In 2020 AA estimated its worldwide membership to be over two million with 75% of those in the U.S.—its country of origin—and Canada.
AA’s traditions keep it from officially addressing the disease model of alcoholism even though its program is a sympathetic response to it. Nevertheless, many AA members have helped make the model popular. Regarding its effectiveness, a 2020 scientific review showed that within all observed demographic groups, clinical interventions increasing AA participation (AA Twelve Step Facilitation, AA/TSF) had higher abstinence rates compared to other well-established treatments. Most studies in the review also found that AA/TSF led to lower health costs.
In 1935, the recognized start of AA, Bill Wilson (Bill W.) first commiserated alcoholic-to-alcoholic with Bob Smith (Dr. Bob). Meeting through AA's immediate precursor the Christian revivalist Oxford Group, they aided other alcoholics there until forming AA. In 1939 the new fellowship, then mostly male and white, published "Alcoholics Anonymous: The Story of How More Than One Hundred Men Have Recovered From Alcoholism," also known as the Big Book, from which AA draws it name. Since its first edition the Big Book has contained AA’s twelve steps, and beginning in 1950 with its second edition, the Big Book has included the twelve traditions; created so AA would stay as what Wilson called a “benign anarchy”.
AA’s twelve steps are a suggested and continuing program of spiritual improvement and of better conduct that goes beyond simply abstaining from alcohol. In the early steps members admit to being alcoholic with unmanageable lives and ready for the help of a "higher power". In the middle steps resentments, character defects and misdeeds are cataloged. Corrections and atonements, if possible, are then attempted. The penultimate step has members pray and meditate to seek guidance from their chosen higher power. Finally, the 12th step has members preserving their sobriety by working with other alcoholics. Though not explicitly prescribed, this is typically done by taking on sponsees. Throughout the steps, divining and following the will of God "as we understood Him" is urged, but differing practices and beliefs, including those of atheists and other non-theists, are accepted and accommodated.
AA’s twelve traditions are AA's advisory guidelines for members, groups and the rest of its organization. The traditions set a desire to stop drinking as the only requirement for AA membership, and recovery from alcoholism as AA’s stated primary purpose, Also, avoidance of dogma, hierarchies and entanglements in public controversies is suggested. Without threat of retribution or means of enforcement, the traditions urge members to remain anonymous in public media and to not use AA for gaining personal wealth, property or public prestige. Moreover, all AA groups are autonomous and supported solely by its members’ voluntary contributions. As with all of AA, groups should reject outside donations and not lend AA’s name to other organizations or causes.
With AA's permission, other fellowships such as Narcotics Anonymous and Al-Anon have adapted the twelve steps and the twelve traditions to their addiction recovery programs.
History.
AA was founded on 10 June 1935 but AA's origins are said to have begun when the renowned psychotherapist Carl Jung inspired Rowland H., an otherwise hopeless drunk, to seek a spiritual solution by sending him to the Oxford Group— a non-denominational, altruistic Christian movement modeled after first-century Christianity. Ebby Thacher got sober in that same Oxford Group and reached out to help his drinking buddy Bill Wilson. Thacher approached Wilson saying that he had "got religion", was sober, and that Wilson could do the same if he set aside objections and instead formed a personal idea of God, "another power" or "higher power". Feeling a "kinship of common suffering", Wilson attended his first group gathering, although he was drunk. Within days, Wilson admitted himself to the Charles B. Towns Hospital after drinking four beers on the way—the last alcohol he ever drank. Under the care of Dr. William Duncan Silkworth (an early benefactor of AA), Wilson's detox included the deliriant belladonna. At the hospital, a despairing Wilson experienced a bright flash of light, which he felt to be God revealing himself.
Following his hospital discharge, Wilson joined the Oxford Group and tried to recruit other alcoholics to the group. These early efforts to help others kept him sober, but were ineffective in getting anyone else to join the group and get sober. Dr. Silkworth suggested that Wilson place less stress on religion (as required by The Oxford Group) and more on the science of treating alcoholism.
Wilson's first success came during a business trip to Akron, Ohio, where he was introduced to Robert Smith, a surgeon and Oxford Group member who was unable to stay sober. After thirty days of working with Wilson, Smith drank his last drink on 10 June 1935, the date marked by AA for its anniversaries.
The first female member, Florence Rankin, joined AA in March 1937, and the first non-Protestant member, a Roman Catholic, joined in 1939. The first Black AA group was established in 1945 in Washington, D.C. by Jim S., an African-American physician from Virginia.
While writing the Big Book in the several years after 1935, Wilson developed the Twelve Steps, which were influenced by the Oxford Group's 6 steps and various readings, including William James's "The Varieties of Religious Experience".
The Big Book, the Twelve Steps, and the Twelve Traditions.
To share their method, Wilson and other members wrote the initially-titled book, "Alcoholics Anonymous: The Story of How More Than One Hundred Men Have Recovered from Alcoholism", from which AA drew its name. Informally known as "The Big Book" (with its first 164 pages virtually unchanged since the 1939 edition), it suggests a twelve-step program in which members admit that they are powerless over alcohol and need help from a "higher power". They seek guidance and strength through prayer and meditation from God or a higher power of their own understanding; take a moral inventory with care to include resentments; list and become ready to remove character defects; list and make amends to those harmed; continue to take a moral inventory, pray, meditate, and try to help other alcoholics recover. The second half of the book, "Personal Stories" (subject to additions, removal, and retitling in subsequent editions), is made of AA members' redemptive autobiographical sketches.
In 1941, interviews on American radio and favorable articles in US magazines, including a piece by Jack Alexander in "The Saturday Evening Post", led to increased book sales and membership. By 1946, as the growing fellowship quarreled over structure, purpose, authority, finances and publicity, Wilson began to form and promote what became known as AA's "Twelve Traditions," which are guidelines for an altruistic, unaffiliated, non-coercive, and non-hierarchical structure that limited AA's purpose to only helping alcoholics on a non-professional level while shunning publicity. Eventually, he gained formal adoption and inclusion of the Twelve Traditions in all future editions of the Big Book. At the 1955 conference in St. Louis, Missouri, Wilson relinquished stewardship of AA to the General Service Conference, as AA had grown to millions of members internationally.
Organization and finances.
AA says it is "not organized in the formal or political sense", and Wilson, borrowing the phrase from anarchist theorist Peter Kropotkin, called it a "benign anarchy". In Ireland, Shane Butler said that AA "looks like it couldn't survive as there's no leadership or top-level telling local cumanns what to do, but it has worked and proved itself extremely robust". Butler explained that "AA's 'inverted pyramid' style of governance has helped it to avoid many of the pitfalls that political and religious institutions have encountered since it was established here in 1946."
In 2018, AA had 2,087,840 members and 120,300 AA groups worldwide. The Twelve Traditions informally guide how individual AA groups function, and the Twelve Concepts for World Service guide how the organization is structured globally.
A member who accepts a service position or an organizing role is a "trusted servant" with terms rotating and limited, typically lasting three months to two years and determined by group vote and the nature of the position. Each group is a self-governing entity, with AA World Services acting only in an advisory capacity. AA is served entirely by alcoholics, except for seven "nonalcoholic friends of the fellowship" of the 21-member AA Board of Trustees.
AA groups are self-supporting, relying on voluntary donations from members to cover expenses. The AA General Service Office (GSO) limits contributions to US$3,000 a year. Above the group level, AA may hire outside professionals for services that require specialized expertise or full-time responsibilities.
Like individual groups, the GSO is self-supporting. AA receives proceeds from books and literature that constitute more than 50% of the income for its GSO. In keeping with AA's Seventh Tradition, the Central Office is fully self-supporting through the sale of literature and related products, and the voluntary donations of AA members and groups. It does not accept donations from people or organizations outside of AA.
In keeping with AA's Eighth Tradition, the Central Office employs special workers who are compensated financially for their services, but their services do not include working with alcoholics in need (the "12th Step"). All 12th Step calls that come to the Central Office are handed to sober AA members who have volunteered to handle these calls. It also maintains service centers, which coordinate activities such as printing literature, responding to public inquiries, and organizing conferences. Other International General Service Offices (Australia, Costa Rica, Russia, etc.) are independent of AA World Services in New York.
Program.
AA's program extends beyond abstaining from alcohol. Its goal is to effect enough change in the alcoholic's thinking "to bring about recovery from alcoholism" through "an entire psychic change," or spiritual awakening. A spiritual awakening is meant to be achieved by taking the Twelve Steps, and sobriety is furthered by volunteering for AA and regular AA meeting attendance or contact with AA members. Members are encouraged to find an experienced fellow alcoholic, called a sponsor, to help them understand and follow the AA program. The sponsor should preferably have experienced all twelve of the steps, be the same sex as the sponsored person, and refrain from imposing personal views on the sponsored person. Following the helper therapy principle, sponsors in AA may benefit from their relationship with their charges, as "helping behaviors" correlate with increased abstinence and lower probabilities of binge drinking.
AA shares the view that acceptance of one's inherent limitations is critical to finding one's proper place among other humans and God. Such ideas are described as "Counter-Enlightenment" because they are contrary to the Enlightenment's ideal that humans have the capacity to make their lives and societies a heaven on Earth using their own power and reason. After evaluating AA's literature and observing AA meetings for sixteen months, sociologists David R. Rudy and Arthur L. Greil found that for an AA member to remain sober, a high level of commitment is necessary. This commitment is facilitated by a change in the member's worldview. They argue that to help members stay sober, AA must provide an all-encompassing worldview while creating and sustaining an atmosphere of transcendence in the organization. To be all-encompassing, AA's ideology emphasizes tolerance rather than a narrow religious worldview that may make the organization unpalatable to potential members and thereby limit its effectiveness. AA's emphasis on the spiritual nature of its program, however, is necessary to institutionalize a feeling of transcendence. A tension results from the risk that the necessity of transcendence, if taken too literally, would compromise AA's efforts to maintain a broad appeal. As this tension is an integral part of AA, Rudy and Greil argue that AA is best described as a "quasi-religious organization".
Meetings.
AA meetings are gatherings where recovery from alcoholism is discussed. One perspective sees them as "quasi-ritualized therapeutic sessions run by and for, alcoholics". There are a variety of meeting types some of which are listed below. At some point during the meeting a basket is passed around for voluntary donations. AA's 7th tradition requires that groups be self-supporting, "declining outside contributions". Weekly meetings are listed in local AA directories in print, online and in apps.
Open vs Closed Meetings.
"Open" meetings welcome anyone—nonalcoholics can attend as observers. Meetings listed as "closed" welcome those with a self-professed "desire to stop drinking," which cannot be challenged by another member on any grounds.
Speaker Meetings.
At speaker meetings one or more members come to tell their stories.
Big Book Meetings.
At Big Book meetings, attendees read from the AA Big Book and discuss it.
Discussion Meetings.
There are also meetings with or without a topic that allow participants to speak up or "share".
Online vs. Offline Meetings.
Online meetings are digital meetings held on platforms such as Zoom. Offline meetings, also called "face to face," "brick and mortar," or "in-person" meetings, are held in a shared physical real-world location. Some meetings are hybrid meetings, where people can meet in a specified physical location, but people can also join the meeting virtually.
Specialized Meetings.
AA meetings do not exclude other alcoholics, though some meetings cater to specific demographics such as gender, profession, age, sexual orientation, or culture. Meetings in the United States are held in a variety of languages including Armenian, English, Farsi, Finnish, French, Japanese, Korean, Russian, and Spanish.
Meeting formats.
While AA has pamphlets that suggest meeting formats, groups have the autonomy to hold and conduct meetings as they wish "except in matters affecting other groups or AA as a whole". Different cultures affect ritual aspects of meetings, but around the world "many particularities of the AA meeting format can be observed at almost any AA gathering".
Confidentiality.
In the Fifth Step, AA members typically reveal their own past misconduct to their sponsors. US courts have not extended the status of privileged communication, such as physician-patient privilege or clergy–penitent privilege, to communications between an AA member and their sponsor.
Spirituality.
Some medical professionals have criticized 12-step programs as "a cult that relies on God as the mechanism of action" and as "overly theistic and outdated". Others have cited the necessity of a "higher power" in formal AA as creating dependence on outside factors rather than internal efficacy. A 2010 study found increased attendance at AA meetings was associated with increased spirituality and decreased frequency and intensity of alcohol use. Since the mid-1970s, several 'agnostic' or 'no-prayer' AA groups have begun across the US, Canada, and other parts of the world, which hold meetings that adhere to a tradition allowing alcoholics to freely express their doubts or disbelief that spirituality will help their recovery, and these meetings forgo the use of opening or closing prayers.
Disease concept of alcoholism.
More informally than not, AA's membership has helped popularize the disease concept of alcoholism which had appeared in the eighteenth century. Though AA usually avoids the term "disease", 1973 conference-approved literature said "we had the disease of alcoholism." Regardless of official positions, since AA's inception, most members have believed alcoholism to be a disease.
AA's Big Book calls alcoholism "an illness which only a spiritual experience will conquer." Ernest Kurtz says this is "The closest the book Alcoholics Anonymous comes to a definition of alcoholism." Somewhat divergently in his introduction to The Big Book, non-member and early benefactor William Silkworth said those unable to moderate their drinking suffer from an allergy. In presenting the doctor's postulate, AA said "The doctor's theory that we have an allergy to alcohol interests us. As laymen, our opinion as to its soundness may, of course, mean little. But as ex-problem drinkers, we can say that his explanation makes good sense. It explains many things for which we cannot otherwise account." AA later acknowledged that "alcoholism is not a true allergy, the experts now inform us." Wilson explained in 1960 why AA had refrained from using the term "disease":
Since then medical and scientific communities have defined alcoholism as an "addictive disease" (aka Alcohol Use Disorder, Severe, Moderate, or Mild). The ten criteria are: alcoholism is a Primary Illness not caused by other illnesses nor by personality or character defects; second, an addiction gene is part of its etiology; third, alcoholism has predictable symptoms; fourth, it is progressive, becoming more severe even after long periods of abstinence; fifth, it is chronic and incurable; sixth, alcoholic drinking or other drug use persists in spite of negative consequences and efforts to quit; seventh, brain chemistry and neural functions change so alcohol is perceived as necessary for survival; eighth, it produces physical dependence and life-threatening withdrawal; ninth, it is a terminal illness; tenth, alcoholism can be treated and can be kept in remission.
Canadian and United States demographics.
AA's New York General Service Office regularly surveys AA members in North America. Its 2014 survey of over 6,000 members in Canada and the United States concluded that, in North America, AA members who responded to the survey were 62% male and 38% female. The survey found that 89% of AA members were white.
Average member sobriety is slightly under 10 years with 36% sober more than ten years, 13% sober from five to ten years, 24% sober from one to five years, and 27% sober less than one year. Before coming to AA, 63% of members received some type of treatment or counseling, such as medical, psychological, or spiritual. After coming to AA, 59% received outside treatment or counseling. Of those members, 84% said that outside help played an important part in their recovery.
The same survey showed that AA received 32% of its membership from other members, another 32% from treatment facilities, 30% were self-motivated to attend AA, 12% of its membership from court-ordered attendance, and only 1% of AA members decided to join based on information obtained from the Internet. People taking the survey were allowed to select multiple answers for what motivated them to join AA.
Relationship with institutions.
Hospitals.
Many AA meetings take place in treatment facilities. Carrying the message of AA into hospitals was how the co-founders of AA first remained sober. They discovered great value in working with alcoholics who are still suffering, and that even if the alcoholic they were working with did not stay sober, they did. Bill Wilson wrote, "Practical experience shows that nothing will so much insure immunity from drinking as intensive work with other alcoholics". Bill Wilson visited Towns Hospital in New York City in an attempt to help the alcoholics who were patients there in 1934. At St. Thomas Hospital in Akron, Ohio, Smith worked with still more alcoholics. In 1939, a New York mental institution, Rockland State Hospital, was one of the first institutions to allow AA hospital groups. Service to corrections and treatment facilities used to be combined until the General Service Conference, in 1977, voted to dissolve its Institutions Committee and form two separate committees, one for treatment facilities, and one for correctional facilities.
Prisons.
In the United States and Canada, AA meetings are held in hundreds of correctional facilities. The AA General Service Office has published a workbook with detailed recommendations for methods of approaching correctional-facility officials with the intent of developing an in-prison AA program. In addition, AA publishes a variety of pamphlets specifically for the incarcerated alcoholic. Additionally, the AA General Service Office provides a pamphlet with guidelines for members working with incarcerated alcoholics.
United States court rulings.
United States courts have ruled that inmates, parolees, and probationers cannot be ordered to attend AA. Though AA itself was not deemed a religion, it was ruled that it contained "enough" religious components (variously described in "Griffin v. Coughlin" below as, inter alia, "religion", "religious activity", "religious exercise") to make coerced attendance at AA meetings a violation of the Establishment Clause of the First Amendment of the constitution. In 2007, the Ninth Circuit of the U.S. Court of Appeals stated that a parolee who was ordered to attend AA had standing to sue his parole office.
United States treatment industry.
In 1939, High Watch Recovery Center in Kent, Connecticut, was founded by Bill Wilson and Marty Mann. Sister Francis who owned the farm tried to gift the spiritual retreat for alcoholics to Alcoholics Anonymous, however citing the sixth tradition Bill W. turned down the gift but agreed to have a separate non-profit board run the facility composed of AA members. Bill Wilson and Marty Mann served on the High Watch board of directors for many years. High Watch was the first and therefore the oldest 12-step-based treatment center in the world still operating today.
In 1949, the Hazelden treatment center was founded and staffed by AA members, and since then many alcoholic rehabilitation clinics have incorporated AA's precepts into their treatment programs. 32% of AA's membership was introduced to it through a treatment facility.
Effectiveness.
There are several ways one can determine whether AA works and numerous ways of measuring if AA is successful, such as looking at abstinence, reduced drinking intensity, reduced alcohol-related consequences, alcohol addiction severity, and healthcare cost.
The effectiveness of AA (compared to other methods and treatments) has been challenged throughout the years, but recent high quality clinical meta-studies using quasi-experiment studies show that AA costs less than other treatments and results in increased abstinence. In longitudinal studies, AA appears to be about as effective as other abstinence-based support groups.
Because of the anonymous and voluntary nature of AA meetings, it has been difficult to perform random trials with them. Environmental and quasi-experiment studies suggest that AA can help alcoholics make positive changes.
In the past, some medical professionals have criticized 12-step programs as pseudoscientific and "a cult that relies on God as the mechanism of action". Until recently, ethical and operational issues had prevented robust randomized controlled trials from being conducted comparing 12-step programs directly to other approaches. More recent studies employing randomized and blinded trials have shown 12-step programs provide similar benefit compared to motivational enhancement therapy (MET) and cognitive behavioral therapy (CBT), and were more effective in producing continuous abstinence and remission compared to these approaches.
Cochrane 2020 review.
A 2020 Cochrane review concluded that "compared to other well-established treatments, clinical linkage using well-articulated Twelve-Step Facilitation (TSF) manualized interventions intended to increase Alcoholics Anonymous (AA) participation" are more effective than other established treatments, such as motivational enhancement therapy (MET) and cognitive-behavioral therapy (CBT), as measured by abstinence rates. Manualized TSF probably achieves additional desirable outcomes—such as fewer drinks per drinking day and less severe alcohol-related problems—at equivalent rates as other treatments, although evidence for such a conclusion comes from low to moderate certainty evidence "so should be regarded with caution".
In response to a concern expressed by another addiction researcher that "those more strongly committed to total abstinence after receiving AA/TSF were likely to experience more protracted 'slips' if they did for any reason drink", the Cochrane review authors stated that subjects who did not achieve abstinence did not have worse drinking outcomes overall.
Older studies.
A 2006 study by Rudolf H. Moos and Bernice S. Moos saw a 67% success rate 16 years later for the 24.9% of alcoholics who ended up, on their own, undergoing a lot of AA treatment. The study's results may be skewed by self-selection bias.
Project MATCH was a 1990s 8-year, multi-site, $27-million investigation that studied which types of alcoholics respond best to which forms of treatment.
Brandsma 1980 showed that Alcoholics Anonymous is more effective than no treatment whatsoever.
Membership retention.
In 2001–2002, the National Institute on Alcohol Abuse and Alcoholism (NIAAA) conducted the National Epidemiological Survey on Alcoholism and Related Conditions (NESARC). Similarly structured to the NLAES, the survey conducted in-person interviews with 43,093 individuals. Respondents were asked if they had ever attended a twelve-step meeting for an alcohol problem in their lifetime (the question was not AA-specific). 1441 (3.4%) of respondents answered the question affirmatively. Answers were further broken down into three categories: disengaged, those who started attending at some point in the past but had ceased attending at some point in the past year (988); continued engagement, those who started attending at some point in the past and continued to attend during the past year (348); and newcomers, those who started attending during the past year (105). In their discussion of the findings, Kaskautas et al. (2008) state that to study disengagement, only the disengaged and continued engagement should be utilized (pg. 270).
The popular press.
"The Sober Truth".
American psychiatrist Lance Dodes, in "The Sober Truth", says that research indicates that only five to eight percent of the people who go to one or more AA meetings achieve sobriety.
The 5–8% figure put forward by Dodes is controversial; other doctors say that the book uses "three separate, questionable, calculations that arrive at the 5–8% figure." Addiction specialists state that the book's conclusion that "[12-step] approaches are almost completely ineffective and even harmful in treating substance use disorders" is wrong. One review called Dodes' reasoning against AA success a "pseudostatistical polemic."
Dodes has not, as of March 2020, read the 2020 Cochrane review showing AA efficacy, but opposes the idea that a social network is needed to overcome substance abuse.
"The Irrationality of Alcoholics Anonymous".
In a 2015 article for "The Atlantic", Gabrielle Glaser criticized the dominance of AA in the treatment of addiction in the United States. Her article uses Lance Dodes's figures and a 2006 Cochrane report to state AA had a low success rate, but those figures were subsequently criticized by experts as outdated. The Glaser article incorrectly conflates the efficacy of treatment centers with the efficacy of Alcoholics Anonymous. The Glaser article says that "nothing about the 12-step approach draws on modern science", but a large amount of scientific research has been done with AA, showing that AA increases abstinence rates. The Glaser article criticizes 12-step programs for being "faith-based", but 12-step programs allow for a very wide diversity of spiritual beliefs, and there are a growing number of secular 12-step meetings.
Criticism.
Sexual advances ("thirteenth-stepping").
"Thirteenth-stepping" is a pejorative term for AA members approaching new members for dates. A study in the "Journal of Addiction Nursing" sampled 55 women in AA and found that 35% of these women had experienced a "pass" and 29% had felt seduced at least once in AA settings. This has also happened with new male members who received guidance from older female AA members pursuing sexual company. The authors suggest that both men and women must be prepared for this behavior or find male or female-only groups. Women-only meetings are a very prevalent part of AA culture, and AA has become more welcoming for women. AA's pamphlet on sponsorship suggests that men be sponsored by men and women be sponsored by women.
Alcoholics Anonymous World Services has a safety flier which states that "Unwanted sexual advances and predatory behaviors are in conflict with carrying the A.A. message of recovery."
Criticism of culture.
Stanton Peele argued that some AA groups apply the disease model to all problem drinkers, whether or not they are "full-blown" alcoholics. Along with Nancy Shute, Peele has advocated that besides AA, other options should be readily available to those problem drinkers who can manage their drinking with the right treatment. The Big Book says "moderate drinkers" and "a certain type of hard drinker" can stop or moderate their drinking. The Big Book suggests no program for these drinkers, but instead seeks to help drinkers without "power of choice in drink."
In 1983, a review stated that the AA program's focus on admission of having a problem increases deviant stigma and strips members of their previous cultural identity, replacing it with the deviant identity. A 1985 study based on observations of AA meetings warned of detrimental iatrogenic effects of the twelve-step philosophy and concluded that AA uses many methods that are also used by cults. A later review disagreed, stating that AA's program bore little resemblance to religious cult practices. In 2014, Vaillant published a paper making the case that Alcoholics Anonymous is not a cult.
Literature.
Alcoholics Anonymous publishes several books, reports, pamphlets, and other media, including a periodical known as the "AA Grapevine". Two books are used primarily: "Alcoholics Anonymous" (the "Big Book") and "Twelve Steps and Twelve Traditions", the latter explaining AA's fundamental principles in depth. The full text of each of these two books is available on the AA website at no charge.
|
2049 | Alpha compositing | In computer graphics, alpha compositing or alpha blending is the process of combining one image with a background to create the appearance of partial or full transparency. It is often useful to render picture elements (pixels) in separate passes or layers and then combine the resulting 2D images into a single, final image called the composite. Compositing is used extensively in film when combining computer-rendered image elements with live footage. Alpha blending is also used in 2D computer graphics to put rasterized foreground elements over a background.
In order to combine the picture elements of the images correctly, it is necessary to keep an associated "matte" for each element in addition to its color. This matte layer contains the coverage information—the shape of the geometry being drawn—making it possible to distinguish between parts of the image where something was drawn and parts that are empty.
Although the most basic operation of combining two images is to put one over the other, there are many operations, or blend modes, that are used.
History.
The concept of an alpha channel was introduced by Alvy Ray Smith and in the late 1970s at the New York Institute of Technology Computer Graphics Lab. Bruce A. Wallace derived the same straight over operator based on a physical reflectance/transmittance model in 1981. A 1984 paper by Thomas Porter and Tom Duff introduced premultiplied alpha using a geometrical approach.
The use of the term "alpha" is explained by Smith as follows: "We called it that because of the classic linear interpolation formula formula_1 that uses the Greek letter formula_2 (alpha) to control the amount of interpolation between, in this case, two images A and B". That is, when compositing image A atop image B, the value of formula_2 in the formula is taken directly from A's alpha channel.
Description.
In a 2D image a color combination is stored for each picture element (pixel), often a combination of red, green and blue (RGB). When alpha compositing is in use, each pixel has an additional numeric value stored in its alpha channel, with a value ranging from 0 to 1. A value of 0 means that the pixel is fully transparent and the color in the pixel beneath will show through. A value of 1 means that the pixel is fully opaque.
With the existence of an alpha channel, it is possible to express compositing image operations using a "compositing algebra". For example, given two images "A" and "B", the most common compositing operation is to combine the images so that "A" appears in the foreground and "B" appears in the background. This can be expressed as "A" over "B". In addition to over, Porter and Duff defined the compositing operators in, held out by (the phrase refers to holdout matting and is usually abbreviated out), atop, and xor (and the reverse operators rover, rin, rout, and ratop) from a consideration of choices in blending the colors of two pixels when their coverage is, conceptually, overlaid orthogonally:
As an example, the over operator can be accomplished by applying the following formula to each pixel:
Here formula_6, formula_7 and formula_8 stand for the color components of the pixels in the result, image A and image B respectively, applied to each color channel (red/green/blue) individually, whereas formula_9, formula_10 and formula_11 are the alpha values of the respective pixels.
The over operator is, in effect, the normal painting operation (see Painter's algorithm). The in and out operators are the alpha compositing equivalent of clipping. The two use only the alpha channel of the second image and ignore the color components. In addition, plus defines additive blending.
Straight versus premultiplied.
If an alpha channel is used in an image, there are two common representations that are available: straight (unassociated) alpha and premultiplied (associated) alpha.
Comparison.
The most significant advantage of premultiplied alpha is that it allows for correct blending, interpolation, and filtering. Ordinary interpolation without premultiplied alpha leads to RGB information leaking out of fully transparent (A=0) regions, even though this RGB information is ideally invisible. When interpolating or filtering images with abrupt borders between transparent and opaque regions, this can result in borders of colors that were not visible in the original image. Errors also occur in areas of semitransparency because the RGB components are not correctly weighted, giving incorrectly high weighting to the color of the more transparent (lower alpha) pixels.
Premultiplied alpha may also be used to allow regions of regular alpha blending (e.g. smoke) and regions with additive blending mode (e.g. flame and glitter effects) to be encoded within the same image. This is represented by an RGBA triplet that express emission with no occlusion, such as (0.4, 0.3, 0.2, 0.0).
Another advantage of premultiplied alpha is performance; in certain situations, it can reduce the number of multiplication operations (e.g. if the image is used many times during later compositing). The Porter–Duff operations have a simple form only in premultiplied alpha. Some rendering pipelines expose a "straight alpha" API surface, but converts them into premultiplied alpha for performance.
One disadvantage of premultiplied alpha is that it can reduce the available relative precision in the RGB values when using integer or fixed-point representation for the color components. This may cause a noticeable loss of quality if the color information is later brightened or if the alpha channel is removed. In practice, this is not usually noticeable because during typical composition operations, such as OVER, the influence of the low-precision color information in low-alpha areas on the final output image (after composition) is correspondingly reduced. This loss of precision also makes premultiplied images easier to compress using certain compression schemes, as they do not record the color variations hidden inside transparent regions, and can allocate fewer bits to encode low-alpha areas. The same “limitations” of lower quantisation bit depths such as 8 bit per channel are also present in imagery without alpha, and this argument is problematic as a result.
Examples.
Assuming that the pixel color is expressed using "straight" (non-premultiplied) RGBA tuples, a pixel value of (0, 0.7, 0, 0.5) implies a pixel that has 70% of the maximum green intensity and 50% opacity. If the color were fully green, its RGBA would be (0, 1, 0, 0.5). However, if this pixel uses premultiplied alpha, all of the RGB values (0, 0.7, 0) are multiplied, or scaled for occlusion, by the alpha value 0.5, which is appended to yield (0, 0.35, 0, 0.5). In this case, the 0.35 value for the G channel actually indicates 70% green emission intensity (with 50% occlusion). A pure green emission would be encoded as (0, 0.5, 0, 0.5). Knowing whether a file uses straight or premultiplied alpha is essential to correctly process or composite it, as a different calculation is required.
Emission with no occlusion cannot be represented in straight alpha. No conversion is available in this case.
Image formats supporting alpha channels.
The most popular image formats that support the alpha channel are PNG and TIFF. GIF supports alpha channels, but is considered to be inefficient when it comes to file size. Support for alpha channels is present in some video codecs, such as Animation and Apple ProRes 4444 of the QuickTime format, or in the Techsmith multi-format codec.
The file format BMP generally does not support this channel; however, in different formats such as 32-bit (888-8) or 16-bit (444-4) it is possible to save the alpha channel, although not all systems or programs are able to read it: it is exploited mainly in some video games or particular applications; specific programs have also been created for the creation of these BMPs.
Gamma correction.
The RGB values of typical digital images do not directly correspond to the physical light intensities, but are rather compressed by a gamma correction function:
This transformation better utilizes the limited number of bits in the encoded image by choosing formula_15 that better matches the non-linear human perception of luminance.
Accordingly, computer programs that deal with such images must decode the RGB values into a linear space (by undoing the gamma-compression), blend the linear light intensities, and re-apply the gamma compression to the result:
When combined with premultiplied alpha, pre-multiplication is done in linear space, prior to gamma compression. This results in the following formula:
Note that only the color components undergo gamma-correction; the alpha channel is always linear.
Other transparency methods.
Although used for similar purposes, transparent colors and image masks do not permit the smooth blending of the superimposed image pixels with those of the background (only whole image pixels or whole background pixels allowed).
A similar effect can be achieved with a 1-bit alpha channel, as found in the 16-bit RGBA high color mode of the Truevision TGA image file format and related TARGA and AT-Vista/NU-Vista display adapters' high color graphic mode. This mode devotes 5 bits for every primary RGB color (15-bit RGB) plus a remaining bit as the "alpha channel".
Screendoor transparency can be used to simulate partial occlusion where only 1-bit alpha is available.
For some applications, a single alpha channel is not sufficient: a stained-glass window, for instance, requires a separate transparency channel for each RGB channel to model the red, green and blue transparency separately. More alpha channels can be added for accurate spectral color filtration applications.
Some order-independent transparency methods replace the over operator with a commutative approximation.
|
2052 | Array (data structure) | In computer science, an array is a data structure consisting of a collection of "elements" (values or variables), of same memory size, each identified by at least one "array index" or "key". An array is stored such that the position of each element can be computed from its index tuple by a mathematical formula. The simplest type of data structure is a linear array, also called one-dimensional array.
For example, an array of ten 32-bit (4-byte) integer variables, with indices 0 through 9, may be stored as ten words at memory addresses 2000, 2004, 2008, ..., 2036, (in hexadecimal: codice_1, codice_2, codice_3, ..., codice_4) so that the element with index "i" has the address 2000 + ("i" × 4).
The memory address of the first element of an array is called first address, foundation address, or base address.
Because the mathematical concept of a matrix can be represented as a two-dimensional grid, two-dimensional arrays are also sometimes called "matrices". In some cases the term "vector" is used in computing to refer to an array, although tuples rather than vectors are the more mathematically correct equivalent. Tables are often implemented in the form of arrays, especially lookup tables; the word "table" is sometimes used as a synonym of array.
Arrays are among the oldest and most important data structures, and are used by almost every program. They are also used to implement many other data structures, such as lists and strings. They effectively exploit the addressing logic of computers. In most modern computers and many external storage devices, the memory is a one-dimensional array of words, whose indices are their addresses. Processors, especially vector processors, are often optimized for array operations.
Arrays are useful mostly because the element indices can be computed at run time. Among other things, this feature allows a single iterative statement to process arbitrarily many elements of an array. For that reason, the elements of an array data structure are required to have the same size and should use the same data representation. The set of valid index tuples and the addresses of the elements (and hence the element addressing formula) are usually, but not always, fixed while the array is in use.
The term "array" may also refer to an array data type, a kind of data type provided by most high-level programming languages that consists of a collection of values or variables that can be selected by one or more indices computed at run-time. Array types are often implemented by array structures; however, in some languages they may be implemented by hash tables, linked lists, search trees, or other data structures.
The term is also used, especially in the description of algorithms, to mean associative array or "abstract array", a theoretical computer science model (an abstract data type or ADT) intended to capture the essential properties of arrays.
History.
The first digital computers used machine-language programming to set up and access array structures for data tables, vector and matrix computations, and for many other purposes. John von Neumann wrote the first array-sorting program (merge sort) in 1945, during the building of the first stored-program computer.p. 159 Array indexing was originally done by self-modifying code, and later using index registers and indirect addressing. Some mainframes designed in the 1960s, such as the Burroughs B5000 and its successors, used memory segmentation to perform index-bounds checking in hardware.
Assembly languages generally have no special support for arrays, other than what the machine itself provides. The earliest high-level programming languages, including FORTRAN (1957), Lisp (1958), COBOL (1960), and ALGOL 60 (1960), had support for multi-dimensional arrays, and so has C (1972). In C++ (1983), class templates exist for multi-dimensional arrays whose dimension is fixed at runtime as well as for runtime-flexible arrays.
Applications.
Arrays are used to implement mathematical vectors and matrices, as well as other kinds of rectangular tables. Many databases, small and large, consist of (or include) one-dimensional arrays whose elements are records.
Arrays are used to implement other data structures, such as lists, heaps, hash tables, deques, queues, stacks, strings, and VLists. Array-based implementations of other data structures are frequently simple and space-efficient (implicit data structures), requiring little space overhead, but may have poor space complexity, particularly when modified, compared to tree-based data structures (compare a sorted array to a search tree).
One or more large arrays are sometimes used to emulate in-program dynamic memory allocation, particularly memory pool allocation. Historically, this has sometimes been the only way to allocate "dynamic memory" portably.
Arrays can be used to determine partial or complete control flow in programs, as a compact alternative to (otherwise repetitive) multiple codice_5 statements. They are known in this context as control tables and are used in conjunction with a purpose built interpreter whose control flow is altered according to values contained in the array. The array may contain subroutine pointers (or relative subroutine numbers that can be acted upon by SWITCH statements) that direct the path of the execution.
Element identifier and addressing formulas.
When data objects are stored in an array, individual objects are selected by an index that is usually a non-negative scalar integer. Indexes are also called subscripts. An index "maps" the array value to a stored object.
There are three ways in which the elements of an array can be indexed:
Using zero based indexing is the design choice of many influential programming languages, including C, Java and Lisp. This leads to simpler implementation where the subscript refers to an offset from the starting position of an array, so the first element has an offset of zero.
Arrays can have multiple dimensions, thus it is not uncommon to access an array using multiple indices. For example, a two-dimensional array codice_6 with three rows and four columns might provide access to the element at the 2nd row and 4th column by the expression codice_7 in the case of a zero-based indexing system. Thus two indices are used for a two-dimensional array, three for a three-dimensional array, and "n" for an "n"-dimensional array.
The number of indices needed to specify an element is called the dimension, dimensionality, or rank of the array.
In standard arrays, each index is restricted to a certain range of consecutive integers (or consecutive values of some enumerated type), and the address of an element is computed by a "linear" formula on the indices.
One-dimensional arrays.
A one-dimensional array (or single dimension array) is a type of linear array. Accessing its elements involves a single subscript which can either represent a row or column index.
As an example consider the C declaration codice_8 which declares a one-dimensional array of ten integers. Here, the array can store ten elements of type codice_9 . This array has indices starting from zero through nine. For example, the expressions codice_10 and codice_11 are the first and last elements respectively.
For a vector with linear addressing, the element with index "i" is located at the address , where "B" is a fixed "base address" and "c" a fixed constant, sometimes called the "address increment" or "stride".
If the valid element indices begin at 0, the constant "B" is simply the address of the first element of the array. For this reason, the C programming language specifies that array indices always begin at 0; and many programmers will call that element "zeroth" rather than "first".
However, one can choose the index of the first element by an appropriate choice of the base address "B". For example, if the array has five elements, indexed 1 through 5, and the base address "B" is replaced by , then the indices of those same elements will be 31 to 35. If the numbering does not start at 0, the constant "B" may not be the address of any element.
Multidimensional arrays.
For a multidimensional array, the element with indices "i","j" would have address "B" + "c" · "i" + "d" · "j", where the coefficients "c" and "d" are the "row" and "column address increments", respectively.
More generally, in a "k"-dimensional array, the address of an element with indices "i"1, "i"2, ..., "i""k" is
For example: int a[2][3];
This means that array a has 2 rows and 3 columns, and the array is of integer type. Here we can store 6 elements they will be stored linearly but starting from first row linear then continuing with second row. The above array will be stored as a11, a12, a13, a21, a22, a23.
This formula requires only "k" multiplications and "k" additions, for any array that can fit in memory. Moreover, if any coefficient is a fixed power of 2, the multiplication can be replaced by bit shifting.
The coefficients "c""k" must be chosen so that every valid index tuple maps to the address of a distinct element.
If the minimum legal value for every index is 0, then "B" is the address of the element whose indices are all zero. As in the one-dimensional case, the element indices may be changed by changing the base address "B". Thus, if a two-dimensional array has rows and columns indexed from 1 to 10 and 1 to 20, respectively, then replacing "B" by will cause them to be renumbered from 0 through 9 and 4 through 23, respectively. Taking advantage of this feature, some languages (like FORTRAN 77) specify that array indices begin at 1, as in mathematical tradition while other languages (like Fortran 90, Pascal and Algol) let the user choose the minimum value for each index.
Dope vectors.
The addressing formula is completely defined by the dimension "d", the base address "B", and the increments "c"1, "c"2, ..., "c""k". It is often useful to pack these parameters into a record called the array's "descriptor" or "stride vector" or "dope vector". The size of each element, and the minimum and maximum values allowed for each index may also be included in the dope vector. The dope vector is a complete handle for the array, and is a convenient way to pass arrays as arguments to procedures. Many useful array slicing operations (such as selecting a sub-array, swapping indices, or reversing the direction of the indices) can be performed very efficiently by manipulating the dope vector.
Compact layouts.
Often the coefficients are chosen so that the elements occupy a contiguous area of memory. However, that is not necessary. Even if arrays are always created with contiguous elements, some array slicing operations may create non-contiguous sub-arrays from them.
There are two systematic compact layouts for a two-dimensional array. For example, consider the matrix
In the row-major order layout (adopted by C for statically declared arrays), the elements in each row are stored in consecutive positions and all of the elements of a row have a lower address than any of the elements of a consecutive row:
In column-major order (traditionally used by Fortran), the elements in each column are consecutive in memory and all of the elements of a column have a lower address than any of the elements of a consecutive column:
For arrays with three or more indices, "row major order" puts in consecutive positions any two elements whose index tuples differ only by one in the "last" index. "Column major order" is analogous with respect to the "first" index.
In systems which use processor cache or virtual memory, scanning an array is much faster if successive elements are stored in consecutive positions in memory, rather than sparsely scattered. This is known as spatial locality, which is a type of locality of reference. Many algorithms that use multidimensional arrays will scan them in a predictable order. A programmer (or a sophisticated compiler) may use this information to choose between row- or column-major layout for each array. For example, when computing the product "A"·"B" of two matrices, it would be best to have "A" stored in row-major order, and "B" in column-major order.
Resizing.
Static arrays have a size that is fixed when they are created and consequently do not allow elements to be inserted or removed. However, by allocating a new array and copying the contents of the old array to it, it is possible to effectively implement a "dynamic" version of an array; see dynamic array. If this operation is done infrequently, insertions at the end of the array require only amortized constant time.
Some array data structures do not reallocate storage, but do store a count of the number of elements of the array in use, called the count or size. This effectively makes the array a dynamic array with a fixed maximum size or capacity; Pascal strings are examples of this.
Non-linear formulas.
More complicated (non-linear) formulas are occasionally used. For a compact two-dimensional triangular array, for instance, the addressing formula is a polynomial of degree 2.
Efficiency.
Both "store" and "select" take (deterministic worst case) constant time. Arrays take linear (O("n")) space in the number of elements "n" that they hold.
In an array with element size "k" and on a machine with a cache line size of B bytes, iterating through an array of "n" elements requires the minimum of ceiling("nk"/B) cache misses, because its elements occupy contiguous memory locations. This is roughly a factor of B/"k" better than the number of cache misses needed to access "n" elements at random memory locations. As a consequence, sequential iteration over an array is noticeably faster in practice than iteration over many other data structures, a property called locality of reference (this does "not" mean however, that using a perfect hash or trivial hash within the same (local) array, will not be even faster - and achievable in constant time). Libraries provide low-level optimized facilities for copying ranges of memory (such as memcpy) which can be used to move contiguous blocks of array elements significantly faster than can be achieved through individual element access. The speedup of such optimized routines varies by array element size, architecture, and implementation.
Memory-wise, arrays are compact data structures with no per-element overhead. There may be a per-array overhead (e.g., to store index bounds) but this is language-dependent. It can also happen that elements stored in an array require "less" memory than the same elements stored in individual variables, because several array elements can be stored in a single word; such arrays are often called "packed" arrays. An extreme (but commonly used) case is the bit array, where every bit represents a single element. A single octet can thus hold up to 256 different combinations of up to 8 different conditions, in the most compact form.
Array accesses with statically predictable access patterns are a major source of data parallelism.
Comparison with other data structures.
Dynamic arrays or growable arrays are similar to arrays but add the ability to insert and delete elements; adding and deleting at the end is particularly efficient. However, they reserve linear (Θ("n")) additional storage, whereas arrays do not reserve additional storage.
Associative arrays provide a mechanism for array-like functionality without huge storage overheads when the index values are sparse. For example, an array that contains values only at indexes 1 and 2 billion may benefit from using such a structure. Specialized associative arrays with integer keys include Patricia tries, Judy arrays, and van Emde Boas trees.
Balanced trees require O(log "n") time for indexed access, but also permit inserting or deleting elements in O(log "n") time, whereas growable arrays require linear (Θ("n")) time to insert or delete elements at an arbitrary position.
Linked lists allow constant time removal and insertion in the middle but take linear time for indexed access. Their memory use is typically worse than arrays, but is still linear.
An Iliffe vector is an alternative to a multidimensional array structure. It uses a one-dimensional array of references to arrays of one dimension less. For two dimensions, in particular, this alternative structure would be a vector of pointers to vectors, one for each row(pointer on c or c++). Thus an element in row "i" and column "j" of an array "A" would be accessed by double indexing ("A"["i"]["j"] in typical notation). This alternative structure allows jagged arrays, where each row may have a different size—or, in general, where the valid range of each index depends on the values of all preceding indices. It also saves one multiplication (by the column address increment) replacing it by a bit shift (to index the vector of row pointers) and one extra memory access (fetching the row address), which may be worthwhile in some architectures.
Dimension.
The "dimension" of an array is the number of indices needed to select an element. Thus, if the array is seen as a function on a set of possible index combinations, it is the dimension of the space of which its domain is a discrete subset. Thus a one-dimensional array is a list of data, a two-dimensional array is a rectangle of data, a three-dimensional array a block of data, etc.
This should not be confused with the dimension of the set of all matrices with a given domain, that is, the number of elements in the array. For example, an array with 5 rows and 4 columns is two-dimensional, but such matrices form a 20-dimensional space. Similarly, a three-dimensional vector can be represented by a one-dimensional array of size three.
|
2053 | Advance Australia Fair | "Advance Australia Fair" is the national anthem of Australia. Written by Scottish-born composer Peter Dodds McCormick, the song was first performed as a patriotic song in Australia in 1878. It replaced "God Save the Queen" as the official national anthem in 1974, following a nationwide opinion survey, only for "God Save the Queen" to be reinstated in January 1976. However, a plebiscite to choose the national song in 1977 preferred "Advance Australia Fair", which was in turn reinstated as the national anthem in 1984. "God Save the King/Queen" became known as the royal anthem, and is used at public engagements attended by the King or members of the monarchy of Australia. The lyrics of the 1984 version of "Advance Australia Fair" were modified from McCormick's original and its verses were trimmed down from four to two. In January 2021, the lyrics were changed once again.
History.
Origin.
"Advance Australia Fair" was published in early December 1878 by Scottish-born composer Peter Dodds McCormick (1833–1916) under the pen-name "Amicus" (which means "friend" in Latin). It was first sung by Andrew Fairfax, accompanied by a concert band conducted by McCormick, at a function of the Highland Society of New South Wales in Sydney on 30 November 1878 (Saint Andrew's Day). The song gained in popularity and an amended version was sung by a choir of around 10,000 at the inauguration of the Commonwealth of Australia on 1 January 1901. In 1907 the Australian Government awarded McCormick £100 for his composition.
In a letter to R.B. Fuller dated 1 August 1913, McCormick described the circumstances that inspired him to write "Advance Australia Fair":
The earliest known sound recording of "Advance Australia Fair" appears in "The Landing of the Australian Troops in Egypt" (), a short commercial recording dramatising the arrival of Australian troops in Egypt "en route" to Gallipoli.
Before its adoption as Australia's national anthem, "Advance Australia Fair" had considerable use elsewhere. For example, Australia's national broadcaster, the Australian Broadcasting Commission, used it to announce its news bulletins until 1952. It was also frequently played at the start or end of official functions. Towards the end of World War II it was one of three songs played in certain picture theatres, along with "God Save the King" and the US national anthem, The Star-Spangled Banner.
Influence.
Other songs and marches have been influenced by "Advance Australia Fair", such as the Australian vice-regal salute.
Competitions, plebiscite and adoption.
In 1973, Prime Minister Gough Whitlam and his government, desiring to forge a new nationalism separate from the United Kingdom, decided that Australia needed a national anthem that could represent the country with "distinction", and they held a competition to find one to replace the existing anthem, "God Save the Queen". In January of that year, Whitlam dedicated an entire Australia Day speech to the search for a new anthem, referring to it as a "symbolic expression of our national pride and dignity". The Australia Council for the Arts organised the contest, which was dubbed the "Australian National Anthem Quest". The contest was held in two stages, the first seeking lyrics and the second music, each having a large prize of A$5,000 for the winning entry. On the recommendation of the Council for the Arts, none of the new entries was felt worthy enough, so the contest ended with suggestions for "Advance Australia Fair", "Waltzing Matilda" and "The Song of Australia".
In 1974 the Whitlam government performed a nationwide opinion survey to determine the song to be sung on occasions of national significance. Conducted through the Australian Bureau of Statistics, the survey polled 60,000 people nationally. "Advance Australia Fair" was chosen by 51.4% of respondents and, on 9 April of that year, Whitlam announced in parliament that it was the national anthem. It was to be used on all occasions excepting those of a specifically regal nature. A spokesman for Whitlam later stated that the Government regarded the tune, primarily, as the national anthem. During the 1975 election campaign following the dismissal of Whitlam by Sir John Kerr, David Combe proposed that the song be played at the start of the Labor Party's official campaign launch on 24 November 1975 at Festival Hall, Melbourne. Whitlam's speechwriter Graham Freudenberg rejected this idea because, among other reasons, the status of the anthem at that point was still tentative.
On 22 January 1976 the Fraser government reinstated "God Save the Queen" as the national anthem for use at royal, vice-regal, defence and loyal toast occasions. Fraser stated that "Advance Australia Fair", "Song of Australia" or "Waltzing Matilda" could be used for non-regal occasions. His government made plans to conduct a national poll to find a song for use on ceremonial occasions when it was desired to mark a separate Australian identity. This was conducted as a plebiscite to choose the National Song, held as an optional additional question in the 1977 referendum on various issues. On 23 May the government announced the results, "Advance Australia Fair" received 43.29% of the vote, defeating the three alternatives, "Waltzing Matilda" (28.28%), "The Song of Australia" (9.65%) and the existing national anthem, "God Save the Queen" (18.78%).
"Advance Australia Fair", with modified lyrics and reduced to two verses (see development of lyrics), was adopted as the Australian national anthem by the Labor government of Bob Hawke, coming into effect on 19 April 1984. At the same time, "God Save the King/Queen" became known as the royal anthem, and continues to be played alongside the Australian national anthem at public engagements in Australia that are attended by the King or any other members of the Royal Family.
Even though any personal copyright of Peter Dodds McCormick's original lyrics has expired, as he died in 1916, the Commonwealth of Australia claims copyright on the official lyrics and particular arrangements of music. Non-commercial use of the anthem is permitted without case-by-case permission, but the Commonwealth government requires permission for commercial use.
The orchestral arrangement of "Advance Australia Fair" that is now regularly played for Australian victories at international sporting medal ceremonies, and at the openings of major domestic sporting, cultural and community events, is by Tommy Tycho, an immigrant from Hungary. It was commissioned by ABC Music in 1984 and then televised by Channel 10 in 1986 in their Australia Day broadcast, featuring Julie Anthony as the soloist.
Legislative basis.
The national anthem was changed on 1 January 2021 by proclamation of the Governor-General on the advice of the Federal Executive Council. The change prior to that was on 19 April 1984.
Lyrics.
The lyrics of "Advance Australia Fair", as modified by the National Australia Day Council, were officially adopted in April 1984. The lyrics were updated as of 1 January 2021 in an attempt to recognise the legacy of Indigenous Australians, with the word "one" in the second line replacing the previous "young". The lyrics are now as follows:
<poem style="float:left; margin-left: 1em;">I
Australians all let us rejoice,
For we are one and free;
We’ve golden soil and wealth for toil,
Our home is girt by sea;
Our land abounds in Nature’s gifts
Of beauty rich and rare;
In history’s page, let every stage
Advance Australia fair!
In joyful strains then let us sing,
Advance Australia fair!</poem>
<poem style="float:left; margin-left: 1em;>II
Beneath our radiant Southern Cross,
We’ll toil with hearts and hands;
To make this Commonwealth of ours
Renowned of all the lands;
For those who’ve come across the seas
We’ve boundless plains to share;
With courage let us all combine
To advance Australia fair.
In joyful strains then let us sing
Advance Australia fair!</poem>
Development of lyrics.
Since the original lyrics were written in 1878, there have been several changes, in some cases with the intent of altering the anthem's political focus especially in regard to gender neutrality and Indigenous Australians. Some of these have been minor while others have significantly altered the song. The original song was four verses long. For its 1984 adoption as the national anthem, the song was cut from the four verses to two. The first verse was kept largely as the 1878 original, except for the change in the first line from " let us rejoice" to " let us rejoice". The second, third and fourth verses of the original were dropped, in favour of a modified version of the new third verse which was sung at Federation in 1901.
The lyrics published in the second edition (1879) were as follows:
<poem style="float:left; margin-left: 1em;>I
Australia's sons, let us rejoice,
For we are young and free;
We've golden soil and wealth for toil,
Our home is girt by sea;
Our land abounds in nature's gifts
Of beauty rich and rare;
In history's page, let every stage
Advance Australia fair.
In joyful strains let us sing,
Advance, Australia fair.</poem>
<poem style="float:left; margin-left: 1em;>II
When gallant Cook from Albion sail'd,
To trace wide oceans o'er,
True British courage bore him on,
Til he landed on our shore.
Then here he raised Old England's flag,
The standard of the brave;
"With all her faults we love her still"
"Britannia rules the wave."
In joyful strains then let us sing,
Advance, Australia fair.</poem>
<poem style="float:left; margin-left: 1em;>III
While other nations of the globe
Behold us from afar,
We'll rise to high renown and shine
Like our glorious southern star;
From England soil and Fatherland,
Scotia and Erin fair,
Let all combine with heart and hand
To advance Australia fair.
In joyful strains then let us sing
Advance, Australia fair.</poem>
<poem style="float:left; margin-left: 1em;>IV
Should foreign foe e'er sight our coast,
Or dare a foot to land,
We'll rouse to arms like sires of yore,
To guard our native strand;
Britannia then shall surely know,
Though oceans roll between,
Her sons in fair Australia's land
Still keep their courage green.
In joyful strains then let us sing
Advance Australia fair.</poem>
The 1901 Federation version of the third verse was originally sung as:
<poem style="float:left; margin-left: 1em;>III
Beneath our radiant Southern Cross,
We'll toil with hearts and hands;
To make our youthful Commonwealth,
Renowned of all the lands;
For loyal sons beyond the seas
We've boundless plains to share;
With courage let us all combine
To advance Australia fair.
In joyful strains then let us sing
Advance Australia fair!</poem>
The lyrics of "Advance Australia Fair", as modified by the National Australia Day Council and officially adopted on 19 April 1984, were as follows:
<poem style="float:left; margin-left: 1em;>I
Australians all let us rejoice,
For we are young and free;
We've golden soil and wealth for toil;
Our home is girt by sea;
Our land abounds in nature's gifts
Of beauty rich and rare;
In history's page, let every stage
Advance Australia Fair.
In joyful strains then let us sing,
Advance Australia Fair.</poem>
<poem style="float:left; margin-left: 1em;>II
Beneath our radiant Southern Cross
We'll toil with hearts and hands;
To make this Commonwealth of ours
Renowned of all the lands;
For those who've come across the seas
We've boundless plains to share;
With courage let us all combine
To Advance Australia Fair.
In joyful strains then let us sing,
Advance Australia Fair.</poem>
These lyrics were updated on 1 January 2021 to the current version, in which "young" in the second line is replaced with "one" to reflect the pre-colonial presence of Indigenous Australians, who lived on Australia a lot longer than Europeans.
Criticism.
General criticism.
In May 1976, after reinstating "God Save the Queen", Fraser advised the Australian Olympic Federation to use "Waltzing Matilda" as the national anthem for the forthcoming Montreal Olympic Games. Fraser responded to criticism of "Waltzing Matilda" compared with "Advance Australia Fair", and countered, "in the second verse... we find these words, 'Britannia rules the waves'." Despite the outcome of the 1977 plebiscite to choose the National Song favouring "Advance Australia Fair", successive Fraser Ministries did not implement the change.
The fourth line of the anthem, "our home is girt by sea", has been criticised for using the so-called archaic word "girt". Additionally, the lyrics and melody of the Australian national anthem have been criticised in some quarters as being dull and unendearing to the Australian people. National Party senator Sandy Macdonald said in 2001 that "'Advance Australia Fair' is so boring that the nation risks singing itself to sleep, with boring music and words impossible to understand".
Political sentiment is divided. Craig Emerson of the Australian Labor Party has critiqued the anthem, former MP Peter Slipper has said that Australia should consider another anthem, in 2011 former Victorian Premier Jeff Kennett suggested "I Am Australian", while former Australian Labor Party leader Kim Beazley defended it.
Recognition of Indigenous Australians.
The song has been criticised for failing to represent or acknowledge Australia's Indigenous peoples and aspects of the country's colonial past. The lyrics have been accused of celebrating British colonisation and perpetuating the concept of "terra nullius", with the second line of the anthem ("for we are young and free") criticised in particular for ignoring the long history of Indigenous Australians. It has also been suggested that the word "fair" celebrates the "civilising" mission of British colonists.
Since about 2015, public debate about the anthem has increased. Boxer Anthony Mundine stated in 2013, 2017 and 2018 that he would not stand for the anthem, prompting organisers not to play it before his fights. In September 2018 a 9-year-old Brisbane girl was disciplined by her school after refusing to stand for the national anthem; her actions were applauded by some public commenters, and criticised by others. In 2019, several National Rugby League football players decided not to sing the anthem before the first match of the State of Origin series and before the Indigenous All-Stars series with New Zealand; NRL coach and celebrated former player Mal Meninga supported the protesting players and called for a referendum on the subject.
Several alternative versions of "Advance Australia Fair" have been proposed to address the alleged exclusion of Indigenous Australians. Judith Durham of The Seekers and Mutti Mutti musician Kutcha Edwards released their alternative lyrics in 2009, replacing "for we are young and free" with the opening lines "Australians let us stand as one, upon this sacred land". In 2015, Aboriginal Australian soprano Deborah Cheetham declined an invitation to sing the anthem at the 2015 AFL grand final after the AFL turned down her request to replace the words "for we are young and free" with "in peace and harmony". She has advocated for the lyrics being rewritten and endorsed Durham and Edwards' alternative version.
In 2017 the Recognition in Anthem Project was established and began work on a new version, with lyrics written by poet and former Victorian Supreme Court judge Peter Vickery following consultation with Indigenous communities and others. Vickery's proposed lyrics replaced "we are young and free" with "we are one and free" in the first verse, deleted the second and added two new ones; the second verse acknowledging Indigenous history, immigration and calls for unity and respect, and the third adapting lines from the official second verse. It was debuted at the Desert Song Festival in Alice Springs by an Aboriginal choir. Former prime minister Bob Hawke endorsed Vickery's alternative lyrics in 2018. In 2017, the federal government under then prime minister Malcolm Turnbull granted permission for Vickery's lyrics to be sung at certain occasions as a "patriotic song", but said that before making any official change to the anthem, "The Government would need to be convinced of a sufficient groundswell of support in the wider community".
In November 2020, NSW Premier Gladys Berejiklian proposed changing one word in the opening couplet, from "we are young and free" to "we are one and free", to acknowledge Australia's Indigenous history. The proposal was supported by the federal Minister for Indigenous Australians, Ken Wyatt, and in December 2020 Prime Minister Scott Morrison announced that this change would be adopted from 1 January 2021, having received approval from Governor-General David Hurley.
Dharawal lyrics.
Lyrics for the anthem have been written twice in the Dharug language, an Australian Aboriginal language spoken around Sydney by the Dharawal people.
A first version was first performed in July 2010, at a Rugby League State of Origin match in Sydney, though there was some opposition:
In December 2020, another setting, in Dharug, followed by the anthem in English, was sung before a Rugby Union international between Australia and Argentina:
Other unofficial variants.
In 2011, about fifty different Christian schools from different denominations came under criticism for singing an unofficial version of the song written by the Sri Lankan immigrant Ruth Ponniah in 1988. The song replaced the official second verse of "Advance Australia Fair" with lyrics that were Christian in nature.
Minister for School Education, Early Childhood and Youth Peter Garrett and chief executive of the National Australia Day Council Warren Pearson admonished the schools for modifying the lyrics of the anthem, and the Australian Parents Council and the Federation of Parents and Citizens' Association of NSW called for a ban on the modified song. Stephen O'Doherty, chief executive of Christian Schools Australia, defended the use of the lyrics in response.
|
2061 | Automatic number announcement circuit | An automatic number announcement circuit (ANAC) is a component of a central office of a telephone company that provides a service to installation and service technicians to determine the telephone number of a telephone line. The facility has a telephone number that may be called to listen to an automatic announcement that includes the caller's telephone number. The ANAC facility is useful primarily during the installation of landline telephones to quickly identify one of multiple wire pairs in a bundle or at a termination point.
Operation.
By connecting a test telephone set, a technician calls the local telephone number of the automatic number announcement service. This call is connected to equipment at the central office that uses automatic equipment to announce the telephone number of the line calling in. The main purpose of this system is to allow telephone company technicians to identify the telephone line they are connected to.
Automatic number announcement systems are based on automatic number identification. They are intended for use by phone company technicians, the ANAC system bypasses customer features, such as unlisted numbers, caller ID blocking, and outgoing call blocking. Installers of multi-line business services where outgoing calls from all lines display the company's main number on call display can use ANAC to identify a specific line in the system, even if CID displays every line as "line one".
Most ANAC systems are provider-specific in each wire center, while others are regional or state-/province- or area-code-wide. No official lists of ANAC numbers are published, as telephone companies guard against abuse that would interfere with availability for installers.
Exchange prefixes for testing.
The North American Numbering Plan reserves the exchange ("central office") prefixes "958" and "959" for plant testing purposes. Code 959 with three or four additional digits is dedicated for access to office test lines in local exchange carrier and interoffice carrier central offices. The specifications define several test features for line conditions, such as quiet line and busy line, and test tones transmitted to callers. Telephone numbers are assigned for ring back to test the ringer when installing telephone sets, milliwatt tone (a number simply answers with a continuous test tone) and a loop around (which connects a call to another inbound call to the same or another test number).
ANAC services are typically installed in the "958" range, which is intended for communications between central offices. In some area codes, multiple additional prefixes may be reserved for test purposes. Many area codes reserved 999; 320 was also formerly reserved in Bell Canada territory.
Other carrier-specific North American test numbers include 555-XXXX numbers (such as 555-0311 on Rogers Communications in Canada) or vertical service codes, such as *99 on Cablevision/Optimum Voice in the United States.
Telephone numbers.
Plant testing telephone numbers are carrier-specific, there is no comprehensive list of telephone numbers for ANAC services. In some communities, test numbers change relatively often. In others, a major incumbent carrier might assign a single number which provides test functions on its network across an entire numbering plan area, throughout an entire province or state, or system-wide.
Some telecommunication carriers maintain toll-free numbers for ANAC facilities. Some national toll-free numbers provide automatic number identification by speaking the telephone number of the caller, but these are not intended for use in identifying the customer's own phone number. They are used for the agent in a call center to confirm the telephone a customer is calling from, so that the customer's account information can be displayed as a "screen pop" for the next available customer service representative.
|
2062 | Amerigo Vespucci | Amerigo Vespucci (; ; 9 March 1451 – 22 February 1512) was an Italian merchant, explorer, and navigator from the Republic of Florence, from whose name the term "America" is derived.
Between 1497 and 1504, Vespucci participated in at least two voyages of the Age of Discovery, first on behalf of Spain (14991500) and then for Portugal (15011502). In 1503 and 1505, two booklets were published under his name, containing colourful descriptions of these explorations and other alleged voyages. Both publications were extremely popular and widely read across much of Europe. Although historians still dispute the authorship and veracity of these accounts, at the time they were instrumental in raising awareness of the new discoveries and enhancing the reputation of Vespucci as an explorer and navigator.
Vespucci claimed to have understood, back in 1501 during his Portuguese expedition, that Brazil was part of a continent new to Europeans, which he called the "New World". The claim inspired cartographer Martin Waldseemüller to recognize Vespucci's accomplishments in 1507 by applying the Latinized form "America" for the first time to a map showing the New World. Other cartographers followed suit, and by 1532 the name America was permanently affixed to the newly discovered continents.
It is unknown whether Vespucci was ever aware of these honours. In 1505, he was made a citizen of Castile by royal decree and in 1508, he was appointed to the newly created position of "piloto mayor" (master navigator) for Spain's "Casa de Contratación" (House of Trade) in Seville, a post he held until his death in 1512.
Biography.
Vespucci was born on 9 March 1451, in Florence, a wealthy Italian city-state and a center of Renaissance art and learning.
Family and education.
Amerigo Vespucci was the third son of Nastagio Vespucci, a Florentine notary for the Money-Changers Guild, and Lisa di Giovanni Mini. The family resided in the District of Santa Lucia d'Ognissanti along with other families of the Vespucci clan. Earlier generations of Vespucci had funded a family chapel in the Ognissanti church, and the nearby Hospital of San Giovanni di Dio was founded by Simone di Piero Vespucci in 1380. Vespucci's immediate family was not especially prosperous but they were politically well-connected. Amerigo's grandfather, also named Amerigo Vespucci, served a total of 36 years as the chancellor of the Florentine government, known as the "Signoria"; and Nastagio also served in the "Signoria" and in other guild offices. More importantly, the Vespuccis had good relations with Lorenzo de' Medici, the powerful de facto ruler of Florence.
Amerigo's two older brothers, Antonio and Girolamo, were sent to the University of Pisa for their education; Antonio followed his father to become a notary, while Girolamo entered the Church and joined the Knights Hospitaller in Rhodes. Amerigo's career path seemed less certain; instead of following his brothers to the university, he remained in Florence and was tutored by his uncle, Giorgio Antonio Vespucci, a Dominican friar in the monastery of San Marco. Fortunately for Amerigo, his uncle was one of the most celebrated humanist scholars in Florence at the time and provided him with a broad education in literature, philosophy, rhetoric, and Latin. He was also introduced to geography and astronomy, subjects that played an essential part in his career. Amerigo's later writings demonstrated a familiarity with the work of the classic Greek cosmographers, Ptolemy and Strabo, and the more recent work of Florentine astronomer Paolo dal Pozzo Toscanelli.
Early career.
In 1478, Guido Antonio Vespucci led a Florentine diplomatic mission to Paris and invited his younger cousin, Amerigo Vespucci, to join him. Amerigo's role is not clear, but it was likely as an attache or private secretary. Along the way they had business in Bologna, Milan, and Lyon. Their objective in Paris was to obtain French support for Florence's war with Naples. Louis XI was noncommittal and the diplomatic mission returned to Florence in 1481 with little to show for their efforts.
After his return from Paris, Amerigo worked for a time with his father and continued his studies in science. In 1482, when his father died, Amerigo went to work for Lorenzo di Pierfrancesco de' Medici, head of a junior branch of the Medici family. Although Amerigo was twelve years older, they had been schoolmates under the tutelage of Giorgio Antonio Vespucci. Amerigo served first as a household manager and then gradually took on increasing responsibilities, handling various business dealings for the family both at home and abroad. Meanwhile, he continued to show an interest in geography, at one point buying an expensive map made by the master cartographer Gabriel de Vallseca.
Seville.
In 1488, Lorenzo di Pierfrancesco became dissatisfied with his Seville business agent, Tomasso Capponi. He dispatched Vespucci to investigate the situation and provide an assessment of a suggested replacement, Florentine merchant Gianotto Berardi. Vespucci's findings have been lost but Capponi returned to Florence around this time and Berardi took over the Medici business in Seville. In addition to managing Medici's trade in Seville, Berardi had his own business in African slavery and ship chandlery.
By 1492 Vespucci had settled permanently in Seville. His motivations for leaving Florence are unclear; he continued to transact some business on behalf of his Medici patrons but more and more he became involved with Berardi's other activities, most notably his support of Christopher Columbus's voyages. Barardi invested half a million "maravedis" in Columbus's first voyage, and he won a potentially lucrative contract to provision Columbus's large second fleet. However, profits proved to be elusive. In 1495, Berardi signed a contract with the crown to send 12 resupply ships to Hispaniola but then died unexpectedly in December without completing the terms of the contract.
Vespucci was the executor of Berardi's will, collecting debts and paying outstanding obligations for the firm. Afterwards he was left owing 140,000 "maravedis". He continued to provision ships bound for the West Indies, but his opportunities were diminishing; Columbus's expeditions were not providing the hoped-for profits, and his patron, Lorenzo di Pierfrancesco Medici, was using other Florentine agents for his business in Seville.
Sometime after he settled in Seville, Vespucci married a Spanish woman, Maria Cerezo. Very little is known about her; Vespucci's will refers to her as the daughter of celebrated military leader Gonzalo Fernández de Córdoba. Historian Fernández-Armesto speculates that she may have been Gonzalo's illegitimate offspring and a connection that would have been very useful to Vespucci. She was an active participant in his business and held power of attorney for Vespucci when he was away.
Voyages and alleged voyages.
The evidence for Vespucci's voyages of exploration consists almost entirely of a handful of letters written by him or attributed to him. Historians have differed sharply on the authorship, accuracy and veracity of these documents. Consequently, opinions also vary widely regarding the number of voyages undertaken, their routes, and Vespucci's roles and accomplishments. Starting in the late 1490s Vespucci participated in two voyages to the New World that are relatively well-documented in the historical record. Two others have been alleged but the evidence is more problematical. Traditionally, Vespucci's voyages are referred to as the "first" through "fourth", even by historians who dismiss one or more of the trips.
Alleged voyage of 14971498.
A letter, addressed to Florentine official Piero Soderini, dated 1504 and published the following year, purports to be an account by Vespucci of a voyage to the New World, departing from Spain on 10 May 1497, and returning on 15 October 1498. This is perhaps the most controversial of Vespucci's voyages, as this letter is the only known record of its occurrence, and many historians doubt that it took place as described. Some question the authorship and accuracy of the letter and consider it to be a forgery. Others point to the inconsistencies in the narrative of the voyage, particularly the alleged course, starting near Honduras and proceeding northwest for 870 leagues (about )—a course that would have taken them across Mexico to the Pacific Ocean.
Certain earlier historians, including contemporary Bartolomé de las Casas, suspected that Vespucci incorporated observations from a later voyage into a fictitious account of this supposed first one, so as to gain primacy over Columbus and position himself as the first European explorer to encounter the mainland. Others, including scholar Alberto Magnaghi, have suggested that the Solderini letter was not written by Vespucci at all, but rather by an unknown author who had access to the navigator's private letters to Lorenzo de' Medici about his 1499 and 1501 expeditions to the Americas, which make no mention of a 1497 voyage. The Soderini letter is one of two attributed to Vespucci that were edited and widely circulated during his lifetime.
Voyage of 14991500.
In 1499, Vespucci joined an expedition licensed by Spain and led by Alonso de Ojeda as fleet commander and Juan de la Cosa as chief navigator. Their intention was to explore the coast of a new landmass found by Columbus on his third voyage and in particular investigate a rich source of pearls that Columbus had reported. Vespucci and his backers financed two of the four ships in the small fleet. His role on the voyage is not clear. Writing later about his experience, Vespucci gave the impression that he had a leadership role, but that is unlikely, due to his inexperience. Instead, he may have served as a commercial representative on behalf of the fleet's investors. Years later, Ojeda recalled that "Morigo Vespuche" was one of his pilots on the expedition.
The vessels left Spain on 18 May 1499 and stopped first in the Canary Islands before reaching South America somewhere near present-day Suriname or French Guiana. From there the fleet split up: Ojeda proceeded northwest toward modern Venezuela with two ships, while the other pair headed south with Vespucci aboard. The only record of the southbound journey comes from Vespucci himself. He assumed they were on the coast of Asia and hoped by heading south they would, according to the Greek geographer Ptolemy, round the unidentified "Cape of Cattigara" and reach the Indian Ocean. They passed two huge rivers (the Amazon and the Para) which poured freshwater out to sea. They continued south for another 40 leagues (about ) before encountering a very strong adverse current which they could not overcome. Forced to turn around, the ships headed north, retracing their course to the original landfall. From there Vespucci continued up the South American coast to the Gulf of Paria and along the shore of what is now Venezuela. At some point they may have rejoined Ojeda but the evidence is unclear. In the late summer, they decided to head north for the Spanish colony at Hispaniola in the West Indies to resupply and repair their ships before heading home. After Hispaniola they made a brief slave raid in the Bahamas, capturing 232 natives, and then returned to Spain.
Voyage of 15011502.
In 1501, Manuel I of Portugal commissioned an expedition to investigate a landmass far to the west in the Atlantic Ocean encountered unexpectedly by a wayward Pedro Álvares Cabral on his voyage around Africa to India. That land would eventually become present-day Brazil. The king wanted to know the extent of this new discovery and determine where it lay in relation to the line established by the Treaty of Tordesillas. Any land that lay to the east of the line could be claimed by Portugal. Vespucci's reputation as an explorer and presumed navigator had already reached Portugal, and he was hired by the king to serve as pilot under the command of Gonçalo Coelho.
Coelho's fleet of three ships left Lisbon in May 1501. Before crossing the Atlantic they resupplied at Cape Verde, where they encountered Cabral on his way home from his voyage to India. This was the same expedition that had found Brazil on its outward-bound journey the previous year. Coelho left Cape Verde in June, and from this point Vespucci's account is the only surviving record of their explorations. On 17 August 1501 the expedition reached Brazil at a latitude of about 6° south. Upon landing it encountered a hostile band of natives who killed and ate one of its crewmen. Sailing south along the coast they found friendlier natives and were able to engage in some minor trading. At 23° S they found a bay which they named Rio de Janeiro because it was 1 January 1502. On 13 February 1502, they left the coast to return home. Vespucci estimated their latitude at 32° S but experts now estimate they were closer to 25° S. Their homeward journey is unclear since Vespucci left a confusing record of astronomical observations and distances travelled.
Alleged voyage of 15031504.
In 1503, Vespucci may have participated in a second expedition for the Portuguese crown, again exploring the east coast of Brazil. There is evidence that a voyage was led by Coelho at about this time but no independent confirmation that Vespucci took part. The only source for this last voyage is the Soderini letter; but several modern scholars dispute Vespucci's authorship of that letter and it is uncertain whether Vespucci undertook this trip. There are also difficulties with the reported dates and details in the account of this voyage.
Return to Seville.
By early 1505, Vespucci was back in Seville. His reputation as an explorer and navigator continued to grow and his recent service in Portugal did not seem to damage his standing with King Ferdinand. On the contrary, the king was likely interested in learning about the possibility of a western passage to India. In February, he was summoned by the king to consult on matters of navigation. During the next few months he received payments from the crown for his services and in April he was declared by royal proclamation a citizen of Castile and León.
From 1505 until his death in 1512, Vespucci remained in service to the Spanish crown. He continued his work as a chandler, supplying ships bound for the Indies. He was also hired to captain a ship as part of a fleet bound for the "spice islands" but the planned voyage never took place. In March 1508, he was named chief pilot for the "Casa de Contratación" or House of Commerce which served as a central trading house for Spain's overseas possessions. He was paid an annual salary of 50,000 "maravedis" with an extra 25,000 for expenses. In his new role, Vespucci was responsible for ensuring that ships' pilots were adequately trained and licensed before sailing to the New World. He was also charged with compiling a "model map" based on input from pilots who were obligated to share what they learned after each voyage.
Vespucci wrote his will in April 1511. He left most of his modest estate, including five household slaves, to his wife. His clothes, books, and navigational equipment were left to his nephew Giovanni Vespucci. He requested to be buried in a Franciscan habit in his wife's family tomb. Vespucci died on 22 February 1512.
Upon his death, Vespucci's wife was awarded an annual pension of 10,000 "maravedis" to be deducted from the salary of the successor chief pilot. His nephew Giovanni was hired into the "Casa de Contratación" where he spent his subsequent years spying on behalf of the Florentine state.
Naming of America.
Vespucci's voyages became widely known in Europe after two accounts attributed to him were published between 1503 and 1505. The Soderini letter (1505) came to the attention of a group of humanist scholars studying geography in Saint-Dié, a small French town in the Duchy of Lorraine. Led by Walter Lud, the academy included Matthias Ringmann and Martin Waldseemüller. In 1506, they obtained a French translation of the Soderini letter as well as a Portuguese maritime map that detailed the coast of lands recently discovered in the western Atlantic. They surmised that this was the "new world" or the "antipodes" hypothesized by classical writers. The Soderini letter gave Vespucci credit for discovery of this new continent and implied that the Portuguese map was based on his explorations.
In April 1507, Ringmann and Waldseemüller published their "Introduction to Cosmography" with an accompanying world map. The "Introduction" was written in Latin and included a Latin translation of the Soderini letter. In a preface to the "Letter", Ringmann wrote
A thousand copies of the world map were printed with the title "Universal Geography According to the Tradition of Ptolemy and the Contributions of Amerigo Vespucci and Others". It was decorated with prominent portraits of Ptolemy and Vespucci and, for the first time, the name America was applied to a map of the New World.
The "Introduction" and map were a great success and four editions were printed in the first year alone. The map was widely used in universities and was influential among cartographers who admired the craftsmanship that went into its creation. In the following years, other maps were printed that often incorporated the name America. In 1538, Gerardus Mercator used America to name both the North and South continents on his influential map. By this point the name had been securely fixed on the New World.
Many supporters of Columbus felt that Vespucci had stolen an honour that rightfully belonged to Columbus. Most historians now believe that he was unaware of Waldseemüller's map before his death in 1512 and many assert that he was not even the author of the Soderini letter.
Vespucci letters.
Knowledge of Vespucci's voyages relies almost entirely on a handful of letters written by him or attributed to him. Two of these letters were published during his lifetime and received widespread attention throughout Europe. Several scholars now believe that Vespucci did not write the two published letters in the form in which they circulated during his lifetime. They suggest that they were fabrications based in part on genuine Vespucci letters.
The remaining documents were unpublished manuscripts; handwritten letters uncovered by researchers more than 250 years after Vespucci's death. After years of controversy, the authenticity of the three complete letters was convincingly demonstrated by Alberto Magnaghi in 1924. Most historians now accept them as the work of Vespucci but aspects of the accounts are still disputed.
Historiography.
Vespucci has been called "the most enigmatic and controversial figure in early American history". The debate has become known among historians as the "Vespucci question". How many voyages did he make? What was his role on the voyages and what did he learn? The evidence relies almost entirely on a handful of letters attributed to him. Many historians have analysed these documents and have arrived at contradictory conclusions.
In 1515, Sebastian Cabot became one of the first to question Vespucci's accomplishments and express doubts about his 1497 voyage. Later, Bartolomé de las Casas argued that Vespucci was a liar and stole the credit that was due Columbus. By 1600, most regarded Vespucci as an impostor and not worthy of his honours and fame. In 1839, Alexander von Humboldt after careful consideration asserted the 1497 voyage was impossible but accepted the two Portuguese-sponsored voyages. Humboldt also called into question the assertion that Vespucci recognized that he had encountered a new continent. According to Humboldt, Vespucci (and Columbus) died in the belief that they had reached the eastern edge of Asia. Vespucci's reputation was perhaps at its lowest in 1856 when Ralph Waldo Emerson called Vespucci a "thief" and "pickle dealer" from Seville who managed to get "half the world baptized with his dishonest name".
Opinions began to shift somewhat after 1857 when Brazilian historian Francisco Adolfo de Varnhagen wrote that everything in the Soderini letter was true. Other historians followed in support of Vespucci including John Fiske and Henry Harrisse.
In 1924, Alberto Magnaghi published the results of his exhaustive review of Vespucci's writings and relevant cartography. He denied Vespucci's authorship of the 1503 "Mundus Novus" and the 1505 Letter to Soderini, the only two texts published during his lifetime. He suggested that the Soderini letter was not written by Vespucci, but was cobbled together by unscrupulous Florentine publishers who combined several accounts – some from Vespucci, others from elsewhere. Magnaghi determined that the manuscript letters were authentic and based on them he was the first to propose that only the second and third voyages were true, and the first and fourth voyages (only found in the Soderini letter) were fabrications. While Magnaghi has been one of the chief proponents of a two-voyage narrative, Roberto Levellier was an influential Argentinian historian who endorsed the authenticity of all Vespucci's letters and proposed the most extensive itinerary for his four voyages.
Other modern historians and popular writers have taken varying positions on Vespucci's letters and voyages, espousing two, three, or four voyages and supporting or denying the authenticity of his two printed letters. Most authors believe that the three manuscript letters are authentic while the first voyage as described in the Soderini letter draws the most criticism and disbelief.
A two-voyage thesis was accepted and popularized by Frederick J. Pohl (1944), and rejected by Germán Arciniegas (1955), who posited that all four voyages were truthful. Luciano Formisiano (1992) also rejects the Magnaghi thesis (acknowledging that publishers probably tampered with Vespucci's writings) and declares all four voyages genuine, but differs from Arciniegas in details (particularly the first voyage). Samuel Morison (1974) flatly rejected the first voyage but was noncommittal about the two published letters. Felipe Fernández-Armesto (2007) calls the authenticity question "inconclusive" and hypothesizes that the first voyage was probably another version of the second; the third is unassailable, and the fourth is probably true.
Legacy.
Vespucci's historical importance may rest more with his letters (whether or not he wrote them all) than his discoveries. Burckhardt cites the naming of America after him as an example of the immense role of the Italian literature of the time in determining historical memory. Within a few years of the publication of his two letters, the European public became aware of the newly discovered continents of the Americas. According to Vespucci:
|
2063 | Aristide Maillol | Aristide Joseph Bonaventure Maillol (; December 8, 1861 – September 27, 1944) was a French sculptor, painter, and printmaker.
Biography.
Maillol was born in Banyuls-sur-Mer, Roussillon. He decided at an early age to become a painter, and moved to Paris in 1881 to study art. After several applications and several years of living in poverty, his enrollment in the École des Beaux-Arts was accepted in 1885, and he studied there under Jean-Léon Gérôme and Alexandre Cabanel. His early paintings show the influence of his contemporaries Pierre Puvis de Chavannes and Paul Gauguin.
Gauguin encouraged his growing interest in decorative art, an interest that led Maillol to take up tapestry design. In 1893 Maillol opened a tapestry workshop in Banyuls, producing works whose high technical and aesthetic quality gained him recognition for renewing this art form in France. He began making small terracotta sculptures in 1895, and within a few years his concentration on sculpture led to the abandonment of his work in tapestry.
In July 1896, Maillol married Clotilde Narcis, one of his employees at his tapestry workshop. Their only son, Lucian, was born that October.
Maillol's first major sculpture, "A Seated Woman", was modeled after his wife. The first version (in the Museum of Modern Art, New York) was completed in 1902, and renamed "La Méditerranée". Maillol, believing that "art does not lie in the copying of nature", produced a second, less naturalistic version in 1905. In 1902, the art dealer Ambroise Vollard provided Maillol with his first exhibition.
The subject of nearly all of Maillol's mature work is the female body, treated with a classical emphasis on stable forms. The figurative style of his large bronzes is perceived as an important precursor to the greater simplifications of Henry Moore, and his serene classicism set a standard for European (and American) figure sculpture until the end of World War II.
Josep Pla said of Maillol, "These archaic ideas, Greek, were the great novelty Maillol brought into the tendency of modern sculpture. What you need to love from the ancients is not the antiquity, it is the sense of permanent, renewed novelty, that is due to the nature and reason."
His important public commissions include a 1912 commission for a monument to Cézanne, as well as numerous war memorials commissioned after World War I.
Maillol served as a juror with Florence Meyer Blumenthal in awarding the Prix Blumenthal (1919–1954) a grant awarded to painters, sculptors, decorators, engravers, writers, and musicians.
He made a series of woodcut illustrations for an edition of Vergil's "Eclogues" published by Harry Graf Kessler in 1926–27. He also illustrated "Daphnis and Chloe" by Longus (1937) and "Chansons pour elle" by Paul Verlaine (1939).
He died in Banyuls at the age of eighty-three, in an automobile accident. While driving home during a thunderstorm, the car in which he was a passenger skidded off the road and rolled over. A large collection of Maillol's work is maintained at the Musée Maillol in Paris, which was established by Dina Vierny, Maillol's model and platonic companion during the last 10 years of his life. His home a few kilometers outside Banyuls, also the site of his final resting place, has been turned into a museum, the Musée Maillol Banyuls-sur-Mer, where a number of his works and sketches are displayed.
Three of his bronzes grace the grand staircase of the Metropolitan Opera House in New York City: "Summer" (1910–11), "Venus Without Arms" (1920), and" Kneeling Woman: Monument to Debussy" (1950–55). The third, the artist's only reference to music, is a copy of an original created for the French city of Saint-Germain-en-Laye, Claude Debussy's birthplace.
Nazi-looted art.
During the German occupation of France, dozens of artworks by Maillol were seized by the Nazi looting organization known as the E.R.R. or Reichsleiter Rosenberg Taskforce. The Database of Art Objects at the Jeu de Paume lists thirty artworks by Maillol. The German Lost Art Foundation database lists 33 entries for Maillol. The German Historical Museum's database for artworks recovered by the Allies at the Munich Central Collecting Point has 13 items related to Maillol. Maillol's sculpture "Head of Flora" was found in the stash of Cornelius Gurlitt, son of Hitler's art dealer Hildebrand Gurlitt together with lithographs, drawings and paintings.
A photograph from May 24, 1946 shows "Six men, members of the Monuments, Fine Arts & Archives section of the military, prepare Aristide Maillol's sculpture "Baigneuse à la draperie", looted during World War II for transport to France. Sculpture is labeled with sign: Wiesbaden, no. 31."
Jewish art collectors whose artworks by Maillol were looted by Nazis include Hugo Simon, Alfred Flechtheim and many others.
|
2064 | Antonio Canova | Antonio Canova (; 1 November 1757 – 13 October 1822) was an Italian Neoclassical sculptor, famous for his marble sculptures. Often regarded as the greatest of the Neoclassical artists, his sculpture was inspired by the Baroque and the classical revival, and has been characterised as having avoided the melodramatics of the former, and the cold artificiality of the latter.
Life.
Possagno.
In 1757, Antonio Canova was born in the Venetian Republic city of Possagno to Pietro Canova, a stonecutter, and Maria Angela Zardo Fantolini. In 1761, his father died. A year later, his mother remarried. As such, in 1762, he was put into the care of his paternal grandfather Pasino Canova, who was a stonemason, owner of a quarry, and was a "sculptor who specialized in altars with statues and low reliefs in late Baroque style". He led Antonio into the art of sculpting.
Before the age of ten, Canova began making models in clay, and carving marble. Indeed, at the age of nine, he executed two small shrines of Carrara marble, which are still extant. After these works, he appears to have been constantly employed under his grandfather.
Venice.
In 1770, he was an apprentice for two years to Giuseppe Bernardi, who was also known as 'Torretto'. Afterwards, he was under the tutelage of Giovanni Ferrari until he began his studies at the Accademia di Belle Arti di Venezia. At the Academy, he won several prizes. During this time, he was given his first workshop within a monastery by some local monks.
The Senator Giovanni Falier commissioned Canova to produce statues of Orpheus and Eurydice for his garden – the Villa Falier at Asolo. The statues were begun in 1775, and both were completed by 1777. The pieces exemplify the late Rococo style. On the year of their completion, both works were exhibited for the Feast of the Ascension in Piazza San Marco. Widely praised, the works won Canova his first renown among the Venetian elite. Another Venetian who is said to have commissioned early works from Canova was the abate Filippo Farsetti, whose collection at Ca' Farsetti on the Grand Canal he frequented.
In 1779, Canova opened his own studio at Calle Del Traghetto at S. Maurizio. At this time, Procurator Pietro Vettor Pisani commissioned Canova's first marble statue: a depiction of Daedalus and Icarus. The statue inspired great admiration for his work at the annual art fair; Canova was paid for 100 gold zecchini for the completed work. At the base of the statue, Daedalus' tools are scattered about; these tools are also an allusion to Sculpture, of which the statue is a personification. With such an intention, there is suggestion that Daedalus is a portrait of Canova's grandfather Pasino.
Rome.
Canova arrived in Rome, on 28 December 1780. Prior to his departure, his friends had applied to the Venetian Senate for a pension. Successful in the application, the stipend allotted amounted to three hundred ducats, limited to three years.
While in Rome, Canova spent time studying and sketching the works of Michelangelo.
In 1781, Girolamo Zulian – the Venetian ambassador to Rome – hired Canova to sculpt "Theseus and the Minotaur". Zulian played a fundamental role in Canova's rise to fame, turning some rooms of his palace into a studio for the artist and placing his trust in him despite Canova's early critics in Rome. The statue depicts the victorious Theseus seated on the lifeless body of a Minotaur. The initial spectators were certain that the work was a copy of a Greek original, and were shocked to learn it was a contemporary work. The highly regarded work is now in the collection of the Victoria & Albert Museum, in London.
Between 1783 and 1785, Canova arranged, composed, and designed a funerary monument dedicated to Clement XIV for the Church of Santi Apostoli. After another two years, the work met completion in 1787. The monument secured Canova's reputation as the pre-eminent living artist.
In 1792, he completed another cenotaph, this time commemorating Clement XIII for St. Peter's Basilica. Canova harmonized its design with the older Baroque funerary monuments in the basilica.
In 1790, he began to work on a funerary monument for Titian, which was eventually abandoned by 1795. During the same year, he increased his activity as a painter. Canova was notoriously disinclined to restore sculptures. However, in 1794 he made an exception for his friend and early patron Zulian, restoring a few sculptures that Zulian had moved from Rome to Venice.
The following decade was extremely productive, beginning works such as "Hercules and Lichas", "Cupid and Psyche", "Hebe", "Tomb of Duchess Maria Christina of Saxony-Teschen", and "The Penitent Magdalene".
In 1797, he went to Vienna, but only a year later, in 1798, he returned to Possagno for a year.
France and England.
By 1800, Canova was the most celebrated artist in Europe. He systematically promoted his reputation by publishing engravings of his works and having marble versions of plaster casts made in his workshop. He became so successful that he had acquired patrons from across Europe including France, England, Russia, Austria and Holland, as well as several members from different royal lineages, and prominent individuals. Among his patrons were Napoleon and his family, for whom Canova produced much work, including several depictions between 1803 and 1809. The most notable representations were that of "Napoleon as Mars the Peacemaker", and "Venus Victrix" which was portrayal of Pauline Bonaparte.
"Napoleon as Mars the Peacemaker" had its inception after Canova was hired to make a bust of Napoleon in 1802. The statue was begun in 1803, with Napoleon requesting to be shown in a French General's uniform, Canova rejected this, insisting on an allusion to Mars, the Roman god of War. It was completed in 1806. In 1811, the statue arrived in Paris, but not installed; neither was its bronze copy in the Foro Napoleonico in Milan. In 1815, the original went to the Duke of Wellington, after his victory at Waterloo against Napoleon.
"Venus Victrix" was originally conceived as a robed and recumbent sculpture of Pauline Borghese in the guise of Diana. Instead, Pauline ordered Canova to make the statue a nude Venus. The work was not intended for public viewing.
Other works for the Napoleon family include, a bust of Napoleon, a statue of Napoleon's mother, and Marie Louise as Concordia.
In 1802, Canova was assigned the post of 'Inspector-General of Antiquities and Fine Art of the Papal State', a position formerly held by Raphael. One of his activities in this capacity was to pioneer the restoration of the Appian Way by restoring the tomb of Servilius Quartus. In 1808 Canova became an associated member of the Royal Institute of Sciences, Literature and Fine Arts of the Kingdom of Holland.
In 1814, he began his "The Three Graces".
In 1815, he was named 'Minister Plenipotentiary of the Pope,' and was tasked with recovering various works of art that were taken to Paris by Napoleon under the terms of the Treaty of Paris (1815).
Also in 1815, he visited London, and met with Benjamin Haydon. It was after the advice of Canova that the Elgin Marbles were acquired by the British Museum, with plaster copies sent to Florence, according to Canova's request.
Returning to Italy.
In 1816, Canova returned to Rome with some of the art Napoleon had taken. He was rewarded with several marks of distinction: he was appointed President of the Accademia di San Luca, inscribed into the "Golden Book of Roman Nobles" by the Pope's own hands, and given the title of Marquis of Ischia, alongside an annual pension of 3,000 crowns.
In 1819, he commenced and completed his commissioned work "Venus Italica" as a replacement for the Venus de' Medici.
After his 1814 proposal to build a personified statue of Religion for St. Peter's Basilica was rejected, Canova sought to build his own temple to house it. This project came to be the Tempio Canoviano. Canova designed, financed, and partly built the structure himself. The structure was to be a testament to Canova's piety. The building's design was inspired by combining the Parthenon and the Pantheon together. On 11 July 1819, Canova laid the foundation stone dressed in red Papal uniform and decorated with all his medals. It first opened in 1830, and was finally completed in 1836. After the foundation-stone of this edifice had been laid, Canova returned to Rome; but every succeeding autumn he continued to visit Possagno to direct the workmen and encourage them with rewards.
During the period that intervened between commencing operations at Possagno and his death, he executed or finished some of his most striking works. Among these were the group "Mars and Venus", the colossal figure of Pius VI, the Pietà, the "St John", and a colossal bust of his friend, the Count Leopoldo Cicognara.
In 1820, he made a statue of George Washington for the state of North Carolina. As recommended by Thomas Jefferson, the sculptor used the marble bust of Washington by Giuseppe Ceracchi as a model. It was delivered on 24 December 1821. The statue and the North Carolina State House where it was displayed were later destroyed by fire in 1831. A plaster replica was sent by King Victor Emmanuel III of Italy in 1910, now on view at the North Carolina Museum of History. A marble copy was sculpted by Romano Vio in 1970, now on view in the rotunda of the capitol building.
In 1822, he journeyed to Naples, to superintend the construction of wax moulds for an equestrian statue of Ferdinand VII. The adventure was disastrous to his health, but soon became healthy enough to return to Rome. From there, he voyaged to Venice; however, on 13 October 1822, he died there at the age of 64. As he never married, the name became extinct, except through his stepbrothers' lineage of Satori-Canova.
On 12 October 1822, Canova instructed his brother to use his entire estate to complete the Tempio in Possagno.
On 25 October 1822, his body was placed in the Tempio Canoviano. His heart was interred at the Basilica di Santa Maria Gloriosa dei Frari in Venice, and his right hand preserved in a vase at the Accademia di Belle Arti di Venezia.
His memorial service was so grand that it rivaled the ceremony that the city of Florence held for Michelangelo in 1564.
In 1826, Giovanni Battista Sartori sold Canova's Roman studio and took every plaster model and sculpture to Possagno, where they were installed in the gypsotheque of the Tempio Canoviano.
Works.
Among Canova's most notable works are:
"Psyche Revived by Cupid's Kiss" (1787).
"Psyche Revived by Cupid's Kiss" was commissioned in 1787 by Colonel John Campbell. It is regarded as a masterpiece of Neoclassical sculpture, but shows the mythological lovers at a moment of great emotion, characteristic of the emerging movement of Romanticism. It represents the god Cupid in the height of love and tenderness, immediately after awakening the lifeless Psyche with a kiss.
"Napoleon as Mars the Peacemaker" (1802–1806).
"Napoleon as Mars the Peacemaker" had its inception after Canova was hired to make a bust of Napoleon in 1802. The statue was begun in 1802, with Napoleon requesting to be shown in a French General's uniform, Canova rejected this, insisting on an allusion to Mars, the Roman god of War. It was completed in 1806. In 1811, the statue arrived in Paris, but not installed; neither was its bronze copy in the Foro Napoleonico in Milan. In 1815, the original went to the Duke of Wellington, after his victory at Waterloo against Napoleon and is on display at Apsley House.
"Perseus Triumphant" (1804–1806).
"Perseus Triumphant", sometimes called "Perseus with the Head of Medusa", was a statue commissioned by tribune Onorato Duveyriez. It depicts the Greek hero Perseus after his victory over the Gorgon Medusa.
The statue was based freely to the Apollo Belvedere and the Medusa Rondanini.
Napoleon, after his 1796 Italian Campaign, took the Apollo Belvedere to Paris. In the statue's absence, Pope Pius VII acquired Canova's "Perseus Triumphant" and placed the work upon the "Apollo"'s pedestal. The statue was so successful that when the "Apollo" was returned, "Perseus" remained as a companion piece.
One replica of the statue was commissioned from Canova by the Polish countess Waleria Tarnowska; it's now displayed in the Metropolitan Museum of Art in New York City.
Karl Ludwig Fernow said of the statue that "every eye must rest with pleasure on the beautiful surface, even when the mind finds its hopes of high and pure enjoyment disappointed."
"Venus Victrix" (1805–1808).
"Venus Victrix" ranks among the most famous of Canova's works. Originally, Canova wished the depiction to be of a robed Diana, but Pauline Borghese insisted to appear as a nude Venus. The work was not intended for public viewing.
"The Three Graces" (1814–1817).
John Russell, the 6th Duke of Bedford, commissioned a version of the now famous work. He had previously visited Canova in his studio in Rome in 1814 and had been immensely impressed by a carving of the Graces the sculptor had made for the Empress Joséphine. When the Empress died in May of the same year he immediately offered to purchase the completed piece, but was unsuccessful as Josephine's son Eugène de Beauharnais claimed it (his son Maximilian, Duke of Leuchtenberg brought it to St. Petersburg, where it can now be found in the Hermitage Museum). Undeterred, the Duke commissioned another version for himself.
The sculpting process began in 1814 and was completed in 1817. Finally in 1819 it was installed at the Duke's residence in Woburn Abbey. Canova even made the trip over to England to supervise its installation, choosing for it to be displayed on a pedestal adapted from a marble plinth with a rotating top. This version is now owned jointly by the Victoria and Albert Museum and the National Galleries of Scotland, and is alternately displayed at each.
Artistic process.
Canova had a distinct, signature style in which he combined Greek and Roman art practices with early stirrings of romanticism to delve into a new path of Neoclassicism. Canova's sculptures fall into three categories: Heroic compositions, compositions of grace, and sepulchral monuments. In each of these, Canova's underlying artistic motivations were to challenge, if not compete, with classical statues.
Canova refused to take in pupils and students, but would hire workers to carve the initial figure from the marble. According to art historian Giuseppe Pavanello, "Canova's system of work concentrated on the initial idea, and on the final carving of the marble". He had an elaborate system of comparative pointing so that the workers were able to reproduce the plaster form in the selected block of marble. These workers would leave a thin veil over the entire statue so Canova's could focus on the surface of the statue.
While he worked, he had people read to him select literary and historical texts.
Last touch.
During the last quarter of the eighteenth century, it became fashionable to view art galleries at night by torchlight. Canova was an artist that leapt on the fad and displayed his works of art in his studio by candlelight. As such, Canova would begin to finalize the statue with special tools by candlelight, to soften the transitions between the various parts of the nude. After a little recarving, he began to rub the statue down with pumice stone, sometimes for periods longer than weeks or months. If that was not enough, he would use tripoli (rottenstone) and lead.
He then applied a now unknown chemical-composition of patina onto the flesh of the figure to lighten the skin tone. Importantly, his friends also denied any usage of acids in his process.
Criticisms.
Conversations revolving around the justification of art as superfluous usually invoked the name of Canova. Karl Ludwig Fernow believed that Canova was not Kantian enough in his aesthetic, because emphasis seemed to have been placed on agreeableness rather than Beauty. Canova was faulted for creating works that were artificial in complexity.
Legacy.
Although the Romantic period artists buried Canova's name soon after he died, he is slowly being rediscovered. Giuseppe Pavanello wrote in 1996 that "the importance and value of Canova's art is now recognized as holding in balance the last echo of the Ancients and the first symptom of the restless experimentation of the modern age".
Canova spent large parts of his fortune helping young students and sending patrons to struggling sculptors, including Sir Richard Westmacott and John Gibson.
He was introduced into various orders of chivalry.
A number of his works, sketches, and writings are collected in the "Sala Canoviana" of the Museo Civico of Bassano del Grappa. Other works, including plaster casts are the Museo Canoviano in Asolo.
In 2018, a crater on Mercury was named in his honor.
Literary inspirations.
Two of Canova's works appear as engravings in "Fisher's Drawing Room Scrap Book", 1834, with poetical illustrations by Letitia Elizabeth Landon. These are of "The Dancing Girl" and "Hebe".
|
2065 | Auguste Rodin | François Auguste René Rodin (12 November 184017 November 1917) was a French sculptor, generally considered the founder of modern sculpture. He was schooled traditionally and took a craftsman-like approach to his work. Rodin possessed a unique ability to model a complex, turbulent, and deeply pocketed surface in clay. He is known for such sculptures as "The Thinker", "Monument to Balzac", "The Kiss", "The Burghers of Calais", and "The Gates of Hell".
Many of Rodin's most notable sculptures were criticized, as they clashed with predominant figurative sculpture traditions in which works were decorative, formulaic, or highly thematic. Rodin's most original work departed from traditional themes of mythology and allegory. He modeled the human body with naturalism, and his sculptures celebrate individual character and physicality. Although Rodin was sensitive to the controversy surrounding his work, he refused to change his style, and his continued output brought increasing favor from the government and the artistic community.
From the unexpected naturalism of Rodin's first major figure – inspired by his 1875 trip to Italy – to the unconventional memorials whose commissions he later sought, his reputation grew, and Rodin became the preeminent French sculptor of his time. By 1900, he was a world-renowned artist. Wealthy private clients sought Rodin's work after his World's Fair exhibit, and he kept company with a variety of high-profile intellectuals and artists. His student, Camille Claudel, became his associate, lover, and creative rival. Rodin's other students included Antoine Bourdelle, Constantin Brâncuși, and Charles Despiau. He married his lifelong companion, Rose Beuret, in the last year of both their lives. His sculptures suffered a decline in popularity after his death in 1917, but within a few decades his legacy solidified. Rodin remains one of the few sculptors widely known outside the visual arts community.
Biography.
Formative years.
Rodin was born in 1840 into a working-class family in Paris, the second child of Marie Cheffer and Jean-Baptiste Rodin, who was a police department clerk. He was largely self-educated, and began to draw at age 10. Between ages 14 and 17, he attended the "Petite École", a school specializing in art and mathematics where he studied drawing and painting. His drawing teacher Horace Lecoq de Boisbaudran believed in first developing the personality of his students so that they observed with their own eyes and drew from their recollections, and Rodin expressed appreciation for his teacher much later in life. It was at Petite École that he met Jules Dalou and Alphonse Legros.
In 1857, Rodin submitted a clay model of a companion to the École des Beaux-Arts in an attempt to win entrance; he did not succeed, and two further applications were also denied. Entrance requirements were not particularly high at the "Grande École", so the rejections were considerable setbacks. Rodin's inability to gain entrance may have been due to the judges' Neoclassical tastes, while Rodin had been schooled in light, 18th-century sculpture. He left the "Petite École" in 1857 and earned a living as a craftsman and ornamenter for most of the next two decades, producing decorative objects and architectural embellishments.
Rodin's sister Maria, two years his senior, died of peritonitis in a convent in 1862, and Rodin was anguished with guilt because he had introduced her to an unfaithful suitor. He turned away from art and joined the Catholic order of the Congregation of the Blessed Sacrament. Saint Peter Julian Eymard, founder and head of the congregation, recognized Rodin's talent and sensed his lack of suitability for the order, so he encouraged Rodin to continue with his sculpture. Rodin returned to work as a decorator while taking classes with animal sculptor Antoine-Louis Barye. The teacher's attention to detail and his finely rendered musculature of animals in motion significantly influenced Rodin.
In 1864, Rodin began to live with a young seamstress named Rose Beuret (born in June 1844), with whom he stayed for the rest of his life, with varying commitment. The couple had a son named Auguste-Eugène Beuret (1866–1934). That year, Rodin offered his first sculpture for exhibition and entered the studio of Albert-Ernest Carrier-Belleuse, a successful mass producer of "objets d'art". Rodin worked as Carrier-Belleuse' chief assistant until 1870, designing roof decorations and staircase and doorway embellishments. With the arrival of the Franco-Prussian War, Rodin was called to serve in the French National Guard, but his service was brief due to his near-sightedness. Decorators' work had dwindled because of the war, yet Rodin needed to support his family, as poverty was a continual difficulty for him until about the age of 30. Carrier-Belleuse soon asked him to join him in Belgium, where they worked on ornamentation for the Brussels Stock Exchange.
Rodin planned to stay in Belgium a few months, but he spent the next six years outside of France. It was a pivotal time in his life. He had acquired skill and experience as a craftsman, but no one had yet seen his art, which sat in his workshop since he could not afford castings. His relationship with Carrier-Belleuse had deteriorated, but he found other employment in Brussels, displaying some works at salons, and his companion Rose soon joined him there. Having saved enough money to travel, Rodin visited Italy for two months in 1875, where he was drawn to the work of Donatello and Michelangelo. Their work had a profound effect on his artistic direction. Rodin said, "It is Michelangelo who has freed me from academic sculpture." Returning to Belgium, he began work on "The Age of Bronze", a life-size male figure whose naturalism brought Rodin attention but led to accusations of sculptural cheatingits naturalism and scale was such that critics alleged he had cast the work from a living model. Much of Rodin's later work was explicitly larger or smaller than life, in part to demonstrate the folly of such accusations.
Artistic independence.
Rose Beuret and Rodin returned to Paris in 1877, moving into a small flat on the Left Bank. Misfortune surrounded Rodin: his mother, who had wanted to see her son marry, was dead, and his father was blind and senile, cared for by Rodin's sister-in-law, Aunt Thérèse. Rodin's eleven-year-old son Auguste, possibly developmentally delayed, was also in the ever-helpful Thérèse's care. Rodin had essentially abandoned his son for six years, and would have a very limited relationship with him throughout his life. Father and son joined the couple in their flat, with Rose as caretaker. Charges of fakery surrounding "The Age of Bronze" continued. Rodin increasingly sought soothing female companionship in Paris, and Rose stayed in the background.
Rodin earned his living collaborating with more established sculptors on public commissions, primarily memorials and neo-baroque architectural pieces in the style of Carpeaux. In competitions for commissions he submitted models of Denis Diderot, Jean-Jacques Rousseau, and Lazare Carnot, all to no avail. On his own time, he worked on studies leading to the creation of his next important work, "St. John the Baptist Preaching".
In 1880, Carrier-Belleuse – then art director of the Sèvres national porcelain factory – offered Rodin a part-time position as a designer. The offer was in part a gesture of reconciliation, and Rodin accepted. That part of Rodin which appreciated 18th-century tastes was aroused, and he immersed himself in designs for vases and table ornaments that brought the factory renown across Europe.
The artistic community appreciated his work in this vein, and Rodin was invited to Paris Salons by such friends as writer Léon Cladel. During his early appearances at these social events, Rodin seemed shy; in his later years, as his fame grew, he displayed the loquaciousness and temperament for which he is better known. French statesman Leon Gambetta expressed a desire to meet Rodin, and the sculptor impressed him when they met at a salon. Gambetta spoke of Rodin in turn to several government ministers, likely including , the Undersecretary of the Ministry of Fine Arts, whom Rodin eventually met.
Rodin's relationship with Turquet was rewarding: through him, he won the 1880 commission to create a portal for a planned museum of decorative arts. Rodin dedicated much of the next four decades to his elaborate "Gates of Hell", an unfinished portal for a museum that was never built. Many of the portal's figures became sculptures in themselves, including Rodin's most famous, "The Thinker" and "The Kiss". With the museum commission came a free studio, granting Rodin a new level of artistic freedom. Soon, he stopped working at the porcelain factory in 1882; his income came from private commissions.
In 1883, Rodin agreed to supervise a course for sculptor Alfred Boucher in his absence, where he met the 18-year-old Camille Claudel. The two formed a passionate but stormy relationship and influenced each other artistically. Claudel inspired Rodin as a model for many of his figures, and she was a talented sculptor, assisting him on commissions as well as creating her own works. Her "Bust of Rodin" was displayed to critical acclaim at the 1892 Salon.
Although busy with "The Gates of Hell", Rodin won other commissions. He pursued an opportunity to create a historical monument for the town of Calais. For a monument to French author Honoré de Balzac, Rodin was chosen in 1891. His execution of both sculptures clashed with traditional tastes, and met with varying degrees of disapproval from the organizations that sponsored the commissions. Still, Rodin was gaining support from diverse sources that propelled him toward fame.
In 1889, the Paris Salon invited Rodin to be a judge on its artistic jury. Though Rodin's career was on the rise, Claudel and Beuret were becoming increasingly impatient with Rodin's "double life". Claudel and Rodin shared an atelier at a small old castle (the Château de l'Islette in the Loire), but Rodin refused to relinquish his ties to Beuret, his loyal companion during the lean years, and mother of his son. During one absence, Rodin wrote to Beuret, "I think of how much you must have loved me to put up with my caprices...I remain, in all tenderness, your Rodin."
Claudel and Rodin parted in 1898. Claudel suffered an alleged nervous breakdown several years later and was confined to an institution for 30 years by her family, until her death in 1943, despite numerous attempts by doctors to explain to her mother and brother that she was sane.
In 1904 Rodin, was introduced to the Welsh artist, Gwen John who modelled for him and became his lover after being introduced by Hilda Flodin. John had a fervent attachment to Rodin and would write to him thousands of times over the next ten years. As their relationship came to a close, despite his genuine feeling for her, Rodin eventually resorted to the use of concièrges and secretaries to keep her at a distance.
Works.
In 1864, Rodin submitted his first sculpture for exhibition, "The Man with the Broken Nose", to the Paris Salon. The subject was an elderly neighborhood street porter. The unconventional bronze piece was not a traditional bust, but instead the head was "broken off" at the neck, the nose was flattened and crooked, and the back of the head was absent, having fallen off the clay model in an accident. The work emphasized texture and the emotional state of the subject; it illustrated the "unfinishedness" that would characterize many of Rodin's later sculptures. The Salon rejected the piece.
Early figures: the inspiration of Italy.
In Brussels, Rodin created his first full-scale work, "The Age of Bronze", having returned from Italy. Modeled after a Belgian soldier, the figure drew inspiration from Michelangelo's "Dying Slave", which Rodin had observed at the Louvre. Attempting to combine Michelangelo's mastery of the human form with his own sense of human nature, Rodin studied his model from all angles, at rest and in motion; he mounted a ladder for additional perspective, and made clay models, which he studied by candlelight. The result was a life-size, well-proportioned nude figure, posed unconventionally with his right hand atop his head, and his left arm held out at his side, forearm parallel to the body.
In 1877, the work debuted in Brussels and then was shown at the Paris Salon. The statue's apparent lack of a theme was troubling to critics – commemorating neither mythology nor a noble historical event – and it is not clear whether Rodin intended a theme. He first titled the work "The Vanquished", in which form the left hand held a spear, but he removed the spear because it obstructed the torso from certain angles. After two more intermediary titles, Rodin settled on "The Age of Bronze", suggesting the Bronze Age, and in Rodin's words, "man arising from nature". Later, however, Rodin said that he had had in mind "just a simple piece of sculpture without reference to subject".
Its mastery of form, light, and shadow made the work look so naturalistic that Rodin was accused of "surmoulage" – having taken a cast from a living model. Rodin vigorously denied the charges, writing to newspapers and having photographs taken of the model to prove how the sculpture differed. He demanded an inquiry and was eventually exonerated by a committee of sculptors. Leaving aside the false charges, the piece polarized critics. It had barely won acceptance for display at the Paris Salon, and criticism likened it to "a statue of a sleepwalker" and called it "an astonishingly accurate copy of a low type". Others rallied to defend the piece and Rodin's integrity. The government minister Turquet admired the piece, and "The Age of Bronze" was purchased by the state for 2,200 francs – what it had cost Rodin to have it cast in bronze.
A second male nude, "St. John the Baptist Preaching", was completed in 1878. Rodin sought to avoid another charge of "surmoulage" by making the statue larger than life: "St. John" stands almost . While "The Age of Bronze" is statically posed, "St. John" gestures and seems to move toward the viewer. The effect of walking is achieved despite the figure having both feet firmly on the ground – a technical achievement that was lost on most contemporary critics. Rodin chose this contradictory position to, in his words, "display simultaneously...views of an object which in fact can be seen only successively".
Despite the title, "St. John the Baptist Preaching" did not have an obviously religious theme. The model, an Italian peasant who presented himself at Rodin's studio, possessed an idiosyncratic sense of movement that Rodin felt compelled to capture. Rodin thought of John the Baptist, and carried that association into the title of the work. In 1880, Rodin submitted the sculpture to the Paris Salon. Critics were still mostly dismissive of his work, but the piece finished third in the Salon's sculpture category.
Regardless of the immediate receptions of "St. John" and "The Age of Bronze", Rodin had achieved a new degree of fame. Students sought him at his studio, praising his work and scorning the charges of "surmoulage". The artistic community knew his name.
"The Gates of Hell".
A commission to create a portal for Paris' planned Museum of Decorative Arts was awarded to Rodin in 1880. Although the museum was never built, Rodin worked throughout his life on "The Gates of Hell", a monumental sculptural group depicting scenes from Dante's "Inferno" in high relief. Often lacking a clear conception of his major works, Rodin compensated with hard work and a striving for perfection.
He conceived "The Gates" with the "surmoulage" controversy still in mind: "...I had made the "St. John" to refute [the charges of casting from a model], but it only partially succeeded. To prove completely that I could model from life as well as other sculptors, I determined...to make the sculpture on the door of figures smaller than life." Laws of composition gave way to the "Gates"' disordered and untamed depiction of Hell. The figures and groups in this, Rodin's meditation on the condition of man, are physically and morally isolated in their torment.
"The Gates of Hell" comprised 186 figures in its final form. Many of Rodin's best-known sculptures started as designs of figures for this composition, such as "The Thinker", "The Three Shades", and "The Kiss", and were only later presented as separate and independent works. Other well-known works derived from "The Gates" are "Ugolino", "Fallen Caryatid Carrying her Stone", "Fugit Amor", "She Who Was Once the Helmet-Maker's Beautiful Wife", "The Falling Man", and "The Prodigal Son".
"The Thinker".
"The Thinker" (originally titled "The Poet", after Dante) was to become one of the best-known sculptures in the world. The original was a high bronze piece created between 1879 and 1889, designed for the "Gates"' lintel, from which the figure would gaze down upon Hell. While "The Thinker" most obviously characterizes Dante, aspects of the Biblical Adam, the mythological Prometheus, and Rodin himself have been ascribed to him. Other observers de-emphasize the apparent intellectual theme of "The Thinker", stressing the figure's rough physicality and the emotional tension emanating from it.
"The Burghers of Calais".
The town of Calais had contemplated a historical monument for decades when Rodin learned of the project. He pursued the commission, interested in the medieval motif and patriotic theme. The mayor of Calais was tempted to hire Rodin on the spot upon visiting his studio, and soon the memorial was approved, with Rodin as its architect. It would commemorate the six townspeople of Calais who offered their lives to save their fellow citizens.
During the Hundred Years' War, the army of King Edward III besieged Calais, and Edward ordered that the town's population be killed "en masse". He agreed to spare them if six of the principal citizens would come to him prepared to die, bareheaded and barefooted and with ropes around their necks. When they came, he ordered that they be executed, but pardoned them when his queen, Philippa of Hainault, begged him to spare their lives. "The Burghers of Calais" depicts the men as they are leaving for the king's camp, carrying keys to the town's gates and citadel.
Rodin began the project in 1884, inspired by the chronicles of the siege by Jean Froissart. Though the town envisioned an allegorical, heroic piece centered on Eustache de Saint-Pierre, the eldest of the six men, Rodin conceived the sculpture as a study in the varied and complex emotions under which all six men were laboring. One year into the commission, the Calais committee was not impressed with Rodin's progress. Rodin indicated his willingness to end the project rather than change his design to meet the committee's conservative expectations, but Calais said to continue.
In 1889, "The Burghers of Calais" was first displayed to general acclaim. It is a bronze sculpture weighing , and its figures are tall. The six men portrayed do not display a united, heroic front; rather, each is isolated from his brothers, individually deliberating and struggling with his expected fate. Rodin soon proposed that the monument's high pedestal be eliminated, wanting to move the sculpture to ground level so that viewers could "penetrate to the heart of the subject". At ground level, the figures' positions lead the viewer around the work, and subtly suggest their common movement forward.
The committee was incensed by the untraditional proposal, but Rodin would not yield. In 1895, Calais succeeded in having "Burghers" displayed in their preferred form: the work was placed in front of a public garden on a high platform, surrounded by a cast-iron railing. Rodin had wanted it located near the town hall, where it would engage the public. Only after damage during the First World War, subsequent storage, and Rodin's death was the sculpture displayed as he had intended. It is one of Rodin's best-known and most acclaimed works.
Commissions and controversy.
Commissioned to create a monument to French writer Victor Hugo in 1889, Rodin dealt extensively with the subject of "artist and muse". Like many of Rodin's public commissions, "Monument to Victor Hugo" was met with resistance because it did not fit conventional expectations. Commenting on Rodin's monument to Victor Hugo, "The Times" in 1909 expressed that "there is some show of reason in the complaint that [Rodin's] conceptions are sometimes unsuited to his medium, and that in such cases they overstrain his vast technical powers". The 1897 plaster model was not cast in bronze until 1964.
The "Société des Gens des Lettres", a Parisian organization of writers, planned a monument to French novelist Honoré de Balzac immediately after his death in 1850. The society commissioned Rodin to create the memorial in 1891, and Rodin spent years developing the concept for his sculpture. Challenged in finding an appropriate representation of Balzac given the author's rotund physique, Rodin produced many studies: portraits, full-length figures in the nude, wearing a frock coat, or in a robe – a replica of which Rodin had requested. The realized sculpture displays Balzac cloaked in the drapery, looking forcefully into the distance with deeply gouged features. Rodin's intent had been to show Balzac at the moment of conceiving a work – to express courage, labor, and struggle.
When "Monument to Balzac" was exhibited in 1898, the negative reaction was not surprising. The "Société" rejected the work, and the press ran parodies. Criticizing the work, Morey (1918) reflected, "there may come a time, and doubtless will come a time, when it will not seem "outre" to represent a great novelist as a huge comic mask crowning a bathrobe, but even at the present day this statue impresses one as slang." A modern critic, indeed, claims that "Balzac" is one of Rodin's masterpieces.
The monument had its supporters in Rodin's day; a manifesto defending him was signed by Monet, Debussy, and future Premier Georges Clemenceau, among many others. In the BBC series Civilisation, art historian Kenneth Clark praised the monument as "the greatest piece of sculpture of the 19th Century, perhaps, indeed, the greatest since Michelangelo." Rather than try to convince skeptics of the merit of the monument, Rodin repaid the "Société" his commission and moved the figure to his garden. After this experience, Rodin did not complete another public commission. Only in 1939 was "Monument to Balzac" cast in bronze and placed on the Boulevard du Montparnasse at the intersection with Boulevard Raspail.
Other works.
The popularity of Rodin's most famous sculptures tends to obscure his total creative output. A prolific artist, he created thousands of busts, figures, and sculptural fragments over more than five decades. He painted in oils (especially in his thirties) and in watercolors. The Musée Rodin holds 7,000 of his drawings and prints, in chalk and charcoal, and thirteen vigorous drypoints. He also produced a single lithograph.
Portraiture was an important component of Rodin's oeuvre, helping him to win acceptance and financial independence. His first sculpture was a bust of his father in 1860, and he produced at least 56 portraits between 1877 and his death in 1917. Early subjects included fellow sculptor Jules Dalou (1883) and companion Camille Claudel (1884).
Later, with his reputation established, Rodin made busts of prominent contemporaries such as English politician George Wyndham (1905), Irish playwright George Bernard Shaw (1906), socialist (and former mistress of the Prince of Wales who became King Edward VII) Countess of Warwick (1908), Austrian composer Gustav Mahler (1909), former Argentine president Domingo Faustino Sarmiento and French statesman Georges Clemenceau (1911).
His undated drawing "Study of a Woman Nude, Standing, Arms Raised, Hands Crossed Above Head" is one of the works seized in 2012 from the collection of Cornelius Gurlitt.
Aesthetic.
Rodin was a naturalist, less concerned with monumental expression than with character and emotion. Departing with centuries of tradition, he turned away from the idealism of the Greeks, and the decorative beauty of the Baroque and neo-Baroque movements. His sculpture emphasized the individual and the concreteness of flesh, and suggested emotion through detailed, textured surfaces, and the interplay of light and shadow. To a greater degree than his contemporaries, Rodin believed that an individual's character was revealed by his physical features.
Rodin's talent for surface modeling allowed him to let every part of the body speak for the whole. The male's passion in "The Thinker" is suggested by the grip of his toes on the rock, the rigidness of his back, and the differentiation of his hands. Speaking of "The Thinker", Rodin illuminated his aesthetic: "What makes my Thinker think is that he thinks not only with his brain, with his knitted brow, his distended nostrils and compressed lips, but with every muscle of his arms, back, and legs, with his clenched fist and gripping toes."
Sculptural fragments to Rodin were autonomous works, and he considered them the essence of his artistic statement. His fragments – perhaps lacking arms, legs, or a head – took sculpture further from its traditional role of portraying likenesses, and into a realm where forms existed for their own sake. Notable examples are "The Walking Man", "Meditation without Arms", and "Iris, Messenger of the Gods".
Rodin saw suffering and conflict as hallmarks of modern art. "Nothing, really, is more moving than the maddened beast, dying from unfulfilled desire and asking in vain for grace to quell its passion." Charles Baudelaire echoed those themes, and was among Rodin's favorite poets. Rodin enjoyed music, especially the opera composer Gluck, and wrote a book about French cathedrals. He owned a work by the as-yet-unrecognized Van Gogh, and admired the forgotten El Greco.
Method.
Instead of copying traditional academic postures, Rodin preferred his models to move naturally around his studio (despite their nakedness). The sculptor often made quick sketches in clay that were later fine-tuned, cast in plaster, and cast in bronze or carved from marble. Rodin's focus was on the handling of clay.
George Bernard Shaw sat for a portrait and gave an idea of Rodin's technique: "While he worked, he achieved a number of miracles. At the end of the first fifteen minutes, after having given a simple idea of the human form to the block of clay, he produced by the action of his thumb a bust so living that I would have taken it away with me to relieve the sculptor of any further work."
He described the evolution of his bust over a month, passing through "all the stages of art's evolution": first, a "Byzantine masterpiece", then "Bernini intermingled", then an elegant Houdon. "The hand of Rodin worked not as the hand of a sculptor works, but as the work of "Elan Vital". The "Hand of God" is his own hand."
After he completed his work in clay, he employed highly skilled assistants to re-sculpt his compositions at larger sizes (including any of his large-scale monuments such as "The Thinker"), to cast the clay compositions into plaster or bronze, and to carve his marbles. Rodin's major innovation was to capitalize on such multi-staged processes of 19th century sculpture and their reliance on plaster casting.
Since clay deteriorates rapidly if not kept wet or fired into a terra-cotta, sculptors used plaster casts as a means of securing the composition they would make from the fugitive material that is clay. This was common practice amongst Rodin's contemporaries, and sculptors would exhibit plaster casts with the hopes that they would be commissioned to have the works made in a more permanent material. Rodin, however, would have multiple plasters made and treat them as the raw material of sculpture, recombining their parts and figures into new compositions, and new names.
As Rodin's practice developed into the 1890s, he became more and more radical in his pursuit of fragmentation, the combination of figures at different scales, and the making of new compositions from his earlier work. A prime example of this is the bold "The Walking Man" (1899–1900), which was exhibited at his major one-person show in 1900. This is composed of two sculptures from the 1870s that Rodin found in his studio – a broken and damaged torso that had fallen into neglect and the lower extremities of a statuette version of his 1878 "St. John the Baptist Preaching" he was having re-sculpted at a reduced scale.
Without finessing the join between upper and lower, between torso and legs, Rodin created a work that many sculptors at the time and subsequently have seen as one of his strongest and most singular works. This is despite the fact that the object conveys two different styles, exhibits two different attitudes toward finish, and lacks any attempt to hide the arbitrary fusion of these two components. It was the freedom and creativity with which Rodin used these practices – along with his activation surfaces of sculptures through traces of his own touch and with his more open attitude toward bodily pose, sensual subject matter, and non-naturalistic surface – that marked Rodin's re-making of traditional 19th century sculptural techniques into the prototype for modern sculpture.
Later years (1900–1917).
By 1900, Rodin's artistic reputation was established. Gaining exposure from a pavilion of his artwork set up near the 1900 World's Fair ("Exposition Universelle") in Paris, he received requests to make busts of prominent people internationally, while his assistants at the atelier produced duplicates of his works. His income from portrait commissions alone totaled probably 200,000 francs a year. As Rodin's fame grew, he attracted many followers, including the German poet Rainer Maria Rilke, and authors Octave Mirbeau, Joris-Karl Huysmans, and Oscar Wilde.
Rilke stayed with Rodin in 1905 and 1906, and did administrative work for him; he would later write a laudatory monograph on the sculptor. Rodin and Beuret's modest country estate in Meudon, purchased in 1897, was a host to such guests as King Edward, dancer Isadora Duncan, and harpsichordist Wanda Landowska. A British journalist who visited the property noted in 1902 that in its complete isolation, there was "a striking analogy between its situation and the personality of the man who lives in it". Rodin moved to the city in 1908, renting the main floor of the Hôtel Biron, an 18th-century townhouse. He left Beuret in Meudon, and began an affair with the American-born Duchesse de Choiseul. From 1910, he mentored the Russian sculptor, Moissey Kogan.
United States.
While Rodin was beginning to be accepted in France by the time of "The Burghers of Calais", he had not yet conquered the American market. Because of his technique and the frankness of some of his work, he did not have an easy time selling his work to American industrialists. However, he came to know Sarah Tyson Hallowell (1846–1924), a curator from Chicago who visited Paris to arrange exhibitions at the large Interstate Expositions of the 1870s and 1880s. Hallowell was not only a curator but an adviser and a facilitator who was trusted by a number of prominent American collectors to suggest works for their collections, the most prominent of these being the Chicago hotelier Potter Palmer and his wife, Bertha Palmer (1849–1918).
The next opportunity for Rodin in America was the 1893 Chicago World's Fair. Hallowell wanted to help promote Rodin's work and he suggested a solo exhibition, which she wrote him was "beaucoup moins beau que l'original" but impossible, outside the rules. Instead, she suggested he send a number of works for her loan exhibition of French art from American collections and she told him she would list them as being part of an American collection. Rodin sent Hallowell three works, "Cupid and Psyche", "Sphinx" and "Andromeda". All nudes, these works provoked great controversy and were ultimately hidden behind a drape with special permission given for viewers to see them.
"Bust of Dalou" and "Burgher of Calais" were on display in the official French pavilion at the fair and so between the works that were on display and those that were not, he was noticed. However, the works he gave Hallowell to sell found no takers, but she soon brought the controversial Quaker-born financier Charles Yerkes (1837–1905) into the fold and he purchased two large marbles for his Chicago manse; Yerkes was likely the first American to own a Rodin sculpture.
Other collectors soon followed including the tastemaking Potter Palmers of Chicago and Isabella Stewart Gardner (1840–1924) of Boston, all arranged by Sarah Hallowell. In appreciation for her efforts at unlocking the American market, Rodin eventually presented Hallowell with a bronze, a marble and a terra cotta. When Hallowell moved to Paris in 1893, she and Rodin continued their warm friendship and correspondence, which lasted to the end of the sculptor's life. After Hallowell's death, her niece, the painter Harriet Hallowell, inherited the Rodins and after her death, the American heirs could not manage to match their value in order to export them, so they became the property of the French state.
Great Britain.
After the start of the 20th century, Rodin was a regular visitor to Great Britain, where he developed a loyal following by the beginning of the First World War. He first visited England in 1881, where his friend, the artist Alphonse Legros, had introduced him to the poet William Ernest Henley. With his personal connections and enthusiasm for Rodin's art, Henley was most responsible for Rodin's reception in Britain. (Rodin later returned the favor by sculpting a that was used as the frontispiece to Henley's collected works and, after his death, on his monument in London.)
Through Henley, Rodin met Robert Louis Stevenson and Robert Browning, in whom he found further support. Encouraged by the enthusiasm of British artists, students, and high society for his art, Rodin donated a significant selection of his works to the nation in 1914.
After the revitalization of the Société Nationale des Beaux-Arts in 1890, Rodin served as the body's vice-president. In 1903, Rodin was elected president of the International Society of Painters, Sculptors, and Engravers. He replaced its former president, James Abbott McNeill Whistler, upon Whistler's death. His election to the prestigious position was largely due to the efforts of Albert Ludovici, father of English philosopher Anthony Ludovici, who was private secretary to Rodin for several months in 1906, but the two men parted company after Christmas, "to their mutual relief."
During his later creative years, Rodin's work turned increasingly toward the female form, and themes of more overt masculinity and femininity. He concentrated on small dance studies, and produced numerous erotic drawings, sketched in a loose way, without taking his pencil from the paper or his eyes from the model. Rodin met American dancer Isadora Duncan in 1900, attempted to seduce her, and the next year sketched studies of her and her students. In July 1906, Rodin was also enchanted by dancers from the Royal Ballet of Cambodia, and produced some of his most famous drawings from the experience.
Fifty-three years into their relationship, Rodin married Rose Beuret. They married on 29 January 1917, and Beuret died two weeks later, on 16 February. Rodin was ill that year; in January, he suffered weakness from influenza, and on 16 November his physician announced that "congestion of the lungs has caused great weakness. The patient's condition is grave." Rodin died the next day, age 77, at his villa in Meudon, Île-de-France, on the outskirts of Paris.
A cast of "The Thinker" was placed next to his tomb in Meudon; it was Rodin's wish that the figure served as his headstone and epitaph. In 1923, Marcell Tirel, Rodin's secretary, published a book alleging that Rodin's death was largely due to cold, and the fact that he had no heat at Meudon. Rodin requested permission to stay in the Hotel Biron, a museum of his works, but the director of the museum refused to let him stay there.
Legacy.
Rodin willed to the French state his studio and the right to make casts from his plasters. Because he encouraged the edition of his sculpted work, Rodin's sculptures are represented in many public and private collections. The Musée Rodin was founded in 1916 and opened in 1919 at the Hôtel Biron, where Rodin had lived, and it holds the largest Rodin collection, with more than 6,000 sculptures and 7,000 works on paper. The French order "Légion d'honneur" made him a Commander, and he received an honorary doctorate from the University of Oxford.
During his lifetime, Rodin was compared to Michelangelo, and was widely recognized as the greatest artist of the era. In the three decades following his death, his popularity waned with changing aesthetic values. Since the 1950s, Rodin's reputation has re-ascended; he is recognized as the most important sculptor of the modern era, and has been the subject of much scholarly work. The sense of incompletion offered by some of his sculpture, such as "The Walking Man", influenced the increasingly abstract sculptural forms of the 20th century.
Rodin restored an ancient role of sculpture – to capture the physical and intellectual force of the human subject – and he freed sculpture from the repetition of traditional patterns, providing the foundation for greater experimentation in the 20th century. His popularity is ascribed to his emotion-laden representations of ordinary men and women – to his ability to find the beauty and pathos in the human animal. His most popular works, such as "The Kiss" and "The Thinker", are widely used outside the fine arts as symbols of human emotion and character. To honor Rodin's artistic legacy, the Google search engine homepage displayed a Google Doodle featuring "The Thinker" to celebrate his 172nd birthday on 12 November 2012.
Rodin had enormous artistic influence. A whole generation of sculptors studied in his workshop. These include Gutzon Borglum, Antoine Bourdelle, Constantin Brâncuși, Camille Claudel, Charles Despiau, Malvina Hoffman, Carl Milles, François Pompon, Rodo, Gustav Vigeland, Clara Westhoff and Margaret Winser, even though Brancusi later rejected his legacy. Rodin also promoted the work of other sculptors, including Aristide Maillol and Ivan Meštrović whom Rodin once called "the greatest phenomenon amongst sculptors." Other sculptors whose work has been described as owing to Rodin include Joseph Csaky, Alexander Archipenko, Joseph Bernard, Henri Gaudier-Brzeska, Georg Kolbe, Wilhelm Lehmbruck, Jacques Lipchitz, Pablo Picasso, Adolfo Wildt, and Ossip Zadkine. Henry Moore acknowledged Rodin's seminal influence on his work.
Several films have been made featuring Rodin as a prominent character or presence. These include "Camille Claudel", a 1988 film in which Gérard Depardieu portrays Rodin, "Camille Claudel 1915" from 2013, and "Rodin", a 2017 film starring Vincent Lindon as Rodin. Furthermore, the Rodin Studios artists' cooperative housing in New York City, completed in 1917 to designs by Cass Gilbert, was named after Rodin.
Forgeries.
The relative ease of making reproductions has also encouraged many forgeries: a survey of expert opinion placed Rodin in the top ten most-faked artists. Rodin fought against forgeries of his works as early as 1901, and since his death, many cases of organized, large-scale forgeries have been revealed. A massive forgery was discovered by French authorities in the early 1990s and led to the conviction of art dealer Guy Hain.
To deal with the complexity of bronze reproduction, France has promulgated several laws since 1956 which limit reproduction to twelve casts – the maximum number that can be made from an artist's plasters and still be considered his work. As a result of this limit, "The Burghers of Calais", for example, is found in fourteen cities.
In the market for sculpture, plagued by fakes, the value of a piece increases significantly when its provenance can be established. A Rodin work with a verified history sold for US$4.8 million in 1999, and Rodin's bronze "Ève, grand modele – version sans rocher" sold for $18.9 million at a 2008 Christie's auction in New York. Art critics concerned about authenticity have argued that taking a cast does not equal reproducing a Rodin sculpture – especially given the importance of surface treatment in Rodin's work.
A number of drawings previously attributed to Rodin are now known to have been forged by Ernest Durig.
|
2067 | Ann Arbor, Michigan | Ann Arbor is a city in the U.S. state of Michigan and the seat of government of Washtenaw County. The 2020 census recorded its population to be 123,851, making it the fifth-largest city in Michigan. It is the principal city of the Ann Arbor Metropolitan Statistical Area, which encompasses all of Washtenaw County. Ann Arbor is also included in the Greater Detroit Combined Statistical Area and the Great Lakes megalopolis, the most populated and largest megalopolis in North America.
Ann Arbor is home to the University of Michigan. The university significantly shapes Ann Arbor's economy as it employs about 30,000 workers, including about 12,000 in the medical center. The city's economy is also centered on high technology, with several companies drawn to the area by the university's research and development infrastructure.
Ann Arbor was founded in 1824, named after the wives of the village's founders, both named Ann, and the stands of bur oak trees. The city's population grew at a rapid rate in the early to mid-20th century.
History.
Before founding as Ann Arbor.
The lands of present-day Ann Arbor were part of Massachusetts's western claim after the French and Indian War (1754–1763), bounded by the latitudes of Massachusetts Bay Colony's original charter, to which it was entitled by its interpretation of its original sea-to-sea grant from The British Crown. Massachusetts ceded the claim to the federal government as part of the Northwest Territory after April 19, 1785.
In about 1774, the Potawatomi founded two villages in the area of what is now Ann Arbor.
19th century.
Ann Arbor was founded in 1824 by land speculators John Allen and Elisha Walker Rumsey. On May 25, 1824, the town plat was registered with Wayne County as the Village of Annarbour, the earliest known use of the town's name. Allen and Rumsey decided to name it for their wives, both named Ann, and for the stands of bur oak in the of land they purchased for $800 from the federal government at $1.25 per acre. The local Ojibwa named the settlement "kaw-goosh-kaw-nick", after the sound of Allen's sawmill.
Ann Arbor became the seat of Washtenaw County in 1827, and was incorporated as a village in 1833. The Ann Arbor Land Company, a group of speculators, set aside of undeveloped land and offered it to the state of Michigan as the site of the state capitol, but lost the bid to Lansing. In 1837, the property was accepted instead as the site of the University of Michigan.
Since the university's establishment in the city in 1837, the histories of the University of Michigan and Ann Arbor have been closely linked. The town became a regional transportation hub in 1839 with the arrival of the Michigan Central Railroad, and a north–south railway connecting Ann Arbor to Toledo and other markets to the south was established in 1878. Throughout the 1840s and the 1850s settlers continued to come to Ann Arbor. While the earlier settlers were primarily of British ancestry, the newer settlers also consisted of Germans, Irish, and Black people. In 1851, Ann Arbor was chartered as a city, though the city showed a drop in population during the Depression of 1873. It was not until the early 1880s that Ann Arbor again saw robust growth, with new immigrants from Greece, Italy, Russia, and Poland.
20th century.
Ann Arbor saw increased growth in manufacturing, particularly in milling. Ann Arbor's Jewish community also grew after the turn of the 20th century, and its first and oldest synagogue, Beth Israel Congregation, was established in 1916.
In 1960, Ann Arbor voters approved a $2.3 million bond issue to build the current city hall, which was designed by architect Alden B. Dow. The City Hall opened in 1963. In 1995, the building was renamed the Guy C. Larcom, Jr. Municipal Building in honor of the longtime city administrator who championed the building's construction.
During the 1960s and 1970s, the city gained a reputation as an important center for liberal politics. Ann Arbor also became a locus for left-wing activism and anti-Vietnam War movement, as well as the student movement. The first major meetings of the national left-wing campus group Students for a Democratic Society took place in Ann Arbor in 1960; in 1965, the city was home to the first U.S. teach-in against the Vietnam War. During the ensuing 15 years, many countercultural and New Left enterprises sprang up and developed large constituencies within the city. These influences washed into municipal politics during the early and mid-1970s when three members of the Human Rights Party (HRP) won city council seats on the strength of the student vote. During their time on the council, HRP representatives fought for measures including pioneering antidiscrimination ordinances, measures decriminalizing marijuana possession, and a rent-control ordinance; many of these progressive organizations remain in effect today in modified form.
Two religious-conservative institutions were created in Ann Arbor; the Word of God (established in 1967), a charismatic inter-denominational movement; and the Thomas More Law Center (established in 1999).
Following a 1956 vote, the city of East Ann Arbor merged with Ann Arbor to encompass the eastern sections of the city.
21st century.
In the past several decades, Ann Arbor has grappled with the effects of sharply rising land values, gentrification, and urban sprawl stretching into outlying countryside. On November 4, 2003, voters approved a greenbelt plan under which the city government bought development rights on agricultural parcels of land adjacent to Ann Arbor to preserve them from sprawling development. Since then, a vociferous local debate has hinged on how and whether to accommodate and guide development within city limits. Ann Arbor consistently ranks in the "top places to live" lists published by various mainstream media outlets every year. In 2008, it was ranked by CNNMoney.com 27th out of 100 "America's best small cities". And in 2010, "Forbes" listed Ann Arbor as one of the most liveable cities in the United States.
In 2016, the city changed mayoral terms from two years to four. Until 2017, City Council held annual elections in which half of the seats (one from each ward) were elected to 2-year terms. These elections were staggered, with each ward having one of its seats up for election in odd years and its other seat up for election in even years. Beginning in 2018 the city council has had staggered elections to 4-year terms in even years. This means that half of the members (one from each ward) are elected in presidential election years, while the other half are elected in mid-term election years. To facilitate this change in scheduling, the 2017 election elected members to terms that lasted 3-years.
Geography.
Ann Arbor is located along the Huron River, which flows southeast through the city on its way to Lake Erie. It is the central core of the Ann Arbor, MI Metropolitan Statistical Area, which is comprised of the whole of Washtenaw County, but is also a part of the Metro Detroit Combined Statistical Area designated by the U.S. Census Bureau. While it borders only Townships, the built-up nature of the sections of Pittsfield and Ypsilanti townships between Ann Arbor and the city of Ypsilanti make the two effectively a single urban area.
Landscape.
The landscape of Ann Arbor consists of hills and valleys, with the terrain becoming steeper near the Huron River. The elevation ranges from about along the Huron River to on the city's west side, near the intersection of Maple Road and Pauline Blvd. Ann Arbor Municipal Airport, which is south of the city at , has an elevation of .
Ann Arbor is nicknamed "Tree Town," both due to its name and to the dense forestation of its parks and residential areas. The city contains more than 50,000 trees along its streets and an equal number in parks. In recent years, the emerald ash borer has destroyed many of the city's approximately 10,500 ash trees. The city contains 157 municipal parks ranging from small neighborhood green spots to large recreation areas. Several large city parks and a university park border sections of the Huron River. Fuller Recreation Area, near the University Hospital complex, contains sports fields, pedestrian and bike paths, and swimming pools. The Nichols Arboretum, owned by the University of Michigan, is a arboretum that contains hundreds of plant and tree species. It is on the city's east side, near the university's Central Campus. Located across the Huron River just beyond the university's North Campus is the university's Matthaei Botanical Gardens, which contains 300 acres of gardens and a large tropical conservatory as well as a wildflower garden specializing in the vegetation of the southern Great Lakes Region.
Cityscape.
The cityscape of Ann Arbor is heavily influenced by the University of Michigan, with 22% of downtown and 9.4% of the total land owned by the university. The downtown Central Campus contains some of the oldest extant structures in the city — including the President's House, built in 1840 — and separates the South University District from the other three downtown commercial districts. These other three districts, Kerrytown, State Street, and Main Street are contiguous near the northwestern corner of the university.
Three commercial areas south of downtown include the areas near I-94 and Ann Arbor-Saline Road, Briarwood Mall, and the South Industrial area. Other commercial areas include the Arborland/Washtenaw Avenue and Packard Road merchants on the east side, the Plymouth Road area in the northeast, and the Westgate/West Stadium areas on the west side. Downtown contains a mix of 19th- and early-20th-century structures and modern-style buildings, as well as a farmers' market in the Kerrytown district. The city's commercial districts are composed mostly of two- to four-story structures, although downtown and the area near Briarwood Mall contain a small number of high-rise buildings.
Ann Arbor's residential neighborhoods contain architectural styles ranging from classic 19th- and early 20th-century designs to ranch-style houses. Among these homes are a number of kit houses built in the early 20th century. Contemporary-style houses are farther from the downtown district. Surrounding the University of Michigan campus are houses and apartment complexes occupied primarily by student renters. Tower Plaza, a 26-story condominium building located between the University of Michigan campus and downtown, is the tallest building in Ann Arbor. The 19th-century buildings and streetscape of the Old West Side neighborhood have been preserved virtually intact; in 1972, the district was listed on the National Register of Historic Places, and it is further protected by city ordinances and a nonprofit preservation group.
Climate.
Ann Arbor has a typically Midwestern humid continental climate (Köppen "Dfa"), which is influenced by the Great Lakes. There are four distinct seasons: winters are cold and snowy, with average highs around . Summers are warm to hot and humid, with average highs around and with slightly more precipitation. Spring and autumn are transitional between the two. The area experiences lake effect weather, primarily in the form of increased cloudiness during late fall and early winter. The monthly daily average temperature in July is , while the same figure for January is . Temperatures reach or exceed on 10 days, and drop to or below on 4.6 nights. Precipitation tends to be the heaviest during the summer months, but most frequent during winter. Snowfall, which normally occurs from November to April but occasionally starts in October, averages per season. The lowest recorded temperature was on February 11, 1885, and the highest recorded temperature was on July 24, 1934.
Demographics.
As of the 2020 U.S. Census, there were 123,851 people and 49,948 households residing in the city. The population density was , making it less densely populated than Detroit proper and its inner-ring suburbs like Oak Park and Ferndale, but more densely populated than outer-ring suburbs like Livonia and Troy. The racial makeup of the city was 67.6% White, 6.8% Black, 0.2% Native American, 15.7% Asian, 0.1% Native Hawaiian or Pacific Islander, 1.8% from other races, and 7.9% from two or more races. Hispanic or Latino residents of any race made up 5.5% of the population. Ann Arbor has a small population of Arab Americans, including students as well as local Lebanese and Palestinians.
As of the 2010 U.S. Census, there were 113,934 people, 20,502 families, and 47,060 households residing in the city. The population density was . The racial makeup of the city was 73.0% White (70.4% non-Hispanic White), 7.7% Black, 0.3% Native American, 14.4% Asian, 0.0% Native Hawaiian or Pacific Islander, 1.0% from other races, and 3.6% from two or more races. Hispanic or Latino residents of any race made up 4.1% of the population.
In 2013, Ann Arbor had the second-largest community of Japanese citizens in the state of Michigan, at 1,541; this figure trailed only that of Novi, which had 2,666 Japanese nationals.
In 2010, out of 47,060 households, 43.6% were family households, 20.1% had individuals under the age of 18 living in them, and 17.0% had individuals over age 65 living in them. Of the 20,502 family households, 19.2% included children under age 18, 34.2% were husband-wife families (estimates did not include same-sex married couples), and 7.1% had a female householder with no husband present. The average household size was 2.17 people, and the average family size was 2.85 people. The median age was 27.8; 14.4% of the population was under age 18, and 9.3% was age 65 or older.
According to the 2012–2016 American Community Survey estimates, the median household income was $57,697, and the median family income was $95,528. Males over age 25 and with earnings had a median income of $51,682, versus $39,203 for females. The per capita income for the city was $37,158. Nearly a quarter (23.4%) of people and 6.7% of families had incomes below the poverty level.
Economy.
The University of Michigan shapes Ann Arbor's economy significantly. It employs about 30,000 workers, including about 12,000 in the medical center. Other employers are drawn to the area by the university's research and development money, and by its graduates. High tech, health services and biotechnology are other major components of the city's economy; numerous medical offices, laboratories, and associated companies are located in the city. Automobile manufacturers, such as General Motors and Visteon, also employ residents.
High tech companies have located in the area since the 1930s, when International Radio Corporation introduced the first mass-produced AC/DC radio (the Kadette, in 1931) as well as the first pocket radio (the Kadette Jr., in 1933). The Argus camera company, originally a subsidiary of International Radio, manufactured cameras in Ann Arbor from 1936 to the 1960s. Current firms include Arbor Networks (provider of Internet traffic engineering and security systems), Arbortext (provider of XML-based publishing software), JSTOR (the digital scholarly journal archive), MediaSpan (provider of software and online services for the media industries), Truven Health Analytics, and ProQuest, which includes UMI. Ann Arbor Terminals manufactured a video-display terminal called the Ann Arbor Ambassador during the 1980s. Barracuda Networks, which provides networking, security, and storage products based on network appliances and cloud services, opened an engineering office in Ann Arbor in 2008 on Depot St. and currently occupies the building previously used as the Borders headquarters on Maynard Street. Duo Security, a cloud-based access security provider protecting thousands of organizations worldwide through two-factor authentication, is headquartered in Ann Arbor. It was formerly a unicorn and continues to be headquartered in Ann Arbor after its acquisition by Cisco Systems. In November 2021, semiconductor test equipment company KLA Corporation opened a new North American headquarters in Ann Arbor.
Websites and online media companies in or near the city include All Media Guide, the Weather Underground, and Zattoo. Ann Arbor is the home to Internet2 and the Merit Network, a not-for-profit research and education computer network. Both are located in the South State Commons 2 building on South State Street, which once housed the Michigan Information Technology Center Foundation. The city is also home to a secondary office of Google's AdWords program—the company's primary revenue stream. The recent surge in companies operating in Ann Arbor has led to a decrease in its office and flex space vacancy rates. As of December 31, 2012, the total market vacancy rate for office and flex space is 11.80%, a 1.40% decrease in vacancy from one year previous, and the lowest overall vacancy level since 2003. The office vacancy rate decreased to 10.65% in 2012 from 12.08% in 2011, while the flex vacancy rate decreased slightly more, with a drop from 16.50% to 15.02%.
As of 2022, Ann Arbor is home to more than twenty video game and XR studios of varying sizes. The city plays host to a regional chapter of the International Game Developers Association (IGDA) which hosts monthly meetups, presentations, and educational events.
Pfizer, once the city's second-largest employer, operated a large pharmaceutical research facility on the northeast side of Ann Arbor. On January 22, 2007, Pfizer announced it would close operations in Ann Arbor by the end of 2008. The facility was previously operated by Warner-Lambert and, before that, Parke-Davis. In December 2008, the University of Michigan Board of Regents approved the purchase of the facilities, and the university anticipates hiring 2,000 researchers and staff during the next 10 years. It is now known as North Campus Research Complex. The city is the home of other research and engineering centers, including those of Lotus Engineering, General Dynamics and the National Oceanic and Atmospheric Administration (NOAA). Other research centers sited in the city are the United States Environmental Protection Agency's National Vehicle and Fuel Emissions Laboratory and the Toyota Technical Center. The city is also home to National Sanitation Foundation International (NSF International), the nonprofit non-governmental organization that develops generally accepted standards for a variety of public health related industries and subject areas.
Borders Books, started in Ann Arbor, was opened by brothers Tom and Louis Borders in 1971 with a stock of used books. The Borders chain was based in the city, as was its flagship store until it closed in September 2011. Domino's Pizza's headquarters is near Ann Arbor on Domino's Farms, a Frank Lloyd Wright-inspired complex just northeast of the city. Another Ann Arbor-based company is Zingerman's Delicatessen, which serves sandwiches and has developed businesses under a variety of brand names. Zingerman's has grown into a family of companies which offers a variety of products (bake shop, mail order, creamery, coffee) and services (business education). Flint Ink Corp., another Ann Arbor-based company, was the world's largest privately held ink manufacturer until it was acquired by Stuttgart-based XSYS Print Solutions in October 2005. Avfuel, a global supplier of aviation fuels and services, is also headquartered in Ann Arbor.
The controversial detective and private security firm, Pinkerton is headquartered in Ann Arbor, being located on 101 N Main St.
Many cooperative enterprises were founded in the city; among those that remain are the People's Food Co-op and the Inter-Cooperative Council at the University of Michigan, a student housing cooperative founded in 1937. There are also three cohousing communities—Sunward, Great Oak, and Touchstone—located immediately to the west of the city limits.
Culture.
Several performing arts groups and facilities are on the University of Michigan's campus, as are museums dedicated to art, archaeology, and natural history and sciences. Founded in 1879, the University Musical Society is an independent performing arts organization that presents over 60 events each year, bringing international artists in music, dance, and theater. Since 2001 Shakespeare in the Arb has presented one play by Shakespeare each June, in a large park near downtown. Regional and local performing arts groups not associated with the university include the Ann Arbor Civic Theatre, the Arbor Opera Theater, the Ann Arbor Symphony Orchestra, the Ann Arbor Ballet Theater, the Ann Arbor Civic Ballet (established in 1954 as Michigan's first chartered ballet company), The Ark, and Performance Network Theatre. Another unique piece of artistic expression in Ann Arbor is the fairy doors. These small portals are examples of installation art and can be found throughout the downtown area.
The Ann Arbor Hands-On Museum is located in a renovated and expanded historic downtown fire station. Multiple art galleries exist in the city, notably in the downtown area and around the University of Michigan campus. Aside from a large restaurant scene in the Main Street, South State Street, and South University Avenue areas, Ann Arbor ranks first among U.S. cities in the number of booksellers and books sold per capita. The Ann Arbor District Library maintains four branch outlets in addition to its main downtown building. The city is also home to the Gerald R. Ford Presidential Library.
Several annual events—many of them centered on performing and visual arts—draw visitors to Ann Arbor. One such event is the Ann Arbor Art Fairs, a set of four concurrent juried fairs held on downtown streets. Scheduled on Thursday through Sunday of the third week of July, the fairs draw upward of half a million visitors. Another is the Ann Arbor Film Festival, held during the third week of March, which receives more than 2,500 submissions annually from more than 40 countries and serves as one of a handful of Academy Award–qualifying festivals in the United States.
Ann Arbor has a long history of openness to marijuana, given Ann Arbor's decriminalization of cannabis, the large number of medical marijuana dispensaries in the city (one dispensary, called People's Co-op, was directly across the street from Michigan Stadium until zoning forced it to move one mile to the west), the large number of pro-marijuana residents, and the annual Hash Bash: an event that is held on the first Saturday of April. Until (at least) the successful passage of Michigan's medical marijuana law, the event had arguably strayed from its initial intent, although for years, a number of attendees have received serious legal responses due to marijuana use on University of Michigan property, which does not fall under the city's progressive and compassionate ticketing program.
Ann Arbor is a major center for college sports, most notably at the University of Michigan. Several well-known college sports facilities exist in the city, including Michigan Stadium, the largest American football stadium in the world and the third-largest stadium of any kind in the world. Michigan Stadium has a capacity of 107,601, with the final "extra" seat said to be reserved for and in honor of former athletic director and Hall of Fame football coach Fitz Crisler. The stadium was completed in 1927 and cost more than $950,000 to build. The stadium is colloquially known as "The Big House" due to its status as the largest American football stadium. Crisler Center and Yost Ice Arena play host to the school's basketball (both men's and women's) and ice hockey teams, respectively. Concordia University, a member of the NAIA, also fields sports teams.
Ann Arbor is represented in the NPSL by semi-pro soccer team AFC Ann Arbor, a club founded in 2014 who call themselves The Mighty Oak.
A person from Ann Arbor is called an "Ann Arborite", and many long-time residents call themselves "townies". The city itself is often called "A²" ("A-squared") or "A2" ("A two") or "AA", "The Deuce" (mainly by Chicagoans), and "Tree Town". With tongue-in-cheek reference to the city's liberal political leanings, some occasionally refer to Ann Arbor as "The People's Republic of Ann Arbor" or "25 square miles surrounded by reality", the latter phrase being adapted from Wisconsin Governor Lee Dreyfus's description of Madison, Wisconsin. In "A Prairie Home Companion" broadcast from Ann Arbor, Garrison Keillor described Ann Arbor as "a city where people discuss socialism, but only in the fanciest restaurants." Ann Arbor sometimes appears on citation indexes as an author, instead of a location, often with the academic degree "MI", a misunderstanding of the abbreviation for Michigan.
Government and politics.
As the county seat of Washtenaw County, the Washtenaw County Trial Court (22nd Circuit Court) is located in Ann Arbor at the Washtenaw County Courthouse on Main Street. Seven judges serve on the court. The 15th Michigan district court, which serves only the city itself, is located within the Ann Arbor Justice Center, immediately next to city hall. The U.S. District Court for the Eastern District of Michigan and Court of Appeals for the Sixth Circuit are also located in downtown Ann Arbor, at the federal building on Liberty Street.
Government.
Ann Arbor has a council-manager form of government, with 11 voting members: the mayor and 10 city council members. Each of the city's five wards are represented by two council members, with the mayor elected at-large during midterm years. Half of the council members are elected in midterm years, with the other in general election years. The mayor is the presiding officer of the city council and has the power to appoint all council committee members as well as board and commission members, with the approval of the city council. The current mayor of Ann Arbor is Christopher Taylor, a Democrat who was elected as mayor in 2014. Day-to-day city operations are managed by a city administrator chosen by the city council.
Politics.
Progressive politics have been particularly strong in municipal government since the 1960s. Voters approved charter amendments that have lessened the penalties for possession of marijuana (1974), and that aim to protect access to abortion in the city should it ever become illegal in the State of Michigan (1990). In 1974, Kathy Kozachenko's victory in an Ann Arbor city-council race made her the country's first openly homosexual candidate to win public office. In 1975, Ann Arbor became the first U.S. city to use instant-runoff voting for a mayoral race. Adopted through a ballot initiative sponsored by the local Human Rights Party, which feared a splintering of the liberal vote, the process was repealed in 1976 after use in only one election. As of April 2021, Democrats hold the mayorship and all ten council seats.
Anti-abortion protesters were outnumbered ten-to-one by abortion-rights counterprotesters in 2017. In 2019, The Diag hosted a Stop the Bans rally. In 2022 in the shadow of the Dobbs decision, the diag once again became a rallying point for abortion rights protests, drawing thousands of protesters, including US Rep. Debbie Dingell Senator Debbie Stabenow, and Michigan Lt. Gov. Garlin Gilchrist.
Local politics.
Ann Arbor has two major political factions. In 2020, after the city council voted 7–4 to fire city administrator Howard Lazarus, several of the council members who voted to fire him lost their elections. In April 2021, the city council voted to strip Jeff Hayner of his committee assignments response to his use of homophobic and racist slurs, followed in June by a vote to ask him to resign. Hayner's ally on council, Elizabeth Nelson, defended Hayner, saying he "spoke the phonetic sounds without euphemism." Hayner did not run for re-election in 2022 and Nelson lost her primary to Dharma Akmon in a series of elections that gave the mayor's faction 11-0 control of city council.
A major source of this local divide is differences in views on the city's growth. In 2018, two council members sued the city over a council decision to sell a city-owned property downtown to a housing developer. Later that year, the city narrowly passed a proposal to keep that space as city owned property in perpetuity. In 2020, the city council enacted a resolution sponsored by then council members Anne Bannister and Jeff Hayner to form an advisory body for developing the roof of the parking structure into a city park. By late 2022, this advisory board had sent council a request to direct staff to evaluate the site for use for food truck rallies and other events. In April 2023, city staff responded to this request with a memorandum stating in part that "this site is not well-suited for use as a food truck rally or food truck installation and that it will require significant capital investment to bring the site up to a standard that would be safe, convenient, and attractive as a community event space."
The following city council meeting included public comments deriding the lack of progress from this advisory commission.
Education.
Primary and secondary education.
Public schools are part of the Ann Arbor Public Schools (AAPS) district. AAPS has one of the country's leading music programs. In September 2008, 16,539 students had been enrolled in the Ann Arbor Public Schools. Notable schools include Pioneer, Huron, Skyline, and Community high schools, and Ann Arbor Open School. The district has a preschool center with both free and tuition-based programs for preschoolers in the district. The University High School, a "demonstration school" with teachers drawn from the University of Michigan's education program, was part of the school system from 1924 to 1968.
Ann Arbor is home to several private schools, including Emerson School, the Father Gabriel Richard High School, Rudolf Steiner School of Ann Arbor, Clonlara School, Michigan Islamic Academy, and Greenhills School, a prep school. The city is also home to several charter schools such as Central Academy (Michigan) (PreK-12) of the Global Educational Excellence (GEE) charter school company, Washtenaw Technical Middle College, and Honey Creek Community School.
Higher education.
The University of Michigan dominates the city of Ann Arbor, providing the city with its distinctive college-town character. University buildings are located in the center of the city and the campus is directly adjacent to the State Street and South University downtown areas.
Other local colleges and universities include Concordia University Ann Arbor, a Lutheran liberal-arts institution, and Cleary University, a private business school. Washtenaw Community College is located in neighboring Ann Arbor Township. In 2000, the Ave Maria School of Law, a Roman Catholic law school established by Domino's Pizza founder Tom Monaghan, opened in northeastern Ann Arbor, but the school moved to Ave Maria, Florida in 2009, and the Thomas M. Cooley Law School acquired the former Ave Maria buildings for use as a branch campus.
Media.
"The Ann Arbor News", owned by the Michigan-based Booth Newspapers chain, was the major newspaper serving Ann Arbor and the rest of Washtenaw County. The newspaper ended its 174-year daily print run in 2009, due to economic difficulties and began producing two printed editions a week under the name AnnArbor.com, It resumed using its former name in 2013. It also produces a daily digital edition named Mlive.com. Another Ann Arbor-based publication that has ceased production was the "Ann Arbor Paper", a free monthly. Ann Arbor has been said to be the first significant city to lose its only daily paper. The "Ann Arbor Chronicle", an online newspaper, covered local news, including meetings of the library board, county commission, and DDA until September 3, 2014.
Current publications in the city include the "Ann Arbor Journal" ("A2 Journal"), a weekly community newspaper; the "Ann Arbor Observer", a free monthly local magazine; and "Current", a free entertainment-focused alt-weekly. The "Ann Arbor Business Review" covers local business in the area. "Car and Driver" magazine and "Automobile Magazine" are also based in Ann Arbor. The University of Michigan is served by many student publications, including the independent "Michigan Daily" student newspaper, which reports on local, state, and regional issues in addition to campus news.
Four major AM radio stations based in or near Ann Arbor are WAAM 1600, a conservative news and talk station; WLBY 1290, a business news and talk station; WDEO 990, Catholic radio; and WTKA 1050, which is primarily a sports station. The city's FM stations include NPR affiliate WUOM 91.7; country station WWWW 102.9; and adult-alternative station WQKL 107.1. Freeform station WCBN-FM 88.3 is a local community radio/college radio station operated by the students of the University of Michigan featuring noncommercial, eclectic music and public-affairs programming. The city is also served by public and commercial radio broadcasters in Ypsilanti, the Lansing/Jackson area, Detroit, Windsor, and Toledo.
Ann Arbor is part of the Detroit television market. WPXD channel 31, the owned-and-operated Detroit outlet of the ION Television network, is licensed to the city. Until its sign-off on August 31, 2017, WHTV channel 18, a MyNetworkTV-affiliated station for the Lansing market, was broadcast from a transmitter in Lyndon Township, west of Ann Arbor. Community Television Network (CTN) is a city-provided cable television channel with production facilities open to city residents and nonprofit organizations. Detroit and Toledo-area radio and television stations also serve Ann Arbor, and stations from Lansing and Windsor, Ontario, can be seen in parts of the area.
Environment and services.
The University of Michigan Medical Center, the only teaching hospital in the city, took the number 1 slot in "U.S. News & World Report" for best hospital in the state of Michigan, as of 2015. The University of Michigan Health System (UMHS) includes University Hospital, C.S. Mott Children's Hospital and Women's Hospital in its core complex. UMHS also operates out-patient clinics and facilities throughout the city. The area's other major medical centers include a large facility operated by the Department of Veterans Affairs in Ann Arbor, and Saint Joseph Mercy Hospital in nearby Superior Township.
The city provides sewage disposal and water supply services, with water coming from the Huron River and groundwater sources. There are two water-treatment plants, one main and three outlying reservoirs, four pump stations, and two water towers. These facilities serve the city, which is divided into five water districts. The city's water department also operates four dams along the Huron River—Argo, Barton, Geddes, and Superior—of which Barton and Superior provide hydroelectric power. The city also offers waste management services, with Recycle Ann Arbor handling recycling service. Other utilities are provided by private entities. Electrical power and gas are provided by DTE Energy. AT&T Inc. is the primary wired telephone service provider for the area. Cable TV service is primarily provided by Comcast.
A plume of the industrial solvent dioxane is migrating under the city from the contaminated Gelman Sciences, Inc. property on the westside of Ann Arbor. It is currently detected at 0.039 ppb. The Gelman plume is a potential threat to one of the City of Ann Arbor's drinking water sources, the Huron River, which flows through downtown Ann Arbor.
Crime.
In 2015, Ann Arbor was ranked 11th safest among cities in Michigan with a population of over 50,000. It ranked safer than cities such as Royal Oak, Livonia, Canton and Clinton Township. The level of most crimes in Ann Arbor has fallen significantly in the past 20 years. In 1995 there were 294 aggravated assaults, 132 robberies and 43 rapes while in 2015 there were 128 aggravated assaults, 42 robberies and 58 rapes (under the revised definition).
Ann Arbor's crime rate was below the national average in 2000. The violent crime rate was further below the national average than the property crime rate; the two rates were 48% and 11% lower than the U.S. average, respectively.
Transportation.
Ann Arbor is considered one of the US's most walkable cities, with one sixth of Ann Arborites walking to work according to the 2020 census.
Non-motorized transportation.
Ann Arbor has made efforts to reverse the trend of car-dependent development. In 2020, the city introduced a Healthy Streets program to encourage non-motorized transportation.
The Washtenaw county Border-to-Border Trail connects Ann Arbor to Ypsilanti, mostly along the Huron river, for pedestrians, bicycles and other non-motorized transportation. In 2017, Spin scooters started providing a scooter share program in Ann Arbor, expanding this to include dockless e-bikes in 2023.
Walkability.
Ann Arbor has a gold designation by the Walk Friendly Communities program. Since 2011, the city's property taxes have included a provision for sidewalk maintenance and expansions, expanding the sidewalk network, filling sidewalk gaps, and repairing existing sidewalks. The city has created a sidewalk gap dashboard, which showed 143 miles of sidewalk gaps in May 2022. The downtown was ranked in 2016 is the most walkable neighborhood in mid-sized cities in the Midwest. However, the outlying parts of the city and the township districts between Ann Arbor and Ypsilanti still contain markedly unwalkable areas.
Bicycle.
Between 2019 and 2022 Ann Arbor's Downtown Development Authority built four two-way protected bikeways downtown. Early studies have shown a significant increase in bicycle use downtown since the construction of these bikeways. In 2023, the city reported over 900 bicycle parking spaces downtown, though this is still a small portion compared to the over 8,000 car parking spots for cars.
Public transit.
The Ann Arbor Area Transportation Authority (AAATA), which brands itself as "TheRide", operates public bus services throughout the city and nearby Ypsilanti. The AATA operates Blake Transit Center on Fourth Ave. in downtown Ann Arbor, and the Ypsilanti Transit Center. A separate zero-fare bus service operates within and between the University of Michigan campuses. Since April 2012, route 98 (the "AirRide") connects to Detroit Metro Airport a dozen times a day. There are also limited-stop bus services between Ann Arbor and Chelsea as well as Canton. These two routes, 91 and 92 respectively, are known as the "ExpressRide".
Intercity buses.
Greyhound Lines provides intercity bus service. The Michigan Flyer, a service operated by Indian Trails, cooperates with AAATA for their AirRide and additionally offers bus service to East Lansing. Megabus has direct service to Chicago, Illinois, while a bus service is provided by Amtrak for rail passengers making connections to services in East Lansing and Toledo, Ohio.
Railroads.
The city was a major rail hub, notably for freight traffic between Toledo and ports north of Chicago, Illinois, from 1878 to 1982; however, the Ann Arbor Railroad also provided passenger service from 1878 to 1950, going northwest to Frankfort and Elberta on Lake Michigan and southeast to Toledo. (In Elberta connections to ferries across the Lake could be made.) The city was served by the Michigan Central Railroad starting in 1837. The Ann Arbor and Ypsilanti Street Railway, Michigan's first interurban, served the city from 1891 to 1929.
Amtrak, which provides service to the city at the Ann Arbor Train Station, operates the "Wolverine" train between Chicago and Pontiac, via Detroit. The present-day train station neighbors the city's old Michigan Central Depot, which was renovated as a restaurant in 1970.
Airports.
Ann Arbor Municipal Airport is a small, city-run general aviation airport located south of I-94. Detroit Metropolitan Airport, the area's large international airport, is about east of the city, in Romulus. Willow Run Airport east of the city near Ypsilanti serves freight, corporate, and general aviation clients.
Roads and highways.
The streets in downtown Ann Arbor conform to a grid pattern, though this pattern is less common in the surrounding areas. Major roads branch out from the downtown district like spokes on a wheel to the highways surrounding the city. The city is belted by three freeways: I-94, which runs along the southern and western portion of the city; U.S. Highway 23 (US 23), which primarily runs along the eastern edge of Ann Arbor; and M-14, which runs along the northern edge of the city. Other nearby highways include US 12 (Michigan Ave.), M-17 (Washtenaw Ave.), and M-153 (Ford Rd.). Several of the major surface arteries lead to the I-94/M-14 interchange in the west, US 23 in the east, and the city's southern areas.
Sister cities.
Ann Arbor has seven sister cities:
|
2070 | Act of Settlement 1701 | The Act of Settlement is an Act of the Parliament of England that settled the succession to the English and Irish crowns to only Protestants, which passed in 1701. More specifically, anyone who became a Roman Catholic, or who married one, became disqualified to inherit the throne. This had the effect of deposing the remaining descendants of Charles I, other than his Protestant granddaughter Anne, as the next Protestant in line to the throne was Sophia of Hanover. Born into the House of Wittelsbach, she was a granddaughter of James VI and I from his most junior surviving line, with the crowns descending only to her non-Catholic heirs. Sophia died shortly before the death of Queen Anne, and Sophia's son succeeded to the throne as King George I, starting the Hanoverian dynasty in Britain.
The Act of Supremacy 1558 had confirmed the independence of the Church of England from Roman Catholicism under the English monarch. One of the principal factors which contributed to the Glorious Revolution was the perceived assaults made on the Church by King James II, a Roman Catholic, who was deposed in favour of his Protestant daughter Mary II and her husband William III. The need for this Act of Settlement was prompted by the inability of William and Mary, as well as of Mary's Protestant sister (the future Queen Anne), to produce any surviving children, and by the perceived threat posed by the pretensions to the throne by remaining Roman Catholic members of the House of Stuart.
The Act played a key role in the formation of the Kingdom of Great Britain as, though England and Scotland had shared a monarch since 1603, they had remained separately governed countries, with the Act catalysing the Union of England and Scotland. However, the Parliament of Scotland was more reluctant to abandon the House of Stuart, members of which had been Scottish monarchs long before they became English. Moreover, the Act also placed limits on both the role of foreigners in the British government and the power of the monarch with respect to the Parliament of England, though some of those provisions have been altered by subsequent legislation.
Along with the Bill of Rights 1689, the Act of Settlement remains today one of the main constitutional laws governing the succession not only to the throne of the United Kingdom, but to those of the other Commonwealth realms, whether by assumption or by patriation. The Act of Settlement cannot be altered in any realm except by that realm's own parliament and, by convention, only with the consent of all the other realms, as it touches on the succession to the shared crown. On 26 March 2015, following the Perth Agreement, legislation amending the Act came into effect across the Commonwealth realms that removed the disqualification arising from marriage to a Roman Catholic and instituted absolute primogeniture.
Background.
Following the Glorious Revolution, the line of succession to the English throne was governed by the Bill of Rights 1689, which declared that the flight of James II from England to France during the revolution amounted to an abdication of the throne and that James's daughter Mary II and her husband/cousin, William III (William of Orange, who was also James's nephew), were James's successors. The Bill of Rights also provided that the line of succession would go through Mary's Protestant descendants by William and any possible future husband should she outlive him, then through Mary's sister Anne and her Protestant descendants, and then to the Protestant descendants of William III by a possible later marriage should he outlive Mary. During the debate, the House of Lords had attempted to append Sophia and her descendants to the line of succession, but the amendment failed in the Commons.
Mary II died childless in 1694, after which William III did not remarry. In 1700, Prince William, Duke of Gloucester, who was Anne's only child to survive infancy, died of what may have been smallpox at the age of 11. Thus, Anne was left as the only person in line to the throne. The Bill of Rights excluded Catholics from the throne, which ruled out James II and his children (as well as their descendants) sired after he converted to Catholicism in 1668. However, it did not provide for the further succession after Anne. Parliament thus saw the need to settle the succession on Sophia and her descendants, and thereby guarantee the continuity of the Crown in the Protestant line.
With religion and lineage initially decided, the ascendancy of William of Orange in 1689 would also bring his partiality to his foreign favourites that followed. By 1701 English jealousy of foreigners was rampant, and action was considered necessary.
The Act.
The Act of Settlement provided that the throne would pass to the Electress Sophia of Hanover – a granddaughter of James VI and I and a niece of King Charles I – and her descendants, but it excluded "for ever" "all and every Person and Persons who ... is are or shall be reconciled to or shall hold Communion with the See or Church of Rome or shall profess the Popish Religion or shall marry a Papist". Thus, those who were Roman Catholics, and those who married Roman Catholics, were barred from ascending the throne.
Conditional provisions.
The Act contained eight additional provisions that were to only come into effect upon the death of both William and Anne:
Firstly, the monarch "shall join in communion with the Church of England". This was intended to ensure the exclusion of a Roman Catholic monarch. Along with James II's perceived despotism, his religion was the main cause of the Glorious Revolution, and of the previous linked religious and succession problems which had been resolved by the joint monarchy of William III and Mary II.
Second, if a person not native to England comes to the throne, England will not wage war for "any dominions or territories which do not belong to the Crown of England, without the consent of Parliament". This would become relevant when a member of the House of Hanover ascended the British throne, as he would retain the territories of the Electorate of Hanover in what is now Lower Saxony (Germany), then part of the Holy Roman Empire. This provision has been dormant since Queen Victoria ascended the throne, because she did not inherit Hanover under the Salic Laws of the German-speaking states.
Third, No monarch may leave "the dominions of England, Scotland, or Ireland", without the consent of Parliament. This provision was repealed in 1716, at the request of George I who was also the Elector of Hanover and Duke of Brunswick-Lüneburg within the Holy Roman Empire; because of this, and also for personal reasons, he wished to visit Hanover from time to time.
Fourth, all government matters within the jurisdiction of the Privy Council were to be transacted there, and all council resolutions were to be signed by those who advised and consented to them. This was because Parliament wanted to know who was deciding policies, as sometimes councillors' signatures normally attached to resolutions were absent. This provision was repealed early in Queen Anne's reign, as many councillors ceased to offer advice and some stopped attending meetings altogether.
Fifth, no foreigner ("no Person born out of the Kingdoms of England Scotland or Ireland or the Dominions thereunto belonging"), even if naturalised or made a denizen (unless born of English parents), can be a Privy Councillor or a member of either House of Parliament, or hold "any Office or Place of Trust, either Civill or Military, or to ["sic"] have any Grant of Lands, Tenements or Hereditaments from the Crown, to himself or to any other or others in Trust for him". Subsequent nationality laws (today primarily the British Nationality Act 1981) made naturalised citizens the equal of those native born, and excluded Commonwealth and Irish citizens from the definition of foreigners, but otherwise this provision still applies. It has however been disapplied in particular cases by a number of other statutes.
Sixth, no person who has an office under the monarch, or receives a pension from the Crown, was to be a Member of Parliament. This provision was inserted to avoid unwelcome royal influence over the House of Commons. It remains in force, but with several exceptions; ministers of the Crown were exempted early on before Anne's death in order to continue some degree of royal patronage, but had to stand for a by-election to re-enter the House upon such appointment until 1926. As a side effect, this provision means that members of the Commons seeking to resign from parliament can get around the prohibition on resignation by obtaining a sinecure in the control of the Crown; while several offices have historically been used for this purpose, two are currently in use: appointments generally alternate between the stewardships of the Chiltern Hundreds and of the Manor of Northstead.
Seventh, Judges' commissions are valid "quamdiu se bene gesserint" (during good behaviour) and if they do not behave themselves, they can be removed only by both Houses of Parliament (or in other Commonwealth realms the one House of Parliament, depending on the legislature's structure.) This provision was the result of various monarchs influencing judges' decisions, and its purpose was to assure judicial independence. This patent was used prior to 1701 but did not prevent Charles I from removing Sir John Walter as Chief Baron of the Exchequer.
Eighth, that "no Pardon under the Great Seal of England be pleadable to an Impeachment by the Commons in Parliament". This meant in effect that no pardon by the monarch was to save someone from being impeached by the House of Commons.
Opposition.
The Tory administration that replaced the Whig Junto in 1699 took responsibility for steering the Act through Parliament. As a result, it passed with little opposition, although five peers voted against it in the House of Lords, including the Earl of Huntingdon, his brother-in-law the Earl of Scarsdale and three others. While many shared their opposition to a 'foreign' king, the general feeling was summed up as 'better a German prince than a French one.'
Legacy.
For different reasons, various constitutionalists have praised the Act of Settlement: Henry Hallam called the Act "the seal of our constitutional laws" and David Lindsay Keir placed its importance above the Bill of Rights of 1689. Naamani Tarkow wrote: "If one is to make sweeping statements, one may say that, save Magna Carta (more truly, its implications), the Act of Settlement is probably the most significant statute in English history".
Union of Scotland with England and Wales.
The Act of Settlement was, in many ways, the major cause of the union of Scotland with England and Wales to form the Kingdom of Great Britain. The Parliament of Scotland was not happy with the Act of Settlement and, in response, passed the Act of Security in 1704, through which Scotland reserved the right to choose its own successor to Queen Anne. Stemming from this, the Parliament of England decided that, to ensure the stability and future prosperity of Great Britain, full union of the two parliaments and nations was essential before Anne's death.
It used a combination of exclusionary legislation (the Alien Act 1705), politics, and bribery to achieve this within three years under the Act of Union 1707. This success was in marked contrast to the four attempts at political union between 1606 and 1689, which all failed owing to a lack of political will in both kingdoms. By virtue of Article II of the Treaty of Union, which defined the succession to the throne of Great Britain, the Act of Settlement became part of Scots law as well.
Succession to the Crown.
In addition to excluding James II, who died a few months after the Act received royal assent, and his Roman Catholic children, Prince James ("The Old Pretender") and the Princess Royal, the Act also excluded the descendants of Princess Henrietta, the youngest sister of James II. Henrietta's daughter was Anne, Queen of Sardinia, a Roman Catholic, from whom descend all Jacobite pretenders after 1807.
With the legitimate descendants of Charles I either childless (in the case of his two grand-daughters the late Queen Mary II and her successor Queen Anne) or Roman Catholic, Parliament's choice was limited to Sophia of Hanover, the Protestant daughter of the late Elizabeth of Bohemia, the only other child of King James I to have survived childhood. Elizabeth had borne nine children who reached adulthood, of whom Sophia was the youngest daughter. However in 1701 Sophia was the senior Protestant one, therefore with a legitimate claim to the English throne; Parliament passed over her Roman Catholic siblings, namely her sister Louise Hollandine of the Palatinate, and their descendants, who included Elizabeth Charlotte, Duchess of Orléans; Louis Otto, Prince of Salm, and his aunts; Anne Henriette, Princess of Condé, and Benedicta Henrietta, Duchess of Brunswick-Lüneburg.
Removal from the succession due to Catholicism.
Since the Act's passing the most senior living member of the royal family to have married a Roman Catholic, and thereby to have been removed from the line of succession, is Prince Michael of Kent, who married Baroness Marie-Christine von Reibnitz in 1978; he was fifteenth in the line of succession at the time. He was restored to the line of succession in 2015 when the Succession to the Crown Act 2013 came into force, and became 34th in line.
The next most senior living descendant of the Electress Sophia who had been ineligible to succeed on this ground is George Windsor, Earl of St Andrews, the elder son of Prince Edward, Duke of Kent, who married the Roman Catholic Sylvana Palma Tomaselli in 1988. His son, Lord Downpatrick, converted to Roman Catholicism in 2003 and is the most senior descendant of Sophia to be barred as a result of his religion. In 2008 his daughter, Lady Marina Windsor, also converted to Catholicism and was removed from the line of succession. More recently, Peter Phillips, the son of Anne, Princess Royal, and eleventh in line to the throne, married Autumn Kelly; Kelly had been brought up as a Roman Catholic, but she converted to Anglicanism prior to the wedding. Had she not done so, Phillips would have forfeited his place in the succession upon their marriage, only to have it restored in 2015.
Excluding those princesses who have married into Roman Catholic royal families, such as Marie of Edinburgh, Victoria Eugenie of Battenberg and Princess Beatrice of Edinburgh, one member of the Royal Family (that is, with the style of "Royal Highness") has converted to Roman Catholicism since the passage of the Act: the Duchess of Kent, wife of Prince Edward, Duke of Kent, who converted on 14 January 1994, but her husband did not lose his place in the succession because she was an Anglican at the time of their marriage.
Present status.
As well as being part of the law of the United Kingdom, the Act of Settlement was received into the laws of all the countries and territories over which the British monarch reigned. It remains part of the laws of the 15 Commonwealth realms and the relevant jurisdictions within those realms. In accordance with established convention, the Statute of Westminster 1931 and later laws, the Act of Settlement (along with the other laws governing the succession of the Commonwealth realms) may only be changed with the agreement of all the realms (and, in some federal realms, the constituent members of those federations). The Succession to the Crown Act 2013 changed many provisions of this Act.
Amendment proposals.
Challenges have been made against the Act of Settlement, especially its provisions regarding Roman Catholics and preference for males. However, changing the Act is a complex process, since the Act governs the shared succession of all the Commonwealth realms. The Statute of Westminster 1931 acknowledges by established convention that any changes to the rules of succession may be made only with the agreement of all of the states involved, with concurrent amendments to be made by each state's parliament or parliaments. Further, as the current monarch's eldest child and, in turn, his eldest child, are Anglican males, any change to the succession laws would have no immediate implications. Consequently, there was little public concern with the issues and debate had been confined largely to academic circles until the November 2010 announcement that Prince William was to marry. This raised the question of what would happen if he were to produce first a daughter and then a son.
"The Times" reported on 6 November 1995 that Prince Charles had said on that day to Tony Blair and Paddy Ashdown that "Catholics should be able to ascend to the British throne". Ashdown claimed the Prince said: "I really can't think why we can't have Catholics on the throne". In 1998, during debate on a Succession to the Crown Bill, Junior Home Office Minister Lord Williams of Mostyn informed the House of Lords that the Queen had "no objection to the Government's view that in determining the line of succession to the throne, daughters and sons should be treated in the same way."
Australia.
In October 2011 the Australian federal government was reported to have reached an agreement with all of the states on potential changes to their laws in the wake of amendments to the Act of Settlement. The practice of the Australian states—for example, New South Wales and Victoria—has been, when legislating to repeal some imperial statutes so far as they still applied in Australia, to provide that imperial statutes concerning the royal succession remain in force.
The legal process required at the federal level remains, theoretically, unclear. The Australian constitution, as was noted during the crisis of 1936, contains no power for the federal parliament to legislate with respect to the monarchy. Everything thus turns upon the status and meaning of clause 2 in the Commonwealth of Australia Constitution Act 1900, which provides: "The provisions of this Act referring to the Queen shall extend to Her Majesty's heirs and successors in the sovereignty of the United Kingdom."
Anne Twomey reviews three possible interpretations of the clause.
However, Twomey expresses confidence that, if the High Court of Australia were to be faced with the problems of covering clause 2, it would find some way to conclude that, with regard to Australia, the clause is subject solely to Australian law. Canadian scholar Richard Toporoski theorised in 1998 that "if, let us say, an alteration were to be made in the United Kingdom to the Act of Settlement 1701, providing for the succession of the Crown... [i]t is my opinion that the domestic constitutional law of Australia or Papua New Guinea, for example, would provide for the succession in those countries of the same person who became Sovereign of the United Kingdom."
In practice, when legislating for the Perth Agreement (see below), the Australian governments took the approach of the states requesting, and referring power to, the federal government to enact the legislation on behalf of the states (under paragraph 51(xxxviii) of the Australian Constitution) and the Commonwealth of Australia.
Canada.
In Canada, where the Act of Settlement () is now a part of Canadian constitutional law, Tony O'Donohue, a Canadian civic politician, took issue with the provisions that exclude Roman Catholics from the throne, and which make the monarch of Canada the Supreme Governor of the Church of England, requiring him or her to be an Anglican. This, he claimed, discriminated against non-Anglicans, including Catholics, who are the largest faith group in Canada. In 2002, O'Donohue launched a court action that argued the Act of Settlement violated the "Canadian Charter of Rights and Freedoms", but, the case was dismissed by the court. It found that, as the Act of Settlement is part of the Canadian constitution, the Charter of Rights and Freedoms, as another part of the same constitution, does not have supremacy over it. Also, the court noted that, while Canada has the power to amend the line of succession to the Canadian throne, the Statute of Westminster stipulates that the agreement of the governments of the fifteen other Commonwealth realms that share the Crown would first have to be sought if Canada wished to continue its relationship with these countries. An appeal of the decision was dismissed on 16 March 2005. Some commentators state that, as a result of this, any single provincial legislature could hinder any attempts to change this Act, and by extension, to the line of succession for the shared crown of all 16 Commonwealth realms. Others contend that that is not the case, and changes to the succession instituted by an Act of the Parliament of Canada "[in accord] with the convention of symmetry that preserves the personal unity of the British and Dominion Crowns".
With the announcement in 2007 of the engagement of Peter Phillips to Autumn Kelly, a Roman Catholic and a Canadian, discussion about the Act of Settlement was revived. Norman Spector called in "The Globe and Mail" for Prime Minister Stephen Harper to address the issue of the Act's bar on Catholics, saying Phillips' marriage to Kelly would be the first time the provisions of the Act would bear directly on Canada—Phillips would be barred from acceding to the Canadian throne because he married a Roman Catholic Canadian. (In fact, Lord St Andrews had already lost his place in the line of succession when he married the Roman Catholic Canadian Sylvana Palma Tomaselli in 1988. But St Andrews' place in the line of succession was significantly lower than Phillips'.) Criticism of the Act of Settlement due to the Phillips–Kelly marriage was muted when Autumn Kelly converted to Anglicanism shortly before her marriage, thus preserving her husband's place in the line of succession.
United Kingdom.
From time to time there has been debate over repealing the clause that prevents Roman Catholics, or those who marry one, from ascending to the British throne. Proponents of repeal argue that the clause is a bigoted anachronism; Cardinal Winning, who was leader of the Roman Catholic Church in Scotland, called the act an "insult" to Catholics. Cardinal Murphy-O'Connor, the leader of the Roman Catholic Church in England, pointed out that Prince William (later the Duke of Cambridge) "can marry by law a Hindu, a Buddhist, anyone, but not a Roman Catholic." Opponents of repeal, such as Enoch Powell and Adrian Hilton, believe that it would lead to the disestablishment of the Church of England as the state religion if a Roman Catholic were to come to the throne. They also note that the monarch must swear to defend the faith and be a member of the Anglican Communion, but that a Roman Catholic monarch would, like all Roman Catholics, owe allegiance to the Pope. This would, according to opponents of repeal, amount to a loss of sovereignty for the Anglican Church.
When in December 1978 there was media speculation that Prince Charles might marry a Roman Catholic, Powell defended the provision that excludes Roman Catholics from ascending the throne, stating his objection was not rooted in religious bigotry but in political considerations. He said a Roman Catholic monarch would mean the acceptance of a source of authority external to the realm and "in the literal sense, foreign to the Crown-in-Parliament ... Between Roman Catholicism and royal supremacy there is, as Saint Thomas More concluded, no reconciliation." Powell concluded that a Roman Catholic crown would be the destruction of the Church of England because "it would contradict the essential character of that church."
He continued:
When Thomas Hobbes wrote that "the Papacy is no other than the ghost of the deceased Roman Empire sitting crowned upon the grave thereof", he was promulgating an enormously important truth. Authority in the Roman Church is the exertion of that "imperium" from which England in the 16th century finally and decisively declared its national independence as the "alter imperium", the "other empire", of which Henry VIII declared "This realm of England is an empire" ... It would signal the beginning of the end of the British monarchy. It would portend the eventual surrender of everything that has made us, and keeps us still, a nation.
The Scottish Parliament unanimously passed a motion in 1999 calling for the complete removal of any discrimination linked to the monarchy and the repeal of the Act of Settlement. The following year, "The Guardian" challenged the succession law in court, claiming that it violated the European Convention on Human Rights, which provides,
The enjoyment of the rights and freedoms set forth in this Convention shall be secured without discrimination on any ground such as sex, race, colour, language, religion, political or other opinion, national or social origin, association with a national minority, property, birth, or other status.
As the Convention nowhere lists the right to succeed to the Crown as a human right, the challenge was rejected.
Adrian Hilton, writing in "The Spectator" in 2003, defended the Act of Settlement as not "irrational prejudice or blind bigotry", but claimed that it was passed because "the nation had learnt that when a Roman Catholic monarch is upon the throne, religious and civil liberty is lost." He points to the Pope's claiming universal jurisdiction, and Hilton argues that "it would be intolerable to have, as the sovereign of a Protestant and free country, one who owes any allegiance to the head of any other state" and contends that, if such situation came about, "we will have undone centuries of common law." He said that because the Roman Catholic Church does not recognise the Church of England as an apostolic church, a Roman Catholic monarch who abided by their faith's doctrine would be obliged to view Anglican and Church of Scotland archbishops, bishops, and clergy as part of the laity and therefore "lacking the ordained authority to preach and celebrate the sacraments." (Hilton noted that the Church of Scotland's Presbyterian polity does not include bishops or archbishops.) Hilton said a Roman Catholic monarch would be unable to be crowned by the Archbishop of Canterbury and notes that other European states have similar religious provisions for their monarchs: Denmark, Norway, and Sweden, whose constitutions compel their monarchs to be Lutherans; the Netherlands, which has a constitution requiring its monarchs be members of the Protestant House of Orange; and Belgium, which has a constitution that provides for the succession to be through Roman Catholic houses.
In December 2004, a private member's bill—the Succession to the Crown Bill—was introduced in the House of Lords. The government, headed by Tony Blair, blocked all attempts to revise the succession laws, claiming it would raise too many constitutional issues and it was unnecessary at the time. In the British general election the following year, Michael Howard promised to work towards having the prohibition removed if the Conservative Party gained a majority of seats in the House of Commons, but the election was won by Blair's Labour Party. Four years later, plans drawn up by Chris Bryant were revealed that would end the exclusion of Catholics from the throne and end the doctrine of agnatic (male-preference) primogeniture in favour of absolute primogeniture, which governs succession solely on birth order and not on sex. The issue was raised again in January 2009, when a private member's bill to amend the Act of Succession was introduced in parliament.
Across the realms.
In early 2011 Keith Vaz, a Labour Member of Parliament, introduced to the House of Commons at Westminster a private member's bill which proposed that the Act of Settlement be amended to remove the provisions relating to Roman Catholicism and change the primogeniture governing the line of succession to the British throne from agnatic to absolute cognatic. Vaz sought support for his project from the Canadian Cabinet and Prime Minister Stephen Harper, but the Office of the Prime Minister of Canada responded that the issue was "not a priority for the government or for Canadians without further elaboration on the merits or drawbacks of the proposed reforms". Stephenson King, Prime Minister of Saint Lucia, said he supported the idea and it was reported that the government of New Zealand did, as well. The Monarchist League of Canada said at the time to the media that it "supports amending the Act of Settlement in order to modernize the succession rules."
Later the same year, the Deputy Prime Minister of the United Kingdom, Nick Clegg, announced that the government was considering a change in the law. At approximately the same time, it was reported that British Prime Minister David Cameron had written to each of the prime ministers of the other fifteen Commonwealth realms, asking for their support in changing the succession to absolute primogeniture and notifying them he would raise his proposals at that year's Commonwealth Heads of Government Meeting (CHOGM) in Perth, Australia. Cameron reportedly also proposed removing the restriction on successors being or marrying Roman Catholics; however, potential Roman Catholic successors would be required to convert to Anglicanism prior to acceding to the throne. In reaction to the letter and media coverage, Harper stated that, this time, he was "supportive" of what he saw as "reasonable modernizations".
At the 2011 Commonwealth Heads of Government Meeting on 28 October 2011, the prime ministers of the other Commonwealth realms agreed to support Cameron's proposed changes to the Act. The bill put before the Parliament of the United Kingdom would act as a model for the legislation required to be passed in at least some of the other realms, and any changes would only first take effect if the Duke of Cambridge were to have a daughter before a son.
The British group Republic asserted that succession reform would not make the monarchy any less discriminatory. As it welcomed the gender equality reforms, the British newspaper "The Guardian" criticized the lack of a proposal to remove the ban on Catholics sitting on the throne, as did Alex Salmond, First Minister of Scotland, who pointed out that "It is deeply disappointing that the reform [of the Act of Settlement of 1701] has stopped short of removing the unjustifiable barrier on a Catholic becoming monarch." On the subject, Cameron asserted: "Let me be clear, the monarch must be in communion with the Church of England because he or she is the head of that Church."
The disqualification arising from marriage to a Roman Catholic was removed by the Succession to the Crown Act 2013.
|
2075 | Aircraft hijacking | Aircraft hijacking (also known as airplane hijacking, skyjacking, plane hijacking, plane jacking, air robbery, air piracy, or aircraft piracy, with the last term used within the special aircraft jurisdiction of the United States) is the unlawful seizure of an aircraft by an individual or a group. Dating from the earliest of hijackings, most cases involve the pilot being forced to fly according to the hijacker's demands. There have also been incidents where the hijackers have overpowered the flight crew, made unauthorized entry into the cockpit and flown them into buildingsmost notably in the September 11 attacksand in several cases, planes have been hijacked by the official pilot or co-pilot; e.g., Ethiopian Airlines Flight 702.
Unlike carjacking or sea piracy, an aircraft hijacking is not usually committed for robbery or theft. Individuals driven by personal gain often divert planes to destinations where they are not planning to go themselves. Some hijackers intend to use passengers or crew as hostages, either for monetary ransom or for some political or administrative concession by authorities. Various motives have driven such occurrences, such as demanding the release of certain high-profile individuals or for the right of political asylum (notably Flight ET 961), but sometimes a hijacking may have been affected by a failed private life or financial distress, as in the case of Aarno Lamminparras in the Oulu Aircraft Hijacking. Hijackings involving hostages have produced violent confrontations between hijackers and the authorities, during negotiation and settlement. In the case of Lufthansa Flight 181 and Air France Flight 139, the hijackers were not satisfied and showed no inclination to surrender, resulting in attempts by special forces to rescue passengers.
In most jurisdictions of the world, aircraft hijacking is punishable by life imprisonment or a long prison sentence. In most jurisdictions where the death penalty is a legal punishment, aircraft hijacking is a capital crime, including in China, India, Liberia and the U.S. states of Georgia and Mississippi.
History.
Airplane hijackings have occurred since the early days of flight. These can be classified in the following eras: 1929–1957, 1958–1979, 1980–2000 and 2001–present. Early incidents involved light planes, but this later involved passenger aircraft as commercial aviation became widespread.
1929–1957.
Between 1929 and 1957, there were fewer than 20 incidents of reported hijackings worldwide; several occurred in Eastern Europe.
One of the first unconfirmed hijackings occurred in December 1929. J. Howard "Doc" DeCelles was flying a postal route for a Mexican firm, Transportes Aeras Transcontinentales, ferrying mail from San Luis Potosí to Torreon and then on to Guadalajara. A lieutenant named Saturnino Cedillo, the governor of the state of San Luis Potosí, ordered him to divert. Several other men were also involved, and through an interpreter, DeCelles had no choice but to comply. He was allegedly held captive for several hours under armed guard before being released.
The first recorded aircraft hijack took place on February 21, 1931, in Arequipa, Peru. Byron Richards, flying a Ford Tri-Motor, was approached on the ground by armed revolutionaries. He refused to fly them anywhere during a 10-day standoff. Richards was informed that the revolution was successful and he could be freed in return for flying one of the men to Lima.
The following year, in September 1932, a Sikorsky S-38 with registration P-BDAD, still bearing the titles of Nyrba do Brasil was seized in the company's hangar by three men, who took a fourth as a hostage. Despite having no flying experience, they managed to take off. However, the aircraft crashed in São João de Meriti, killing the four men. Apparently, the hijack was related to the events of the Constitutionalist Revolution in São Paulo; it is considered to be the first hijack that took place in Brazil.
On October 28, 1939, the first murder on a plane took place in Brookfield, Missouri, US. The victim was Carl Bivens, a flight instructor, who was teaching a man named Earnest P. "Larry" Pletch. While airborne in a Taylor Cub monoplane, Pletch shot Bivens twice in the back of the head. Pletch later told prosecutors, "Carl was telling me I had a natural ability and I should follow that line", adding, "I had a revolver in my pocket and without saying a word to him, I took it out of my overalls and I fired a bullet into the back of his head. He never knew what struck him." The "Chicago Daily Tribune" stated it was one of the most spectacular crimes of the 20th century. Pletch pleaded guilty and was sentenced to life in prison. However, he was released on March 1, 1957, after serving 17 years, and lived until June 2001.
In 1942 near Malta, two New Zealanders, a South African and an Englishman achieved the first confirmed in-air hijack when they overpowered their captors aboard an Italian seaplane that was flying them to a prisoner-of-war camp. As they approached an Allied base, they were strafed by Supermarine Spitfires unaware of the aircraft's true operators and forced to land on the water. However, all on board survived to be picked up by a British boat.
In the years following World War II, Philip Baum, an aviation security expert suggests that the development of a rebellious youth "piggybacking on to any cause which challenged the status quo or acted in support of those deemed oppressed", may have been a contributor to attacks against the aviation field. The first hijacking of a commercial flight occurred on the Cathay Pacific "Miss Macao" on July 16, 1948. After this incident and others in the 1950s, airlines recommended that flight crews comply with the hijackers' demands rather than risk a violent confrontation. There were also various hijacking incidents and assaults on planes in China and the Middle East.
On 23 July 1956, in the Hungarian People's Republic, seven passengers hijacked a domestic flight of Malév Hungarian Airlines, a Lisunov Li-2 (registration HA-LIG), to escape from behind the Iron Curtain, and flew it to West Germany. The aircraft landed safely at Ingolstadt Air Base without injuries.
The first hijacking of a flight for political reasons happened in Bolivia, affecting the airline Lloyd Aereo Boliviano on September 26, 1956. The DC-4 was carrying 47 prisoners who were being transported from Santa Cruz, Bolivia, to El Alto, in La Paz. A political group was waiting to take them to a concentration camp located in Carahuara de Carangas, Oruro. The 47 prisoners overpowered the crew and gained control of the aircraft while airborne and diverted the plane to Tartagal, Argentina. Prisoners took control of the aircraft and received instructions to again fly to Salta, Argentina, as the airfield in Tartagal was not big enough. Upon landing, they told the government of the injustice they were subjected to, and received political asylum.
On October 22, 1956, French forces hijacked a Moroccan airplane carrying leaders of the Algerian National Liberation Front (FLN) during the ongoing Algerian War. The plane, which was carrying Ahmed Ben Bella, Hocine Aït Ahmed, and Mohamed Boudiaf, was destined to leave from Palma de Mallorca for Tunis where the FLN leaders were to conference with Prime Minister Habib Bourguiba, but French forces redirected the flight to occupied Algiers, where the FLN leaders were arrested.
1958–1979.
Between 1958 and 1967, there were approximately 40 hijackings worldwide. Beginning in 1958, hijackings from Cuba to other destinations started to occur; in 1961, hijackings from other destinations to Cuba became prevalent. The first happened on May 1, 1961, on a flight from Miami to Key West. The perpetrator, armed with a knife and gun, forced the captain to land in Cuba.
Australia was relatively untouched by the threat of hijackings until July 19, 1960. On that evening, a 22-year-old Russian man attempted to divert Trans Australia Airlines Flight 408 to Darwin or Singapore. The crew were able to subdue the man after a brief struggle.
According to the FAA, in the 1960s, there were 100 attempts of hijackings involving U.S. aircraft: 77 successful and 23 unsuccessful. Recognizing the danger early, the FAA issued a directive on July 28, 1961, which prohibits unauthorized persons from carrying concealed firearms and interfering with crew member duties. The Federal Aviation Act of 1958 was amended to impose severe penalties for those seizing control of a commercial aircraft. Airlines could also refuse to transport passengers who were likely to cause danger. That same year, the FAA and Department of Justice created the Peace Officers Program which put trained marshals on flights. A few years later, on May 7, 1964, the FAA adopted a rule requiring that cockpit doors on commercial aircraft be kept locked at all times.
In a five-year period (1968–1972) the world experienced 326 hijack attempts, or one every 5.6 days. The incidents were frequent and often just an inconvenience, which resulted in television shows creating parodies. "Time" magazine even ran a lighthearted comedy piece called "What to Do When the Hijacker Comes". Most incidents occurred in the United States. There were two distinct types: hijackings for transportation elsewhere and hijackings for extortion with the threat of harm.
Between 1968 and 1972, there were 90 recorded transport attempts to Cuba. In contrast, there were 26 extortion attempts (see table on the right). The longest and first transcontinental (Los Angeles, Denver, New York, Bangor, Shannon and Rome) hijacking from the US started on 31 October 1969.
The Eastern Air Lines Shuttle flight 1320 on May 17, 1970, witnessed the first fatality in the course of a U.S. hijacking.
Incidents also became problematic outside of the U.S. For instance, in 1968, El Al Flight 426 was seized by Popular Front for the Liberation of Palestine (PFLP) militants on 23 July, an incident which lasted 40 days, making it one of the longest. This record was later beaten in 1999.
As a result of the evolving threat, President Nixon issued a directive in 1970 to promote security at airports, electronic surveillance and multilateral agreements for tackling the problem.
The International Civil Aviation Organization (ICAO) issued a report on aircraft hijacking in July 1970. Beginning in 1969 until the end of June 1970, there were 118 incidents of unlawful seizure of aircraft and 14 incidents of sabotage and armed attacks against civil aviation. This involved airlines of 47 countries and more than 7,000 passengers. In this period, 96 people were killed and 57 were injured as a result of hijacking, sabotage and armed attacks.
The ICAO stated that this is not isolated to one nation or one region, but a worldwide issue to the safe growth of international civil aviation. Incidents also became notoriousin 1971, a man known as D. B. Cooper hijacked a plane and extorted US$200,000 in ransom before parachuting over Oregon. He was never identified.
On August 20, 1971, a Pakistan Air Force T-33 military plane was hijacked prior the Indo-Pakistani war of 1971 in Karachi. Lieutenant Matiur Rahman attacked Officer Rashid Minhas and attempted to land in India. Minhas deliberately crashed the plane into the ground near Thatta to prevent the diversion.
Countries around the world continued their efforts to tackle crimes committed on-board planes. The Tokyo Convention, drafted in 1958, established an agreement between signatories that the "state in which the aircraft is registered is competent to exercise jurisdiction over crimes committed on board that aircraft while it is in flight". While the Convention does not make hijacking an international crime, it does contain provisions which obligate the country in which a hijacked aircraft lands to restore the aircraft to its responsible owner, and allow the passengers and crew to continue their journey. The Convention came into force in December 1969.
A year later, in December 1970, the Hague Convention was drafted which punishes hijackers, enabling each state to prosecute a hijacker if that state does not extradite them, and to deprive them from asylum from prosecution.
On December 5, 1972, the FAA issued emergency rules requiring all passengers and their carry-on baggage to be screened. Airports slowly implemented walk-through metal detectors, hand-searches and X-ray machines, to prohibit weapons and explosive devices. These rules came into effect on January 5, 1973, and were welcomed by most of the public. In 1974, Congress enacted a statute which provided for the death penalty for acts of aircraft piracy resulting in death. Between 1968 and 1977, there were approximately 41 hijackings per year.
In the 1970s, in pursuit of their demands for Croatia's independence from the Socialist Republic of Yugoslavia, Croatian nationalists hijacked several civilian airliners, such as Scandinavian Airlines System Flight 130 and TWA Flight 355.
1980–2000.
By 1980, airport screening and greater cooperation from the international community led to fewer successful hijackings; the number of events had significantly dropped below the 1968 level. Between 1978 and 1988, there were roughly 26 incidents of hijackings a year. A new threat emerged in the 1980s: organised terrorists destroying aircraft to draw attention. For instance, terrorist groups were responsible for the bombing of Air India Flight 182 over the Irish coast. In 1988, Pan Am Flight 103 was bombed flying over Scotland. Terrorist activity which included hijack attempts in the Middle East were also a cause of concern.
During the 1990s, there was relative peace in the United States airspace as the threat of domestic hijacking was seen as a distant memory. Globally, however, hijackings still persisted. Between 1993 and 2003, the highest number of hijackings occurred in 1993 (see table below). This number can be attributed to events in China where hijackers were trying to gain political asylum in Taiwan. Europe and the rest of East Asia were not immune either. On December 26, 1994, Air France Flight 8969 with 172 passengers and crew was hijacked after leaving Algiers. Authorities believed that the goal was to crash the plane into the Eiffel Tower. On June 21, 1995, All Nippon Airways Flight 857 was hijacked by a man claiming to be a member of the Aum Shinrikyo religious cult, demanding the release of its imprisoned leader Shoko Asahara. The incident was resolved when the police stormed the plane.
On October 17, 1996, the first hijacking that was brought to an end while airborne was carried out by four operatives of the Austrian special law enforcement unit Cobra on a Russian Aeroflot flight from Malta to Lagos, Nigeria, aboard a Tupolev Tu-154. The operatives escorted inmates detained for deportation to their homelands and were equipped with weapons and gloves. On 12 April 1999, six ELN members hijacked a Fokker 50 of Avianca Flight 9463, flying from Bucaramanga to Bogotá. Many hostages were held for more than a year, and the last hostage was finally freed 19 months after the hijacking.
2001–present.
On September 11, 2001, four airliners were hijacked by 19 Al-Qaeda extremists: American Airlines Flight 11, United Airlines Flight 175, American Airlines Flight 77 and United Airlines Flight 93. The first two planes were deliberately crashed into the Twin Towers of the World Trade Center in New York City and the third was crashed into The Pentagon in Arlington County, Virginia. The fourth crashed in a field in Stonycreek Township near Shanksville, Pennsylvania after crew and passengers attempted to overpower the hijackers. Authorities believe that the intended target was the U.S. Capitol or the White House in Washington DC. In total, 2,996 people perished and more than 6,000 were injured in the attacks, making the hijackings the deadliest in modern history.
Following the attacks, the U.S. government formed the Transportation Security Administration (TSA) to handle airport screening at U.S. airports. Government agencies around the world tightened their airport security, procedures and intelligence gathering. Until the September 11 attacks, there had never been an incident whereby a passenger aircraft was used as a weapon of mass destruction. The 9/11 Commission report stated that it was always assumed that a "hijacking would take the traditional form"; therefore, airline crews never had a contingency plan for a suicide-hijacking. As Patrick Smith, an airline pilot, summarizes:
Throughout the mid-2000s, hijackings still occurred but there were much fewer incidents and casualties. The number of incidents had been declining, even before the September 11 attacks. One notable incident in 2006 was the hijacking of Turkish Airlines Flight 1476, flying from Tirana to Istanbul, which was seized by a man named Hakan Ekinci. The aircraft, with 107 passengers and 6 crew, made distress calls to air traffic control and the plane was escorted by military aircraft before landing safely at Brindisi, Italy. In 2007, several incidents occurred in the Middle East and Northern Africa; hijackers in one of these incidents claimed to be affiliated with Al-Qaeda. Towards the end of the decade, AeroMexico experienced its first terror incident when Flight 576 was hijacked by a man demanding to speak with President Calderón.
Since 2010, the Aviation Safety Network estimates there have been 15 hijackings worldwide with three fatalities. This is a considerably lower figure than in previous decades which can be attributed to greater security enhancements and awareness of September 11–style attacks. On June 29, 2012, an attempt was made to hijack Tianjin Airlines Flight GS7554 from Hotan to Ürümqi in China. More recently was the 2016 hijacking of EgyptAir Flight MS181, involving an Egyptian man who claimed to have a bomb and ordered the plane to land in Cyprus. He surrendered several hours later, after freeing the passengers and crew.
Countermeasures.
As a result of the large number of U.S.–Cuba hijackings in the late 1960s to early 1970s, international airports introduced screening technology such as metal detectors, X-ray machines and explosive detection tools. In the U.S, these rules were enforced starting from January 1973 and were eventually copied around the world. These security measures did make hijacking a "higher-risk proposition" and deter criminals in later decades. Until September 2001, the FAA set and enforced a "layered" system of defense: hijacking intelligence, passenger pre-screening, checkpoint screening and on-board security. The idea was that if one layer were later to fail, another would be able stop a hijacker from boarding a plane. However, the 9/11 Commission found that this layered approach was flawed and unsuitable to prevent the September 11 attacks. The U.S Transportation Security Administration has since strengthened this approach, with a greater emphasis on intelligence sharing.
On-board security.
In the history of hijackings, most incidents involved planes being forced to land at a certain destination with demands. As a result, commercial airliners adopted a "total compliance" rule which taught pilots and cabin crew to comply with the hijackers' demands. Crews advise passengers to sit quietly to increase their chances of survival. The ultimate goal is to land the plane safely and let the security forces handle the situation. The FAA suggested that the longer a hijacking persisted, the more likely it would end peacefully with the hijackers reaching their goal. Although total compliance is still relevant, the events of September 11 changed this paradigm as this technique cannot prevent a murder-suicide hijacking.
After the September 11 attacks, it became evident that each hijacking situation needs to be evaluated on a case-by-case basis. Cabin crew, now aware of the severe consequences, have a greater responsibility for maintaining control of their aircraft. Most airlines also give crew members training in self-defense tactics. Ever since the 1970s, crew are taught to be vigilant for suspicious behaviour. For example, passengers who have no carry-on luggage, or are standing next to the cockpit door with fidgety movements. There have been various incidents when crew and passengers intervened to prevent attacks: on December 22, 2001, Richard Reid attempted to ignite explosives on American Airlines Flight 63. In 2009, on Northwest Flight 253, Umar Farouk Abdulmutallab attempted to detonate explosives sewn into his underwear. In 2012, the attempted hijacking of Tianjin Airlines Flight 7554 was stopped when cabin crew placed a trolley in-front of the cockpit door and asked passengers for help.
American Airlines Flight 11.
In the September 11 attacks, crew on one of the hijacked planes went beyond their scope of training by informing the airline ground crew about the events on board. In separate phone calls, Amy Sweeney and Betty Ong provided information on seat numbers of the attackers and passenger injuries. This helped authorities identify them.
Cockpit security.
As early as 1964, the FAA required cockpit doors on commercial aircraft be kept locked during flight. In 2002, U.S. Congress passed the Arming Pilots Against Terrorism Act, allowing pilots at U.S. airlines to carry guns in the cockpit. Since 2003, these pilots are known as Federal Flight Deck Officers. It is estimated that one in 10 of the 125,000 commercial pilots are trained and armed. Also in 2002, aircraft manufacturers such as Airbus introduced a reinforced cockpit door which is resistant to gunfire and forced entry. Shortly afterwards, the FAA required operators of more than 6,000 aircraft to install tougher cockpit doors by April 9, 2003. Rules were also tightened to restrict cockpit access and make it easier for pilots to lock the doors. In 2015, Germanwings Flight 9525 was seized by the co-pilot and deliberately crashed, while the captain was out. The captain was unable to re-enter the cockpit, because the airline had already reinforced the cockpit door. The European Aviation Safety Agency issued a recommendation for airlines to ensure that at least two people, one pilot and a member of cabin crew, occupy the cockpit during flight. The FAA in the United States enforce a similar rule.
Air marshal service.
Some countries operate a marshal service, which puts members of law enforcement on high-risk flights based on intelligence. Their role is to keep passengers safe, by preventing hijackings and other criminal acts committed on a plane. Federal marshals in the U.S. are required to identify themselves before boarding a plane; marshals of other countries often are not. According to the Congressional Research Service, the budget for the U.S. Federal Air Marshal Service was US$719 million in 2007. Marshals often sit as regular passengers, at the front of the plane to allow observation of the cockpit. Despite the expansion of the marshal service, they cannot be on every plane, and they rarely face a real threat on a flight. Critics have questioned the need for them.
Air traffic control.
There is no generic or set of rules for handling a hijacking situation. Air traffic controllers are expected to exercise their best judgement and expertise when dealing with the apparent consequences of an unlawful interference or hijack. Depending on the jurisdiction, the controller will inform authorities, such as the military, who will escort the hijacked plane. Controllers are expected to keep communications to a minimum and clear the runway for a possible landing.
Legislation for downing hijacked aircraft.
Germany.
In January 2005, a federal law came into force in Germany, called the , which allows "direct action by armed force" against a hijacked aircraft to prevent a September 11–style attack. However, in February 2006 the Federal Constitutional Court struck down these provisions of the law, stating such preventive measures were unconstitutional and would essentially be state-sponsored murder, even if such an act would save many more lives on the ground. The main reason behind this decision was that the state would effectively be killing innocent hostages in order to avoid a terrorist attack. The Court also ruled that the Minister of Defense is constitutionally not entitled to act in terrorism matters, as this is the duty of the state and federal police forces. President of Germany Horst Köhler urged judicial review of the constitutionality of the Luftsicherheitsgesetz after he signed it into law in 2005.
India.
India published its new anti-hijacking policy in August 2005. The policy came into force after approval from the Cabinet Committee on Security (CCS). The main points of the policy are:
United States.
Prior to the September 11 attacks, countermeasures were focused on "traditional" hijackings. As such, there were no specific rules for handling suicide hijackings, where aircraft would be used as a weapon. Moreover, military response at the time consisted of multiple uncoordinated units, each with its own set of rules of engagement with no unified command structure. Soon after the attacks, however, new rules of engagement were introduced, authorizing the North American Aerospace Defense Command (NORAD)the Air Force command tasked with protecting U.S. airspaceto shoot down hijacked commercial airliners if the plane is deemed a threat to strategic targets. In 2003, the military stated that fighter pilots exercise this scenario several times a week.
Other countries.
Poland and Russia are among other countries that have had laws or directives for shooting down hijacked planes. However, in September 2008 the Polish Constitutional Court ruled that the Polish rules were unconstitutional, and voided them.
International law.
Tokyo Convention.
The Convention on Offences and Certain Other Acts Committed on Board Aircraft, known as the Tokyo Convention, is an international treaty which entered force on December 4, 1969. , it has been ratified by 186 parties. Article 11 of the Tokyo Convention states the following:
The signatories agree that if there is unlawful takeover of an aircraft, or a threat of it on their territory, then they will take all necessary measures to regain or keep control over an aircraft. The captain can also disembark a suspected person on the territory of any country, where the aircraft lands, and that country must agree to it, as stated in Articles 8 and 12 of the convention.
Hague Convention.
The Convention for the Suppression of Unlawful Seizure of Aircraft (known as the Hague Convention) went into effect on October 14, 1971. , the convention has 185 signatories.
Montreal Convention.
The Montreal Convention is a multilateral treaty adopted by a diplomatic meeting of ICAO member states in 1999. It amended important provisions of the Warsaw Convention's regime concerning compensation for the victims of air disasters.
|
2076 | Acropolis of Athens | The Acropolis of Athens (; ) is an ancient citadel located on a rocky outcrop above the city of Athens, Greece, and contains the remains of several ancient buildings of great architectural and historical significance, the most famous being the Parthenon. The word "acropolis" is from the Greek words ("akron", "highest point, extremity") and ("polis", "city"). The term acropolis is generic and there are many other acropoleis in Greece. During ancient times the Acropolis of Athens was known also more properly as Cecropia, after the legendary serpent-man Cecrops, the supposed first Athenian king.
While there is evidence that the hill was inhabited as early as the fourth millennium BC, it was Pericles (–429 BC) in the fifth century BC who coordinated the construction of the buildings whose present remains are the site's most important ones, including the Parthenon, the Propylaea, the Erechtheion and the Temple of Athena Nike. The Parthenon and the other buildings were seriously damaged during the 1687 siege by the Venetians during the Morean War when gunpowder being stored by the then Turkish rulers in the Parthenon was hit by a Venetian bombardment and exploded.
History.
Early settlement.
The Acropolis is located on a flattish-topped rock that rises above sea level in the city of Athens, with a surface area of about . While the earliest artifacts date to the Middle Neolithic era, there have been documented habitations in Attica from the Early Neolithic period (6th millennium BC).
There is little doubt that a Mycenaean megaron palace stood upon the hill during the late Bronze Age. Nothing of this megaron survives except, probably, a single limestone column base and pieces of several sandstone steps. Soon after the palace was constructed, a Cyclopean massive circuit wall was built, 760 meters long, up to 10 meters high, and ranging from 3.5 to 6 meters thick. From the end of the Helladic IIIB (1300-1200 BC) on, this wall would serve as the main defense for the acropolis until the 5th century. The wall consisted of two parapets built with large stone blocks and cemented with an earth mortar called "emplekton" (Greek: ἔμπλεκτον). The wall uses typical Mycenaean conventions in that it followed the natural contour of the terrain and its gate, which was towards the south, was arranged obliquely, with a parapet and tower overhanging the incomers' right-hand side, thus facilitating defense. There were two lesser approaches up the hill on its north side, consisting of steep, narrow flights of steps cut in the rock. Homer is assumed to refer to this fortification when he mentions the "strong-built house of Erechtheus" ("Odyssey" 7.81). At some time before the 13th century BC, an earthquake caused a fissure near the northeastern edge of the Acropolis. This fissure extended some 35 meters to a bed of soft marl in which a well was dug. An elaborate set of stairs was built and the well served as an invaluable, protected source of drinking water during times of siege for some portion of the Mycenaean period.
Archaic Acropolis.
Not much is known about the architectural appearance of the Acropolis until the Archaic era. During the 7th and the 6th centuries BC, the site was controlled by Kylon during the failed Kylonian revolt, and twice by Peisistratos; each of these was attempts directed at seizing political power by "coups d'état". Apart from the Hekatompedon mentioned later, Peisistratos also built an entry gate or propylaea. Nevertheless, it seems that a nine-gate wall, the Enneapylon, had been built around the acropolis hill and incorporated the biggest water spring, the Clepsydra, at the northwestern foot.
A temple to Athena Polias, the tutelary deity of the city, was erected between 570 and 550 BC. This Doric limestone building, from which many relics survive, is referred to as the Hekatompedon (Greek for "hundred–footed"), Ur-Parthenon (German for "original Parthenon" or "primitive Parthenon"), H–Architecture or Bluebeard temple, after the pedimental three-bodied man-serpent sculpture, whose beards were painted dark blue. Whether this temple replaced an older one or just a sacred precinct or altar is not known. Probably, the Hekatompedon was built where the Parthenon now stands.
Between 529 and 520 BC yet another temple was built by the Pisistratids, the Old Temple of Athena, usually referred to as the Arkhaios Neōs (ἀρχαῖος νεώς, "ancient temple"). This temple of Athena Polias was built upon the Dörpfeld foundations, between the Erechtheion and the still-standing Parthenon. The Arkhaios Neōs was destroyed as part of the Achaemenid destruction of Athens during the Second Persian invasion of Greece during 480–479 BC; however, the temple was probably reconstructed during 454 BC, since the treasury of the Delian League was transferred in its opisthodomos. The temple may have been burnt down during 406/405 BC as Xenophon mentions that the old temple of Athena was set afire. Pausanias does not mention it in his 2nd century AD "Description of Greece".
Around 500 BC the Hekatompedon was dismantled to make place for a new grander building, the Older Parthenon (often referred to as the Pre-Parthenon or Early Parthenon). For this reason, Athenians decided to stop the construction of the Olympieion temple which was connoted with the tyrant Peisistratos and his sons, and, instead, used the Piraeus limestone destined for the Olympieion to build the Older Parthenon. To accommodate the new temple, the south part of the summit was cleared, made level by adding some 8,000 two-ton blocks of limestone, a foundation deep at some points, and the rest was filled with soil kept in place by the retaining wall. However, after the victorious Battle of Marathon in 490 BC, the plan was revised and marble was used instead. The limestone phase of the building is referred to as Pre-Parthenon I and the marble phase as Pre-Parthenon II. In 485 BC, construction stalled to save resources as Xerxes became king of Persia, and war seemed imminent.
The Older Parthenon was still under construction when the Persians invaded and sacked the city in 480 BC. The building was burned and looted, along with the Ancient Temple and practically everything else on the rock. After the Persian crisis had subsided, the Athenians incorporated many architectural parts of the unfinished temple (unfluted column drums, triglyphs, metopes, etc.) into the newly built northern curtain wall of the Acropolis, where they served as a prominent "war memorial" and can still be seen today. The devastated site was cleared of debris. Statuary, cult objects, religious offerings, and unsalvageable architectural members were buried ceremoniously in several deeply dug pits on the hill, serving conveniently as a fill for the artificial plateau created around the Classical Parthenon. This "Persian debris" was the richest archaeological deposit excavated on the Acropolis by 1890.
The Periclean building program.
After winning at Eurymedon during 468 BC, Cimon and Themistocles ordered the reconstruction of the southern and northern walls of the Acropolis. Most of the major temples, including the Parthenon, were rebuilt by order of Pericles during the so-called Golden Age of Athens (460–430 BC). Phidias, an Athenian sculptor, and Ictinus and Callicrates, two famous architects, were responsible for the reconstruction.
During 437 BC, Mnesicles started building the Propylaea, a monumental gate at the western end of the Acropolis with Doric columns of Pentelic marble, built partly upon the old Propylaea of Peisistratos. These colonnades were almost finished during 432 BC and had two wings, the northern one decorated with paintings by Polygnotus. About the same time, south of the Propylaea, building started on the small Ionic Temple of Athena Nike in Pentelic marble with tetrastyle porches, preserving the essentials of Greek temple design. After an interruption caused by the Peloponnesian War, the temple was finished during the time of Nicias' peace, between 421 BC and 409 BC.
Construction of the elegant temple of Erechtheion in Pentelic marble (421–406 BC) was by a complex plan which took account of the extremely uneven ground and the need to circumvent several shrines in the area. The entrance, facing east, is lined with six Ionic columns. Unusually, the temple has two porches, one on the northwest corner borne by Ionic columns, the other, to the southwest, supported by huge female figures or caryatids. The eastern part of the temple was dedicated to Athena Polias, while the western part, serving the cult of the archaic king Poseidon-Erechtheus, housed the altars of Hephaestus and Voutos, brother of Erechtheus. Little is known about the original plan of the interior, which was destroyed by fire during the first century BC and has been rebuilt several times.
During the same period, a combination of sacred precincts including the temples of Athena Polias, Poseidon, Erechtheus, Cecrops, Herse, Pandrosos and Aglauros, with its Kore Porch (Porch of the Maidens) or Caryatids' Balcony was begun. Between the temple of Athena Nike and the Parthenon, there was the Sanctuary of Artemis Brauronia (or the Brauroneion), the goddess represented as a bear and worshipped in the deme of Brauron. According to Pausanias, a wooden statue or "xoanon" of the goddess and a statue of Artemis made by Praxiteles during the 4th century BC were both in the sanctuary.
Behind the Propylaea, Phidias' gigantic bronze statue of Athena Promachos ("Athena who fights in the front line"), built between 450 BC and 448 BC, dominated. The base was high, while the total height of the statue was . The goddess held a lance, the gilt tip of which could be seen as a reflection by crews on ships rounding Cape Sounion, and a giant shield on the left side, decorated by Mys with images of the fight between the Centaurs and the Lapiths. Other monuments that have left almost nothing visible to the present day are the Chalkotheke, the Pandroseion, Pandion's sanctuary, Athena's altar, Zeus Polieus's sanctuary and, from Roman times, the circular Temple of Roma and Augustus.
Hellenistic and Roman Period.
During the Hellenistic and Roman periods, many of the existing buildings in the area of the Acropolis were repaired to remedy damage from age and occasionally war. Monuments to foreign kings were erected, notably those of the Attalid kings of Pergamon Attalos II (in front of the NW corner of the Parthenon), and Eumenes II, in front of the Propylaea. These were rededicated during the early Roman Empire to Augustus or Claudius (uncertain) and Agrippa, respectively. Eumenes was also responsible for constructing a stoa on the south slope, similar to that of Attalos in the agora below.
During the Julio-Claudian period, the Temple of Roma and Augustus, a small, round edifice about 23 meters from the Parthenon, was to be the last significant ancient construction on the summit of the rock. Around the same time, on the north slope, in a cave next to the one dedicated to Pan since the Classical period, a sanctuary was founded where the archons dedicated to Apollo on assuming office. During 161 AD, on the south slope, the Roman Herodes Atticus built his grand amphitheater or odeon. It was destroyed by the invading Herulians a century later but was reconstructed during the 1950s.
During the 3rd century, under threat from a Herulian invasion, repairs were made to the Acropolis walls, and the Beulé Gate was constructed to restrict entrance in front of the Propylaea, thus returning the Acropolis to use as a fortress.
Byzantine, Latin, and Ottoman Period.
During the Byzantine period, the Parthenon was used as a church dedicated to the Virgin Mary. During the Latin Duchy of Athens, the Acropolis functioned as the city's administrative center, with the Parthenon as its cathedral, and the Propylaea as part of the ducal palace. A large tower was added, the Frankopyrgos, demolished during the 19th century.
After the Ottoman conquest of Greece, the Parthenon was used as the garrison headquarters of the Turkish army, and the Erechtheum was turned into the governor's private harem. The buildings of the Acropolis suffered significant damage during the 1687 siege by the Venetians in the Morean War. The Parthenon, which was being used as a gunpowder magazine, was hit by artillery shot and damaged severely.
During subsequent years, the Acropolis was a site of bustling human activity with many Byzantine, Frankish, and Ottoman structures. The dominant feature during the Ottoman period was a mosque inside the Parthenon, complete with a minaret.
The Acropolis was besieged thrice during the Greek War of Independence — two sieges from the Greeks in 1821–1822 and one from the Ottomans in 1826–1827. A new bulwark named after Odysseas Androutsos was built by the Greeks between 1822 and 1825 to protect the recently rediscovered Klepsydra spring, which became the sole fresh water supply of the fortress.
After independence, most features that dated from the Byzantine, Frankish, and Ottoman periods were cleared from the site in an attempt to restore the monument to its original form, "cleansed" of all later additions.
German Neoclassicist architect Leo von Klenze was responsible for the restoration of the Acropolis in the 19th century, according to German historian Wolf Seidl, as described in his book "Bavarians in Greece".
Second World War.
At the beginning of the Axis occupation of Greece in 1941, German soldiers raised the Nazi German War Flag over the Acropolis. It would be taken down by Manolis Glezos and Apostolos Santas in one of the first acts of resistance. In 1944 Greek Prime Minister Georgios Papandreou arrived on the Acropolis to celebrate liberation from the Nazis.
Archaeological remains.
The entrance to the Acropolis was a monumental gateway termed the Propylaea. To the south of the entrance is the tiny Temple of Athena Nike. At the centre of the Acropolis is the Parthenon or Temple of Athena Parthenos (Athena the Virgin). East of the entrance and north of the Parthenon is the temple known as the Erechtheum. South of the platform that forms the top of the Acropolis there are also the remains of the ancient, though often remodelled, Theatre of Dionysus. A few hundred metres away, there is the now partially reconstructed Odeon of Herodes Atticus.
All the valuable ancient artifacts are situated in the Acropolis Museum, which resides on the southern slope of the same rock, 280 metres from the Parthenon.
Site plan.
Site plan of the Acropolis at Athens showing the major archaeological remains.
The Acropolis Restoration Project.
The Acropolis Restoration Project began in 1975 to reverse the decay of centuries of attrition, pollution, destruction from military actions, and misguided past restorations. The project included the collection and identification of all stone fragments, even small ones, from the Acropolis and its slopes, and the attempt was made to restore as much as possible using reassembled original material (anastylosis), with new marble from Mount Pentelicus used sparingly. All restoration was made using titanium dowels and is designed to be completely reversible, in case future experts decide to change things. A combination of cutting-edge modern technology and extensive research and reinvention of ancient techniques were used.
The Parthenon colonnades, largely destroyed by Venetian bombardment during the 17th century, were restored, with many wrongly assembled columns now properly placed. The roof and floor of the Propylaea were partly restored, with sections of the roof made of new marble and decorated with blue and gold inserts, as in the original. Restoration of the Temple of Athena Nike was completed in 2010.
A total of 2,675 tons of architectural members were restored, with 686 stones reassembled from fragments of the originals, 905 patched with new marble, and 186 parts made entirely of new marble. A total of 530 cubic meters of new Pentelic marble were used.
In 2021, the addition of new reinforced concrete paths to the site to improve accessibility caused controversy among archaeologists.
Cultural significance.
Every four years, the Athenians had a festival called the Great Panathenaea that rivaled the Olympic Games in popularity. During the festival, a procession (believed to be depicted on the Parthenon frieze) traveled through the city via the Panathenaic Way and culminated on the Acropolis. There, a new robe of woven wool ("peplos") was placed on either the statue of Athena Polias in the Erechtheum (during the annual Lesser Panathenaea) or the statue of Athena Parthenos in the Parthenon (during the Great Panathenaea, held every four years).
Within the later tradition of Western civilization and Classical revival, the Acropolis, from at least the mid-18th century on, has often been invoked as a critical symbol of the Greek legacy and of the glories of Classical Greece.
Most of the artifacts from the temple are housed today in the Acropolis Museum at the foot of the ancient rock.
Geology.
The Acropolis is a klippe consisting of two lithostratigraphic units: the Athens schist and the overlying Acropolis limestone. The Athens schist is a soft reddish rock dating from the late Cretaceous period. The original sediments were deposited in a river delta approximately 72 million years ago. The Acropolis limestone dates from the late Jurassic period, predating the underlying Athens schist by about 30 million years. The Acropolis limestone was thrust over the Athens schist by compressional tectonic forces, forming a nappe or overthrust sheet. Erosion of the limestone nappe led to the eventual detachment of the Acropolis, forming the present-day feature. Where the Athens schist and the limestone meet there are springs and karstic caves.
Many of the hills in the Athens region were formed by the erosion of the same nappe as the Acropolis. These include the hills of Lykabettos, Areopagus, and Mouseion.
The marble used for the buildings of the Acropolis was sourced from the quarries of Mount Pentelicus, a mountain to the northeast of the city.
Geological instability.
The limestone that the Acropolis is built upon is unstable because of the erosion and tectonic shifts that the region is prone to. This instability can cause rock slides that cause damage to the historic site. Various measures have been implemented to protect the site including retaining walls, drainage systems, and rock bolts. These measures work to counter the natural processes that threaten the historic site.
References.
Notes
Bibliography
External links.
Videos
|
2077 | Adam Weishaupt | Johann Adam Weishaupt (; 6 February 1748 – 18 November 1830) was a German philosopher, professor of civil law and later canon law, and founder of the Illuminati.
Early life.
Adam Weishaupt was born on 6 February 1748 in Ingolstadt in the Electorate of Bavaria. Weishaupt's father Johann Georg Weishaupt (1717–1753) died when Adam was five years old. After his father's death he came under the tutelage of his godfather Johann Adam Freiherr von Ickstatt who, like his father, was a professor of law at the University of Ingolstadt. Ickstatt was a proponent of the philosophy of Christian Wolff and of the Enlightenment, and he influenced the young Weishaupt with his rationalism. Weishaupt began his formal education at age seven at a Jesuit school. He later enrolled at the University of Ingolstadt and graduated in 1768 at age 20 with a doctorate of law. In 1772 he became a professor of law after conversion to Protestantism. The following year he married Afra Sausenhofer of Eichstätt.
After Pope Clement XIV's suppression of the Society of Jesus in 1773, Weishaupt became a professor of canon law, a position that was held exclusively by the Jesuits until that time. In 1775 Weishaupt was introduced to the empirical philosophy of Johann Georg Heinrich Feder of the University of Göttingen. Both Feder and Weishaupt would later become opponents of Kantian idealism.
Foundation of the Illuminati.
On 1 May 1776 Johann Adam Weishaupt founded the "Illuminati" in the Electorate of Bavaria. Initially Illumination was designated for a group of outstanding and enlightened individuals in the society. Indeed, the word was adapted from a Latin root, "Iluminatus," which directly translates to "enlightened." He also adopted the name of "Brother Spartacus" within the order. Even encyclopedia references vary on the goal of the order, such as "Catholic Encyclopedia" (1910) saying the Order was not egalitarian or democratic internally, but sought to promote the doctrines of equality and freedom throughout society; while others such as "Collier's" have said the aim was to combat religion and foster rationalism in its place. The Illuminati was formed with the vision of liberating humans from religious bondage and undermining corrupted governments.
The actual character of the society was an elaborate network of spies and counter-spies. Each isolated cell of initiates reported to a superior, whom they did not know: a party structure that was effectively adopted by some later groups.
Weishaupt was initiated into the Masonic lodge "Theodor zum guten Rath", at Munich in 1777. His project of "illumination, enlightening the understanding by the sun of reason, which will dispel the clouds of superstition and of prejudice" was an unwelcome reform. He used Freemasonry to recruit for his own quasi-masonic society, with the goal of "perfecting human nature" through re-education to achieve a communal state with nature, freed of government and organized religion. Presenting their own system as pure masonry, Weishaupt and Adolph Freiherr Knigge, who organised his ritual structure, greatly expanded the secret organisation.
Contrary to Immanuel Kant's that Enlightenment (and Weishaupt's Order was in some respects an expression of the Enlightenment Movement) was the passage by man out of his 'self-imposed immaturity' through daring to 'make use of his own reason, without the guidance of another,' Weishaupt's Order of Illuminati prescribed in great detail everything which the members had obediently to read and think, so that Dr. Wolfgang Riedel has commented that this approach to illumination or enlightenment constituted a degradation and twisting of the Kantian principle of Enlightenment. Riedel writes:
Weishaupt's radical rationalism and vocabulary were not likely to succeed. Writings that were intercepted in 1784 were interpreted as seditious, and the Society was banned by the government of Karl Theodor, Elector of Bavaria, in 1784. Weishaupt lost his position at the University of Ingolstadt and fled Bavaria.
Activities in exile.
He received the assistance of Duke Ernest II of Saxe-Gotha-Altenburg (1745–1804), and lived in Gotha writing a series of works on illuminism, including "A Complete History of the Persecutions of the Illuminati in Bavaria" (1785), "A Picture of Illuminism" (1786), "An Apology for the Illuminati" (1786), and "An Improved System of Illuminism" (1787). Adam Weishaupt died in Gotha on 18 November 1830. He was survived by his second wife, Anna Maria (née Sausenhofer), and his children Nanette, Charlotte, Ernst, Karl, Eduard, and Alfred. His body was buried next to that of his son Wilhelm, who preceded him in death (in 1802), at Friedhof II der Sophiengemeinde Berlin, a Protestant cemetery.
After Weishaupt's Order of Illuminati was banned and its members dispersed, it left behind no enduring traces of influence, not even on its own erstwhile members, who went on to develop in quite different directions.
Assessment of character and intentions.
Weishaupt's character and intentions have been variously assessed. Some took a negative view, such as Augustin Barruel, who despite writing that Weishaupt's goals were that "Equality and Liberty, together with the most absolute independence, are to be the substitutes for all rights and all property" saw this as more dangerous than beneficial, and John Robison, who regarded Weishaupt as a 'human devil' and saw his mission as one of malevolent destructiveness. Others took a more positive view, including Thomas Jefferson, who wrote in a letter to James Madison that "Barruel’s own parts of the book are perfectly the ravings of a Bedlamite" and considered Weishaupt to be an "enthusiastic Philanthropist" who believed in the indefinite perfectibility of man, and believed that the intention of Jesus Christ was simply to "reinstate natural religion, and by diffusing the light of his morality, to teach us to govern ourselves".
In his defence, Weishaupt wrote a "Kurze Rechtfertigung meiner Absichten" (A Brief Justification of my Intentions) in 1787. Author Tony Page comments:
Works.
Works relating to the Illuminati.
Source
|
2078 | Acorn Electron | The Acorn Electron (nicknamed the Elk inside Acorn and beyond) was a lower-cost alternative to the BBC Micro educational/home computer, also developed by Acorn Computers Ltd, to provide many of the features of that more expensive machine at a price more competitive with that of the ZX Spectrum. It had 32 kilobytes of RAM, and its ROM included BBC BASIC II together with the operating system. Announced in 1982 for a possible release the same year, it was eventually introduced on 25 August 1983 priced at £199.
The Electron was able to save and load programs onto audio cassette via a supplied cable that connected it to any standard tape recorder that had the correct sockets. It was capable of bitmapped graphics, and could use either a television set, a colour (RGB) monitor or a monochrome monitor as its display. Several expansions were made available to provide many of the capabilities omitted from the BBC Micro. Acorn introduced a general-purpose expansion unit, the Plus 1, offering analogue joystick and parallel ports, together with cartridge slots into which ROM cartridges, providing software, or other kinds of hardware expansions, such as disc interfaces, could be inserted. Acorn also produced a dedicated disc expansion, the Plus 3, featuring a disc controller and 3.5-inch floppy drive.
For a short period, the Electron was reportedly the best selling micro in the United Kingdom, with an estimated 200,000 to 250,000 machines sold over its entire commercial lifespan. With production effectively discontinued by Acorn as early as 1985, and with the machine offered in bundles with games and expansions, later being substantially discounted by retailers, a revival in demand for the Electron supported a market for software and expansions without Acorn's involvement, with its market for games also helping to sustain the continued viability of games production for the BBC Micro.
History.
After Acorn Computers released the BBC Micro, executives believed that the company needed a less-expensive computer for the mass market. In May 1982, when asked about the recently announced Sinclair ZX Spectrum's potential to hurt sales of the BBC Micro, priced at £125 for the 16K model compared to around twice that price for the 16K BBC Model A, Acorn co-founder Hermann Hauser responded that in the third quarter of that year Acorn would release a new £120–150 computer which "will probably be called the Electron", a form of "miniaturised BBC Micro", having 32 KB of RAM and 32 KB of ROM, with "higher resolution graphics than those offered by the Spectrum".
Acorn co-founder Chris Curry also emphasised the Electron's role as being "designed to compete with the Spectrum... to get the starting price very low, but not preclude expansion in the long term." In order to reduce component costs, and to prevent cloning, the company reduced the number of chips in the Electron from the 102 on the BBC Micro's motherboard to "something like 12 to 14 chips" with most functionality on a single 2,400-gate Uncommitted Logic Array (ULA). Reports during the second half of 1982 indicated a potential December release, with Curry providing qualified confirmation of such plans, together with an accurate depiction of the machine's form and capabilities, noting that the "massive ULA" would be the "dominant factor" in any pre-Christmas release. However, as the end of the year approached, with the ULA not ready for "main production", the launch of the Electron was to be delayed until the spring.
By June 1983, with the planned March release having passed, the launch of the Electron had been rescheduled for the "Acorn User" Exhibition in August 1983, and the machine was indeed launched at the event. The company expected to ship the Electron before Christmas, and sell 100,000 by February 1984. The price at launch - £199 - remained unchanged from that stated in an announcement earlier in the year, with the machine's nickname within Acorn - the "Elk" - also being reported publicly for perhaps the first time.
Reviews were generally favourable, starting with positive impressions based on the physical design of the computer, with one reviewer noting, "The Electron is beautifully designed and built — quite a shock compared to the BBC. Its designer case will look great on the coffee table." Praise was also forthcoming for the Electron's keyboard which was regarded as being better than most of its low-cost peers, with only the VIC-20 being comparable. In one review, the keyboard was even regarded as better than the one in the BBC Micro. The provision of rapid BASIC keyword entry though the combination of the key with various letter keys was also welcomed as a helpful aid to prevent typing errors by "most users", while "touch typists" were still able to type out the keywords in full.
Reviewers also welcomed the machine's excellent graphics compared to its rivals, noting that "the graphics are much more flexible and the maximum resolution is many times that of the Spectrum's". The provision of screen modes supporting 80 columns of readable text and graphics resolutions of was described as "unrivalled by every machine up to the BBC Model B itself", although the absence of a teletext mode was considered regrettable. Although valued for its low memory usage characteristics in the BBC Micro, one reviewer considered the absence of a "software simulation of a teletext screen" to be a "lazy omission" even if it would have to be "awfully slow and take up piles of memory".
While its speed was acceptable compared to its immediate competition, the Electron was, however, rather slower than the BBC Micro with one review noting that games designed for the BBC Micro ran "at less than half the speed, with very significant effects on their appeal". The reduced performance can be attributed to the use of a 4-bit wide memory system instead of the 8-bit wide memory system of the BBC Micro to reduce cost. Due to needing two accesses to the memory instead of one to fetch each byte, along with contention with the video hardware also needing access, reading or writing RAM was much slower than on the BBC Micro. Reviewers were also disappointed by the single-channel sound, noting that "BBC-style music" and its "imitations of various musical instruments" would not be possible, the latter due to the inability of the sound system to vary the amplitude of sounds.
Despite some uncertainty about Acorn's target demographic for the Electron, some noted the potential for the machine in education given its robustness, but also given its price, noting that the high price of BBC Model B machines seemed "rarely justified by their actual practical applications in schools". The introduction of the Electron was seen as potentially leading to competition between Acorn's different models within the schools market rather than creating a broader audience for them, although the potential for more computers in schools, giving more "hands-on" experience for students, was welcomed. Nevertheless, reviewers anticipated that the Electron would sell well at the lower end of the market, with projected sales of 100,000 units by Christmas 1983, helped by the Electron's software compatibility with the BBC Micro and the already established reputation of its predecessor. With parents potentially being convinced of the Electron's educational value, some reviewers foresaw a conflict between parents and "discerning children", the latter merely wanting to play games and preferring models with sound and graphics capabilities more appropriate for gaming. Although Acorn had based its expansion into the United States on the BBC Micro, the company did have plans to introduce the Electron at a later time, with Chris Curry having indicated "a very heavy push overseas" involving both the BBC Micro and Electron. A model for the US market was described in an official book, "The Acorn Guide to the Electron", but this model was never produced.
Production difficulties at Astec in Malaysia delayed the machine's introduction, forcing Acorn to look to other manufacturers such as AB Electronics in Wales and Wongs in Hong Kong (an original equipment manufacturer making over 30 million circuit boards a year, along with power supplies and plastic housings, for companies such as IBM, Xerox, Atari, and Apple, including units made for Acorn for the BBC Micro). By October 1983, Acorn had received orders for more than 150,000 units, but had production targets of only 25,000 a month before Christmas, meaning that the existing backlog would take more than six months to fulfil. Demand for the Electron was high but only two of WH Smith's London branches had inventory. Ultimately, manufacturing in Malaysia ceased with the anticipated but unspecified number of units having been produced, this having been originally reported as 100,000 units. Acorn's marketing manager, Tom Hohenberg, admitted in early 1984 that "a lot of the trouble stemmed from the ULA" in getting production to the desired levels, but that such difficulties had been resolved, although Acorn faced an order backlog of almost a quarter of a million units.
As the company increased production during 1984, however, the British home computer market greatly weakened. Hohenberg later noted that after the 1983 Christmas season, Electron deliveries had increased to meet a demand that was no longer there, with the market having "completely dried up". Acorn's Christmas 1984 sales were greatly below expectations and by March 1985 the company had reduced the Electron's price to £129. With the company's unsuccessful expansion into the United States abandoned, Acorn's financial situation had deteriorated sufficiently to prompt Olivetti to rescue the company by taking a 49.30% ownership stake. Renewed efforts were made to sell the machine, bundling it with Acorn's own expansions and software, such as one package adding the Plus 1 expansion, joysticks and a ROM cartridge game to the base machine for a total price of £219. Acorn committed to supporting the machine "until the end of 1986", continuing to supply it (as the Merlin M2105) to British Telecom as part of the Healthnet communications system, with small-scale manufacturing continuing while existing stocks were being run down.
By autumn, retailers appeared eager to discount the computer, with prices in stores as low as £100, reportedly less than the distributor prices of the summer months. As the Christmas season approached, Dixons Retail acquired the remaining Electron inventory to sell, bundled with a cassette recorder and software, at a retail price of £99.95. This deal, from the perspective of a year later, apparently played a significant part in helping to reduce Acorn's unsold inventory from a value of £18 million to around £7.9 million, and in combination with "streamlining corporate activities and reducing overheads", had helped to reduce Acorn's losses from over £20 million to less than £3 million.
The deal effectively brought to an end Acorn's interest in the Electron and the lower-cost end of the home computing market, but empowered third-party suppliers whose "inventiveness and initiative" was noted as being in contrast with Acorn's lack of interest in the product and the "false promises" made to its users. However, Acorn subsequently released the Master Compact a model in the Master series of microcomputers with fewer BBC Micro-style ports and a similar expansion connector to that used by the Electron with the home audience specifically in mind. Indeed, prior to its release, the Master Compact had been perceived as the successor to the Electron. Superficial similarities between the Compact and Acorn Communicator, together with technical similarities between the Electron (particularly when expanded in the form of the Merlin M2105) and the Communicator, may also have driven rumours of an updated Electron model. A more substantial emphasis on the "home, music and hobby sectors" came with the appointment of a dedicated marketing manager in 1989 following the launch of the BBC A3000 in the Acorn Archimedes range.
Although the Electron presented challenges to developers in terms of the amount of memory available to programs and, particularly for those writing or porting games to the machine, a reduction in hardware features useful for controlling or presenting content on the screen, developers often discovered creative workarounds to deliver commercially successful products, making the business of writing conversions a viable one for some developers.
Several features that would later be associated with the BBC Master and Archimedes first appeared as features of Electron expansion units, including ROM cartridge slots and the Advanced Disc Filing System, a hierarchical improvement to the BBC's original Disc Filing System. Having been envisaged as the basis of a portable computer with "a very strong emphasis on communications" during its development, supporting both modem and Econet interfaces, the BT Merlin M2105 product subsequently combined the Electron with communications functionality, and the Acorn Communicator developed such concepts further, introducing networking support.
The availability of the Electron at discounted prices from 1985 onwards led to increased demand for third-party software and expansions for the machine. While it may not have been as popular as the Spectrum, Commodore 64 or Amstrad CPC, it did sell in sufficient numbers to ensure that new software titles from established producers were being produced right up until the early 1990s, with mainstream publications dedicated to the machine having effectively supported it for five years beyond the point at which Acorn's own support had ceased.
Hardware expansions.
Since the Electron provided only a selection of video output ports, a cassette port and the expansion connector, a range of additional expansions were produced to offer ports and connections to various peripherals. The first expansions were largely joystick and printer interfaces or sideways ROM boards. For instance, First Byte Computers developed an interface and software which allowed a "switched" joystick to be used with the majority of software titles. This interface became very popular and was sold by W.H. Smiths, Boots, Comet and hundreds of independent computer dealers, selling as many as 23,000 units over a two-year period, helped by a bundling agreement with Dixons.
Acorn's own expansion strategy was led by the Plus 1 which offered a combination of ports and cartridge connectors, followed by the Plus 3 disc drive unit, but by early 1986 the more general range of expansions had broadened to include floppy drive and RS423 interfaces, Teletext adapters, and other fundamental enhancements to the base machine.
Multi-function expansions.
Since the Electron's expansion connector was the basis of practically all external hardware expansions for the machine, unless an expansion propagated this connector to others, as was done by the Acorn Plus 3, the capabilities of any given expansion would limit the capabilities of the expanded machine. Thus, expansions offering a single function, such as joystick ports or a printer port, would need to be disconnected if other capabilities were needed, and then reconnected later. Consequently, multi-function expansions offering a combination of different capabilities offered a significant degree of convenience as well as avoiding wear on the expansion connector.
Alongside announcements of Acorn's then-unreleased Plus 1, Solidisk previewed a General Purpose Interface for the Electron in early 1984 offering a Centronics printer port, switched joystick port, user port, sideways ROM sockets, and mini-cartridge sockets supported by the 6522 versatile interface adapter (VIA) chip. The Plus 1 itself was released in mid-1984, introducing the influential cartridge format for expansions ultimately used by several other companies.
Acorn Plus 1.
The Acorn Plus 1 added two ROM cartridge slots, an analogue interface (supporting four channels) and a Centronics parallel port, priced at £59.90. The analogue interface was normally used for joysticks, although trackball and graphics tablet devices were available, and the parallel interface was typically used to connect a printer. Game ROM cartridges would boot automatically. Languages in paged ROM cartridges would take precedence over BASIC. (The slot at the front of the interface took priority if both were populated.)
Access to ROM occurred at 2 MHz until RAM access was required, so theoretically programs released on ROM could run up to twice as fast as those released on tape or disc. Despite this, all of the games released on ROM were packaged as ROM filing system cartridges, from which the micro would load programs into main memory in exactly the same way as if it were loading from tape. This meant that programs did not need to be modified for their new memory location and could be written in BASIC but gave no execution speed benefits. Six ROM cartridge titles were announced for the launch of the Plus 1: three arcade games, one adventure game, one educational title, and the "Lisp" language implementation, the latter being a genuine language ROM that "takes the place of the BASIC ROM" and is instantly available when switching on.
The cartridge slots provided additional control lines (compared to the lines available via the edge connector on the rear of the Electron) to ease implementation of ROM cartridges. Acorn described the hardware extension possibilities in promotional literature, giving an RS423 cartridge as an example of this capability of the Plus 1.
Additional peripheral cartridges were produced by companies such as Advanced Computer Products (and subsequently PRES) whose Advanced ROM Adaptor (ARA) and Advanced Sideways RAM (ASR) products provided sideways ROM and RAM capabilities, allowing ROM- or EPROM-based software to be accessed to provide languages, utilities and applications. ROM or EPROM devices containing the software could be physically inserted into empty ZIF sockets, or the software would be loaded from ROM image files (typically provided on disk) into RAM devices fitted in such sockets. Such RAM could potentially be powered by a battery and thus be able to retain its contents when the computer itself was powered off. Both such arrangements exposed the software in the same sideways memory region.
Such cartridge support enabled the Electron to provide the same functionality as that offered by the expansion ROM slots under the keyboard and on the bottom-left of the BBC Micro B keyboard. However, the need to use cartridge sockets for other peripherals encouraged PRES to develop the Advanced Plus 6 (AP6): an internal RAM and ROM board for the Plus 1 providing six sockets that could be freely used for ROM, EPROM and RAM devices. Installation of the AP6 unit required some modifications to the Plus 1, undertaken either by the user or by PRES, and the product could also be enhanced with the Advanced Plus 7 offering battery-backed RAM support for two 16 KB banks.
The addition of the Plus 1 added a number of new *FX or OSBYTE calls that allowed the OS to read values from the analogue interface and write to the parallel interface.
The Plus 1 needed memory page &D for its workspace, and the unit added some processing overhead when enabled, both of these things causing issues with the loading and running of software, particularly cassette-based games. To disable the Plus 1, after pressing BREAK, the following commands could be issued:
*FX163,128,1
?&212=&D6
?&213=&F1
?&2AC=0
An official application note described a similar set of commands to "remove the Plus One completely from the address map disabling the Centronics and A/D ports (additionally disabling the RS423 cartridge if fitted)".
Further developments.
After Acorn's change of focus away from the Electron, and with a shortage of Plus 1 units available to purchase, Advanced Computer Products secured the rights to manufacture the unit under licence from Acorn, obtaining the injection moulds and tooling, thus restarting production in 1987 after Acorn's own production of the unit had ceased in November 1985. The Advanced Plus 2 (AP2) ROM was later sold by PRES as a replacement ROM for the Plus 1, of whose 8 KB utility ROM only 3 KB had been used, thus providing an opportunity for a more comprehensive ROM to be developed. The AP2 added ROM management commands familiar from the BBC Master series, support for various sideways RAM products from PRES, disc formatting and verification utilities for different ADFS versions, a command to disable the Plus 1 entirely, and improvements that made tape loading more reliable in "high memory" screen modes.
Slogger Rombox Plus.
Following on from Slogger's earlier Rombox product an expansion similar in profile to the Plus 1 but offering eight ROM sockets and propagating the expansion connector to other units the Rombox Plus was positioned more directly as a competitor to the Plus 1 in that it offered two cartridge slots and a Centronics print port alongside four ROM sockets. Priced at £49.95, the unit was mostly compatible with cartridges designed for the Plus 1 although one reviewer reported physical issues with some expansion cartridges, suggesting some manufacturing inconsistencies given other users' more positive experiences, but indicated that it was still "worth considering as an alternative to the Plus 1". One review reported that the Cumana Electron Filing System cartridge had an edge connector that would not physically fit inside the slot in the Rombox Plus unit; this along with a perceived lack of robustness of the case being their only major reservations about the product. The product's support for utilising 8 KB RAM devices as a printer buffer, with buffer management provided by the built-in EPROM, was noted as a particular advantage over the Plus 1.
Slogger Remote Expansion and Plus 2.
In early 1989, Slogger announced its "remote expansion" (RX) system for the Electron, providing a separate case with power supply to house expansions and disc drives, able to support the weight of a monitor or television. Intended for the RX system, the Plus 2 offered two ROM cartridge slots compatible with the Plus 1, three ROM sockets, and RS423 and user port capabilities. One application of the user port was to connect a mouse, utilised by Slogger's version of the Stop Press desktop publishing package by Advanced Memory Systems.
Software Bargains Plus 1.
In mid-1989, Software Bargains announced an expansion providing different levels of Plus 1 functionality, offered as a bare printed circuit board without casing and in three different variants: the basic model offered one cartridge port and was bundled with View and Viewsheet cartridges for £29.95; an extended model offered one cartridge port and a printer port with the two bundled cartridges for £36.95; the full model offered two cartridge ports, printer port and the bundled cartridges for £39.95. Various board upgrade options were also offered between the variants, with the product being described mainly as a vehicle to expose the bundled software packages to as many as 150,000 owners of the estimated 200,000 Electrons in the UK who "have not yet been able to acquire or use View or Viewsheet". The lack of casing was considered the most significant disadvantage, with the absence of the analogue port deemed less critical due to a general lack of support for joysticks in many games.
Communications and networking.
To support connectivity, Acorn announced a Plus 2 network interface with availability scheduled for early 1985, together with a RS423 cartridge for the Plus 1. Neither of these products were delivered as announced.
Acorn Plus 2.
The Acorn Plus 2 interface was due to provide Econet capability. This interface did not make it to market. However, an Econet interface was produced by Barson Computers in Australia and possibly other individuals and businesses.
Acorn Plus 4.
The Acorn Plus 4 interface was due to provide a modem communications capability.
Andyk RS423 cartridge.
Andyk announced an RS423 cartridge for the Plus 1 providing a serial port, alongside other products, in late 1985. It was priced at £34.99.
Pace Tellstar/Nightingale.
Originally reported in mid-1985 as a collaboration between Acorn and Pace Micro Technology, but launched in early 1986, Pace offered a communications product consisting of a RS423 cartridge, bundled with a Nightingale modem and Tellstar communications software, offered at a discounted price of £145.
Jafa Systems RS423 cartridge.
Jafa Systems announced an RS423 cartridge in late 1989 to "fill a two year gap in the market", offering a serial connector compatible with the BBC Micro together with an on-board socket for 8 KB or 16 KB EPROM devices or for 32 KB RAM, the latter being configured to present two sideways RAM banks to the system. Write protect functionality was supported to prevent certain ROM software from attempting to overwrite itself if stored in RAM. The cartridge board itself was priced at around £30, with a case costing £5 extra, and an optional 32 KB RAM adding another £20. Support for the E00 ADFS offered by PRES for that company's AP3 disc system was indicated as an application for the sideways RAM.
Slogger Plus 2 RS423 interface.
Slogger provided an RS423 interface as an option for its own Plus 2 expansion, announced in early 1989.
Disc interfaces.
The first disc interface to be announced for the Electron was Pace's Le Box in 1984, offering a single-sided 100 KB floppy drive controlled by the 8271 controller and accessed using the Amcom Disc Filing System, with pricing at £299 plus VAT including the drive or around £199 without. The unit also provided eight sideways ROM sockets and was intended to sit under the Electron itself. The unit was connected via cabling to the expansion edge connector and included its own power supply, and other drives including switchable 40/80 track drives offering up to 400 KB capacity were dealer-supplied options. Although the product was meant to be on sale at the Acorn User Show in August 1984, and had been advertised, it was "discontinued" in early 1985 before getting to market, with a Pace representative indicating that prohibitive pricing of the 8271 chips (each at "over £80 at times") had left the company considering a re-launch of the product should the pricing situation become more favourable.
Following on from Acorn's Plus 3 interface, Cumana, Solidisk, Advanced Computer Products and Slogger all offered disc interfaces for the Electron. Unlike disc systems on the BBC Micro and the Acorn Plus 3, many of the systems released for the Electron did not claim RAM workspace (and raise the PAGE variable affecting applications above the default of &E00), making it easier to use cassette-based software transferred to disc and to run larger programs from disc.
Low-cost alternatives to disc systems, briefly made fashionable by press coverage of the Sinclair Microdrive, were reportedly under development by expansion suppliers such as Solidisk, and finished products such as the Phloopy looped tape system were offered for the Electron. Reliability issues were described with the Phloopy, and the product was apparently short-lived.
Acorn Plus 3.
Launched in late 1984 for a price of £229, the Acorn Plus 3 was a hardware module that connected independently of the Plus 1 and provided a "self contained disc interface and 3.5 inch single sided disc drive" offering over 300 KB of storage per disc using the newly introduced Advanced Disc Filing System (ADFS). The Plus 3 was also reportedly produced with a double-sided drive fitted.
An expansion connector for a second 3.5- or 5.25-inch drive was also provided by the unit, with such drives needing to provide a Shugart-compatible connector and their own power supply. The original Electron edge connector was repeated on the back of the Plus 3, allowing the Plus 1 or other compatible expansion to be connected in conjunction with the Plus 3.
The double-density drive of the Plus 3 was driven using a WD1770 drive controller by the ADFS. (The Plus 3 had been rumoured to offer Acorn's DFS and to feature a 8272 double-density disk controller before its launch.) Because the WD1770 is capable of single-density mode and uses the same IBM360-derived floppy disc format as the Intel 8271 found in the BBC Micro, it was also possible to use the Disc Filing System with an alternate ROM, such as the ACP 1770 DFS.
The Plus 3 reset PAGE to &1D00, reducing the amount of free RAM available to the user. The ADFS system could be temporarily disabled (and PAGE reset to &E00) via the command. Later products such as the PRES E00 ADFS remedied the memory demands of the ADFS, along with other issues suffered by the software as delivered with the Plus 3. If using the Plus 3 in screen modes 0–3, the pseudo-variable would be thrown off, as the interrupts were disabled during disk access in these modes. The screen would also blank during disc accesses.
Disks had to be manually mounted and dismounted using the / commands, or using the ++ key combination. Disks could also be booted from via the standard + key-combination, if the !BOOT file was present on the disk. This behaviour was the same as on the BBC Micro.
The Plus 3 included an uprated square black power supply unit with mains cord, manufactured by STC, designed and manufactured in England to and , that was designed to power the Plus 3, in addition to the Electron and the Plus 1 interface as well. This replaced the original cream-coloured "wall wart" style power supply, designed to and manufactured in Hong Kong.
Repair note: If the internal power-supply connector, used to power the existing internal 3.5-inch drive is damaged, and requires replacement, then the original AMP 800-930 4-pin connector, which was already in short supply during the original production run, may be replaced with a Molex 5264 50-37-5043 "Mini-SPOX" connector as an alternative.
Advanced Plus 3.
Designed and produced by Baildon Electronics and sold by PRES, the Advanced Plus 3 (AP3) was a Plus 1 cartridge interface using the WD1770 controller, supplied with Acorn's ADFS and a single-sided 3½-inch disc drive for £99 plus VAT, offering equivalent functionality to the Acorn Plus 3. Announced in late 1987, the product was made possible by an agreement between ACP and Acorn to license the ADFS software. As with many disc interfaces for the Electron, since the interface provided a connector for the drive, this made it possible to connect a 5¼" floppy disc drive (more common amongst BBC Micro owners) or the more typical 3½-inch drive.
PRES later released a version of ADFS with support for PAGE at &E00, this being achieved by using RAM provided by the Advanced Battery Backed RAM (ABR) cartridge. This version also fixed two notable bugs in Acorn's ADFS, eliminating unreliability when accessing the first tracks on a disc which had previously necessitated the writing of a file (ZYSYSHELP) as a workaround, and switching off the text cursor during disc compaction which had previously caused disc corruption (since the disc data would be processed using screen memory during this operation, and the cursor would modify that data when blinking). The ROM image was supplied on disc for £17.19, whereas a bundle of the ROM and ABR cartridge was £50.95.
In 1989, the Advanced Plus 3 Mark 2 was launched, offering a double-sided drive in place of the single-sided drive previously offered. This meant that the storage capacity of each disc was increased from the 320 KB of the original Plus 3 to 640 KB (this being supported by ADFS on the Master Compact).
Cumana Floppy Disc System.
Early in 1985, Cumana released a cartridge-based interface providing support for double-density storage, a real-time clock and calendar for timestamping of files, and a spare ROM socket for user-fitted sideways ROMs. The filing system used was Cumana's own QFS, supporting 89 files per disc, PAGE at &E00, a non-hierarchical catalogue, ten-character filenames, with a format not directly compatible with either of Acorn's DFS or ADFS. The interface itself cost £149.95 when originally announced, but settled at around £115.95 including VAT, also being offered in a promotional bundle with a 5¼-inch drive for £224.15 including VAT. Later pricing put the interface at £74.95 including VAT.
Solidisk EFS.
In mid-1985, Solidisk released a cartridge-based interface with support for single and double density storage and providing Acorn DFS and ADFS compatibility, 16 KB of on-board sideways RAM, and a connector for a Winchester hard drive. The cartridge itself cost £59, with a bundle including a double-sided, double-density, 3½-inch drive costing £200. A 20 MB hard drive was offered at a price of £805.
Advanced Plus 4.
Announced in early 1986, the Advanced Plus 4 (AP4) from Advanced Computer Products was a cartridge-based interface employing the WD1770 controller and featuring ACP's 1770 DFS product, providing compatibility with Acorn's DFS from the BBC Micro and thereby supporting seven-character filenames and up to 31 files per disc. However, 8 KB of on-board static RAM was used as workspace for the filing system, keeping PAGE at &E00. An extra ROM socket was provided for a user-fitted sideways ROM, and being a 1770-based interface, it was reported that Acorn's ADFS could be used instead, although since it was not aware of the additional RAM, PAGE would be raised to &1D00 as it would be when using Acorn's Plus 3. The interface was priced at £69.55 plus VAT.
Slogger Electron Disc System and Pegasus 400.
Slogger, an established producer of expansions and a reseller of other disc systems, introduced the Electron Disc System in early 1987, priced at £74.95, featuring the Cumana Floppy Disc System interface, which was combined with an Acorn-compatible DFS, SEDFS, having the capability of reading 40-track discs on 80-track drives plus support for Slogger's tape-to-disc conversion products, and reported as offering "virtual 100 per cent 8271 emulation" for compatibility with traditional DFS software. The SEDFS ROM was also available separately for existing Cumana interface owners, priced at £24.95.
The SEDFS was later bundled with Slogger's own cartridge-based interface and a 40/80-track switchable drive offering up to 400 KB storage per disc, with the bundle taking the Pegasus 400 name, introduced as part of a sales tour towards the end of 1987. This package of interface and drive cost £130. The precise DFS variant used by the Pegasus 400 system kept PAGE at &E00 and introduced "typeahead" support, permitting keystroke buffering during disc activity on systems with the Turbo-Driver or Master RAM Board fitted and enabled.
Slogger/Elektuur Turbo boards.
Announced in early 1986, the Slogger Turbo-Driver was a professionally fitted upgrade priced at £42. The board itself plugged into the CPU and BASIC ROM sockets on the main circuit board of the Electron, which merely involved removing socketed components on very early Electron models, but required desoldering work and therefore benefited from a fitting service for later units. The performance benefit of fitting the board was to make some programs, particularly those running in the high bandwidth modes (0 to 3), run up to three times faster.
The direct origins of the Slogger product appear to be a board designed by Andyk Limited, announced as the Fast Electron Board in late 1985 with a price of £29.99, whereas the Elektuur modification was described in an article in Dutch Electronics magazine Elektuur and intended for users to perform at home.
The Slogger and Elektuur Turbo boards were born out of a hack initially devised at Acorn. By shadowing the lowest 8 KB of RAM with a static RAM chip outside of reach of the ULA, the CPU could always access it at 2 MHz. The tradeoff was that the screen could not be located in that 8 KB. In practice the operating system ROMs always put the screen into the top 20 KB and as a result this probably only broke compatibility with around 2% of software. Speeding up the low portion of memory is particularly useful on 6502 derived machines because that processor has a faster addressing for the first 256 bytes and so it is common for software to put any variables involved in time-critical sections of program into that region.
The cost of the 64 Kbit SRAM chip would have been more than that of doubling the four 64 Kbit DRAM chips to give 8-bit RAM access, fixing both the modest memory and poor performance issues of the Electron.
Slogger Master RAM Board.
Introduced at around the start of 1987 and priced at £64.95 fitted or £54.95 as a kit, the Master RAM Board offered the familiar turbo mode from the Slogger Turbo-Driver alongside a shadow mode providing 32 KB of static shadow RAM in addition to the existing 32 KB, thus giving 64 KB in total. So-called "legally written software", this being software using the operating system calls and not writing directly to the screen, could function without significant modification, making substantially more memory available for BASIC, View, Viewsheet, language ROMs and many other applications. By providing extra storage this modification also allowed some games and applications intended for the BBC Micro to function on the Electron despite the lack of a native Mode 7.
Applications could not directly address video memory in shadow mode without modification, so it was incompatible with most games, although there is no inherent reason why a game could not be written to function in shadow mode. A switch mounted through the case switched between normal, turbo and shadow modes.
Towards the end of the Electron's commercial lifetime, the Turbo-Driver and Master RAM Boards were offered already fitted to new Electrons in an attempt to increase sales. For a time, Jafa Systems manufactured their own equivalent of the Master RAM Board in order to support their own product range.
Mode 7 display expansions.
One of the features of the BBC Micro that was absent in the Electron was the Teletext-style "Mode 7" display. The omission of this display mode was remarkable because it had a very low memory requirement (just 1000 bytes) and many BBC programs used it to maximise available memory for program code and data while also providing a colourful 40-column textual display with simple low-resolution graphical decorations.
Such display capabilities, desirable in their own right on low-memory computers, were also desirable for delivering content through low-bandwidth communications channels such as that from Teletext and Viewdata services. However, access to such services can be considered to be a separate capability, and the BBC Micro needed to be upgraded to complement its display capability with the Teletext or Prestel adapters to receive such over-the-air or online content.
Jafa Systems provided a number of solutions to remedy the absence of a Teletext display capability. Morley Electronics instead chose to offer an expansion combining the display and reception capabilities.
Sir Computers.
In late 1984, Sir Computers announced a Mode 7 adapter unit that plugged into the Electron's expansion connector. Unfortunately, Sir Computers ceased trading before the product was brought to market.
Jafa Systems.
Released in 1987 at a price of £89, the Mode 7 Mark 1 Display Unit was a separate unit "about the size, shape and colour of the Plus One or a Slogger ROMbox" that connected to the Electron's expansion connector and featured a Motorola 6845 display controller and Mullard SAA5050 character generator to replicate the main elements of the BBC Micro's Teletext display solution. This only used 1 KB of memory for the display, with the expansion listening to display memory write accesses and buffering the data in its own memory. A ROM was included to extend the operating system to allow activation of Mode 7 as a genuine screen mode and to provide extra commands and to support keyboard shortcuts used on the BBC Micro to emit Teletext control sequences. To support the output of both the Mode 7 display and the existing video output, a lead connected the Electron's RGB output to the expansion, with the expansion providing only RF (television) output.
Conscious of the relatively high price of the Mark 1 unit, John Wike of Jafa devised and, at the end of 1988, introduced a software-based "Mode 7 Simulator", priced at £25, supplied on a ROM cartridge that rendered the Mode 7 display in a low-resolution, 8-colour graphics mode. Although cheap and effective in enabling use of some software that only used official operating system routines for text output, this solution proved very slow because the Electron had to be placed into the high-bandwidth Mode 2 display to be able to show eight colours at once. In doing so, the CPU spent a lot of time drawing representations of Mode 7 characters and graphics that in a hardware solution would be achieved without any demand on the CPU. It also used up 20 KB of RAM for the graphics display rather than the 1000 bytes of a hardware Mode 7.
A conceptually similar predecessor to the software-based simulator was published by Electron User in early 1987, offering a monochrome Mode 4 simulation of the Teletext display, using the lower 25 character lines of the screen to show the Teletext output, reserving several lines at the top of the screen for a representation of Mode 7 used to prepare the eventual visual output. However, the program did not support direct access to Mode 7 memory locations. The author noted that a Mode 2 version would have been possible but would have required a redesigned character set and "too much memory".
A further refinement of the hardware solution was introduced in 1989 with the Mode 7 Mark 2 Display Unit, which retained the SAA5050 character generator but omitted the 6845 display controller, and was fitted internally in the Electron itself instead of being housed in an external unit, although some kind of ROM expansion unit was needed to hold the driver/utilities ROM. It used software to ensure that the SAA5050 was fed with the correct character data. A software ROM would put the machine into a two-colour, 40-column graphics mode (thus providing one byte per character), and as the ULA read display data from memory in the usual fashion, the SAA5050 would listen to the data it was reading and produce a Mode 7 interpretation of the same information, this being achieved by fitting a board on top of the ULA connecting to its pins. When necessary the hardware would switch between the conventional Electron graphics output and the Mode 7 output being produced by the add-on, feeding it to the Electron's built-in video output sockets via the red, green and blue lines on the motherboard.
The disadvantage to this system is that while the SAA5050 would expect to be repeatedly fed the same 40 bytes of data for every display scanline of each character row, the ULA would read a different set of 40 bytes for every display scanline in order to produce a full graphics display. A software ROM worked around this by duplicating the data intended for a Mode 7 display in memory. Although this produced a Mode 7 that had less of an impact upon CPU performance than a software solution, gave the same visual quality as the BBC Micro, and supported direct access to Mode 7 screen addresses as well as accesses via operating system routines, it still used 10 KB of memory for the display and reduced the amount of readily-usable application memory (as indicated by HIMEM) by another 6 KB.
However, with users increasingly able to rely on expansions such as the Slogger Master RAM board to provide more memory, and with this combination of expansions acknowledged throughout the user manual, the emphasis of the Mode 7 Simulator and Mark 2 Display Unit was arguably to deliver the actual display capabilities for those applications that needed them, instead of using Mode 7 as a way of economising with regard to memory usage, and to do so at a reasonable price. In this latter regard, the Mark 2 model was available as a kit costing £25 or as an assembled product (requiring some soldering) costing £49, with a fitting service available for £10.
The Jafa interfaces did not provide a Teletext or Viewdata reception capability, but the Mark 2 was explicitly stated to work in conjunction with the Morley Electronics Teletext Adapter. Meanwhile, the manual for the Mark 2 noted that the product would provide the functionality of a Viewdata terminal if combined with Jafa's RS423 cartridge.
Morley Electronics.
Morley Electronics produced a Teletext Adaptor expansion for the BBC Micro and the Electron. Since the BBC Micro has the Mode 7 display capability, the model aimed at the BBC Micro merely provided the content reception capability needed to receive and decode Teletext signals, connecting to the user port and power supply. However, the Electron models provided both display and reception capabilities, doing so by routing either the RGB or UHF signals (depending on the model) through the unit in order to introduce the Mode 7 output produced by the unit, also connecting via a cartridge. The Teletext display capabilities in the Electron models exceeded those of the BBC Micro, with one reviewer noting that the enhanced capabilities permitted "black text on a coloured background, something I've always wanted to do on my Beeb". The UHF model of the Electron adapter also supported overlaying of Teletext onto video and framing of video.
Second Processor expansions.
Acorn did demonstrate a prototype "Tube" interface for the Electron alongside the Plus 3 interface at the Compec exhibition in November 1984, although this was never brought to market directly by Acorn.
Advanced Plus 5.
Despite Acorn's withdrawal from the Electron peripheral market, Baildon Electronics developed the Advanced Plus 5 (AP5) expansion, featuring Tube, 1 MHz bus and user port interfaces, which plugged into a Plus 1 cartridge socket. This provided a sufficient level of compatibility that both the 6502 and Z80 second processor products from Acorn were shown to work, providing a Tube implementation that was "as faithful as you can get", with it also being noted that the Electron being available for as little as £50 at that point in its commercial lifespan was a "very cheap way of getting a CP/M machine". Some differences in the memory map of the Electron meant that BBC Micro software would need modifications to work on the Electron with AP5. The price of the unit in late 1986 was £66.70.
The additional facilities of the AP5 alongside the Tube interface permitted various expansions for the BBC Micro to be made available for the Electron. These included the Hybrid Music 5000 and the AMX Mouse.
PMS Electron Second Processor.
In 1986, Permanent Memory Systems (PMS) announced a second processor product for the Electron, the PMS-E2P, as a self-contained cartridge for use with the Plus 1 containing a second 2 MHz 6502A processor plus 64 KB of RAM, priced at £89. This was based on a product originally developed by John Wike of Jafa Systems. Available as a kit or in assembled form, it could even be adapted to connect directly to the Electron's expansion connector, thus avoiding the need to even have a Plus 1 expansion, although this would require the user to find other solutions for attaching peripherals. The implementation of the interface between the Electron and second processor was said to adhere closely to Acorn's recommendations, noting that any hardware or software compatibility issues were likely to be the fault of other vendors not similarly adhering to Acorn's guidelines. PMS supplied Acorn's Hi-BASIC with the E2P, permitting the use of as much as 44 KB of the second processor's RAM with BASIC programs. The company also made a version of Computer Concepts' Wordwise Plus available for the E2P, priced at £39.95.
Sound system expansions.
Despite the Electron having only limited sound generation capabilities, few expansions were offered to overcome the machine's limitations.
Millsgrade Voxbox.
Advertised in late 1985, the Voxbox by Millsgrade Limited was an expansion connecting to the Electron's expansion connector that provided allophone-based speech synthesis, with driver software provided on cassette. The supplied software supported the definition of spoken words built up from the allophones these allophones or sounds being stored in the expansion's own ROM and for catalogues of words to be created and saved. A program was supplied that extended BASIC to allow the use of the synthesiser in user programs. The expansion used the General Instrument SP0256A-AL2 speech synthesis chip.
Sound Expansion cartridge.
Originally announced in 1987 by Project Expansions to be priced at around £40, the Sound Expansion cartridge could be fitted in a Plus 1 (or compatible) slot and provide sound output equivalent to that of the BBC Micro, with Superior Software's Speech bundled as a "limited offer". A product of the same name and with similar functionality was subsequently sold by Complex Software for around £55, employing its own adjustable speaker in the cartridge unit. Superior Software had announced a version of Speech for the unexpanded Acorn Electron in 1986, but this was never released.
Hybrid Music 5000.
Hybrid Technology's Music 5000 was adapted and released by PRES for use with the 1 MHz bus of the Advanced Plus 5 expansion, with the Music 5000 itself priced at £113.85. The only functional differences between the Electron adaptation and the original BBC Micro unit involved the use of Mode 6 for the display and the reduced performance of the Electron imposing some limitations on processing in programs written for the system, although this was not thought to prevent most programs for the system from working on the Electron version.
Merlin M2105.
An unusual variant of the Electron was sold by British Telecom Business Systems as the BT Merlin M2105 Communications Terminal, being previewed by British Telecom at the Communications '84 show. This consisted of a rebadged Electron plus a large expansion unit containing 32 KB of battery-backed RAM (making up 64 KB of RAM in total), up to 64 KB of ROM resident in four sockets (making up to 96 KB of ROM in total), a Centronics printer port, an RS423 serial port, a modem, and the speech generator previously offered for the BBC Micro. The ROM firmware provided dial-up communications facilities, text editing and text messaging functions. The complete product included a monitor and dot-matrix printer.
Initially trialled in a six-month pilot at 50 florists, with the intention of rolling out to all 2,500 members of the UK network, these were used by the Interflora florists network in the UK for over a decade. Used mostly for sending messages, despite providing support for other applications, limited availability of the product led Interflora to look for alternatives after five years, although users appeared to be happy with the product as it was.
This generic product combination of the Electron and accompanying expansion was apparently known as the Chain during development, itself having a different board layout, with British Telecom having intended the M2105 to be a product supporting access to an online service known as Healthnet. This service aimed to improve and speed up communications within hospitals so that patients could be treated and discharged more quickly, and to facilitate transfers of information to doctors and health workers outside hospitals, with communications taking place over conventional telephone lines. The service was to be introduced in the Hammersmith and Fulham district health authority, with installation starting at Charing Cross Hospital. The Electron was said to be particularly suitable for deployment in this application in that it had a "large expansion bus", ostensibly making the machine amenable to the necessary adaptations required for the role, together with its "price, and the fact it has a real keyboard". As a Healthnet terminal, the M2105 was intended to support the exchange of forms, letters and memos.
The adoption of an Acorn product in this role was perhaps also unusual in that much of BT's Merlin range of this era had been supplied by ICL, notably the M2226 small business computer and M3300 "communicating word processor". Nevertheless, the M2105 offered interoperability with other BT products such as the QWERTYphone which was able to receive messages from the M2105 and the Merlin Tonto.
The hardware specifications of the M2105, observed from manufactured units, include the 6502 CPU (SY6502 or R6502), ULA and 32 KB of dynamic RAM fitted in the Electron main unit, plus 32 KB of static RAM, two 6522 VIA devices for interfacing, AM2910PC modem, SCN2681A UART, and TMS5220 plus TMS6100 for speech synthesis. The speech synthesis was used for the "voice response" function which answered incoming voice calls by playing a synthesised message to the caller. The components chosen and the capabilities provided (excluding speech synthesis) are broadly similar to those featured by the Acorn Communicator which was another product of Acorn's custom systems division.
The product documentation indicates a specification with 48 KB of RAM plus 16 KB of "non volatile CMOS RAM" and 96 KB of ROM, although this particular composition of RAM is apparently contradicted by the RAM devices present on surviving M2105 machines. However, the earlier Chain variant of the board does appear to provide only 16 KB of static RAM using two HM6264LP-15 chips, also providing an extra 16 KB of dynamic RAM using eight MK4516-15 chips, suggesting that the product evolved during development.
Technical information.
Much of the core functionality of the BBC Micro the video and memory controller, cassette input/output, timers and sound generation was replicated using a single customised ULA chip designed by Acorn in conjunction with Ferranti, albeit with only one sound channel instead of three (and one noise channel), and without the character-based Teletext Mode 7.
The edge connector on the rear of the Electron exposes all address and data bus lines from the CPU, including the upper eight bits of the address bus, in contrast to the limited selection available via the BBC Micro's expansion ports, with the One Megahertz Bus as the principal mechanism for general purpose expansion on the BBC Micro only providing the lower eight bits of the address bus. In addition, various control signals provided by the CPU and ULA are exposed via the Electron's expansion connector.
For Issue 1–4 motherboards, the ULA had an issue similar to those experienced by other socketed CPUs. Over time, the thermal heating and cooling could cause the ULA to rise slightly out of its socket just enough to cause the machine to start exhibiting 'hanging' or other startup-failure issues, such as a continuous 'startup beep'. This was despite a metal cover, and locking-bar mechanism designed to prevent this from occurring. Pushing down on the metal cover to reseat the ULA was normally sufficient to rectify these issues. Issue 5 and 6 boards utilized a different ULA type, this being known as the Aberdeen ULA (as opposed to the earlier Ferranti ULA) which was mounted on a board that was directly soldered to the main board, with the chip being covered by epoxy resin "insulating material". This arrangement dispensed with the 68-pin socket, and this new type of ULA was expected to be "less prone to failure". This type of ULA was also used on the German release of the Electron mainboard which is designated by the marking "GERMAN ELECTRON Issue 1" on the mainboard rather than just "ELECTRON" as for the UK model.
The keyboard includes a form of quick keyword input, similar to that used on the Sinclair ZX Spectrum, through use of the key in combination with other keys labelled with BASIC keywords. However, unlike the Spectrum, this method of rapid keyword entry is optional, and keywords can be entered manually if preferred.
The ULA mediates access to 32 KB of addressable RAM using 4 64-kilobit RAM chips (4164), sharing the RAM between the CPU and the video signal generation (or screen refresh) performed by the ULA itself. Two accesses have to be made to the RAM to get each byte (albeit with a single RAS), delivering a maximum transfer rate to or from RAM of one byte per 2 MHz cycle. In generating the video signal, the ULA is able to take advantage of this 2 MHz bandwidth when producing the picture for the high-bandwidth screen modes. Due to signalling constraints, the CPU can only access RAM at 1 MHz, even when it is not competing with the video system.
When the ULA is consuming all of the RAM bandwidth during the active portion of a display line, the CPU is unable to access the RAM. (The Electron uses the Synertek variant of the 6502 processor as that allowed the clock to be stopped for this 40 microsecond period.) In other modes the CPU and video accesses are interleaved with each accessor acquiring bytes at 1 MHz.
In contrast, the BBC Micro employs one or two sets of eight 16-kilobit devices, with the RAM running at twice the speed (4 MHz) of the CPU (2 MHz), allowing the video system (screen refresh) and CPU memory accesses to be interleaved, with each accessor able to transfer bytes at 2 MHz. The RAM access limitations imposed by the Electron's ULA therefore reduce the effective CPU speed by as much as a factor of four relative to the BBC Micro in the more demanding display modes, and as much as a factor of two otherwise. Byte transfers from ROM occur at 2 MHz, however.
Hardware.
The hardware specification according to official documentation, combined with more technical documentation and analysis is as follows:
The composite video output provides a greyscale image on the standard machine, but an internal modification allows a colour image to be produced, albeit with a degradation in picture quality. Acorn ostensibly intended the composite output to be a high-quality output for monochrome monitors, with the RGB output being the preferred high-quality output for colour images.
Quirks.
Like the BBC Micro, the Electron was constrained by limited memory resources. Of the 32 KB RAM, 3½ KB was allocated to the OS at startup and at least 10 KB was taken up by the display buffer in contiguous display modes.
Although programs running on the BBC Micro could use the machine's 6522 chip to trigger interrupts at certain points in the update of each display frame, using these events to change the palette and potentially switching all colours to black, thus blanking regions of the screen and hiding non-graphical data that had been stored in screen memory, the Electron lacked such hardware capabilities as standard. However, it was possible to take advantage of the characteristics of interrupts that were provided, permitting palette changes after the top 100 lines of each display frame, thus facilitating the blanking of either the top 100 or bottom 156 lines of the display. Many games took advantage of this, gaining storage by leaving non-graphical data in the disabled area.
Other games would simply load non-graphical data into the display and leave it visible as regions of apparently randomly coloured pixels. One notable example is Superior Software's Citadel.
Although page flipping was a hardware possibility, the limited memory forced most applications to do all their drawing directly to the visible screen, often resulting in graphical flicker or visible redraw. A notable exception is Players' "Joe Blade" series.
Tricks.
Firetrack: smooth vertical scrolling.
Although programs can alter the position of the screen in memory, the non-linear format of the display means that vertical scrolling can only be done in blocks of 8 pixels without further work.
"Firetrack", released on a compilation by Superior Software, exploits a division in the way the Electron handles its display of the seven available graphics modes, two are configured so that the final two of every ten scanlines are blank and are not based on the contents of RAM. If 16 scanlines of continuous graphical data are written to a character-block-aligned portion of the screen then they will appear as a continuous block in most modes but in the two non-continuous modes they will be displayed as two blocks of eight scanlines, separated in the middle by two blank scanlines.
In order to keep track of its position within the display, the Electron maintains an internal display address counter. The same counter is used in both the continuous and non-continuous graphics modes and switching modes mid-frame does not cause any adjustment to the counter.
"Firetrack" switches from a non-continuous to a continuous graphics mode part way down the display. By using the palette to mask the top area of the display and taking care about when it changes mode it can shift the continuous graphics at the bottom of the display down in two pixel increments because the internal display counter is not incremented on blank scanlines during non-continuous graphics modes.
Exile: sampled speech.
"Exile" turns the Electron's one channel output into a digital speaker for PCM output.
The speaker can be programmatically switched on or off at any time but is permanently attached to a hardware counter so is normally only able to output a square wave. But if set to a frequency outside the human audible range then the ear can't perceive the square wave, only the difference between the speaker being switched on and off. This gives the effect of a simple toggle speaker similar to that seen in the 48 KB Sinclair ZX Spectrum. "Exile" uses this to output 1-bit audio samples.
Frak! and Zalaga: Polyphonic music.
As part of their copy protection, illegal copies of Aardvark Software's "Frak!" and "Zalaga" would cause a pseudo-polyphonic rendition of Trumpet Hornpipe, the Captain Pugwash theme tune, to play endlessly rather than loading the game properly (Pugwash being a pirate). On the Electron version of Frak!, the tune was the main theme from "Benny Hill" (Boots Randolph's "Yakety Sax"). The polyphony was achieved via fast note-switching to achieve the necessary chords.
Software.
A range of titles were made available on cassette at the launch of the Electron through the Acornsoft publishing arm of Acorn, including a number of games, the Forth and Lisp languages, and a handful of other educational and productivity titles. Acorn's decision to provide the Electron with a degree of compatibility with the BBC Micro meant that a number of titles already available for the older machine could be expected to run on its new machine, with only minor cosmetic issues occurring when running some titles. Of the Acornsoft languages, the existing Forth and Lisp language releases worked on the Electron (these being re-released specially for the machine), together with BCPL and Microtext (which remained BBC-only releases). Games such as Chess and Snooker, plus a number of other titles were also established as being compatible prior to launch. Various applications in Acornsoft's View suite, together with the languages Comal, Logo and ISO Pascal, were reported as being compatible with the Electron, as were some titles from BBC Soft and other developers.
Languages.
A significant selling point for the Electron was its built-in BBC BASIC interpreter, providing a degree of familiarity from the BBC Micro along with a level of compatibility with the earlier machine. However, as had been the case with the BBC Micro, support for other languages was quickly forthcoming, facilitated by the common heritage of the two systems.
In addition to the early releases, Forth and Lisp, Acornsoft released the Pascal subset, S-Pascal, on cassette and followed up with an ISO Pascal implementation on ROM cartridge, the latter providing two 16 KB ROMs containing a program editor and a Pascal compiler producing intermediate code that required Pascal run-time routines to be loaded. As a more minimal implementation, S-Pascal made use of the machine's built-in BASIC program editing facilities and provided a compiler generating assembly language that would then be assembled, generating machine code for direct execution. ISO Pascal had Oxford Pascal as a direct competitor offering a range of features differentiating it from Acornsoft's product, notably a compiler that could produce a stand-alone "relocatable 6502 machine-code file". Acornsoft later released the ISO Pascal Stand Alone Generator product for the BBC Micro and Master series, permitting the generation of executable programs embedding "sections of the interpreter" required by each program, with such executables being subject to various licensing restrictions.
Acornsoft Forth, aiming for compliance with the Forth-79 standard, was regarded as "an excellent implementation of the language". It saw competition from Skywave Software's Multi-Forth 83 which was delivered on a ROM chip, supported the Forth-83 standard, and provided a multitasking environment. Future availability of Multi-Forth 83 on ROM cartridge was advertised.
With the launch of the Plus 1, Acornsoft Lisp was also made available on cartridge. This Lisp implementation provided only the "bare essentials" of a Lisp system that "a small micro such as the Electron" could hope to be able to support. However, with the interpreter and initialised workspace being loaded from cassette into RAM in the earlier release, one stated advantage of the ROM version was the availability of more memory for use by programs, with the immediacy of a Lisp system provided as a language ROM being an implicit benefit.
Acornsoft provided two products offering different degrees of support for the Logo programming language. Turtle Graphics was a cassette-based product, available alongside Forth, Lisp and S-Pascal amongst the first titles released for the Electron, featuring a subset of Logo focused on the interactive aspects of the language. Acornsoft Logo was provided on ROM cartridge and offered a vocabulary of over 200 commands as part of a more comprehensive implementation of the language, exposing its list processing foundations. Turtle Graphics was substantially cheaper than Logo: by 1987, the former had been reportedly discounted to under £3 whereas the latter cost "less than £30". Unlike other Acornsoft language products, however, Logo was supplied with "two thick manuals".
Applications.
Acornsoft made a number of applications available for the Electron. In early 1985, the "View" word processor and "ViewSheet" spreadsheet applications, familiar from the BBC Micro, were released on ROM cartridge for use with the Electron expanded with a Plus 1, priced at £49.50 each. By running directly from ROM, these applications were able to dedicate all of the machine's available RAM to their documents, and using general filing system mechanisms, documents could be loaded from and saved to cassette or disc, although disc users could also use commands that took advantage of that faster, random-access medium. Cassette-based operation was still regarded as "perfectly feasible" since the software itself did not need to be loaded, with loading and saving operations in View achieving about 800 words per minute and in ViewSheet achieving around 200 cells per minute.
When using View in Mode 6, providing a 40-column, 25-line display occupying 8 KB of memory, around 20 KB of RAM was available to cassette-based systems or to disc-based systems using products such as the Cumana Floppy Disc System that also maintained PAGE at &E00, this corresponding to about 10 or 11 A4 pages of text. In Mode 3, providing an 80-column, 25-line display occupying 16 KB, around 6 or 7 A4 pages of text could be retained in memory. Acorn's Plus 3 disc system reduced this workspace by a further 4 KB. However, documents could be broken up into sections to be processed individually by View. Operation in the 80-column Mode 0 and Mode 3 was reported as being "sometimes slow" due to the Electron's hardware architecture, but View supported horizontal scrolling across documents, permitting the use of a 40-column mode to edit wider documents.
ViewSheet could also operate in different display modes, with spreadsheets of approximately 1600 cells being editable in Mode 6 and around 800 cells in Mode 3. A windowing system was provided that permitted ten different views of a spreadsheet to be displayed on screen at once, and recalculation operations were reported to be "around ten seconds for quite a large model". Reviewers considered the View and ViewSheet applications to be "professional" and to "compare well with similar software sold for much more expensive machines" such as the IBM PC, with WordStar being noted as a broadly similar package to View. Compatibility with the same programs on the BBC Micro made a complete Electron-based system an attractive, low-cost, entry-level word processing and spreadsheet system. However, View's printing support was criticised as inadequate without the use of a companion printer driver program.
Acornsoft did not release its "ViewStore" database program specifically for the Electron, but the software was reported as being compatible, albeit with function key combinations different to those documented for the BBC Micro. However, Acornsoft did release a product, "Database", on 3.5-inch diskette for use with the Electron upgraded with the Plus 3 expansion. The product provided a suite of programs for the creation, maintenance and analysis of structured data files, visualising records using a card index user interface metaphor, and supporting sorting and searching operations on the stored data.
Slogger, an established provider of expansions, also produced productivity applications such as "Starword", a word processor, and "Starstore", a database. Starword provided separate command and editing modes familiar from Acornsoft's View, also supporting 132-column documents and horizontal scrolling for the editing of such wider documents. Along with other operations familiar from View, such as search and replace functions, block-based editing, and control over text justification, it had built-in support for customising documents for output using a mail merge function. Available on ROM for fitting to a ROM expansion such as Slogger's Rombox or inside a separately purchased ROM cartridge, and reportedly developed specifically for the Electron, Starword was considered "comprehensive and powerful".
Starstore, also available on ROM, provided a database management suite primarily aimed at users of cassette storage, with databases being entirely resident in RAM. It supported database definition, data editing, searching, sorting and printing activities. Various features complemented Starword, such as mail merge integration. "Starstore II" followed on as an alternative to, as opposed to a direct successor of, the earlier Starstore product by requiring a disc-based system and permitting databases to be as large as the amount of free space on any given disc. Its user interface was improved over the earlier product, offering pop-up menus and cursor-based navigation.
Computer Concepts' "Wordwise Plus", developed from the company's earlier Wordwise product for the BBC Micro and launched in early 1985, was made available for use with the Electron expanded with the E2P-6502 second processor cartridge. The original Wordwise product was incompatible with the Electron due to its use of Mode 7 (the BBC Micro's 40-column Teletext display mode), and being supplied on a ROM chip, it could also not be readily added to the Electron without appropriate expansions. Available from Permanent Memory Systems, producers of the E2P-6502 cartridge, the Electron version of the software was the Hi-Wordwise Plus variant, supplied on disc instead of ROM, and designed to run on the second processor and to use the expanded memory provided in that environment. The program used the Electron's 40-column Mode 6 display.
Expansion manufacturers Advanced Computer Products and Slogger both made solutions available based on products from Advanced Memory Systems. ACP released a bundle of the "AMX Mouse" and "AMX Art" software for use with its Advanced Plus 5 expansion, also requiring a DFS-compatible disc system. Slogger produced a version of the desktop publishing package "Stop Press" for the Electron, requiring a DFS-compatible disc system, two spare ROM sockets, a mouse, and a suitable user port expansion, with Slogger producing its own user port expansion cartridge. Competing with these products but requiring only a disc system, AVP's "Pixel Perfect" offered a rudimentary desktop publishing solution, utilising the computer's high-resolution Mode 0 display.
Games.
Of the twelve software titles announced by Acornsoft for the Electron at the machine's launch, six were games titles: "Snapper", "Monsters" (a clone of Space Panic), "Meteors" (a clone of Asteroids), "Starship Command", "Chess", and the combined title "Draughts and Reversi". When the Plus 1 expansion was launched in 1984, three of these titles - "Hopper", "Snapper" and "Starship Command" - were among the six ROM cartridge titles available at launch, together with the adventure "Countdown to Doom". Acornsoft would continue to release games including those based on existing arcade games such as "Arcadians" (based on Galaxian) and "Hopper" (based on Frogger), as well as original titles such as "Free Fall" and "Elite".
Micro Power, already an established BBC Micro games publisher, also entered the Electron market at a relatively early stage, offering ten initial titles either converted from the BBC Micro, in the case of "Escape from Moonbase Alpha" and "Killer Gorilla", or "completely re-written", in the case of "Moonraider" (due to differences in the screen handling between the machines). Superior Software, also a significant publisher for the BBC Micro, routinely released games for both machines, notably a licensed version of Atari's Tempest in 1985, but also successful original titles such as the "Repton" series of games, "Citadel", "Thrust" and "Galaforce". Superior's role in games publishing for the Acorn machines expanded in 1986 when the company acquired the right to use the Acornsoft brand, leading to the co-branding of games and compilations released by the company and the re-release of existing Acornsoft titles with this branding, Elite among them. The company would subsequently release another "masterpiece" with bundled novella - the 1988 game "Exile" - as well as numerous conversions and compilations.
By 1988, the "big three" full-price games publishers for the Acorn 8-bit market were identified as Superior Software, Audiogenic (ASL) and Tynesoft, with Top Ten and Alternative Software being the significant budget publishers, and other "strong contenders" being Godax, Mandarin and Bug Byte, this assessment made from the perspective of an established games author evaluating trustworthy publishers for aspiring authors. Commercial considerations motivated authors to make their games available for the Electron due to its importance in sales terms, representing "around half of the Acorn market", with it being regarded as "almost compulsory for any mainstream game" to have an Electron version "unless your game is a state-of-the-art masterpiece", with "Revs", "Cholo" and "Sentinel" cited as such BBC Micro exclusives. Although the Electron imposed additional technical constraints on authors accustomed to the BBC Micro, some authors were able to use this to their creative advantage. For instance, of "Frak!" it was noted that the "Electron version is more popular, and considered better than the BBC version because it has a screen designer included".
Although not as well supported by the biggest software publishers as rivals like the Commodore 64 and Sinclair ZX Spectrum, a good range of games were available for the Electron including popular multi-format games such as "Chuckie Egg". There were also many popular games officially converted to the Electron from arcade machines (including "Crystal Castles", "Tempest", "Commando", "Paperboy" and "Yie Ar Kung-Fu") and other home computer systems (including "Impossible Mission", "Jet Set Willy", "The Way of the Exploding Fist", "Tetris", "The Last Ninja", "Barbarian", "Ballistix", "Predator", "Hostages" and "SimCity").
Despite Acorn themselves effectively shelving the Electron in 1985, games continued to be developed and released by professional software houses until the early 1990s. There were around 1,400 games released for the Acorn Electron, several thousand extra public domain titles were released on disc through Public Domain libraries. Notable enterprises which produced discs of such software are BBC PD, EUG (Electron User Group) and HeadFirst PD.
Emulation.
Several emulators of the machine exist: ElectrEm for Windows/Linux/macOS, Elkulator for Windows/Linux/DOS, ElkJS is a browser-based (JavaScript/HTML5) emulator, and the multi-system emulators MESS and Clock Signal feature support for the Electron. Electron software is predominantly archived in the UEF file format.
There are also two known FPGA based recreations of the Acorn Electron hardware. ElectronFPGA for the Papilio Duo hardware and the Acorn-Electron core for the FPGA Arcade "Replay" board.
Design team.
The operating system ROM locations 0xFC00-0xFFFF contain the details of some members of the Electron's design team, these differing somewhat from those listed in the corresponding message in the BBC Model B ROM:
Additionally, the last bytes of both the BASIC ROM and the Plus 3 interface's ADFS v1.0 ROM include the word "Roger", thought to be a reference to Roger Wilson.
The case was designed by industrial designer Allen Boothroyd of Cambridge Product Design Ltd.
|
2080 | A Fire Upon the Deep | A Fire Upon the Deep is a 1992 science fiction novel by American writer Vernor Vinge. It is a space opera involving superhuman intelligences, aliens, variable physics, space battles, love, betrayal, genocide, and a communication medium resembling Usenet. "A Fire Upon the Deep" won the Hugo Award in 1993, sharing it with "Doomsday Book" by Connie Willis.
Besides the normal print book editions, the novel was also included on a CD-ROM sold by ClariNet Communications along with the other nominees for the 1993 Hugo awards. The CD-ROM edition included numerous annotations by Vinge on his thoughts and intentions about different parts of the book, and was later released as a standalone e-book.
Setting.
The novel is set in various locations in the Milky Way. The galaxy is divided into four concentric volumes called the "Zones of Thought"; it is not clear to the novel's characters whether this is a natural phenomenon or an artificially produced one, but it seems to roughly correspond with galactic-scale stellar density and a Beyond region is mentioned in the Sculptor Galaxy as well. The Zones reflect fundamental differences in basic physical laws, and one of the main consequences is their effect on intelligence, both biological and artificial. Artificial intelligence and automation is most directly affected, in that advanced hardware and software from the Beyond or the Transcend will work less and less well as a ship "descends" towards the Unthinking Depths. But even biological intelligence is affected to a lesser degree. The four zones are spoken of in terms of "low" to "high" as follows:
Ravna uses this analogy to explain the relation between the zones:
Plot.
An expedition from Straumli Realm, an ambitious young human civilization in the high Beyond, investigates a five-billion-year-old data archive in the low Transcend that offers the possibility of unimaginable riches. The expedition's facility, High Lab, is gradually compromised by a dormant superintelligence within the archive later known as the Blight. However, shortly before the Blight's final "flowering", two self-aware entities created similarly to the Blight plot to aid the humans before the Blight can escape.
Recognizing the danger of what they have awakened, the researchers at High Lab attempt to flee in two ships, one carrying all the adults and the second carrying all the children in "coldsleep boxes". Suspicious, the Blight discovers that the first ship contains a data storage device in its cargo manifest; assuming it contains information that could harm it, the Blight destroys the ship. The second ship escapes. The Blight assumes that it is no threat, but later realizes that it is actually carrying away a "countermeasure" against it.
The ship lands on a distant planet with a medieval-level civilization of dog-like creatures, dubbed "Tines", who live in packs as group minds. Upon landing, however, the two surviving adults are ambushed and killed by Tine fanatics known as Flenserists, in whose realm they have landed. The Flenserists capture a young boy named Jefri Olsndot and his wounded sister, Johanna. While Jefri is taken deeper into Flenserist territory, Johanna is rescued by a Tine pilgrim who witnessed the ambush and delivers her to a neighboring kingdom ruled by a Tine named Woodcarver. The Flenserists tell Jefri that Johanna has been killed by Woodcarver and exploit him in order to develop advanced technology (such as cannon and radio communication), while Johanna and the knowledge stored in her "dataset" device help Woodcarver rapidly develop in turn.
A distress signal from the sleeper ship eventually reaches "Relay", a major node in the galactic communications network. A benign transcendent entity named "the Old One" contacts Relay, seeking information about the Blight and the humans who released it, and reconstitutes a human man named Pham Nuwen from an old wreck to act as its agent, using his doubt of his own memory's veracity to bend him to the Old One's will. Ravna Bergsndot, the only human Relay employee, traces the sleeper ship's signal to the Tines' world and persuades her employer to investigate what the human ship took from High Lab, contracting the merchant vessel "Out of Band II", owned by two sentient plant "Skroderiders", Blueshell and Greenstalk, to transport them.
Before the mission is launched, the Blight attacks Relay and concurrently kills Old One. As Old One dies, it downloads what information it can into Pham to defeat the Blight, and Pham, Ravna and the Skroderiders barely escape Relay's destruction in the "Out of Band II".
The Blight expands, taking over races and "rewriting" their people to become its agents, murdering several other Powers, and seizing other archives in the Beyond, looking for what was taken. It finally realizes where the danger truly lies and sends a hastily assembled fleet in pursuit of the "Out of Band II".
The humans arrive at the Tines' homeworld and ally with Woodcarver to defeat the Flenserists. Pham initiates Countermeasure, which extends the Slow Zone by thousands of light years, enveloping the Blight at the cost of wrecking thousands of uninvolved civilizations and causing trillions of deaths. The humans are stranded on the Tines world, now in the depths of the "Slow Zone". Activating the countermeasure costs Pham his life, but just before Pham dies, he realizes that, although his body is a reconstruction, his memories are real. Vinge expands on Pham's background story in "A Deepness in the Sky".
Intelligent species.
Aprahanti.
A race of humanoids with colorful butterfly-like wings who attempt to use the chaos wrought by the Blight to reestablish their waning hegemony. Despite their attractive, delicate appearance, the Aprahanti are an extremely fearsome and vicious species.
Blight.
An ancient, malevolent super-intelligent entity which strives to constantly expand and can easily manipulate electronics and even organic beings.
Dirokimes.
An older race which originally inhabited Sjandra Kei before the arrival of humanity.
Humans.
All humans in the novel (except Pham) are descended from Nyjoran stock. Their ancestors were "Tuvo-Norsk" asteroid miners from Old Earth's solar system, which is noted as being on the other side of the galaxy in the Slow Zone. ("Nyjora" sounds similar to New Norwegian "New Earth".) One of the major human habitations is Sjandra Kei, three systems comprising roughly 28 billion individuals. Their main language is Samnorsk, the Norwegian term for a hypothetical unification of the Bokmål and Nynorsk forms of the language. (Vinge indicates in the book's dedication that several key ideas in it came to him while at a conference in Tromsø, Norway.)
Skroders/Riders/Skroderiders.
A race of plantlike beings with fronds that are used for expression. The riders have no native capacity for short-term memory. Five billion years ago, someone gave the species wheeled mechanical constructs ("skrodes") to move around and to provide short-term memory. It is later revealed that their "benefactor" was the Blight, and it is able to corrupt and remotely operate the Riders via their skrodes.
Tines.
A race of group minds: each person is a "pack" of 4–8 doglike members, which communicate within the pack using very short-range ultrasonic waves from drumlike organs called "tympana".
Each "soul" can survive and evolve by adding members to replace those who die, potentially for hundreds of years, as Woodcarver does.
Kalir.
A race of butterfly-like insectoids, authoritarian and warlike, who constitute one of the "majority races" of the Vrinimi organization.
Related works.
Vinge first used the concepts of "Zones of Thought" in a 1988 novella "The Blabber", which occurs after "Fire". Vinge's novel "A Deepness in the Sky" (1999) is a prequel to "A Fire Upon the Deep" set 20,000 years earlier and featuring Pham Nuwen. Vinge's "The Children of the Sky", "a near-term sequel to "A Fire Upon the Deep", set ten years later, was released in October 2011.
Vinge's former wife, Joan D. Vinge, has also written stories in the Zones of Thought universe, based on his notes. These include "The Outcasts of Heaven Belt", "Legacy", and (as of 2008) a planned novel featuring Pham Nuwen.
Title.
Vinge's original title for the novel was "Among the Tines"; its final title was suggested by his editors.
Awards and nominations.
"A Fire Upon the Deep" shared the 1993 Hugo Award for Best Novel with "Doomsday Book". The book was nominated for the Nebula Award for Best Novel of 1992, the 1993 John W. Campbell Memorial Award for Best Science Fiction Novel, and the 1993 Locus Award for Best Science Fiction Novel.
Critical reactions.
Jo Walton wrote: "Any one of the ideas in "A Fire Upon the Deep" would have kept an ordinary writer going for years. For me it's the book that does everything right, the example of what science fiction does when it works. ... "A Fire Upon the Deep" remains a favourite and a delight to re-read, absorbing even when I know exactly what’s coming."
|
2082 | Aeronautics | Aeronautics is the science or art involved with the study, design, and manufacturing of air flight–capable machines, and the techniques of operating aircraft and rockets within the atmosphere. The British Royal Aeronautical Society identifies the aspects of "aeronautical Art, Science and Engineering" and "The profession of Aeronautics (which expression includes Astronautics)."
While the term originally referred solely to "operating" the aircraft, it has since been expanded to include technology, business, and other aspects related to aircraft.
The term "aviation" is sometimes used interchangeably with aeronautics, although "aeronautics" includes lighter-than-air craft such as airships, and includes ballistic vehicles while "aviation" technically does not.
A significant part of aeronautical science is a branch of dynamics called aerodynamics, which deals with the motion of air and the way that it interacts with objects in motion, such as an aircraft.
History.
Early ideas.
Attempts to fly without any real aeronautical understanding have been made from the earliest times, typically by constructing wings and jumping from a tower with crippling or lethal results.
Wiser investigators sought to gain some rational understanding through the study of bird flight. Medieval Islamic Golden Age scientists such as Abbas ibn Firnas also made such studies. The founders of modern aeronautics, Leonardo da Vinci in the Renaissance and Cayley in 1799, both began their investigations with studies of bird flight.
Man-carrying kites are believed to have been used extensively in ancient China. In 1282 the Italian explorer Marco Polo described the Chinese techniques then current. The Chinese also constructed small hot air balloons, or lanterns, and rotary-wing toys.
An early European to provide any scientific discussion of flight was Roger Bacon, who described principles of operation for the lighter-than-air balloon and the flapping-wing ornithopter, which he envisaged would be constructed in the future. The lifting medium for his balloon would be an "aether" whose composition he did not know.
In the late fifteenth century, Leonardo da Vinci followed up his study of birds with designs for some of the earliest flying machines, including the flapping-wing ornithopter and the rotating-wing helicopter. Although his designs were rational, they were not based on particularly good science. Many of his designs, such as a four-person screw-type helicopter, have severe flaws. He did at least understand that "An object offers as much resistance to the air as the air does to the object." (Newton would not publish the Third law of motion until 1687.) His analysis led to the realisation that manpower alone was not sufficient for sustained flight, and his later designs included a mechanical power source such as a spring. Da Vinci's work was lost after his death and did not reappear until it had been overtaken by the work of George Cayley.
Balloon flight.
The modern era of lighter-than-air flight began early in the 17th century with Galileo's experiments in which he showed that air has weight. Around 1650 Cyrano de Bergerac wrote some fantasy novels in which he described the principle of ascent using a substance (dew) he supposed to be lighter than air, and descending by releasing a controlled amount of the substance. Francesco Lana de Terzi measured the pressure of air at sea level and in 1670 proposed the first scientifically credible lifting medium in the form of hollow metal spheres from which all the air had been pumped out. These would be lighter than the displaced air and able to lift an airship. His proposed methods of controlling height are still in use today; by carrying ballast which may be dropped overboard to gain height, and by venting the lifting containers to lose height. In practice de Terzi's spheres would have collapsed under air pressure, and further developments had to wait for more practicable lifting gases.
From the mid-18th century the Montgolfier brothers in France began experimenting with balloons. Their balloons were made of paper, and early experiments using steam as the lifting gas were short-lived due to its effect on the paper as it condensed. Mistaking smoke for a kind of steam, they began filling their balloons with hot smoky air which they called "electric smoke" and, despite not fully understanding the principles at work, made some successful launches and in 1783 were invited to give a demonstration to the French "Académie des Sciences".
Meanwhile, the discovery of hydrogen led Joseph Black in to propose its use as a lifting gas, though practical demonstration awaited a gas tight balloon material. On hearing of the Montgolfier Brothers' invitation, the French Academy member Jacques Charles offered a similar demonstration of a hydrogen balloon. Charles and two craftsmen, the Robert brothers, developed a gas tight material of rubberised silk for the envelope. The hydrogen gas was to be generated by chemical reaction during the filling process.
The Montgolfier designs had several shortcomings, not least the need for dry weather and a tendency for sparks from the fire to set light to the paper balloon. The manned design had a gallery around the base of the balloon rather than the hanging basket of the first, unmanned design, which brought the paper closer to the fire. On their free flight, De Rozier and d'Arlandes took buckets of water and sponges to douse these fires as they arose. On the other hand, the manned design of Charles was essentially modern. As a result of these exploits, the hot-air balloon became known as the "Montgolfière" type and the hydrogen balloon the "Charlière".
Charles and the Robert brothers' next balloon, "", was a Charlière that followed Jean Baptiste Meusnier's proposals for an elongated dirigible balloon, and was notable for having an outer envelope with the gas contained in a second, inner ballonet. On 19 September 1784, it completed the first flight of over 100 km, between Paris and Beuvry, despite the man-powered propulsive devices proving useless.
In an attempt the next year to provide both endurance and controllability, de Rozier developed a balloon having both hot air and hydrogen gas bags, a design which was soon named after him as the "Rozière." The principle was to use the hydrogen section for constant lift and to navigate vertically by heating and allowing to cool the hot air section, in order to catch the most favourable wind at whatever altitude it was blowing. The balloon envelope was made of goldbeater's skin. The first flight ended in disaster and the approach has seldom been used since.
Cayley and the foundation of modern aeronautics.
Sir George Cayley (1773–1857) is widely acknowledged as the founder of modern aeronautics. He was first called the "father of the aeroplane" in 1846 and Henson called him the "father of aerial navigation." He was the first true scientific aerial investigator to publish his work, which included for the first time the underlying principles and forces of flight.
In 1809 he began the publication of a landmark three-part treatise titled "On Aerial Navigation" (1809–1810). In it he wrote the first scientific statement of the problem, "The whole problem is confined within these limits, viz. to make a surface support a given weight by the application of power to the resistance of air." He identified the four vector forces that influence an aircraft: "thrust", "lift", "drag" and "weight" and distinguished stability and control in his designs.
He developed the modern conventional form of the fixed-wing aeroplane having a stabilising tail with both horizontal and vertical surfaces, flying gliders both unmanned and manned.
He introduced the use of the whirling arm test rig to investigate the aerodynamics of flight, using it to discover the benefits of the curved or cambered aerofoil over the flat wing he had used for his first glider. He also identified and described the importance of dihedral, diagonal bracing and drag reduction, and contributed to the understanding and design of ornithopters and parachutes.
Another significant invention was the tension-spoked wheel, which he devised in order to create a light, strong wheel for aircraft undercarriage.
The 19th century: Otto Lilienthal and the first human flights.
During the 19th century Cayley's ideas were refined, proved and expanded on, culminating in the works of Otto Lilienthal.
Lilienthal was a German engineer and businessman who became known as the "flying man". He was the first person to make well-documented, repeated, successful flights with gliders, therefore making the idea of "heavier than air" a reality. Newspapers and magazines published photographs of Lilienthal gliding, favourably influencing public and scientific opinion about the possibility of flying machines becoming practical.
His work lead to him developing the concept of the modern wing. His flight attempts in Berlin in the year 1891 are seen as the beginning of human flight and the "Lilienthal Normalsegelapparat" is considered to be the first air plane in series production, making the "Maschinenfabrik Otto Lilienthal" in Berlin the first air plane production company in the world.
Otto Lilienthal is often referred to as either the "father of aviation" or "father of flight".
Other important investigators included Horatio Phillips.
Branches.
Aeronautics may be divided into three main branches, Aviation, Aeronautical science and Aeronautical engineering.
Aviation.
Aviation is the art or practice of aeronautics. Historically aviation meant only heavier-than-air flight, but nowadays it includes flying in balloons and airships.
Aeronautical engineering.
Aeronautical engineering covers the design and construction of aircraft, including how they are powered, how they are used and how they are controlled for safe operation.
A major part of aeronautical engineering is aerodynamics, the science of passing through the air.
With the increasing activity in space flight, nowadays aeronautics and astronautics are often combined as aerospace engineering.
Aerodynamics.
The science of aerodynamics deals with the motion of air and the way that it interacts with objects in motion, such as an aircraft.
The study of aerodynamics falls broadly into three areas:
"Incompressible flow" occurs where the air simply moves to avoid objects, typically at subsonic speeds below that of sound (Mach 1).
"Compressible flow" occurs where shock waves appear at points where the air becomes compressed, typically at speeds above Mach 1.
"Transonic flow" occurs in the intermediate speed range around Mach 1, where the airflow over an object may be locally subsonic at one point and locally supersonic at another.
Rocketry.
A rocket or rocket vehicle is a missile, spacecraft, aircraft or other vehicle which obtains thrust from a rocket engine. In all rockets, the exhaust is formed entirely from propellants carried within the rocket before use. Rocket engines work by action and reaction. Rocket engines push rockets forwards simply by throwing their exhaust backwards extremely fast.
Rockets for military and recreational uses date back to at least 13th-century China. Significant scientific, interplanetary and industrial use did not occur until the 20th century, when rocketry was the enabling technology of the Space Age, including setting foot on the moon.
Rockets are used for fireworks, weaponry, ejection seats, launch vehicles for artificial satellites, human spaceflight and exploration of other planets. While comparatively inefficient for low speed use, they are very lightweight and powerful, capable of generating large accelerations and of attaining extremely high speeds with reasonable efficiency.
Chemical rockets are the most common type of rocket and they typically create their exhaust by the combustion of rocket propellant. Chemical rockets store a large amount of energy in an easily released form, and can be very dangerous. However, careful design, testing, construction and use minimizes risks.
|
2083 | Auguste and Louis Lumière | The Lumière brothers (, ; ), Auguste Marie Louis Nicolas Lumière (19 October 1862 – 10 April 1954) and Louis Jean Lumière (5 October 1864 – 6 June 1948), were French manufacturers of photography equipment, best known for their "" motion picture system and the short films they produced between 1895 and 1905, which places them among the earliest filmmakers.
Their screening of a single film on 22 March 1895 for around 200 members of the Society for the Development of the National Industry in Paris was probably the first presentation of projected film. Their first commercial public screening on 28 December 1895 for around 40 paying visitors and invited relations has traditionally been regarded as the birth of cinema. Either the techniques or the business models of earlier filmmakers proved to be less viable than the breakthrough presentations of the Lumières.
History.
The Lumière brothers were born in Besançon, France, to Charles-Antoine Lumière (1840–1911) and Jeanne Joséphine Costille Lumière, who were married in 1861 and moved to Besançon, setting up a small photographic portrait studio. Here were born Auguste, Louis and their daughter Jeanne. They moved to Lyon in 1870, where their two other daughters were born: Mélina and Francine. Auguste and Louis both attended La Martiniere, the largest technical school in Lyon.
They patented several significant processes leading up to their film camera, most notably film perforations (originally implemented by Émile Reynaud) as a means of advancing the film through the camera and projector. The original had been patented by Léon Guillaume Bouly on 12 February 1892. The — a three-in-one device that could record, develop, and project motion pictures — was further developed by the Lumières. The brothers patented their own version on 13 February 1895.
The date of the recording of their first film is in dispute. In an interview with Georges Sadoul given in 1948, Louis claimed that he shot the film in August 1894 - before the arrival of the kinetoscope in France. This is questioned by historians, who consider that a functional Lumière camera did not exist before the beginning of 1895.
The Lumière brothers saw film as a novelty and had withdrawn from the film business by 1905. They went on to develop the first practical photographic colour process, the Lumière Autochrome.
Louis died on 6 June 1948 and Auguste on 10 April 1954. They are buried in a family tomb in the New Guillotière Cemetery in Lyon.
First film screenings.
On 22 March 1895 in Paris, at the Society for the Development of the National Industry, in front of a small audience, one of whom was said to be Léon Gaumont, then director of the company , the Lumières privately screened a single film, "Workers Leaving the Lumière Factory". The main focus of the conference by Louis concerned the recent developments in the photographic industry, mainly the research on polychromy (colour photography). It was much to Lumière's surprise that the moving black-and-white images retained more attention than the coloured stills.
The Lumières gave their first paid public screening on 28 December 1895, at Salon Indien du Grand Café in Paris. This presentation consisted of the following 10 short films:
Each film was up to long, which, when hand cranked through a projector, runs approximately 50 seconds.
The Lumières went on tour with the in 1896, visiting cities including Brussels, Bombay, London, Montreal, New York City, and Buenos Aires.
In 1896, only a few months after the initial screenings in Europe, films by the Lumiere Brothers were shown in Egypt, first in the Tousson stock exchange in Alexandria on 5 November 1896 and then in the Hamam Schneider (Schneider Bath) in Cairo.
Early colour photography.
The brothers stated that "the cinema is an invention without any future" and declined to sell their camera to other filmmakers such as Georges Méliès. This made many film makers upset. Consequently, their role in the history of film was exceedingly brief. In parallel with their cinema work they experimented with colour photography. They worked on colour photographic processes in the 1890s including the Lippmann process (interference heliochromy) and their own 'bichromated glue' process, a subtractive colour process, examples of which were exhibited at the Exposition Universelle in Paris in 1900. This last process was commercialised by the Lumieres but commercial success had to wait for their next colour process. In 1903 they patented a colour photographic process, the "Autochrome Lumière", which was launched on the market in 1907. Throughout much of the 20th century, the Lumière company was a major producer of photographic products in Europe, but the brand name, Lumière, disappeared from the marketplace following merger with Ilford.
Film systems that preceded the Cinématographe Lumière.
Earlier moving images, for instance those of the phantasmagoria shows, the phénakisticope, the zoetrope and Émile Reynaud's Théâtre Optique consisted of hand-drawn images. A system that could record photographic reality in motion, in a fashion much like it is seen by the eyes, had a greater impact on people.
Eadweard Muybridge's Zoopraxiscope projected moving painted silhouettes based on his chronophotographic work. The only Zoopraxiscope disc with actual photographs was made as an early form of stop motion.
Less-known predecessors, such as Jules Duboscq's Bioscope (patented in 1852) were not developed to project the moving images.
Le Prince went missing in 1890, before he got around to give public demonstrations of the patented cameras and projectors he had been developing during the previous years. His short film known as "Roundhay Garden Scene" (1888) has later come to be regarded as the oldest film.
William Friese-Greene patented a "machine camera" in 1889, which embodied many aspects of later film cameras. He displayed the results at photographic societies in 1890 and developed further cameras but did not publicly project the results.
Ottomar Anschütz's Electrotachyscope projected very short loops of high photographic quality.
Thomas Edison believed projection of films wasn't as viable a business model as offering the films in the "peepshow" kinetoscope device. Watching the images on the screen turned out to be much preferred by audiences. Thomas Edison's Kinetoscope (developed by William Kennedy Dickson), premiered publicly in 1894.
Kazimierz Prószyński allegedly built his camera and projecting device, called Pleograph, in 1894.
Lauste and Latham's Eidoloscope was demonstrated for members of the press on 21 April 1895, and opened to the paying public on Broadway on 20 May. They shot films up to twenty minutes long at speeds over thirty frames per second and showed them in many US cities. The Eidoloscope Company was dissolved in 1896 after various internal disputes.
Max and Emil Skladanowsky, inventors of the Bioscop, had offered projected moving images to a paying public in Berlin from 1 November 1895 until the end of the month. Their machinery was relatively cumbersome and their films much shorter tha those of the Lumière brothers. The Skladnowskys' booked screenings in Paris were cancelled after the news of the Lumière show. Nonetheless, they toured their films to other countries.
|
2084 | Acts of the Apostles | The Acts of the Apostles (, "Práxeis Apostólōn"; ) is the fifth book of the New Testament; it tells of the founding of the Christian Church and the spread of its message to the Roman Empire.
Acts and the Gospel of Luke make up a two-part work, Luke–Acts, by the same anonymous author. It is usually dated to around 80–90 AD, although some scholars suggest 110–120 AD. The first part, the Gospel of Luke, tells how God fulfilled his plan for the world's salvation through the life, death, and resurrection of Jesus of Nazareth. Acts continues the story of Christianity in the 1st century, beginning with the ascension of Jesus to Heaven. The early chapters, set in Jerusalem, describe the Day of Pentecost (the coming of the Holy Spirit,) the expulsion of Christians from Jerusalem and the establishment of the church at Antioch. The later chapters narrate the continuation of the message under Paul the Apostle and concludes with his imprisonment in Rome, where he awaits trial.
Luke–Acts is an attempt to answer a theological problem, namely how the Messiah of the Jews came to have an overwhelmingly non-Jewish church; the answer it provides is that the message of Christ was sent to the Gentiles because as a whole Jews rejected it. Luke–Acts can also be seen as a defense of the Jesus movement addressed to the Jews: the bulk of the speeches and sermons in Acts are addressed to Jewish audiences, with the Romans serving as external arbiters on disputes concerning Jewish customs and law. On the one hand, Luke portrays the followers of Jesus as a sect of the Jews, and therefore entitled to legal protection as a recognised religion; on the other, Luke seems unclear as to the future that God intends for Jews and Christians, celebrating the Jewishness of Jesus and his immediate followers, while also stressing how the Jews had rejected the Messiah.
Composition and setting.
Title, unity of Luke – Acts, authorship and date.
The name "Acts of the Apostles" was first used by Irenaeus in the late 2nd century. It is not known whether this was an existing name for the book or one invented by Irenaeus; it does seem clear that it was not given by the author, as the word "práxeis" (deeds, acts) only appears once in the text (Acts 19:18) and there it refers not to the apostles but to deeds confessed by their followers.
The Gospel of Luke and Acts make up a two-volume work which scholars call Luke–Acts. Together they account for 27.5% of the New Testament, the largest contribution attributed to a single author, providing the framework for both the Church's liturgical calendar and the historical outline into which later generations have fitted their idea of the story of Jesus and the early church. The author is not named in either volume. According to Church tradition dating from the 2nd century, the author was Luke, named as a companion of the apostle Paul in three of the letters attributed to Paul himself; this view is still sometimes advanced, but "a critical consensus emphasizes the countless contradictions between the account in Acts and the authentic Pauline letters." (An example can be seen by comparing Acts's accounts of Paul's conversion (Acts 9:1–31, 22:6–21, and 26:9–23) with Paul's own statement that he remained unknown to Christians in Judea after that event (Galatians 1:17–24).) The author "is an admirer of Paul, but does not share Paul's own view of himself as an apostle; his own theology is considerably different from Paul's on key points and does not represent Paul's own views accurately." He was educated, a man of means, probably urban, and someone who respected manual work, although not a worker himself; this is significant, because more high-brow writers of the time looked down on the artisans and small business people who made up the early church of Paul and were presumably Luke's audience.
The earliest possible date for Luke-Acts is around 62 AD, the time of Paul's imprisonment in Rome, but most scholars date the work to 80–90 AD on the grounds that it uses Mark as a source, looks back on the destruction of Jerusalem, and does not show any awareness of the letters of Paul (which began circulating late in the first century); if it does show awareness of the Pauline epistles, and also of the work of the Jewish historian Josephus, as some believe, then a date in the early 2nd century is possible.
Manuscripts.
There are two major textual variants of Acts, the Western text-type and the Alexandrian. The oldest complete Alexandrian manuscripts date from the 4th century and the oldest Western ones from the 6th, with fragments and citations going back to the 3rd. Western texts of Acts are 6.2–8.4% longer than Alexandrian texts, the additions tending to enhance the Jewish rejection of the Messiah and the role of the Holy Spirit, in ways that are stylistically different from the rest of Acts. The majority of scholars prefer the Alexandrian (shorter) text-type over the Western as the more authentic, but this same argument would favour the Western over the Alexandrian for the Gospel of Luke, as in that case the Western version is the shorter.
Genre, sources and historicity of Acts.
The title "Acts of the Apostles" ("Praxeis Apostolon") would seem to identify it with the genre telling of the deeds and achievements of great men ("praxeis"), but it was not the title given by the author. The anonymous author aligned Luke–Acts to the "narratives" (διήγησις, "diēgēsis") which many others had written, and described his own work as an "orderly account" (ἀκριβῶς καθεξῆς). It lacks exact analogies in Hellenistic or Jewish literature. The author may have taken as his model the works of Dionysius of Halicarnassus, who wrote a well-known history of Rome, or the Jewish historian Josephus, author of a history of the Jews. Like them, he anchors his history by dating the birth of the founder (Romulus for Dionysius, Moses for Josephus, Jesus for Luke) and like them he tells how the founder is born from God, taught authoritatively, and appeared to witnesses after death before ascending to heaven. By and large the sources for Acts can only be guessed at, but the author would have had access to the Septuagint (a Greek translation of the Jewish scriptures), the Gospel of Mark, and either the hypothetical collection of "sayings of Jesus" called the Q source or the Gospel of Matthew. He transposed a few incidents from Mark's gospel to the time of the Apostles—for example, the material about "clean" and "unclean" foods in Mark 7 is used in Acts 10, and Mark's account of the accusation that Jesus has attacked the Temple (Mark 14:58) is used in a story about Stephen (Acts 6:14). There are also points of contacts (meaning suggestive parallels but something less than clear evidence) with 1 Peter, the Letter to the Hebrews, and 1 Clement. Other sources can only be inferred from internal evidence—the traditional explanation of the three "we" passages, for example, is that they represent eyewitness accounts. The search for such inferred sources was popular in the 19th century, but by the mid-20th it had largely been abandoned.
Acts was read as a reliable history of the early church well into the post-Reformation era, but by the 17th century biblical scholars began to notice that it was incomplete and tendentious—its picture of a harmonious church is quite at odds with that given by Paul's letters, and it omits important events such as the deaths of both Peter and Paul. The mid-19th-century scholar Ferdinand Baur suggested that the author had re-written history to present a united Peter and Paul and advance a single orthodoxy against the Marcionites (Marcion was a 2nd-century heretic who wished to cut Christianity off entirely from the Jews); Baur continues to have enormous influence, but today there is less interest in determining the historical accuracy of Acts (although this has never died out) than in understanding the author's theological program.
Audience and authorial intent.
Luke was written to be read aloud to a group of Jesus-followers gathered in a house to share the Lord's supper. The author assumes an educated Greek-speaking audience, but directs his attention to specifically Christian concerns rather than to the Greco-Roman world at large. He begins his gospel with a preface addressed to Theophilus (; "cf." ), informing him of his intention to provide an "ordered account" of events which will lead his reader to "certainty". He did not write in order to provide Theophilus with historical justification—"did it happen?"—but to encourage faith—"what happened, and what does it all mean?"
Acts (or Luke–Acts) is intended as a work of "edification," meaning "the empirical demonstration that virtue is superior to vice." The work also engages with the question of a Christian's proper relationship with the Roman Empire, the civil power of the day: could a Christian obey God and also Caesar? The answer is ambiguous. The Romans never move against Jesus or his followers unless provoked by the Jews, in the trial scenes the Christian missionaries are always cleared of charges of violating Roman laws, and Acts ends with Paul in Rome proclaiming the Christian message under Roman protection; at the same time, Luke makes clear that the Romans, like all earthly rulers, receive their authority from Satan, while Christ is ruler of the kingdom of God.
Structure and content.
Structure.
Acts has two key structural principles. The first is the geographic movement from Jerusalem, centre of God's Covenantal people, the Jews, to Rome, centre of the Gentile world. This structure reaches back to the author's preceding work, the Gospel of Luke, and is signaled by parallel scenes such as Paul's utterance in Acts 19:21, which echoes Jesus's words in Luke 9:51: Paul has Rome as his destination, as Jesus had Jerusalem. The second key element is the roles of Peter and Paul, the first representing the Jewish Christian church, the second the mission to the Gentiles.
Content.
The Gospel of Luke began with a prologue addressed to Theophilus; Acts likewise opens with an address to Theophilus and refers to "my earlier book", almost certainly the gospel.
The apostles and other followers of Jesus meet and elect Matthias to replace Judas Iscariot as a member of The Twelve. On Pentecost, the Holy Spirit descends and confers God's power on them, and Peter and John preach to many in Jerusalem and perform healings, casting out of evil spirits, and raising of the dead. The first believers share all property in common, eat in each other's homes, and worship together. At first many Jews follow Christ and are baptized, but the followers of Jesus begin to be increasingly persecuted by other Jews. Stephen is accused of blasphemy and stoned. Stephen's death marks a major turning point: the Jews have rejected the message, and henceforth it will be taken to the Gentiles.
The death of Stephen initiates persecution, and many followers of Jesus leave Jerusalem. The message is taken to the Samaritans, a people rejected by Jews, and to the Gentiles. Saul of Tarsus, one of the Jews who persecuted the followers of Jesus, is converted by a vision to become a follower of Christ (an event which Luke regards as so important that he relates it three times). Peter, directed by a series of visions, preaches to Cornelius the Centurion, a Gentile God-fearer, who becomes a follower of Christ. The Holy Spirit descends on Cornelius and his guests, thus confirming that the message of eternal life in Christ is for all mankind. The Gentile church is established in Antioch (north-western Syria, the third-largest city of the empire), and here Christ's followers are first called Christians.
The mission to the Gentiles is promoted from Antioch and confirmed at a meeting in Jerusalem between Paul and the leadership of the Jerusalem church. Paul spends the next few years traveling through western Asia Minor and the Aegean, preaching, converting, and founding new churches. On a visit to Jerusalem he is set on by a Jewish mob. Saved by the Roman commander, he is accused by the Jews of being a revolutionary, the "ringleader of the sect of the Nazarenes", and imprisoned. Later, Paul asserts his right as a Roman citizen, to be tried in Rome and is sent by sea to Rome, where he spends another two years under house arrest, proclaiming the Kingdom of God and teaching freely about "the Lord Jesus Christ". Acts ends abruptly without recording the outcome of Paul's legal troubles.
Theology.
Prior to the 1950s, Luke–Acts was seen as a historical work, written to defend Christianity before the Romans or Paul against his detractors; since then the tendency has been to see the work as primarily theological. Luke's theology is expressed primarily through his overarching plot, the way scenes, themes and characters combine to construct his specific worldview. His "salvation history" stretches from the Creation to the present time of his readers, in three ages: first, the time of "the Law and the Prophets" (Luke 16:16), the period beginning with Genesis and ending with the appearance of John the Baptist (Luke 1:5–3:1); second, the epoch of Jesus, in which the Kingdom of God was preached (Luke 3:2–24:51); and finally the period of the Church, which began when the risen Christ was taken into Heaven, and would end with his second coming.
Luke–Acts is an attempt to answer a theological problem, namely how the Messiah, promised to the Jews, came to have an overwhelmingly non-Jewish church; the answer it provides, and its central theme, is that the message of Christ was sent to the Gentiles because the Jews rejected it. This theme is introduced in Chapter 4 of the Gospel of Luke, when Jesus, rejected in Nazareth, recalls that the prophets were rejected by Israel and accepted by Gentiles; at the end of the gospel he commands his disciples to preach his message to all nations, "beginning from Jerusalem." He repeats the command in Acts, telling them to preach "in Jerusalem, in all Judea and Samaria, and to the end of the Earth." They then proceed to do so, in the order outlined: first Jerusalem, then Judea and Samaria, then the entire (Roman) world.
For Luke, the Holy Spirit is the driving force behind the spread of the Christian message, and he places more emphasis on it than do any of the other evangelists. The Spirit is "poured out" at Pentecost on the first Samaritan and Gentile believers and on disciples who had been baptised only by John the Baptist, each time as a sign of God's approval. The Holy Spirit represents God's power (at his ascension, Jesus tells his followers, "You shall receive power when the Holy Spirit has come upon you"): through it the disciples are given speech to convert thousands in Jerusalem, forming the first church (the term is used for the first time in Acts 5).
One issue debated by scholars is Luke's political vision regarding the relationship between the early church and the Roman Empire. On the one hand, Luke generally does not portray this interaction as one of direct conflict. Rather, there are ways in which each may have considered having a relationship with the other rather advantageous to its own cause. For example, early Christians may have appreciated hearing about the protection Paul received from Roman officials against Gentile rioters in Philippi (Acts 16:16–40) and Ephesus (Acts 19:23–41), and against Jewish rioters on two occasions (Acts 17:1–17; Acts 18:12–17). Meanwhile, Roman readers may have approved of Paul's censure of the illegal practice of magic (Acts 19:17–19) as well as the amicability of his rapport with Roman officials such as Sergius Paulus (Acts 13:6–12) and Festus (Acts 26:30–32). Furthermore, Acts does not include any account of a struggle between Christians and the Roman government as a result of the latter's imperial cult. Thus Paul is depicted as a moderating presence between the church and the Roman Empire.
On the other hand, events such as the imprisonment of Paul at the hands of the empire (Acts 22–28) as well as several encounters that reflect negatively on Roman officials (for instance, Felix's desire for a bribe from Paul in Acts 24:26) function as concrete points of conflict between Rome and the early church. Perhaps the most significant point of tension between Roman imperial ideology and Luke's political vision is reflected in Peter's speech to the Roman centurion, Cornelius (Acts 10:36). Peter states that "this one" [οὗτος], i.e. Jesus, "is lord [κύριος] of all." The title, κύριος, was often ascribed to the Roman emperor in antiquity, rendering its use by Luke as an appellation for Jesus an unsubtle challenge to the emperor's authority.
Comparison with other writings.
Gospel of Luke.
As the second part of the two-part work Luke–Acts, Acts has significant links to the Gospel of Luke. Major turning points in the structure of Acts, for example, find parallels in Luke: the presentation of the child Jesus in the Temple parallels the opening of Acts in the Temple, Jesus's forty days of testing in the wilderness prior to his mission parallel the forty days prior to his Ascension in Acts, the mission of Jesus in Samaria and the Decapolis (the lands of the Samaritans and Gentiles) parallels the missions of the Apostles in Samaria and the Gentile lands, and so on (see Gospel of Luke). These parallels continue through both books. There are also differences between Luke and Acts, amounting at times to outright contradiction. For example, the gospel seems to place the Ascension on Easter Sunday, shortly after the Resurrection, while Acts 1 puts it forty days later. There are similar conflicts over the theology, and while not seriously questioning the single authorship of Luke–Acts, these differences do suggest the need for caution in seeking too much consistency in books written in essence as popular literature.
Pauline epistles.
Acts agrees with Paul's letters on the major outline of Paul's career: he is converted and becomes a Christian missionary and apostle, establishing new churches in Asia Minor and the Aegean and struggling to free Gentile Christians from the Jewish Law. There are also agreements on many incidents, such as Paul's escape from Damascus, where he is lowered down the walls in a basket. But details of these same incidents are frequently contradictory: for example, according to Paul it was a pagan king who was trying to arrest him in Damascus, but according to Luke it was the Jews (2 Corinthians 11:33 and Acts 9:24). Acts speaks of "Christians" and "disciples", but Paul never uses either term, and it is striking that Acts never brings Paul into conflict with the Jerusalem church and places Paul under the authority of the Jerusalem church and its leaders, especially James and Peter (Acts 15 vs. Galatians 2). Acts omits much from the letters, notably Paul's problems with his congregations (internal difficulties are said to be the fault of the Jews instead), and his apparent final rejection by the church leaders in Jerusalem (Acts has Paul and Barnabas deliver an offering that is accepted, a trip that has no mention in the letters). There are also major differences between Acts and Paul on Christology (the understanding of Christ's nature), eschatology (the understanding of the "last things"), and apostleship.
|
2085 | Assyria | Assyria (Neo-Assyrian cuneiform: , "māt Aššur"; ) was a major ancient Mesopotamian civilization which existed as a city-state from the 21st century BC to the 14th century BC, then to a territorial state, and eventually an empire from the 14th century BC to the 7th century BC.
Spanning from the early Bronze Age to the late Iron Age, modern historians typically divide ancient Assyrian history into the Early Assyrian ( 2600–2025 BC), Old Assyrian ( 2025–1364 BC), Middle Assyrian ( 1363–912 BC), Neo-Assyrian (911–609 BC) and post-imperial (609 BC– AD 240) periods, based on political events and gradual changes in language. Assur, the first Assyrian capital, was founded 2600 BC but there is no evidence that the city was independent until the collapse of the Third Dynasty of Ur in the 21st century BC, when a line of independent kings beginning with Puzur-Ashur I began ruling the city. Centered in the Assyrian heartland in northern Mesopotamia, Assyrian power fluctuated over time. The city underwent several periods of foreign rule or domination before Assyria rose under Ashur-uballit I in the early 14th century BC as the Middle Assyrian Empire. In the Middle and Neo-Assyrian periods Assyria was one of the two major Mesopotamian kingdoms, alongside Babylonia in the south, and at times became the dominant power in the ancient Near East. Assyria was at its strongest in the Neo-Assyrian period, when the Assyrian army was the strongest military power in the world and the Assyrians ruled the largest empire then yet assembled in world history, spanning from parts of modern-day Iran in the east to Egypt in the west.
The Neo-Assyrian Empire fell in the late 7th century BC, conquered by a coalition of the Babylonians, who had lived under Assyrian rule for about a century, and the Medes. Though the core urban territory of Assyria was extensively devastated in the Medo-Babylonian conquest of the Assyrian Empire and the succeeding Neo-Babylonian Empire invested little resources in rebuilding it, ancient Assyrian culture and traditions continued to survive for centuries throughout the post-imperial period. Assyria experienced a recovery under the Seleucid and Parthian empires, though declined again under the Sasanian Empire, which sacked numerous cities and semi independent Assyrian territories in the region, including Assur itself. The remaining Assyrian people, who have survived in northern Mesopotamia to modern times, were gradually Christianized from the 1st century AD onward. Ancient Mesopotamian religion persisted at Assur until its final sack in the 3rd century AD, and at certain other holdouts for centuries thereafter.
The success of ancient Assyria did not derive solely from its energetic warrior-kings, but also from its ability to efficiently incorporate and govern conquered lands through innovative and sophisticated administrative systems. Innovations in warfare and administration pioneered in ancient Assyria were used under later empires and states for millennia thereafter. Ancient Assyria also left a legacy of great cultural significance, particularly through the Neo-Assyrian Empire making a prominent impression in later Assyrian, Greco-Roman and Hebrew literary and religious tradition.
Nomenclature.
In the Old Assyrian period, when Assyria was merely a city-state centered around the city of Assur, the state was typically referred to as "ālu Aššur" ("city of Ashur"). From the time of its rise as a territorial state in the 14th century BC and onward, Assyria was referred to in official documentation as "māt Aššur" ("land of Ashur"), marking the shift to being a regional polity. The term "māt Aššur" is first attested as being used in the reign of Ashur-uballit I ( 1363–1328 BC), the first king of the Middle Assyrian Empire. Both "ālu Aššur" and "māt Aššur" derive from the Assyrian national deity Ashur. Ashur probably originated in the Early Assyrian period as a deified personification of Assur itself. In the Old Assyrian period the deity was considered the formal king of Assur, with the actual rulers only using the style "Išši'ak" ("governor"). From the time of Assyria's rise as a territorial state, Ashur began to be regarded as an embodiment of the entire land ruled by the Assyrian kings.
The modern name "Assyria" is of Greek origin, derived from Ασσυρία ("Assuría"). The term is first attested in the time of the ancient Greek historian Herodotus (5th century BC). The Greeks designated the Levant as "Syria" and Mesopotamia as "Assyria", even though the local population at the time, and well into the later Christian period, used both terms interchangeably for the entire region. Whether the Greeks began referring to Mesopotamia as "Assyria" because they equated the region with the Assyrian Empire, long fallen by the time the term is first attested, or because they named the region after the people who lived there (the Assyrians) is not known. Because the term is so similar to "Syria", the question of whether the two are connected has been examined by scholars since the 17th century. Since the shortened form "Syria" is attested in sources predating the Greek ones as a synonym for Assyria, notably in Luwian and Aramaic texts from the time of the Neo-Assyrian Empire, modern scholars overwhelmingly support the names as being connected.
Both "Assyria" and the contracted "Syria" are ultimately derived from the Akkadian "Aššur". The numerous later empires that ruled over Assyria after the fall of the Neo-Assyrian Empire used their own names for the region, many of which were also derived from "Aššur". The Achaemenid Empire referred to Assyria as "Aθūrā" ("Athura"). The Sasanian Empire inexplicably referred to Lower Mesopotamia as Asoristan ("land of the Assyrians"), though the northern province of Nōdšīragān, which included much of the old Assyrian heartland, was also sometimes called "Atūria" or "Āthōr". In Syriac, Assyria was and is referred to as "ʾĀthor".
History.
Early history.
Agricultural villages in the region that would later become Assyria are known to have existed by the time of the Hassuna culture, 6300–5800 BC. Though the sites of some nearby cities that would later be incorporated into the Assyrian heartland, such as Nineveh, are known to have been inhabited since the Neolithic, the earliest archaeological evidence from Assur dates to the Early Dynastic Period, 2600 BC. During this time, the surrounding region was already relatively urbanized. There is no evidence that early Assur was an independent settlement, and it might not have been called Assur at all initially, but rather Baltil or Baltila, used in later times to refer to the city's oldest portion. The name "Assur" is first attested for the site in documents of the Akkadian period in the 24th century BC. Through most of the Early Assyrian period ( 2600–2025 BC), Assur was dominated by states and polities from southern Mesopotamia. Early on, Assur for a time fell under the loose hegemony of the Sumerian city of Kish and it was later occupied by both the Akkadian Empire and then the Third Dynasty of Ur. In 2025 BC, due to the collapse of the Third Dynasty of Ur, Assur became an independent city-state under Puzur-Ashur I.
Assur was under the Puzur-Ashur dynasty home to less than 10,000 people and likely held very limited military power; no military institutions at all are known from this time and no political influence was exerted on neighboring cities. The city was still influential in other ways; under Erishum I ( 1974–1934 BC), Assur experimented with free trade, the earliest known such experiment in world history, which left the initiative for trade and large-scale foreign transactions entirely to the populace rather than the state. Royal encouragement of trade led to Assur quickly establishing itself as a prominent trading city in northern Mesopotamia and soon thereafter establishing an extensive long-distance trade network, the first notable impression Assyria left in the historical record. Among the evidence left from this trade network are large collections of Old Assyrian cuneiform tablets from Assyrian trade colonies, the most notable of which is a set of 22,000 clay tablets found at Kültepe, near the modern city of Kayseri in Turkey. As trade declined, perhaps due to increased warfare and conflict between the growing states of the Near East, Assur was frequently threatened by larger foreign states and kingdoms. The original Assur city-state, and the Puzur-Ashur dynasty, came to an end 1808 BC when the city was conquered by the Amorite ruler of Ekallatum, Shamshi-Adad I. Shamshi-Adad's extensive conquests in northern Mesopotamia eventually made him the ruler of the entire region, founding what some scholars have termed the "Kingdom of Upper Mesopotamia". The survival of this realm relied chiefly on Shamshi-Adad's own strength and charisma and it thus collapsed shortly after his death 1776 BC.
After Shamshi-Adad's death, the political situation in northern Mesopotamia was highly volatile, with Assur at times coming under the brief control of Eshnunna, Elam and the Old Babylonian Empire. At some point, the city returned to being an independent city-state, though the politics of Assur itself were volatile as well, with fighting between members of Shamshi-Adad's dynasty, native Assyrians and Hurrians for control. The infighting came to an end after the rise of Bel-bani as king 1700 BC. Bel-bani founded the Adaside dynasty, which after his reign ruled Assyria for about a thousand years. Assyria's rise as a territorial state in later times was in large part facilitated by two separate invasions of Mesopotamia by the Hittites. An invasion by the Hittite king Mursili I in 1595 BC destroyed the dominant Old Babylonian Empire, allowing the smaller kingdoms of Mitanni and Kassite Babylonia to rise in the north and south, respectively. Around 1430 BC, Assur was subjugated by Mitanni, an arrangement that lasted for about 70 years, until 1360 BC. Another Hittite invasion by Šuppiluliuma I in the 14th century BC effectively crippled the Mitanni kingdom. After his invasion, Assyria succeeded in freeing itself from its suzerain, achieving independence once more under Ashur-uballit I ( 1363–1328 BC) whose rise to power, independence, and conquests of neighboring territory traditionally marks the rise of the Middle Assyrian Empire ( 1363–912 BC).
Assyrian Empire.
Ashur-uballit I was the first native Assyrian ruler to claim the royal title "šar" ("king"). Shortly after achieving independence, he further claimed the dignity of a great king on the level of the Egyptian pharaohs and the Hittite kings. Assyria's rise was intertwined with the decline and fall of the Mitanni kingdom, its former suzerain, which allowed the early Middle Assyrian kings to expand and consolidate territories in northern Mesopotamia. Under the warrior-kings Adad-nirari I ( 1305–1274 BC), Shalmaneser I ( 1273–1244 BC) and Tukulti-Ninurta I ( 1243–1207 BC), Assyria began to realize its aspirations of becoming a significant regional power. These kings campaigned in all directions and incorporated a significant amount of territory into the growing Assyrian Empire. Under Shalmaneser I, the last remnants of the Mitanni kingdom were formally annexed into Assyria. The most successful of the Middle Assyrian kings was Tukulti-Ninurta I, who brought the Middle Assyrian Empire to its greatest extent. His most notable military achievements were his victory at the Battle of Nihriya 1237 BC, which marked the beginning of the end of Hittite influence in northern Mesopotamia, and his temporary conquest of Babylonia, which became an Assyrian vassal 1225–1216 BC. Tukulti-Ninurta was also the first Assyrian king to try to move the capital away from Assur, inaugurating the new city Kar-Tukulti-Ninurta as capital 1233 BC. The capital was returned to Assur after his death.
Tukulti-Ninurta I's assassination 1207 BC was followed by inter-dynastic conflict and a significant drop in Assyrian power. Tukulti-Ninurta I's successors were unable to maintain Assyrian power and Assyria became increasingly restricted to just the Assyrian heartland, a period of decline broadly coinciding with the Late Bronze Age collapse. Though some kings in this period of decline, such as Ashur-dan I ( 1178–1133 BC), Ashur-resh-ishi I (1132–1115 BC) and Tiglath-Pileser I (1114–1076 BC) worked to reverse the decline and made significant conquests, their conquests were ephemeral and shaky, quickly lost again. From the time of Eriba-Adad II (1056–1054 BC) onward, Assyrian decline intensified. The Assyrian heartland remained safe since it was protected by its geographical remoteness. Since Assyria was not the only state to undergo decline during these centuries, and the lands surrounding the Assyrian heartland were also significantly fragmented, it would ultimately be relatively easy for the reinvigorated Assyrian army to reconquer large parts of the empire. Under Ashur-dan II (934–912 BC), who campaigned in the northeast and northwest, Assyrian decline was at last reversed, paving the way for grander efforts under his successors. The end of his reign conventionally marks the beginning of the Neo-Assyrian Empire (911–609 BC).
Through decades of conquests, the early Neo-Assyrian kings worked to retake the lands of the Middle Assyrian Empire. Since this "reconquista" had to begin nearly from scratch, its eventual success was an extraordinary achievement. Under Ashurnasirpal II (883–859 BC), the Neo-Assyrian Empire became the dominant political power in the Near East. In his ninth campaign, Ashurnasirpal II marched to the coast of the Mediterranean Sea, collecting tribute from various kingdoms on the way. A significant development during Ashurnasirpal II's reign was the second attempt to transfer the Assyrian capital away from Assur. Ashurnasirpal restored the ancient and ruined town of Nimrud, also located in the Assyrian heartland, and in 879 BC designated that city as the new capital of the empire Though no longer the political capital, Assur remained the ceremonial and religious center of Assyria. Ashurnasirpal II's son Shalmaneser III (859–824 BC) also went on wide-ranging wars of conquest, expanding the empire in all directions. After Shalmaneser III's death, the Neo-Assyrian Empire entered into a period of stagnation dubbed the "age of the magnates", when powerful officials and generals were the principal wielders of political power rather than the king. This time of stagnation came to an end with the rise of Tiglath-Pileser III (745–727 BC), who reduced the power of the magnates, consolidated and centralized the holdings of the empire, and through his military campaigns and conquests more than doubled the extent of Assyrian territory. The most significant conquests were the vassalization of the Levant all the way to the Egyptian border and the 729 BC conquest of Babylonia.
The Neo-Assyrian Empire reached the height of its extent and power under the Sargonid dynasty, founded by Sargon II (722–705 BC). Under Sargon II and his son Sennacherib (705–681 BC), the empire was further expanded and the gains were consolidated. Both kings founded new capitals; Sargon II moved the capital to the new city of Dur-Sharrukin in 706 BC and the year after, Sennacherib transferred the capital to Nineveh, which he ambitiously expanded and renovated, and might even have built the hanging gardens there, one of the seven wonders of the ancient world. The 671 BC conquest of Egypt under Esarhaddon (681–669 BC) brought Assyria to its greatest ever extent. After the death of Ashurbanipal (669–631 BC), the Neo-Assyrian Empire swiftly collapsed. One of the primary reasons was the inability of the Neo-Assyrian kings to resolve the "Babylonian problem"; despite many attempts to appease Babylonia in the south, revolts were frequent all throughout the Sargonid period. The revolt of Babylon under Nabopolassar in 626 BC, in combination with an invasion by the Medes under Cyaxares in 615/614 BC, led to the Medo-Babylonian conquest of the Assyrian Empire. Assur was sacked in 614 BC and Nineveh fell in 612 BC. The last Assyrian ruler, Ashur-uballit II, tried to rally the Assyrian army at Harran in the west but he was defeated in 609 BC, marking the end of the ancient line of Assyrian kings and of Assyria as a state.
Later history.
Despite the violent downfall of the Assyrian Empire, Assyrian culture continued to survive through the subsequent post-imperial period (609 BC – AD 240) and beyond. The Assyrian heartland experienced a dramatic decrease in the size and number of inhabited settlements during the rule of the Neo-Babylonian Empire founded by Nabopolassar; the former Assyrian capital cities Assur, Nimrud and Nineveh were nearly completely abandoned. Throughout the time of the Neo-Babylonian and later Achaemenid Empire, Assyria remained a marginal and sparsely populated region. Toward the end of the 6th century BC, the Assyrian dialect of the Akkadian language went extinct, having toward the end of the Neo-Assyrian Empire already largely been replaced by Aramaic as a vernacular language. Under the empires succeeding the Neo-Babylonians, from the late 6th century BC onward, Assyria began to experience a recovery. Under the Achaemenids, most of the territory was organized into the province Athura ("Aθūrā"). The organization into a single large province, the lack of interference of the Achaemenid rulers in local affairs, and the return of the cult statue of Ashur to Assur soon after the Achaemenids conquered Babylon facilitated the survival of Assyrian culture. Under the Seleucid Empire, which controlled Mesopotamia from the late 4th to mid-2nd century BC, Assyrian sites such as Assur, Nimrud and Nineveh were resettled and a large number of villages were rebuilt and expanded.
After the Parthian Empire conquered the region in the 2nd century BC, the recovery of Assyria continued, culminating in an unprecedented return to prosperity and revival in the 1st to 3rd centuries AD. The region was resettled and restored so intensely that the population and settlement density reached heights not seen since the Neo-Assyrian Empire. The region was under the Parthians primarily ruled by a group of vassal kingdoms, including Osroene, Adiabene and Hatra. Though in some aspects influenced by Assyrian culture, these states were for the most part not ruled by Assyrian rulers. Assur itself flourished under Parthian rule. From around or shortly after the end of the 2nd century BC, the city may have become the capital of its own small semi-autonomous Assyrian realm, either under the suzerainty of Hatra, or under direct Parthian suzerainty. On account of the resemblance between the stelae by the local rulers and those of the ancient Assyrian kings, they may have seen themselves as the restorers and continuators of the old royal line. The ancient Ashur temple was restored in the 2nd century AD. This last cultural golden age came to an end with the sack of Assur by the Sasanian Empire 240. During the sack, the Ashur temple was destroyed again and the city's population was dispersed.
Starting from the 1st century AD onward, many of the Assyrians became Christianized, though holdouts of the old ancient Mesopotamian religion continued to survive for centuries. Despite the loss of political power, the Assyrians continued to constitute a significant portion of the population in northern Mesopotamia until religiously-motivated suppression and massacres under the Ilkhanate and the Timurid Empire in the 14th century, which relegated them to a local ethnic and religious minority. The Assyrians lived largely in peace under the rule of the Ottoman Empire, which gained control of Assyria in 16th century. In the late 19th and early 20th century, when the Ottomans grew increasingly nationalistic, further persecutions and massacres were enacted against the Assyrians, most notably the "Sayfo" (Assyrian genocide), which resulted in the deaths of as many as 250,000 Assyrians. Throughout the 20th century, many unsuccessful proposals have been made by the Assyrians for autonomy or independence. Further massacres and persecutions, enacted both by governments and by terrorist groups such as the Islamic State, have resulted in most of the Assyrian people living in diaspora.
Government and military.
Kingship.
In the Assur city-state of the Old Assyrian period, the government was in many respects an oligarchy, where the king was a permanent, albeit not the only prominent, actor. The Old Assyrian kings were not autocrats, with sole power, but rather acted as stewards on behalf of the god Ashur and presided over the meetings of the city assembly, the main Assyrian administrative body during this time. The composition of the city assembly is not known, but it is generally believed to have been made up of members of the most powerful families of the city, many of whom were merchants. The king acted as the main executive officer and chairman of this group of influential individuals and also contributed with legal knowledge and expertise. The Old Assyrian kings were styled as "iššiak Aššur" ("governor [on behalf] of Ashur"), with Ashur being considered the city's formal king. That the populace of Assur in the Old Assyrian period often referred to the king as "rubā’um" ("great one") clearly indicates that the kings, despite their limited executive power, were seen as royal figures and as being "primus inter pares" (first among equals) among the powerful individuals of the city.
Assur first experienced a more autocratic form of kingship under the Amorite conqueror Shamshi-Adad I, the earliest ruler of Assur to use the style "šarrum" (king) and the title 'king of the Universe'. Shamshi-Adad I appears to have based his more absolute form of kingship on the rulers of the Old Babylonian Empire. Under Shamshi-Adad I, Assyrians also swore their oaths by the king, not just by the god. This practice did not survive beyond his death. The influence of the city assembly had disappeared by the beginning of the Middle Assyrian period. Though the traditional "iššiak Aššur" continued to be used at times, the Middle Assyrian kings were autocrats, in terms of power having little in common with the rulers of the Old Assyrian period. As the Assyrian Empire grew, the kings began to employ an increasingly sophisticated array of royal titles. Ashur-uballit I was the first to assume the style "šar māt Aššur" ("king of the land of Ashur") and his grandson Arik-den-ili ( 1317–1306 BC) introduced the style "šarru dannu" ("strong king"). Adad-nirari I's inscriptions required 32 lines to be devoted just to his titles. This development peaked under Tukulti-Ninurta I, who assumed, among other titles, the styles "king of Assyria and Karduniash", "king of Sumer and Akkad", "king of the Upper and the Lower Seas" and "king of all peoples". Royal titles and epithets were often highly reflective of current political developments and the achievements of individual kings; during periods of decline, the royal titles used typically grew more simple again, only to grow grander once more as Assyrian power experienced resurgences.
The kings of the Middle and Neo-Assyrian periods continued to present themselves, and be viewed by their subjects, as the intermediaries between Ashur and mankind. This position and role was used to justify imperial expansion: the Assyrians saw their empire as being the part of the world overseen and administered by Ashur through his human agents. In their ideology, the outer realm outside of Assyria was characterized by chaos and the people there were uncivilized, with unfamiliar cultural practices and strange languages. The mere existence of the "outer realm" was regarded as a threat to the cosmic order within Assyria and as such, it was the king's duty to expand the realm of Ashur and incorporate these strange lands, converting chaos to civilization. Texts describing the coronation of Middle and Neo-Assyrian kings at times include Ashur commanding the king to "broaden the land of Ashur" or "extend the land at his feet". As such, expansion was cast as a moral and necessary duty. Because the rule and actions of the Assyrian king were seen as divinely sanctioned, resistance to Assyrian sovereignty in times of war was regarded to be resistance against divine will, which deserved punishment. Peoples and polities who revolted against Assyria were seen as criminals against the divine world order. Since Ashur was the king of the gods, all other gods were subjected to him and thus the people who followed those gods should be subjected to the representative of Ashur, the Assyrian king.
The kings also had religious and judicial duties. Kings were responsible for performing various rituals in support of the cult of Ashur and the Assyrian priesthood. They were expected, together with the Assyrian people, to provide offerings to not only Ashur but also all the other gods. From the time of Ashur-resh-ishi I onward, the religious and cultic duties of the king were pushed somewhat into the background, though they were still prominently mentioned in accounts of building and restoring temples. Assyrian titles and epithets in inscriptions from then on generally emphasized the kings as powerful warriors. Developing from their role in the Old Assyrian period, the Middle and Neo-Assyrian kings were the supreme judicial authority in the empire, though they generally appear to have been less concerned with their role as judges than their predecessors in the Old Assyrian period were. The kings were expected to ensure the welfare and prosperity of the Assyria and its people, indicated by multiple inscriptions referring to the kings as "shepherds" ("re’û").
Capital cities.
No word for the idea of a capital city existed in Akkadian, the nearest being the idea of a "city of kingship", i.e. an administrative center used by the king, but there are several examples of kingdoms having multiple "cities of kingship". Due to Assyria growing out of the Assur city-state of the Old Assyrian period, and due to the city's religious importance, Assur was the administrative center of Assyria through most of its history. Though the royal administration at times moved elsewhere, the ideological status of Assur was never fully superseded and it remained a ceremonial center in the empire even when it was governed from elsewhere. The transfer of the royal seat of power to other cities was ideologically possible since the king was Ashur's representative on Earth. The king, like the deity embodied Assyria itself, and so the capital of Assyria was in a sense wherever the king happened to have his residence.
The first transfer of administrative power away from Assur occurred under Tukulti-Ninurta I, who 1233 BC inaugurated Kar-Tukulti-Ninurta as capital. Tukulti-Ninurta I's foundation of a new capital was perhaps inspired by developments in Babylonia in the south, where the Kassite dynasty had transferred the administration from the long-established city of Babylon to the newly constructed city of Dur-Kurigalzu, also named after a king. It seems that Tukulti-Ninurta I intended to go further than the Kassites and also establish Kar-Tukulti-Ninurta as the new Assyrian cult center. The city was however not maintained as capital after Tukulti-Ninurta I's death, with subsequent kings once more ruling from Assur.
The Neo-Assyrian Empire underwent several different capitals. There is some evidence that Tukulti-Ninurta II (890–884 BC), perhaps inspired by his predecessor of the same name, made unfulfilled plans to transfer the capital to a city called Nemid Tukulti-Ninurta, either a completely new city or a new name applied to Nineveh, which by this point already rivalled Assur in scale and political importance. The capital was transferred under Tukulti-Ninurta II's son Ashurnasirpal II to Nimrud in 879 BC. An architectural detail separating Nimrud and the other Neo-Assyrian capitals from Assur is that they were designed in a way that emphasized royal power: the royal palaces in Assur were smaller than the temples but the situation was reversed in the new capitals. Sargon II transferred the capital in 706 BC to the city Dur-Sharrukin, which he built himself. Since the location of Dur-Sharrukin had no obvious practical or political merit, this move was probably an ideological statement. Immediately after Sargon II's death in 705 BC, his son Sennacherib transferred the capital to Nineveh, a far more natural seat of power. Though it was not meant as a permanent royal residence, Ashur-uballit II chose Harran as his seat of power after the fall of Nineveh in 612 BC. Harran is typically seen as the short-lived final Assyrian capital. No building projects were conducted during this time, but Harran had been long-established as a major religious center, dedicated to the god Sîn.
Aristocracy and elite.
Because of the nature of source preservation, more information about the upper classes of ancient Assyria survives than for the lower ones. At the top of Middle and Neo-Assyrian society were members of long-established and large families called "houses". Members of this aristocracy tended to occupy the most important offices within the government and they were likely descendants of the most prominent families of the Old Assyrian period. One of the most influential offices in the Assyrian administration was the position of vizier ("sukkallu"). From at least the time of Shalmaneser I onward, there were grand viziers ("sukkallu rabi’u"), superior to the ordinary viziers, who at times governed their own lands as appointees of the kings. At least in the Middle Assyrian period, the grand viziers were typically members of the royal family and the position was at this time, as were many other offices, hereditary.
The elite of the Neo-Assyrian Empire was expanded and included several different offices. The Neo-Assyrian inner elite is typically divided by modern scholars into the "magnates", a set of high-ranking offices, and the "scholars" ("ummânī"), tasked with advising and guiding the kings through interpreting omens. The magnates included the offices "masennu" (treasurer), "nāgir ekalli" (palace herald), "rab šāqê" (chief cupbearer), "rab ša-rēši" (chief officer/eunuch), "sartinnu" (chief judge), "sukkallu" (grand vizier) and "turtanu" (commander-in-chief), which at times continued to be occupied by royal family members. Some of the magnates also acted as governors of important provinces and all of them were deeply involved with the Assyrian military, controlling significant forces. They also owned large tax-free estates, scattered throughout the empire. In the late Neo-Assyrian Empire, there was a growing disconnect between the traditional Assyrian elite and the kings due to eunuchs growing unprecedently powerful. The highest offices both in the civil administration and the army began to be occupied by eunuchs with deliberately obscure and lowly origins since this ensured that they would be loyal to the king. Eunuchs were trusted since they were believed to not be able to have any dynastic aspirations of their own.""
From the time of Erishum I in the early Old Assyrian period onward, a yearly office-holder, a "limmu" official, was elected from the influential men of Assyria. The "limmu" official gave their name to the year, meaning that their name appeared in all administrative documents signed that year. Kings were typically the "limmu" officials in their first regnal years. In the Old Assyrian period, the "limmu" officials also held substantial executive power, though this aspect of the office had disappeared by the time of the rise of the Middle Assyrian Empire.
Administration.
The success of Assyria was not only due to energetic kings who expanded its borders but more importantly due to its ability to efficiently incorporate and govern conquered lands. From the rise of Assyria as a territorial state at the beginning of the Middle Assyrian period onward, Assyrian territory was divided into a set of provinces or districts ("pāḫutu"). The total number and size of these provinces varied and changed as Assyria expanded and contracted. Every province was headed by a provincial governor ("bel pāḫete"," bēl pīhāti" or "šaknu") who was responsible for handling local order, public safety and economy. Governors also stored and distributed the goods produced in their province, which were inspected and collected by royal representatives once a year. Through these inspections, the central government could keep track of current stocks and production throughout the country. Governors had to pay both taxes and offer gifts to the god Ashur, though such gifts were usually small and mainly symbolic. The channeling of taxes and gifts were not only a method of collecting profit but also served to connect the elite of the entire empire to the Assyrian heartland. In the Neo-Assyrian period, an extensive hierarchy within the provincial administration is attested. At the bottom of this hierarchy were lower officials, such as village managers ("rab ālāni") who oversaw one or more villages, collecting taxes in the form of labor and goods and keeping the administration informed of the conditions of their settlements, and corvée officers ("ša bēt-kūdini") who kept tallies on the labor performed by forced laborers and the remaining time owed. Individual cities had their own administrations, headed by mayors ("ḫazi’ānu"), responsible for the local economy and production.
Some regions of the Assyrian Empire were not incorporated into the provincial system but were still subjected to the rule of the Assyrian kings. Such vassal states could be ruled indirectly through allowing established local lines of kings to continue ruling in exchange for tribute or through the Assyrian kings appointing their own vassal rulers. Through the "ilku" system, the Assyrian kings could also grant arable lands to individuals in exchange for goods and military service.
To overcome the challenges of governing a large empire, the Neo-Assyrian Empire developed a sophisticated state communication system, which included various innovative techniques and relay stations. Per estimates by Karen Radner, an official message sent in the Neo-Assyrian period from the western border province Quwê to the Assyrian heartland, a distance of 700 kilometers (430 miles) over a stretch of lands featuring many rivers without any bridges, could take less than five days to arrive. Such communication speed was unprecedented before the rise of the Neo-Assyrian Empire and was not surpassed in the Middle East until the telegraph was introduced by the Ottoman Empire in 1865, nearly two and a half thousand years after the Neo-Assyrian Empire's fall.
Military.
The Assyrian army was throughout its history mostly composed of levies, mobilized only when they were needed (such as in the time of campaigns). Through regulations, obligations and sophisticated government systems, large amounts of soldiers could be recruited and mobilized already in the early Middle Assyrian period. A small central standing army unit was established in the Neo-Assyrian Empire, dubbed the "kiṣir šarri" ("king's unit"). Some professional (though not standing) troops are also attested in the Middle Assyrian period, dubbed "ḫurādu" or "ṣābū ḫurādātu", though what their role was is not clear due to the scarcity of sources. Perhaps this category included archers and charioteers, who needed more extensive training than normal foot soldiers.
The Assyrian army developed and evolved over time. In the Middle Assyrian period, foot soldiers were divided into the "sạ bū ša kakkē" ("weapon troops") and the "sạ bū ša arâtē" ("shield-bearing troops") but surviving records are not detailed enough to determine what the differences were. It is possible that the "sạ bū ša kakkē" included ranged troops, such as slingers ("ṣābū ša ušpe") and archers ("ṣābū ša qalte"). The chariots in the army composed a unit of their own. Based on surviving depictions, chariots were crewed by two soldiers: an archer who commanded the chariot ("māru damqu") and a driver ("ša mugerre"). Chariots first entered extensive military use under Tiglath-Pileser I in the 12th–11th centuries BC and were in the later Neo-Assyrian period gradually phased out in favor of cavalry ("ša petḫalle"). In the Middle Assyrian period, cavalry was mainly used for escorting or message deliveries.
Under the Neo-Assyrian Empire, important new developments in the military were the large-scale introduction of cavalry, the adoption of iron for armor and weapons, and the development of new and innovative siege warfare techniques. At the height of the Neo-Assyrian Empire, the Assyrian army was the strongest army yet assembled in world history. The number of soldiers in the Neo-Assyrian army was likely several hundred thousand. The Neo-Assyrian army was subdivided into "kiṣru", composed of perhaps 1,000 soldiers, most of whom would have been infantry soldiers ("zūk", "zukkû" or "raksūte"). The infantry was divided into three types: light, medium and heavy, with varying weapons, level of armor and responsibilities. While on campaign, the Assyrian army made heavy use of both interpreters/translators ("targumannu") and guides ("rādi kibsi"), both probably being drawn from foreigners resettled in Assyra.
Population and society.
Population and social standing.
Populace.
The majority of the population of ancient Assyria were farmers who worked land owned by their families. Old Assyrian society was divided into two main groups: slaves ("subrum") and free citizens, referred to as "awīlum" ("men") or "Aššur" ("sons of Ashur"). Among the free citizens there was also a division into "rabi" ("big") and "ṣaher" ("small") members of the city assembly. Assyrian society grew more complex and hierarchical over time. In the Middle Assyrian Empire, there were several groups among the lower classes, the highest of which were the free men ("a’ılū"), who like the upper classes could receive land in exchange for performing duties for the government, but who could not live on these lands since they were comparably small. Below the free men were the unfree men ("šiluhlu̮"). The unfree men had given up their freedom and entered the services of others on their own accord, and were in turn provided with clothes and rations. Many of them probably originated as foreigners. Though similar to slavery, it was possible for an unfree person to regain their freedom by providing a replacement and they were during their service considered the property of the government rather than their employers. Other lower classes of the Middle Assyrian period included the "ālāyû" ("village residents"), "ālik ilke" (people recruited through the "ilku" system) and the "hupšu", though what these designations meant in terms of social standing and living standards is not known.
The Middle Assyrian structure of society by and large endured through the subsequent Neo-Assyrian period. Below the higher classes of Neo-Assyrian society were free citizens, semi-free laborers and slaves. It was possible through steady service to the Assyrian state bureaucracy for a family to move up the social ladder; in some cases stellar work conducted by a single individual enhanced the status of their family for generations to come. In many cases, Assyrian family groups, or "clans", formed large population groups within the empire referred to as tribes. Such tribes lived together in villages and other settlements near or adjacent to their agricultural lands.
Slavery was an intrinsic part of nearly every society in the ancient Near East. There were two main types of slaves in ancient Assyria: chattel slaves, primarily foreigners who were kidnapped or who were spoils of war, and debt slaves, formerly free men and women who had been unable to pay off their debts. In some cases, Assyrian children were seized by authorities due to the debts of their parents and sold off into slavery when their parents were unable to pay. Children born to slave women automatically became slaves themselves, unless some other arrangement had been agreed to. Though Old Babylonian texts frequently mention the geographical and ethnic origin of slaves, there is only a single known such reference in Old Assyrian texts (whereas there are many describing slaves in a general sense), a slave girl explicitly being referred to as Subaraean, indicating that ethnicity was not seen as very important in terms of slavery. The surviving evidence suggests that the number of slaves in Assyria never reached a large share of the population. In the Akkadian language, several terms were used for slaves, commonly "wardum", though this term could confusingly also be used for (free) official servants, retainers and followers, soldiers and subjects of the king. Because many individuals designated as "wardum" in Assyrian texts are described as handling property and carrying out administrative tasks on behalf of their masters, many may have in actuality been free servants and not slaves in the common meaning of the term. A number of "wardum" are however also recorded as being bought and sold.
Status of women.
The main evidence concerning the lives of ordinary women in ancient Assyria is in administrative documents and law codes. There was no legal distinction between men and women in the Old Assyrian period and they had more or less the same rights in society. Since several letters written by women are known from the Old Assyrian period, it is evident that women were free to learn how to read and write. Both men and women paid the same fines, could inherit property, participated in trade, bought, owned, and sold houses and slaves, made their own last wills, and were allowed to divorce their partners. Records of Old Assyrian marriages confirm that the dowry to the bride belonged to her, not the husband, and it was inherited by her children after her death. Although they were equal legally, men and women in the Old Assyrian period were raised and socialized differently and had different social expectations and obligations. Typically, girls were raised by their mothers, taught to spin, weave, and help with daily tasks and boys were taught trades by masters, later often following their fathers on trade expeditions. Sometimes the eldest daughter of a family was consecrated as a priestess. She was not allowed to marry and became economically independent.
Wives were expected to provide their husbands with garments and food. Although marriages were typically monogamous, husbands were allowed to buy a female slave in order to produce an heir if his wife was infertile. The wife was allowed to choose that slave and the slave never gained the status of a second wife. Husbands who were away on long trading journeys were allowed to take a second wife in one of the trading colonies, although with strict rules that must be followed: the second wife was not allowed to accompany him back to Assur and both wives had to be provided with a home to live in, food, and wood.
The status of women decreased in the Middle Assyrian period, as can be gathered from laws concerning them among the Middle Assyrian Laws. Among these laws were punishments for various crimes, often sexual or marital ones. Although they did not deprive women of all their rights and they were not significantly different from other ancient Near Eastern laws of their time, the Middle Assyrian Laws effectively made women second-class citizens. However, it is not clear how strongly these laws were enforced. These laws gave men the right to punish their wives as they wished. Among the harshest punishments written into these laws, for a crime not even committed by the woman, was that a raped woman would be forcibly married to her rapist. These laws also specified that certain women were obliged to wear veils while out on the street, marital status being the determining factor. Some women, such as slave women and "ḫarımtū" women, were prohibited from wearing veils and others, such as certain priestesses, were only allowed to wear veils if they were married.
Not all laws were suppressive against women; women whose husbands died or were taken prisoner in war, and who did not have any sons or relatives to support them, were guaranteed support from the government. The "ḫarımtū" women have historically been believed to have been prostitutes, but today, are interpreted as women with an independent social existence, i.e. not tied to a husband, father, or institution. Although most "ḫarımtū" appear to have been poor, there were noteworthy exceptions. The term appears with negative connotations in several texts. Their mere existence makes it clear that it was possible for women to live independent lives, despite their lesser social standing during that period.
During the Neo-Assyrian period that followed, royal and upper-class women experienced increased influence. Women attached to the Neo-Assyrian royal court sent and received letters, were independently wealthy, and could buy and own lands of their own. The queens of the Neo-Assyrian Empire are better attested historically than queens of preceding periods of the culture. Under the Sargonid dynasty, they were granted their own military units, sometimes they are known to have partaken alongside other units in military campaigns.
Among the most influential women of the Neo-Assyrian period were Shammuramat, queen of Shamshi-Adad V (824–811 BC), who in the reign of her son Adad-nirari III (811–783 BC) might have been regent and participated in military campaigns. Another is Naqi'a, who influenced politics in the reigns of Sennacherib, Esarhaddon, and Ashurbanipal.
Economy.
In the Old Assyrian period, a major portion of Assur's population was involved in the city's international trade. As can be gathered from hiring contracts and other records, the trade involved people of many different occupations, including porters, guides, donkey drivers, agents, traders, bakers and bankers. Because of the extensive cuneiform records known from the period, details of the trade are relatively well-known. It has been estimated that just in the period 1950–1836 BC, twenty-five tons of Anatolian silver was transported to Assur, and that approximately one hundred tons of tin and 100,000 textiles were transported to Anatolia in return. The Assyrians also sold livestock, processed goods and reed products. In many cases, the materials sold by Assyrian colonists came from far-away places; the textiles sold by Assyrians in Anatolia were imported from southern Mesopotamia and the tin came from the east in the Zagros Mountains.
After international trade declined in the 19th century BC, the Assyrian economy became increasingly oriented toward the state. In the Neo-Assyrian period, the wealth generated through private investments was dwarfed by the wealth of the state, which was by far the largest employer in the empire and had a monopoly on agriculture, manufacturing and exploitation of minerals. The imperial economy advantaged mainly the elite, since it was structured in a way that ensured that surplus wealth flowed to the government and was then used for the maintenance of the state throughout the empire. Though all means of production were owned by the state, there also continued to be a vibrant private economic sector within the empire, with property rights of individuals ensured by the government.
Personal identity and continuity.
Ethnicity and culture are largely based in self-perception and self-designation. A distinct Assyrian identity seems to have formed already in the Old Assyrian period, when distinctly Assyrian burial practices, foods and dress codes are attested and Assyrian documents appear to consider the inhabitants of Assur to be a distinct cultural group. A wider Assyrian identity appears to have spread across northern Mesopotamia under the Middle Assyrian Empire, since later writings concerning the reconquests of the early Neo-Assyrian kings refer to some of their wars as liberating the Assyrian people of the cities they reconquered.
Surviving evidence suggests that the ancient Assyrians had a relatively open definition of what it meant to be Assyrian. Modern ideas such as a person's ethnic background, or the Roman idea of legal citizenship, do not appear to have been reflected in ancient Assyria. Although Assyrian accounts and artwork of warfare frequently describe and depict foreign enemies, they are not depicted with different physical features, but rather with different clothing and equipment. Assyrian accounts describe enemies as barbaric only in terms of their behavior, as lacking correct religious practices, and as doing wrongdoings against Assyria. All things considered, there does not appear to have been any well-developed concepts of ethnicity or race in ancient Assyria. What mattered for a person to be seen by others as Assyrian was mainly fulfillment of obligations (such as military service), being affiliated with the Assyrian Empire politically and maintaining loyalty to the Assyrian king. One of the inscriptions that attest to this view, as well as royal Assyrian policies enacted to encourage assimilation and cultural mixture, is Sargon II's account of the construction of Dur-Sharrukin. One of the passages of the inscription reads:
Although the text clearly differentiates the new settlers from those that had been "born Assyrians", the aim of Sargon's policy was also clearly to transform the new settlers into Assyrians through appointing supervisors and guides to teach them. Though the expansion of the Assyrian Empire, in combination with resettlements and deportations, changed the ethno-cultural make-up of the Assyrian heartland, there is no evidence to suggest that the more ancient Assyrian inhabitants of the land ever disappeared or became restricted to a small elite, nor that the ethnic and cultural identity of the new settlers was anything other than "Assyrian" after one or two generations.
Although the use of the term "Assyrian" by the modern Assyrian people has historically been the target of misunderstanding and controversy, both politically and academically, Assyrian continuity is generally scholarly accepted based on both historical and genetic evidence in the sense that the modern Assyrians are regarded to be descendants of the population of the ancient Assyrian Empire. Though the ancient Akkadian language and cuneiform script did not survive for long in Assyria after the empire was destroyed in 609 BC, Assyrian culture clearly did; the old Assyrian religion continued to be practised at Assur until the 3rd century AD, and at other sites for centuries thereafter, gradually losing ground to Christianity. At Mardin, believers in the old religion are known from as late as the 18th century. Individuals with names harkening back to ancient Mesopotamia are also attested at Assur until it was sacked for the last time in AD 240 and at other sites as late as the 13th century. Though many foreign states ruled over Assyria in the millennia following the empire's fall, there is no evidence of any large scale influx of immigrants that replaced the original population, which instead continued to make up a significant portion of the region's people until the Mongol and Timurid massacres in the late 14th century.
In pre-modern Syriac-language (the type of Aramaic used in Christian Mesopotamian writings) sources, the typical self-designations used are "ʾārāmāyā" ("Aramean") and "suryāyā", with the term "ʾāthorāyā" ("Assyrian") rarely being used as a self-designation. The terms Assyria ("ʾāthor") and Assyrian ("ʾāthorāyā") were however used in several senses in pre-modern times; most notably being used for the ancient Assyrians and for the land surrounding Nineveh (and for the city of Mosul, built next to Nineveh's ruins). In Syriac translations of the Bible, the term "ʾāthor" is also used to refer to the ancient Assyrian Empire. In the sense of a citizen of Mosul, the designation "ʾāthorāyā" were used for some individuals in the pre-modern period. The reluctance of Christians to use "ʾāthorāyā" as a self-designation could perhaps be explained by the Assyrians described in the Bible being prominent enemies of Israel; the term "ʾāthorāyā" was sometimes employed in Syriac writings as a term for enemies of Christians. In this context, the term was sometimes applied to the Persians of the Sasanian Empire; the 4th-century Syriac writer Ephrem the Syrian for instance referred to the Sasanian Empire as "filthy "ʾāthor", mother of corruption". In a similar fashion, the term was also sometimes applied to the later Muslim rulers.
The self-designation "suryāyā", "suryāyē" or "sūrōyē", sometimes translated as "Syrian", is believed to be derived from the Akkadian term "assūrāyu" ("Assyrian"), which was sometimes even in ancient times rendered in the shorter form "sūrāyu". Some medieval Syriac Christian documents used "āsūrāyē" and "sūrāyē", rather than "āthōrāyē", also for the ancient Assyrians. Medieval and modern Armenian sources also connected "assūrāyu" and "suryāyā", consistently referring to the Aramaic-speaking Christians of Mesopotamia and Syria as "Asori".
Despite the complex issue of self-designations, pre-modern Syriac-language sources at times identified positively with the ancient Assyrians and drew connections between the ancient empire and themselves. Most prominently, ancient Assyrian kings and figures long appeared in local folklore and literary tradition and claims of descent from ancient Assyrian royalty were forwarded both for figures in folklore and by actual living high-ranking members of society in northern Mesopotamia. Visits by missionaries from various western churches to the Assyrian heartland in the 18th century likely contributed to the Assyrian people more strongly relating their self-designation and identity to ancient Assyria; in the context of interactions with westerners who connected them to the ancient Assyrians, and due to an increasing number of atrocities and massacres directed against them, the Assyrian people experienced a cultural "awakening" or "renaissance" toward the end of the 19th century, which led to the development of a national ideology more strongly rooted in their descent from ancient Assyria and a re-adoption of self-designations such as "ʾāthorāyā" and "ʾāsurāyā". Today, "sūryōyō" or "sūrāyā" are the predominant self-designations used by Assyrians in their native language, though they are typically translated as "Assyrian" rather than "Syrian".
Culture.
Languages.
Akkadian.
The ancient Assyrians primarily spoke and wrote the Assyrian language, a Semitic language (i.e. related to modern Hebrew and Arabic) closely related to Babylonian, spoken in southern Mesopotamia. Both Assyrian and Babylonian are generally regarded by modern scholars to be dialects of the Akkadian language. This is a modern convention since contemporary ancient authors considered Assyrian and Babylonian to be two separate languages; only Babylonian was referred to as "akkadûm", with Assyrian being referred to as "aššurû" or "aššurāyu". Though both were written with cuneiform script, the signs look quite different and can be distinguished relatively easily. Given the vast timespan covered by ancient Assyria, the Assyrian language developed and evolved over time. Modern scholars broadly categorize it into three different periods, roughly (though far from precisely) corresponding to the periods used to divide Assyrian history: the Old Assyrian language (2000–1500 BC), Middle Assyrian language (1500–1000 BC) and Neo-Assyrian language (1000–500 BC). Because the record of Assyrian tablets and documents is still somewhat spotty, many of the stages of the language remain poorly known and documented.
The signs used in Old Assyrian texts are for the most part less complex than those used during the succeeding Middle and Neo-Assyrian periods and they were fewer in number, amounting to no more than 150–200 unique signs, most of which were syllabic signs (representing syllables). Due to the limited number of signs used, Old Assyrian is relatively easier to decipher for modern researchers than later forms of the language, though the limited number of signs also means that there are in cases several possible alternative phonetic values and readings. This means that while it is easy to decipher the signs, many researchers remain uncomfortable with the language itself. Though it was a more archaic variant of the later Assyrian language, Old Assyrian also contains several words that are not attested in later periods, some being peculiar early forms of words and others being names for commercial terms or various textile and food products from Anatolia.
In the Middle and Neo-Assyrian empires, the later versions of the Assyrian language were not the only versions of Akkadian used. Though Assyrian was typically used in letters, legal documents, administrative documents, and as a vernacular, Standard Babylonian was also used in an official capacity. Standard Babylonian was a highly codified version of ancient Babylonian, as used around 1500 BC, and was used as a language of high culture, for nearly all scholarly documents, literature, poetry and royal inscriptions. The culture of the Assyrian elite was strongly influenced by Babylonia in the south; in a vein similar to how Greek civilization was respected in, and influenced, ancient Rome, the Assyrians had much respect for Babylon and its ancient culture.
Because of the multilingual nature of the vast empire, many loan words are attested as entering the Assyrian language during the Neo-Assyrian period. The number of surviving documents written in cuneiform grow considerably fewer in the late reign of Ashurbanipal, which suggests that the language was declining since it is probably attributable to an increased use of Aramaic, often written on perishable materials such as leather scrolls or papyrus. The ancient Assyrian language did not disappear completely until around the end of the 6th century BC, well into the subsequent post-imperial period.
Aramaic and other languages.
Because the Assyrians never imposed their language on foreign peoples whose lands they conquered outside of the Assyrian heartland, there were no mechanisms in place to stop the spread of languages other than Akkadian. Beginning with the migrations of Arameans into Assyrian territory during the Middle Assyrian period, this lack of linguistic policies facilitated the spread of the Aramaic language. As the most widely spoken and mutually understandable of the Semitic languages (the language group containing many of the languages spoken through the empire), Aramaic grew in importance throughout the Neo-Assyrian period and increasingly replaced the Neo-Assyrian language even within the Assyrian heartland itself. From the 9th century BC onward, Aramaic became the "de facto" lingua franca of the Neo-Assyrian Empire, with Neo-Assyrian and other forms of Akkadian becoming relegated to a language of the political elite.
From the time of Shalmaneser III, in the 9th century BC, Aramaic was used in state-related contexts alongside Akkadian and by the time of Tiglath-Pileser III, the kings employed both Akkadian and Aramaic-language royal scribes, confirming the rise of Aramaic to a position of an official language used by the imperial administration. During the time after the fall of the Neo-Assyrian Empire, the old Assyrian language was completely abandoned in Mesopotamia in favor of Aramaic. By 500 BC, Akkadian was probably no longer a spoken language.
Modern Assyrian people refer to their language as "Assyrian" ("Sūrayt" or "Sūreth"). Though it has little in common with the Assyrian dialect of the Akkadian language, it is a modern version of the ancient Mesopotamian Aramaic. The language retains some influence of ancient Akkadian, particularly in the form of loanwords. Modern Assyrian varieties of Aramaic are often referred to by scholars as Neo-Aramaic or Neo-Syriac. As a liturgical language, many Assyrians also speak Syriac, a codified version of classical Aramaic as spoken at Edessa during the Christianization of Assyria.
Another language sometimes used in ancient Assyria as a language of scholarship and culture, though only in written form, was the ancient Sumerian language. At the height of the Neo-Assyrian Empire, various other local languages were also spoken within the imperial borders, though none achieved the same level of official recognition as Aramaic.
Architecture.
There are three surviving forms of primary evidence for the architecture of ancient Assyria. The most important form is the surviving buildings themselves, found through archaeological excavations, but important evidence can also be gathered from both contemporary documentation, such as letters and administrative documents that describe buildings that might not have been preserved, as well as documentation by later kings concerning the building works of previous kings. Assyrian buildings and construction works were almost always constructed out of mudbrick. Limestone was also used, though primarily only in works such as aqueducts and river walls, exposed to running water, and defensive fortifications.
In order to support large buildings, they were often built on top of foundation platforms or on mud brick foundations. Floors were typically made of rammed earth, covered in important rooms with carpets or reed mats. Floors in locations that were exposed to the elements, such as outside on terraces or in courtyards, were paved with stone slabs or backed bricks. Roofs, particularly in larger rooms, were supported through the use of wooden beams.
The ancient Assyrians accomplished several technologically complex construction projects, including constructions of whole new capital cities, which indicates sophisticated technical knowledge. Though in large part following previous Mesopotamian architecture, there are several characteristic features of ancient Assyrian architecture. Some examples of features of ancient Assyrian architecture include stepped merlons, vaulted roofs, and palaces to a large degree often being made up of sets of self-contained suites.
Art.
A relatively large number of statues and figurines have been recovered from the ruins of temples in Assur dating to the Early Assyrian period. Most of the surviving artwork from this time was clearly influenced by the artwork of foreign powers. For instance, a set of 87 alabaster figures of male and female worshippers from Assur before the rise of the Akkadian Empire greatly resembles Early Dynastic Sumerian figures. Because of variation in artwork elsewhere, the artwork of early Assur was also highly variable depending on the time period, ranging from highly stylized to highly naturalistic.
Among the most unique finds from the Early period is the head of a woman of which her eyes, eyebrows, and elaborate hair covering were originally inlaid. This head is typical of the art style of the Akkadian period, with an overall naturalistic style, smooth and soft curves and a full mouth. Another unique art piece from the early period is an ivory figurine of a nude woman, and fragments of at least five additional similar figurines. The ivory used might have come from Indian elephants, which would indicate trade between early Assur and the early tribes and states of Iran. Among other artwork known from the early period are a handful of large stone statues of rulers (governors and foreign kings), figures of animals, and stone statues of naked women.
The artwork known from the Old Assyrian period, other than a few objects such as a partial stone statue perhaps depicting Erishum I, is largely limited to seals and impressions of seals on cuneiform documents. Royal seals from the Puzur-Ashur dynasty of kings, prior to the rise of Shamshi-Adad I, are very similar to the seals of the kings of the Third Dynasty of Ur. In the Middle Assyrian period, from Ashur-uballit I onward, seals looked quite different and appear to emphasize royal power, rather than the theological and cosmic sources of the king's right to rule. Among non-royal seals of the Middle Assyrian period a wide assortment of different motifs are known, including both religious scenes and peaceful scenes of animals and trees. From the time of Tukulti-Ninurta I onward, seals also sometimes featured contests and struggles between humans, various animals, and mythological creatures.
Several other new artistic innovations were also made in the Middle Assyrian period. In the temple dedicated to Ishtar in Assur, four cult pedestals (or "altars") from the time of Tukulti-Ninurta I have been discovered. These altars were decorated with various motifs, common inclusions being the king (sometimes multiple times) and protective divine figures and standards. One of the pedestals preserves along the lower step of its base a relief image which is the earliest known narrative image in Assyrian art history. This relief, which is not very well-preserved, appears to depict rows of prisoners before the Assyrian king. The earliest known Assyrian wall paintings are also from the time of Tukulti-Ninurta I, from his palace in Kar-Tukulti-Ninurta. Motifs included plant-based patterns (rosettes and palmettes), trees and bird-headed genies. The colors used to paint the walls included black, red, blue, and white. An unusual limestone statue of a nude woman is known from Nineveh from the time of Ashur-bel-kala (1074–1056 BC). An entirely new type of monument introduced in the 11th century BC were obelisks; four-sided stone stelae decorated all around with both images and text. Obelisks saw continued use until at least the 9th century BC.
Compared to other periods, a larger amount of artwork survives from the Neo-Assyrian period, particularly monumental art made under the patronage of the kings. The most well-known form of Neo-Assyrian monumental art are wall reliefs, carved stone artwork that lined the internal and external walls of temples and palaces. Another well-known form of Neo-Assyrian art are colossi, often human-headed lions or bulls ("lamassu"), that were placed at the gates of temples, palaces and cities. The earliest known examples of both wall reliefs and colossi are from the reign of Ashurnasirpal II, who might have been inspired by the Hittite monumental art that he saw on his campaigns to the Mediterranean. Wall paintings such as those made under Tukulti-Ninurta I in the Middle Assyrian period also continued to be used, sometimes to supplement wall reliefs and sometimes instead of them. Interior walls could be decorated by covering the mudbrick used in construction with painted mud plaster and exterior walls were at times decorated with glazed and painted tiles or bricks. The most extensive known surviving sets of wall reliefs are from the reign of Sennacherib. In terms of Neo-Assyrian artwork, modern scholars have paid particular attention to the reliefs produced under Ashurbanipal, which have been described as possessing a distinct "epic quality" unlike the art under his predecessors.
Scholarship and literature.
Ancient Assyrian literature drew heavily on Babylonian literary traditions. Both the Old and Middle Assyrian periods are limited in terms of surviving literary texts. The most important surviving Old Assyrian literary work is "Sargon, Lord of Lies", a text found in a well-preserved version on a cuneiform tablet from Kültepe. Once thought to have been a parody, the tale is a first-person narrative of the reign of Sargon of Akkad, the founder of the Akkadian Empire. The text follows Sargon as he gains strength from the god Adad, swears by Ishtar, the "lady of combat", and speaks with the gods. Surviving Middle Assyrian literature is only slightly more diverse. A distinct Assyrian scholarship tradition, though still drawing on Babylonian tradition, is conventionally placed as beginning around the time of the beginning of the Middle Assyrian period. The rising status of scholarship at this time might be connected to the kings beginning to regard amassing knowledge as a way to strengthen their power. Known Middle Assyrian works include the "Tukulti-Ninurta Epic" (a narrative of the reign of Tukulti-Ninurta I and his exploits), fragments of other royal epics, "The Hunter" (a short martial poem) and some royal hymns.
The clear majority of surviving ancient Assyrian literature is from the Neo-Assyrian period. The kings of the Neo-Assyrian Empire began to see preserving knowledge as one of their responsibilities, and not (as previous kings had) a responsibility of private individuals and temples. This development might have originated with the kings no longer viewing the divination performed by their diviners as enough and wished to have access to the relevant texts themselves. The office of chief scholar is first attested in the reign of the Neo-Assyrian king Tukulti-Ninurta II.
Most of the surviving ancient Assyrian literature comes from the Neo-Assyrian Library of Ashurbanipal, which included more than 30,000 documents. Libraries were built in the Neo-Assyrian period to preserve knowledge of the past and maintain scribal culture. Neo-Assyrian texts fall into a wide array of genres, including divinatory texts, divination reports, treatments for the sick (either medical or magical), ritual texts, incantations, prayers and hymns, school texts and literary texts. An innovation of the Neo-Assyrian period were the annals, a genre of texts recording the events of the reigns of a king, particularly military exploits. Annals were disseminated throughout the empire and probably served propagandistic purposes, supporting the legitimacy of the king's rule. Various purely literary works, previously aligned by scholars with propaganda, are known from the Neo-Assyrian period. Such works include, among others, the "Underworld Vision of an Assyrian Crown Prince", the "Sin of Sargon" and the "Marduk Ordeal". In addition to their own works, the Assyrians also copied and preserved earlier Mesopotamian literature. The inclusion of texts such as the "Epic of Gilgamesh", the "Enûma Eliš" (the Babylonian creation myth), "Erra", the "Myth of Etana" and the "Epic of Anzu" in the Library of Ashurbanipal is the primary reason for how such texts have survived to the present day.
Religion.
Ancient Assyrian religion.
Knowledge of the ancient polytheistic Assyrian religion, referred to as "Ashurism" by some modern Assyrians, is mostly limited to state cults given that little can be ascertained of the personal religious beliefs and practices of the common people of ancient Assyria. The Assyrians worshipped the same pantheon of gods as the Babylonians in southern Mesopotamia. The chief Assyrian deity was the national deity Ashur. Though the deity and the ancient capital city are commonly distinguished by modern historians through calling the god Ashur and the city Assur, both were inscribed in the exact same way in ancient times ("Aššur"). In documents from the preceding Old Assyrian period, the city and god are often not clearly differentiated, which suggests that Ashur originated sometime in the Early Assyrian period as a deified personification of the city itself. Below Ashur, the other Mesopotamian deities were organized in a hierarchy, with each having their own assigned roles (the sun-god Shamash was for instance regarded as a god of justice and Ishtar was seen as a goddess of love and war) and their own primary seats of worship (Ninurta was for instance primarily worshipped at Nimrud and Ishtar primarily at Arbela). Quintessentially Babylonian deities such as Enlil, Marduk, and Nabu were worshipped in Assyria just as much as in Babylonia, and several traditionally Babylonian rituals, such as the "akitu" festival, were borrowed in the north.
Ashur's role as the chief deity was flexible and changed with the changing culture and politics of the Assyrians themselves. In the Old Assyrian period, Ashur was mainly regarded as a god of death and revival, related to agriculture. Under the Middle and Neo-Assyrian Empire, Ashur's role was expanded and thoroughly altered. Possibly originating as a reaction to the period of suzerainty under the Mittani kingdom, Middle Assyrian theology presented Ashur as a god of war, who bestowed the Assyrian kings not only with divine legitimacy, something retained from the Old Assyrian period, but also commanded the kings to enlarge Assyria ("the land of Ashur") with Ashur's "just scepter", i.e. expand the Assyrian Empire through military conquest. This militarization of Ashur might also have derived from the Amorite conqueror Shamshi-Adad I equating Ashur with the southern Enlil during his rule over northern Mesopotamia in the 18th and 17th centuries BC. In the Middle Assyrian period, Ashur is attested with the title "king of the gods", a role previous civilizations in both northern and southern Mesopotamia ascribed to Enlil. The development of equating Ashur with Enlil, or at least transferring Enlil's role to Ashur, was paralleled in Babylon, where the previously unimportant local god Marduk was elevated in the reign of Hammurabi (18th century BC) to the head of the pantheon, modelled after Enlil.
Assyrian religion was centered in temples, monumental structures that included a central shrine which housed the cult statue of the temple's god, and several subordinate chapels with space for statues of other deities. Temples were typically self-contained communities; they had their own economic resources, chiefly in the form of land holdings, and their own hierarchically organized personnel. In later times, temples became increasingly dependent on royal benefits, in the shape of specific taxes, offerings and donations of booty and tribute. The head of a temple was titled as the "chief administrator" and was responsible to the Assyrian king since the king was regarded to be Ashur's representative in the mortal world. Records from temples showcase that divination in the form of astrology and extispicy (studying the entrails of dead animals) were important parts of the Assyrian religion since they were believed to be means through which deities communicated with the mortal world.
Unlike many other ancient empires, the Neo-Assyrian Empire did at its height not impose its culture and religion on conquered regions; there were no significant temples built for Ashur outside of northern Mesopotamia. In the post-imperial period, after the fall of the Neo-Assyrian Empire, the Assyrians continued to venerate Ashur and the rest of the pantheon, though without the Assyrian state, religious beliefs in many parts of the Assyrian heartland diverged and developed in different directions. From the time of Seleucid rule over the region (4th to 2nd century BC) onward, there was a strong influence of the ancient Greek religion, with many Greek deities becoming syncretized with Mesopotamian deities. There was also some influence of Judaism, given that the kings of Adiabene, a vassal kingdom covering much of the old Assyrian heartland, converted to Judaism in the 1st century AD. In the 1st century BC onward, as a frontier region between the Roman and the Parthian empires, Assyria was likely highly religiously complex and diverse. Under Parthian rule, both old and new gods were worshipped at Assur. As late as the time of the city's second destruction in the 3rd century AD, the most important deity was still Ashur, known during this time as "Assor" or "Asor". Worship of Ashur during this time was carried out in the same way as it had been in ancient times, per a cultic calendar effectively identical to that used under the Neo-Assyrian Empire 800 years prior. The ancient Mesopotamian religion persisted in some places for centuries after the end of the post-imperial period, such as at Harran until at least the 10th century (the "Sabians" of Harran) and at Mardin until as late as the 18th century (the "Shamsīyah").
Christianity.
The Church of the East developed early in Christian history. Though tradition holds that Christianity was first spread to Mesopotamia by Thomas the Apostle, the exact timespan when the Assyrians were first Christianized is unknown. The city of Arbela was an important early Christian center; according to the later "Chronicle of Arbela", Arbela became the seat of a bishop already in AD 100, but the reliability of this document is questioned among scholars. It is however known that both Arbela and Kirkuk later served as important Christian centers in the Sasanian and later Islamic periods. According to some traditions, Christianity took hold in Assyria when Saint Thaddeus of Edessa converted King Abgar V of Osroene in the mid-1st century AD. From the 3rd century AD onward, it is clear that Christianity was becoming the major religion of the region, with Christ replacing the old Mesopotamian deities. Assyrians had by this time already intellectually contributed to Christian thought; in the 1st century AD, the Christian Assyrian writer Tatian composed the influential " Diatessaron", a synoptic rendition of the gospels.
Though Christianity is today an intrinsic part of Assyrian identity, Assyrian Christians have over the centuries splintered into a number of different Christian denominations. Though the prominent Assyrian Church of the East, the followers of which have often been termed "Nestorians", continues to exist, other prominent eastern churches include the Chaldean Catholic Church, which split off in the 16th century, the Syriac Orthodox Church, the Syriac Catholic Church, and the Ancient Church of the East, which branched off from the Assyrian Church of the East in 1968.
Though these churches have been distinct for centuries, they still follow much of the same liturgical, spiritual and theological foundation. There are also Assyrian followers of various denominations of Protestantism, chiefly due to missions by American missionaries of the Presbyterian Church.
Because the Assyrian Church of the East remains dismissed as "Nestorian" and heretical by many other branches of Christianity, it has not been admitted into the Middle East Council of Churches and it does not take part in the Joint International Commission for Theological Dialogue Between the Catholic Church and the Orthodox Church. This does not mean that efforts to approach ecumenism have not been undertaken. In 1994, Pope John Paul II and Patriarch Dinkha IV signed the Common Christological Declaration Between the Catholic Church and the Assyrian Church of the East, with some further efforts also having been made in the years since. Historically, the main obstacle in the way of ecumenism has been the ancient text "Liturgy of Addai and Mari", used in the Assyrian churches, wherein the anaphora does not contain the Words of Institution, seen as indispensable by the Catholic Church. This obstacle was removed in 2001, when the Catholic Congregation for the Doctrine of the Faith determined that the text could be considered valid in Catholicism as well, despite the absence of the words. Some efforts have also been made to approach reunification of the Assyrian and Chaldean churches. In 1996, Dinkha IV and Patriarch Raphael I Bidawid of the Chaldean Church signed a list of common proposals to move toward unity, approved by synods of both churches in 1997.
|
2086 | Abijah | Abijah ( ') is a Biblical Hebrew unisex name which means "my Father is Yah". The Hebrew form ' also occurs in the Bible.
Russian name.
The variant used in the Russian language is "" ('), with "" or "" ('), being older forms. Included into various, often handwritten, church calendars throughout the 17th–19th centuries, it was omitted from the official Synodal Menologium at the end of the 19th century. In 1924–1930, the name (as "", a form of "'") was included into various Soviet calendars, which included the new and often artificially created names promoting the new Soviet realities and encouraging the break with the tradition of using the names in the Synodal Menologia. In Russian it is only used as a female name. Diminutives of this name include " (') and " ("").
|
2087 | Ark | Ark or ARK may refer to:
|
2088 | Aphasia | In aphasia, a person is unable to comprehend or unable to formulate language because of damage to specific brain regions. The major causes are stroke and head trauma; prevalence is hard to determine but aphasia due to stroke is estimated to be 0.1–0.4% in the Global North. Aphasia can also be the result of brain tumors, brain infections, or neurodegenerative diseases (such as dementias).
To be diagnosed with aphasia, a person's speech or language must be significantly impaired in one (or more) of the four aspects of communication following acquired brain injury. Alternatively, in the case of progressive aphasia, it must have significantly declined over a short period of time. The four aspects of communication are auditory comprehension, verbal expression, reading and writing, and functional communication.
The difficulties of people with aphasia can range from occasional trouble finding words, to losing the ability to speak, read, or write; intelligence, however, is unaffected. Expressive language and receptive language can both be affected as well. Aphasia also affects visual language such as sign language. In contrast, the use of formulaic expressions in everyday communication is often preserved. For example, while a person with aphasia, particularly expressive aphasia (Broca's aphasia), may not be able to ask a loved one when their birthday is, they may still be able to sing "Happy Birthday". One prevalent deficit in the aphasias is anomia, which is a difficulty in finding the correct word.
With aphasia, one or more modes of communication in the brain have been damaged and are therefore functioning incorrectly. Aphasia is not caused by damage to the brain that results in motor or sensory deficits, which produces abnormal speech; that is, aphasia is not related to the mechanics of speech but rather the individual's language cognition (although a person can have both problems, as an example, if they have a haemorrhage that damaged a large area of the brain). An individual's language is the socially shared set of rules, as well as the thought processes that go behind communication (as it affects both verbal and nonverbal language). It is not a result of a more peripheral motor or sensory difficulty, such as paralysis affecting the speech muscles or a general hearing impairment.
Neurodevelopmental forms of auditory processing disorder are differentiable from aphasia in that aphasia is by definition caused by acquired brain injury, but acquired epileptic aphasia has been viewed as a form of APD.
Prevalence.
Aphasia affects about two million people in the U.S. and 250,000 people in Great Britain. Nearly 180,000 people acquire the disorder every year in the U.S., 170,000 due to stroke. Any person of any age can develop aphasia, given that it is often caused by a traumatic injury. However, people who are middle aged and older are the most likely to acquire aphasia, as the other etiologies are more likely at older ages. For example, approximately 75% of all strokes occur in individuals over the age of 65. Strokes account for most documented cases of aphasia: 25% to 40% of people who survive a stroke develop aphasia as a result of damage to the language-processing regions of the brain.
Aphasia and dysphasia.
Technically, dysphasia means impaired language and aphasia means lack of language. There have been calls to use the term 'aphasia' regardless of severity. Reasons for doing so include dysphasia being easily confused with the swallowing disorder dysphagia, consumers and speech pathologists preferring the term aphasia, and many languages other than English using a word similar to aphasia. The term "aphasia" is more commonly used in North America, while "dysphasia" is more frequently used elsewhere.
Signs and symptoms.
People with aphasia may experience any of the following behaviors due to an acquired brain injury, although some of these symptoms may be due to related or concomitant problems, such as dysarthria or apraxia, and not primarily due to aphasia. Aphasia symptoms can vary based on the location of damage in the brain. Signs and symptoms may or may not be present in individuals with aphasia and may vary in severity and level of disruption to communication. Often those with aphasia may have a difficulty with naming objects, so they might use words such as "thing" or point at the objects. When asked to name a pencil they may say it is a “thing used to write”.
Related behaviors.
Given the previously stated signs and symptoms, the following behaviors are often seen in people with aphasia as a result of attempted compensation for incurred speech and language deficits:
Causes.
Aphasia is most often caused by stroke, where about a quarter of patients who experience an acute stroke develop aphasia. However, any disease or damage to the parts of the brain that control language can cause aphasia. Some of these can include brain tumors, traumatic brain injury, and progressive neurological disorders. In rare cases, aphasia may also result from herpesviral encephalitis. The herpes simplex virus affects the frontal and temporal lobes, subcortical structures, and the hippocampal tissue, which can trigger aphasia. In acute disorders, such as head injury or stroke, aphasia usually develops quickly. When caused by brain tumor, infection, or dementia, it develops more slowly.
Substantial damage to tissue anywhere within the region shown in blue (on the figure in the infobox above) can potentially result in aphasia. Aphasia can also sometimes be caused by damage to subcortical structures deep within the left hemisphere, including the thalamus, the internal and external capsules, and the caudate nucleus of the basal ganglia. The area and extent of brain damage or atrophy will determine the type of aphasia and its symptoms. A very small number of people can experience aphasia after damage to the right hemisphere only. It has been suggested that these individuals may have had an unusual brain organization prior to their illness or injury, with perhaps greater overall reliance on the right hemisphere for language skills than in the general population.
Primary progressive aphasia (PPA), while its name can be misleading, is actually a form of dementia that has some symptoms closely related to several forms of aphasia. It is characterized by a gradual loss in language functioning while other cognitive domains are mostly preserved, such as memory and personality. PPA usually initiates with sudden word-finding difficulties in an individual and progresses to a reduced ability to formulate grammatically correct sentences (syntax) and impaired comprehension. The etiology of PPA is not due to a stroke, traumatic brain injury (TBI), or infectious disease; it is still uncertain what initiates the onset of PPA in those affected by it.
Epilepsy can also include transient aphasia as a prodromal or episodic symptom. Aphasia is also listed as a rare side-effect of the fentanyl patch, an opioid used to control chronic pain.
Classification.
Aphasia is best thought of as a collection of different disorders, rather than a single problem. Each individual with aphasia will present with their own particular combination of language strengths and weaknesses. Consequently, it is a major challenge just to document the various difficulties that can occur in different people, let alone decide how they might best be treated. Most classifications of the aphasias tend to divide the various symptoms into broad classes. A common approach is to distinguish between the fluent aphasias (where speech remains fluent, but content may be lacking, and the person may have difficulties understanding others), and the nonfluent aphasias (where speech is very halting and effortful, and may consist of just one or two words at a time).
However, no such broad-based grouping has proven fully adequate. There is wide variation among people even within the same broad grouping, and aphasias can be highly selective. For instance, people with naming deficits (anomic aphasia) might show an inability only for naming buildings, or people, or colors.
It is important to note that there are typical difficulties with speech and language that come with normal aging as well. As we age, language can become more difficult to process resulting in a slowing of verbal comprehension, reading abilities and more likely word finding difficulties. With each of these though, unlike some aphasias, functionality within daily life remains intact.
Classical-localizationist approaches.
Localizationist approaches aim to classify the aphasias according to their major presenting characteristics and the regions of the brain that most probably gave rise to them. Inspired by the early work of nineteenth-century neurologists Paul Broca and Carl Wernicke, these approaches identify two major subtypes of aphasia and several more minor subtypes:
Recent classification schemes adopting this approach, such as the Boston-Neoclassical Model, also group these classical aphasia subtypes into two larger classes: the nonfluent aphasias (which encompasses Broca's aphasia and transcortical motor aphasia) and the fluent aphasias (which encompasses Wernicke's aphasia, conduction aphasia and transcortical sensory aphasia). These schemes also identify several further aphasia subtypes, including: anomic aphasia, which is characterized by a selective difficulty finding the names for things; and global aphasia, where both expression and comprehension of speech are severely compromised.
Many localizationist approaches also recognize the existence of additional, more "pure" forms of language disorder that may affect only a single language skill. For example, in pure alexia, a person may be able to write but not read, and in pure word deafness, they may be able to produce speech and to read, but not understand speech when it is spoken to them.
Cognitive neuropsychological approaches.
Although localizationist approaches provide a useful way of classifying the different patterns of language difficulty into broad groups, one problem is that a sizeable number of individuals do not fit neatly into one category or another. Another problem is that the categories, particularly the major ones such as Broca's and Wernicke's aphasia, still remain quite broad. Consequently, even amongst those who meet the criteria for classification into a subtype, there can be enormous variability in the types of difficulties they experience.
Instead of categorizing every individual into a specific subtype, cognitive neuropsychological approaches aim to identify the key language skills or "modules" that are not functioning properly in each individual. A person could potentially have difficulty with just one module, or with a number of modules. This type of approach requires a framework or theory as to what skills/modules are needed to perform different kinds of language tasks. For example, the model of Max Coltheart identifies a module that recognizes phonemes as they are spoken, which is essential for any task involving recognition of words. Similarly, there is a module that stores phonemes that the person is planning to produce in speech, and this module is critical for any task involving the production of long words or long strings of speech. Once a theoretical framework has been established, the functioning of each module can then be assessed using a specific test or set of tests. In the clinical setting, use of this model usually involves conducting a battery of assessments, each of which tests one or a number of these modules. Once a diagnosis is reached as to the skills/modules where the most significant impairment lies, therapy can proceed to treat these skills.
Progressive aphasias.
Primary progressive aphasia (PPA) is a neurodegenerative focal dementia that can be associated with progressive illnesses or dementia, such as frontotemporal dementia / Pick Complex Motor neuron disease, Progressive supranuclear palsy, and Alzheimer's disease, which is the gradual process of progressively losing the ability to think. Gradual loss of language function occurs in the context of relatively well-preserved memory, visual processing, and personality until the advanced stages. Symptoms usually begin with word-finding problems (naming) and progress to impaired grammar (syntax) and comprehension (sentence processing and semantics). The loss of language before the loss of memory differentiates PPA from typical dementias. People with PPA may have difficulties comprehending what others are saying. They can also have difficulty trying to find the right words to make a sentence. There are three classifications of Primary Progressive Aphasia : Progressive nonfluent aphasia (PNFA), Semantic Dementia (SD), and Logopenic progressive aphasia (LPA).
Progressive Jargon Aphasia is a fluent or receptive aphasia in which the person's speech is incomprehensible, but appears to make sense to them. Speech is fluent and effortless with intact syntax and grammar, but the person has problems with the selection of nouns. Either they will replace the desired word with another that sounds or looks like the original one or has some other connection or they will replace it with sounds. As such, people with jargon aphasia often use neologisms, and may perseverate if they try to replace the words they cannot find with sounds. Substitutions commonly involve picking another (actual) word starting with the same sound (e.g., clocktower – colander), picking another semantically related to the first (e.g., letter – scroll), or picking one phonetically similar to the intended one (e.g., lane – late).
Deaf aphasia.
There have been many instances showing that there is a form of aphasia among deaf individuals. Sign languages are, after all, forms of language that have been shown to use the same areas of the brain as verbal forms of language. Mirror neurons become activated when an animal is acting in a particular way or watching another individual act in the same manner. These mirror neurons are important in giving an individual the ability to mimic movements of hands. Broca's area of speech production has been shown to contain several of these mirror neurons resulting in significant similarities of brain activity between sign language and vocal speech communication. Facial communication is a significant portion of how animals interact with each other. Humans use facial movements to create, what other humans perceive, to be faces of emotions. While combining these facial movements with speech, a more full form of language is created which enables the species to interact with a much more complex and detailed form of communication. Sign language also uses these facial movements and emotions along with the primary hand movement way of communicating. These facial movement forms of communication come from the same areas of the brain. When dealing with damages to certain areas of the brain, vocal forms of communication are in jeopardy of severe forms of aphasia. Since these same areas of the brain are being used for sign language, these same, at least very similar, forms of aphasia can show in the Deaf community. Individuals can show a form of Wernicke's aphasia with sign language and they show deficits in their abilities in being able to produce any form of expressions. Broca's aphasia shows up in some people, as well. These individuals find tremendous difficulty in being able to actually sign the linguistic concepts they are trying to express.
Severity.
The severity of the type of aphasia varies depending on the size of the stroke. However, there is much variance between how often one type of severity occurs in certain types of aphasia. For instance, any type of aphasia can range from mild to profound. Regardless of the severity of aphasia, people can make improvements due to spontaneous recovery and treatment in the acute stages of recovery. Additionally, while most studies propose that the greatest outcomes occur in people with severe aphasia when treatment is provided in the acute stages of recovery, Robey (1998) also found that those with severe aphasia are capable of making strong language gains in the chronic stage of recovery as well. This finding implies that persons with aphasia have the potential to have functional outcomes regardless of how severe their aphasia may be. While there is no distinct pattern of the outcomes of aphasia based on severity alone, global aphasia typically makes functional language gains, but may be gradual since global aphasia affects many language areas.
Cognitive deficits in aphasia.
While aphasia has traditionally been described in terms of language deficits, there is increasing evidence that many people with aphasia commonly experience co-occurring non-linguistic cognitive deficits in areas such as attention, memory, executive functions and learning. By some accounts, cognitive deficits, such as attention and working memory constitute the underlying cause of language impairment in people with aphasia. Others suggest that cognitive deficits often co-occur but are comparable to cognitive deficits in stroke patients without aphasia and reflect general brain dysfunction following injury. The degree to which deficits in attention and other cognitive domains underlie language deficits in aphasia is still unclear.
In particular, people with aphasia often demonstrate short-term and working memory deficits. These deficits can occur in both the verbal domain as well as the visuospatial domain. Furthermore, these deficits are often associated with performance on language specific tasks such as naming, lexical processing, and sentence comprehension, and discourse production. Other studies have found that most, but not all people with aphasia demonstrate performance deficits on tasks of attention, and their performance on these tasks correlate with language performance and cognitive ability in other domains. Even patients with mild aphasia, who score near the ceiling on tests of language often demonstrate slower response times and interference effects in non-verbal attention abilities.
In addition to deficits in short-term memory, working memory, and attention, people with aphasia can also demonstrate deficits in executive function. For instance, people with aphasia may demonstrate deficits in initiation, planning, self-monitoring, and cognitive flexibility. Other studies have found that people with aphasia demonstrate reduced speed and efficiency during completion executive function assessments.
Regardless of their role in the underlying nature of aphasia, cognitive deficits have a clear role in the study and rehabilitation of aphasia. For instance, the severity of cognitive deficits in people with aphasia has been associated with lower quality of life, even more so than the severity of language deficits. Furthermore, cognitive deficits may influence the learning process of rehabilitation and language treatment outcomes in aphasia. Non-linguistic cognitive deficits have also been the target of interventions directed at improving language ability, though outcomes are not definitive. While some studies have demonstrated language improvement secondary to cognitively-focused treatment, others have found little evidence that the treatment of cognitive deficits in people with aphasia has an influence on language outcomes.
One important caveat in the measurement and treatment of cognitive deficits in people with aphasia is the degree to which assessments of cognition rely on language abilities for successful performance. Most studies have attempted to circumvent this challenge by utilizing non-verbal cognitive assessments to evaluate cognitive ability in people with aphasia. However, the degree to which these tasks are truly 'non-verbal' and not mediated by language in unclear. For instance, Wall et al. found that language and non-linguistic performance was related, except when non-linguistic performance was measured by 'real life' cognitive tasks.
Prevention of aphasia.
Aphasia is largely caused by unavoidable instances. However, some precautions can be taken to decrease risk for experiencing one of the two major causes of aphasia: stroke and traumatic brain injury (TBI). To decrease the probability of having an ischemic or hemorrhagic stroke, one should take the following precautions:
To prevent aphasia due to traumatic injury, one should take precautionary measures when engaging in dangerous activities such as:
Additionally, one should always seek medical attention after sustaining head trauma due to a fall or accident. The sooner that one receives medical attention for a traumatic brain injury, the less likely one is to experience long-term or severe effects.
Management.
When addressing Wernicke's aphasia, according to Bakheit et al. (2007), the lack of awareness of the language impairments, a common characteristic of Wernicke's aphasia, may affect the rate and extent of therapy outcomes. Robey (1998) determined that at least 2 hours of treatment per week is recommended for making significant language gains. Spontaneous recovery may cause some language gains, but without speech-language therapy, the outcomes can be half as strong as those with therapy.
When addressing Broca's aphasia, better outcomes occur when the person participates in therapy, and treatment is more effective than no treatment for people in the acute period. Two or more hours of therapy per week in acute and post-acute stages produced the greatest results. High-intensity therapy was most effective, and low-intensity therapy was almost equivalent to no therapy.
People with global aphasia are sometimes referred to as having irreversible aphasic syndrome, often making limited gains in auditory comprehension, and recovering no functional language modality with therapy. With this said, people with global aphasia may retain gestural communication skills that may enable success when communicating with conversational partners within familiar conditions. Process-oriented treatment options are limited, and people may not become competent language users as readers, listeners, writers, or speakers no matter how extensive therapy is. However, people's daily routines and quality of life can be enhanced with reasonable and modest goals. After the first month, there is limited to no healing to language abilities of most people. There is a grim prognosis leaving 83% who were globally aphasic after the first month they will remain globally aphasic at the first year. Some people are so severely impaired that their existing process-oriented treatment approaches offer no signs of progress, and therefore cannot justify the cost of therapy.
Perhaps due to the relative rareness of conduction aphasia, few studies have specifically studied the effectiveness of therapy for people with this type of aphasia. From the studies performed, results showed that therapy can help to improve specific language outcomes. One intervention that has had positive results is auditory repetition training. Kohn et al. (1990) reported that drilled auditory repetition training related to improvements in spontaneous speech, Francis et al. (2003) reported improvements in sentence comprehension, and Kalinyak-Fliszar et al. (2011) reported improvements in auditory-visual short-term memory.
Most acute cases of aphasia recover some or most skills by working with a speech-language pathologist. Recovery and improvement can continue for years after the stroke. After the onset of aphasia, there is approximately a six-month period of spontaneous recovery; during this time, the brain is attempting to recover and repair the damaged neurons. Improvement varies widely, depending on the aphasia's cause, type, and severity. Recovery also depends on the person's age, health, motivation, handedness, and educational level.
There is no one treatment proven to be effective for all types of aphasias. The reason that there is no universal treatment for aphasia is because of the nature of the disorder and the various ways it is presented, as explained in the above sections. Aphasia is rarely exhibited identically, implying that treatment needs to be catered specifically to the individual. Studies have shown that, although there is no consistency on treatment methodology in literature, there is a strong indication that treatment, in general, has positive outcomes. Therapy for aphasia ranges from increasing functional communication to improving speech accuracy, depending on the person's severity, needs and support of family and friends. Group therapy allows individuals to work on their pragmatic and communication skills with other individuals with aphasia, which are skills that may not often be addressed in individual one-on-one therapy sessions. It can also help increase confidence and social skills in a comfortable setting.
Evidence does not support the use of transcranial direct current stimulation (tDCS) for improving aphasia after stroke. Moderate quality evidence does indicate naming performance improvements for nouns but not verbs using tDCS
Specific treatment techniques include the following:
Semantic feature analysis (SFA) – a type of aphasia treatment that targets word-finding deficits. It is based on the theory that neural connections can be strengthened by using related words and phrases that are similar to the target word, to eventually activate the target word in the brain. SFA can be implemented in multiple forms such as verbally, written, using picture cards, etc. The SLP provides prompting questions to the individual with aphasia in order for the person to name the picture provided. Studies show that SFA is an effective intervention for improving confrontational naming.
Melodic intonation therapy is used to treat non-fluent aphasia and has proved to be effective in some cases. However, there is still no evidence from randomized controlled trials confirming the efficacy of MIT in chronic aphasia. MIT is used to help people with aphasia vocalize themselves through speech song, which is then transferred as a spoken word. Good candidates for this therapy include people who have had left hemisphere strokes, non-fluent aphasias such as Broca's, good auditory comprehension, poor repetition and articulation, and good emotional stability and memory. An alternative explanation is that the efficacy of MIT depends on neural circuits involved in the processing of rhythmicity and formulaic expressions (examples taken from the MIT manual: "I am fine," "how are you?" or "thank you"); while rhythmic features associated with melodic intonation may engage primarily left-hemisphere subcortical areas of the brain, the use of formulaic expressions is known to be supported by right-hemisphere cortical and bilateral subcortical neural networks.
Systematic reviews support the effectiveness and importance of partner training. According to the National Institute on Deafness and Other Communication Disorders (NIDCD), involving family with the treatment of an aphasic loved one is ideal for all involved, because while it will no doubt assist in their recovery, it will also make it easier for members of the family to learn how best to communicate with them.
When a person's speech is insufficient, different kinds of augmentative and alternative communication could be considered such as alphabet boards, pictorial communication books, specialized software for computers or apps for tablets or smartphones.
Intensity of treatment.
The intensity of aphasia therapy is determined by the length of each session, total hours of therapy per week, and total weeks of therapy provided. There is no consensus about what "intense" aphasia therapy entails, or how intense therapy should be to yield the best outcomes. A 2016 Cochrane review of speech and language therapy for people with aphasia found that treatments that are higher intensity, higher dose or over a long duration of time led to significantly better functional communication but people were more likely to drop out of high intensity treatment (up to 15 hours per week).
Intensity of therapy is also dependent on the recency of stroke. People with aphasia react differently to intense treatment in the acute phase (0–3 months post stroke), sub-acute phase (3–6 months post stroke), or chronic phase (6+ months post stroke). Intensive therapy has been found to be effective for people with nonfluent and fluent chronic aphasia, but less effective for people with acute aphasia. People with sub-acute aphasia also respond well to intensive therapy of 100 hours over 62 weeks. This suggests people in the sub-acute phase can improve greatly in language and functional communication measures with intensive therapy compared to regular therapy.
Individualized service delivery.
Intensity of treatment should be individualized based on the recency of stroke, therapy goals, and other specific characteristics such as age, size of lesion, overall health status, and motivation. Each individual reacts differently to treatment intensity and is able to tolerate treatment at different times post-stroke. Intensity of treatment after a stroke should be dependent on the person's motivation, stamina, and tolerance for therapy.
Outcomes.
If the symptoms of aphasia last longer than two or three months after a stroke, a complete recovery is unlikely. However, it is important to note that some people continue to improve over a period of years and even decades. Improvement is a slow process that usually involves both helping the individual and family understand the nature of aphasia and learning compensatory strategies for communicating.
After a traumatic brain injury (TBI) or cerebrovascular accident (CVA), the brain undergoes several healing and re-organization processes, which may result in improved language function. This is referred to as spontaneous recovery. Spontaneous recovery is the natural recovery the brain makes without treatment, and the brain begins to reorganize and change in order to recover. There are several factors that contribute to a person's chance of recovery caused by stroke, including stroke size and location. Age, sex, and education have not been found to be very predictive. There is also research pointing to damage in the left hemisphere healing more effectively than the right.
Specific to aphasia, spontaneous recovery varies among affected people and may not look the same in everyone, making it difficult to predict recovery.
Though some cases of Wernicke's aphasia have shown greater improvements than more mild forms of aphasia, people with Wernicke's aphasia may not reach as high a level of speech abilities as those with mild forms of aphasia.
History.
The first recorded case of aphasia is from an Egyptian papyrus, the Edwin Smith Papyrus, which details speech problems in a person with a traumatic brain injury to the temporal lobe.
During the second half of the 19th century, aphasia was a major focus for scientists and philosophers who were working in the beginning stages of the field of psychology.
In medical research, speechlessness was described as an incorrect prognosis, and there was no assumption that underlying language complications existed. Broca and his colleagues were some of the first to write about aphasia, but Wernicke was the first credited to have written extensively about aphasia being a disorder that contained comprehension difficulties. Despite claims of who reported on aphasia first, it was F.J. Gall that gave the first full description of aphasia after studying wounds to the brain, as well as his observation of speech difficulties resulting from vascular lesions. A recent book on the entire history of aphasia is available (Reference: Tesak, J. & Code, C. (2008) "Milestones in the History of Aphasia: Theories and Protagonists". Hove, East Sussex: Psychology Press).
Etymology.
"Aphasia" is from Greek "a-" ("without", negative prefix) + "phásis" ("φάσις", "speech").
The word "aphasia" comes from the word ἀφασία "aphasia", in Ancient Greek, which means "speechlessness", derived from ἄφατος "aphatos", "speechless" from ἀ- "a-", "not, un" and φημί "phemi", "I speak".
Neuroimaging Methods.
Magnetic resonance imaging (MRI) and functional magnetic resonance imaging (fMRI) are the most common neuroimaging tools used in identifying aphasia and studying the extent of damage in the loss of language abilities. This is done by doing MRI scans and locating the extent of lesions or damage within brain tissue, particularly within areas of the left frontal and temporal regions- where a lot of language related areas lie. In fMRI studies a language related task is often completed and then the BOLD image is analyzed. If there are lower than normal BOLD responses that indicate a lessening of blood flow to the affected area and can show quantitatively that the cognitive task is not being completed.
There are limitations to the use of fMRI in aphasic patients particularly. Because a high percentage of aphasic patients develop it because of stroke there can be infarcts present which is the total loss of blood flow. This can be due to the thinning of blood vessels or the complete blockage of it. This is important in fMRI as it relies on the BOLD response (the oxygen levels of the blood vessels), and this can create a false hyporesponse upon fMRI study. Due to the limitations of fMRI such as a lower spatial resolution, it can show that some areas of the brain are not active during a task when they in reality are. Additionally, with stroke being the cause of many cases of aphasia the extent of damage to brain tissue can be difficult to quantify therefore the effects of stroke brain damage on the functionality of the patient can vary.
Neural Substrates of Aphasia Subtypes
MRI is often used to predict or confirm the subtype of aphasia present. Researchers compared 3 subtypes of aphasia- nonfluent-variant primary progressive aphasia (nfPPA), logopenic-variant primary progressive aphasia (lvPPA), and semantic-variant primary progressive aphasia (svPPA), with primary progressive aphasia (PPA) and Alzheimer’s disease. This was done by analyzing the MRIs of patients with each of the subsets of PPA. Images which compare subtypes of aphasia as well as for finding the extent of lesions are generated by overlapping images of different participant's brains (if applicable) and isolating areas of lesions or damage using third party software such as MRIcron. MRI has also been used to study the relationship between the type of aphasia developed and the age of the person with aphasia. It was found that patients with fluent aphasia are on average older than people with non-fluent aphasia. It was also found that among patients with lesions confined to the anterior portion of the brain an unexpected portion of them presented with fluent aphasia and were remarkably older than those with non-fluent aphasia. This effect was not found when the posterior portion of the brain was studied.
Associated Conditions
In a study on the features associated with different disease trajectories in Alzheimer's disease (AD)-related primary progressive aphasia (PPA), it was found that metabolic patterns via PET SPM analysis can help predict progression of total loss of speech and functional autonomy in AD and PPA patients. This was done by comparing an MRI or CT image of the brain and presence of a radioactive biomarker with normal levels in patients without Alzheimer’s Disease. Apraxia is another disorder often correlated with aphasia. This is due to a subset of apraxia which affects speech. Specifically, this subset affects the movement of muscles associated with speech production, apraxia and aphasia are often correlated due to the proximity of neural substrates associated with each of the disorders. Researchers concluded that there were 2 areas of lesion overlap between patients with apraxia and aphasia, the anterior temporal lobe and the left inferior parietal lobe.
Treatment and Neuroimaging
Evidence for positive treatment outcomes can also be quantified using neuroimaging tools. The use of fMRI and an automatic classifier can help predict language recovery outcomes in stroke patients with 86% accuracy when coupled with age and language test scores. The stimuli tested were sentences both correct and incorrect and the subject had to press a button whenever the sentence was incorrect. The fMRI data collected focused on responses in regions of interest identified by healthy subjects. Recovery from aphasia can also be quantified using diffusion tensor imaging. The accurate fasciculus (AF) connects the right and left superior temporal lobe, premotor regions/posterior inferior frontal gyrus. and the primary motor cortex. In a study which enrolled patients in a speech therapy program, an increase in AF fibers and volume was found in patients after 6-weeks in the program which correlated with long-term improvement in those patients. The results of the experiment are pictured in Figure 2. This implies that DTI can be used to quantify the improvement in patients after speech and language treatment programs are applied.
Conclusion
Neuroimaging tools serve as a useful method for determining disorder progression, quantification of cortical damage, aphasia subtype, treatment effectiveness, and differentiating diagnosis with correlated disorders. Utilization of neuroimaging tools is necessary for the progression of knowledge of aphasia and its subtypes.
Further research.
Research is currently being done using functional magnetic resonance imaging (fMRI) to witness the difference in how language is processed in normal brains vs aphasic brains. This will help researchers to understand exactly what the brain must go through in order to recover from Traumatic Brain Injury (TBI) and how different areas of the brain respond after such an injury.
Another intriguing approach being tested is that of drug therapy. Research is in progress that will hopefully uncover whether or not certain drugs might be used in addition to speech-language therapy in order to facilitate recovery of proper language function. It's possible that the best treatment for Aphasia might involve combining drug treatment with therapy, instead of relying on one over the other.
One other method being researched as a potential therapeutic combination with speech-language therapy is brain stimulation. One particular method, Transcranial Magnetic Stimulation (TMS), alters brain activity in whatever area it happens to stimulate, which has recently led scientists to wonder if this shift in brain function caused by TMS might help people re-learn languages.
The research being put into Aphasia has only just begun. Researchers appear to have multiple ideas on how Aphasia could be more effectively treated in the future.
|
2089 | Aorta | The aorta ( ) is the main and largest artery in the human body, originating from the left ventricle of the heart, branching upwards immediately after, and extending down to the abdomen, where it splits at the aortic bifurcation into two smaller arteries (the common iliac arteries). The aorta distributes oxygenated blood to all parts of the body through the systemic circulation.
Structure.
Sections.
In anatomical sources, the aorta is usually divided into sections.
One way of classifying a part of the aorta is by anatomical compartment, where the thoracic aorta (or thoracic portion of the aorta) runs from the heart to the diaphragm. The aorta then continues downward as the abdominal aorta (or abdominal portion of the aorta) from the diaphragm to the aortic bifurcation.
Another system divides the aorta with respect to its course and the direction of blood flow. In this system, the aorta starts as the ascending aorta, travels superiorly from the heart, and then makes a hairpin turn known as the aortic arch. Following the aortic arch, the aorta then travels inferiorly as the descending aorta. The descending aorta has two parts. The aorta begins to descend in the thoracic cavity and is consequently known as the thoracic aorta. After the aorta passes through the diaphragm, it is known as the abdominal aorta. The aorta ends by dividing into two major blood vessels, the common iliac arteries and a smaller midline vessel, the median sacral artery.
Ascending aorta.
The ascending aorta begins at the opening of the aortic valve in the left ventricle of the heart. It runs through a common pericardial sheath with the pulmonary trunk. These two blood vessels twist around each other, causing the aorta to start out posterior to the pulmonary trunk, but end by twisting to its right and anterior side.
The transition from ascending aorta to aortic arch is at the pericardial reflection on the aorta.
At the root of the ascending aorta, the lumen has three small pockets between the cusps of the aortic valve and the wall of the aorta, which are called the aortic sinuses or the sinuses of Valsalva. The left aortic sinus contains the origin of the left coronary artery and the right aortic sinus likewise gives rise to the right coronary artery. Together, these two arteries supply the heart. The posterior aortic sinus does not give rise to a coronary artery. For this reason the left, right and posterior aortic sinuses are also called left-coronary, right-coronary and non-coronary sinuses.
Aortic arch.
The aortic arch loops over the left pulmonary artery and the bifurcation of the pulmonary trunk, to which it remains connected by the ligamentum arteriosum, a remnant of the fetal circulation that is obliterated a few days after birth. In addition to these blood vessels, the aortic arch crosses the left main bronchus. Between the aortic arch and the pulmonary trunk is a network of autonomic nerve fibers, the cardiac plexus or "aortic plexus". The left vagus nerve, which passes anterior to the aortic arch, gives off a major branch, the recurrent laryngeal nerve, which loops under the aortic arch just lateral to the ligamentum arteriosum. It then runs back to the neck.
The aortic arch has three major branches: from proximal to distal, they are the brachiocephalic trunk, the left common carotid artery, and the left subclavian artery. The brachiocephalic trunk supplies the right side of the head and neck as well as the right arm and chest wall, while the latter two together supply the left side of the same regions.
The aortic arch ends, and the descending aorta begins at the level of the intervertebral disc between the fourth and fifth thoracic vertebrae.
Thoracic aorta.
The thoracic aorta gives rise to the intercostal and subcostal arteries, as well as to the superior and inferior left bronchial arteries and variable branches to the esophagus, mediastinum, and pericardium. Its lowest pair of branches are the superior phrenic arteries, which supply the diaphragm, and the subcostal arteries for the twelfth rib.
Abdominal aorta.
The abdominal aorta begins at the aortic hiatus of the diaphragm at the level of the twelfth thoracic vertebra. It gives rise to lumbar and musculophrenic arteries, renal and middle suprarenal arteries, and visceral arteries (the celiac trunk, the superior mesenteric artery and the inferior mesenteric artery). It ends in a bifurcation into the left and right common iliac arteries. At the point of the bifurcation, there also springs a smaller branch, the median sacral artery.
Development.
The ascending aorta develops from the outflow tract, which initially starts as a single tube connecting the heart with the aortic arches (which will form the great arteries) in early development but is then separated into the aorta and the pulmonary trunk.
The aortic arches start as five pairs of symmetrical arteries connecting the heart with the dorsal aorta, and then undergo a significant remodelling to form the final asymmetrical structure of the great arteries, with the 3rd pair of arteries contributing to the common carotids, the right 4th forming the base and middle part of the right subclavian artery and the left 4th being the central part of the aortic arch. The smooth muscle of the great arteries and the population of cells that form the aorticopulmonary septum that separates the aorta and pulmonary artery is derived from cardiac neural crest. This contribution of the neural crest to the great artery smooth muscle is unusual as most smooth muscle is derived from mesoderm. In fact the smooth muscle within the abdominal aorta is derived from mesoderm, and the coronary arteries, which arise just above the semilunar valves, possess smooth muscle of mesodermal origin. A failure of the aorticopulmonary septum to divide the great vessels results in persistent truncus arteriosus.
Microanatomy.
The aorta is an elastic artery, and as such is quite distensible. The aorta consists of a heterogeneous mixture of smooth muscle, nerves, intimal cells, endothelial cells, fibroblast-like cells, and a complex extracellular matrix. The vascular wall is subdivided into three layers known as the tunica externa, tunica media, and tunica intima. The aorta is covered by an extensive network of tiny blood vessels called vasa vasora, which feed the tunica externa and tunica media, the outer layers of the aorta. The aortic arch contains baroreceptors and chemoreceptors that relay information concerning blood pressure and blood pH and carbon dioxide levels to the medulla oblongata of the brain. This information along with information from baroreceptors and chemoreceptors located elsewhere is processed by the brain and the autonomic nervous system mediates appropriate homeostatic responses.
Within the tunica media, smooth muscle and the extracellular matrix are quantitatively the largest components, these are arranged concentrically as musculoelastic layers (the elastic lamella) in mammals. The elastic lamella, which comprise smooth muscle and elastic matrix, can be considered as the fundamental structural unit of the aorta and consist of elastic fibers, collagens (predominately type III), proteoglycans, and glycoaminoglycans. The elastic matrix dominates the biomechanical properties of the aorta. The smooth muscle component, while contractile, does not substantially alter the diameter of the aorta, but rather serves to increase the stiffness and viscoelasticity of the aortic wall when activated.
Variation.
Variations may occur in the location of the aorta, and the way in which arteries branch off the aorta. The aorta, normally on the left side of the body, may be found on the right in dextrocardia, in which the heart is found on the right, or situs inversus, in which the location of all organs are flipped.
Variations in the branching of individual arteries may also occur. For example, the left vertebral artery may arise from the aorta, instead of the left common carotid artery.
In patent ductus arteriosus, a congenital disorder, the fetal ductus arteriosus fails to close, leaving an open vessel connecting the pulmonary artery to the proximal descending aorta.
Function.
The aorta supplies all of the systemic circulation, which means that the entire body, except for the respiratory zone of the lung, receives its blood from the aorta. Broadly speaking, branches from the ascending aorta supply the heart; branches from the aortic arch supply the head, neck, and arms; branches from the thoracic descending aorta supply the chest (excluding the heart and the respiratory zone of the lung); and branches from the abdominal aorta supply the abdomen. The pelvis and legs get their blood from the common iliac arteries.
Blood flow and velocity.
The contraction of the heart during systole is responsible for ejection and creates a (pulse) wave that is propagated down the aorta, into the arterial tree. The wave is reflected at sites of impedance mismatching, such as bifurcations, where reflected waves rebound to return to semilunar valves and the origin of the aorta. These return waves create the dicrotic notch displayed in the aortic pressure curve during the cardiac cycle as these reflected waves push on the aortic semilunar valve. With age, the aorta stiffens such that the pulse wave is propagated faster and reflected waves return to the heart faster before the semilunar valve closes, which raises the blood pressure. The stiffness of the aorta is associated with a number of diseases and pathologies, and noninvasive measures of the pulse wave velocity are an independent indicator of hypertension. Measuring the pulse wave velocity (invasively and non-invasively) is a means of determining arterial stiffness. Maximum aortic velocity may be noted as Vmax or less commonly as AoVmax.
Mean arterial pressure (MAP) is highest in the aorta, and the MAP decreases across the circulation from aorta to arteries to arterioles to capillaries to veins back to atrium. The difference between aortic and right atrial pressure accounts for blood flow in the circulation. When the left ventricle contracts to force blood into the aorta, the aorta expands. This stretching gives the potential energy that will help maintain blood pressure during diastole, as during this time the aorta contracts passively. This Windkessel effect of the great elastic arteries has important biomechanical implications. The elastic recoil helps conserve the energy from the pumping heart and smooth out the pulsatile nature created by the heart. Aortic pressure is highest at the aorta and becomes less pulsatile and lower pressure as blood vessels divide into arteries, arterioles, and capillaries such that flow is slow and smooth for gases and nutrient exchange.
Other animals.
All amniotes have a broadly similar arrangement to that of humans, albeit with a number of individual variations. In fish, however, there are two separate vessels referred to as aortas. The ventral aorta carries de-oxygenated blood from the heart to the gills; part of this vessel forms the ascending aorta in tetrapods (the remainder forms the pulmonary artery). A second, dorsal aorta carries oxygenated blood from the gills to the rest of the body and is homologous with the descending aorta of tetrapods. The two aortas are connected by a number of vessels, one passing through each of the gills.
Amphibians also retain the fifth connecting vessel, so that the aorta has two parallel arches.
History.
The word "aorta" stems from the Late Latin "" from Classical Greek "aortē" (), from "aeirō", "I lift, raise" () This term was first applied by Aristotle when describing the aorta and describes accurately how it seems to be "suspended" above the heart.
The function of the aorta is documented in the Talmud, where it is noted as one of three major vessels entering or leaving the heart, and where perforation is linked to death.
|
2093 | Abimelech | Abimelech (also spelled Abimelek or Avimelech; ) was the generic name given to all Philistine kings in the Hebrew Bible from the time of Abraham through King David. In the Book of Judges, Abimelech, son of Gideon, of the Tribe of Manasseh, is proclaimed king of Shechem after the death of his father.
Etymology.
The name or title "Abimelech" is formed from Hebrew words for "father" and "king," and may be interpreted in a variety of ways, including "Father-King", "My father is king," or "Father of a king." In the Pentateuch, it is used as a title for kings in the land of Canaan.
Abimelech can be translated in Arabic as well into "My father is king", "My father is owner" or "Father of a king," where () means father or my father while () means king or () for owner.
At the time of the Amarna tablets (mid-14th century BC), there was an Egyptian governor of Tyre similarly named Abimilki,
Abimelech of Gerar.
Abimelech was most prominently the name of a polytheistic king of Gerar who is mentioned in two of the three wife–sister narratives in the Book of Genesis, in connection with both Abraham and Isaac.
King Abimelech of Gerar also appears in an extra-biblical tradition recounted in texts such as the "Arabic Apocalypse of Peter", the "Cave of Treasures" and the "Conflict of Adam and Eve with Satan", as one of twelve regional kings in Abraham's time said to have built the city of Jerusalem for Melchizedek.
Abimelech Son of Jerubbaal.
The Book of Judges mentions Abimelech, son of judge Gideon (also known as Jerubbaal). According to the biblical narrative, Abimelech was an extremely conniving and evil person. He persuaded his mother's brothers to encourage the people of Shechem to back him in a plot to overthrow his family rule and make him sole ruler.
After slaying all but one of his seventy brothers, Abimelech was crowned king. The brother who escaped, Jotham youngest son of Jerrubaal, made a pronouncement against Abimelech and those who had crowned him.
The curse was that if they had not dealt righteously with the family of Jerrubaal, then fire would come against Abimelech from the people of Shechem and fire would come out of Abimelech against the people who had backed him in this bloody coup.
After Abimelech ruled for three years, the pronouncement came through. The people of Shechem set robbers to lie in wait of any goods or money headed to Abimelech and steal everything.
Then Gaal Son of Ebed went to Shechem and drunkenly bragged that he would remove Abimelech from the throne. Zebul, ruler of Shechem, sent word to Abimelech along with a battle strategy. Once Zebul taunted Gaal into fighting Abimelech, he shut Gaal and his brethren out of the city.
Abimelech then slew the field workers that came out of the city of Shechem the next day. When he heard that the people of Shechem had locked themselves in a strong tower, he and his men set fire to it, killing about a thousand men and women.
After this, Abimelech went to Thebez and camped against it. When he went close to the tower in Thebez to set it on fire, a woman dropped an upper millstone on Abimelech's head. He did not want to be known as having been killed by a woman, so he asked his armour bearer to run him through with a sword. His place of death is cited as being Thebez.
Other people with this name.
Apart from the king (or kings) of Gerar, the Bible also records this name for:
Other literary references include:
|
2099 | Andrew Tridgell | Andrew "Tridge" Tridgell (born 28 February 1967) is an Australian computer programmer. He is the author of and a contributor to the Samba file server, and co-inventor of the rsync algorithm.
He has analysed complex proprietary protocols and algorithms, to allow compatible free and open source software implementations.
Projects.
Tridgell was a major developer of the Samba software, analyzing the Server Message Block protocol used for workgroup and network file sharing by Microsoft Windows products. He developed the hierarchical memory allocator, originally as part of Samba.
For his PhD dissertation, he co-developed rsync, including the rsync algorithm, a highly efficient file transfer and synchronisation tool. He was also the original author of rzip, which uses a similar algorithm to rsync. He developed spamsum, based on locality-sensitive hashing algorithms.
He is the author of KnightCap, a reinforcement-learning based chess engine.
Tridgell was also a leader in hacking the TiVo to make it work in Australia, which uses the PAL video format.
In April 2005, Tridgell tried to produce free software (now known as SourcePuller) that interoperated with the BitKeeper source code repository. This was cited as the reason that BitMover revoked a license allowing Linux developers free use of their BitKeeper product. Linus Torvalds, the creator of the Linux kernel, and Tridgell were thus involved in a public debate about the events, in which Tridgell stated that, not having bought or owned BitKeeper – and thus having never agreed to its license – he could not violate it, and was analyzing the protocol ethically, as he had done with Samba. Tridgell's involvement in the project resulted in Torvalds accusing him of playing dirty tricks with BitKeeper. Tridgell claimed his analysis started with simply telneting to a BitKeeper server and typing codice_1.
In 2011 Tridgell got involved with the software development of ArduPilot Mega, an open source Arduino-based UAV controller board, working on an entry for the UAV Challenge Outback Rescue.
Academic achievements.
Tridgell completed a PhD at the Computer Sciences Laboratory of the Australian National University. His original doctorate work was in the area of speech recognition but was never completed. His submitted dissertation 'Efficient Algorithms for Sorting and Synchronization' was based on his work on the rsync algorithm.
|
2100 | Applesoft BASIC | Applesoft BASIC is a dialect of Microsoft BASIC, developed by Marc McDonald and Ric Weiland, supplied with the Apple II series of computers. It supersedes Integer BASIC and is the BASIC in ROM in all Apple II series computers after the original Apple II model. It is also referred to as FP BASIC (from floating point) because of the Apple DOS command used to invoke it, instead of codice_1 for Integer BASIC.
Applesoft BASIC was supplied by Microsoft and its name is derived from the names of both Apple Computer and Microsoft. Apple employees, including Randy Wigginton, adapted Microsoft's interpreter for the Apple II and added several features. The first version of Applesoft was released in 1977 on cassette tape and lacked proper support for high-resolution graphics. Applesoft II, which was made available on cassette and disk and in the ROM of the Apple II Plus and subsequent models, was released in 1978. It is this latter version, which has some syntax differences and support for the Apple II high-resolution graphics modes, that is usually synonymous with the term "Applesoft."
A compiler for Applesoft BASIC, "TASC" (The Applesoft Compiler), was released by Microsoft in 1981.
History.
When Steve Wozniak wrote Integer BASIC for the Apple II, he did not implement support for floating-point arithmetic because he was primarily interested in writing games, a task for which integers alone were sufficient. In 1976, Microsoft had developed Microsoft BASIC for the MOS Technology 6502, but at the time there was no production computer that used it. Upon learning that Apple had a 6502 machine, Microsoft asked if the company were interested in licensing BASIC, but Steve Jobs replied that Apple already had one.
The Apple II was unveiled to the public at the West Coast Computer Faire in April 1977 and became available for sale in June. One of the most common customer complaints about the computer was BASIC's lack of floating-point math. Making things more problematic was that the rival Commodore PET personal computer had a floating point-capable BASIC interpreter from the beginning. As Wozniak—the only person who understood Integer BASIC well enough to add floating point features—was busy with the Disk II drive and controller and with Apple DOS, Apple turned to Microsoft.
Apple reportedly obtained an eight-year license for Applesoft BASIC from Microsoft for a flat fee of $31,000, renewing it in 1985 through an arrangement that gave Microsoft the rights and source code for Apple's Macintosh version of BASIC. Applesoft was designed to be backwards-compatible with Integer BASIC and uses the core of Microsoft's 6502 BASIC implementation, which includes using the GET command for detecting key presses and not requiring any spaces on program lines. While Applesoft BASIC is slower than Integer BASIC, it has many features that the older BASIC lacks:
Conversely, Applesoft lacks the codice_9 (remainder) operator from Integer BASIC.
Adapting BASIC for the Apple II was a tedious job as Apple received a source listing for Microsoft 6502 BASIC which proved to be buggy and also required the addition of Integer BASIC commands. Since Apple had no 6502 assembler on hand, the development team was forced to send the source code over the phone lines to Call Computer, an outfit that offered compiler services. This was an extremely tedious, slow process and after Call Computer lost the source code due to an equipment malfunction, one of the programmers, Cliff Huston, used his own IMSAI 8080 computer to cross assemble the BASIC source.
Features.
Applesoft is similar to Commodore's BASIC 2.0 aside from features inherited from Integer BASIC. There are a few minor differences such as Applesoft's lack of bitwise operators; otherwise most BASIC programs that do not use hardware-dependent features will run on both BASICs.
The statement redirects output to an expansion card, and redirects input from an expansion card. The slot number of the card is specified after the or within the statement. The computer locks-up if there is no card present in the slot. restores output to the 40 column screen and to the keyboard.
The statement can be used to redirect output to the printer (e.g. ) where x is the slot number containing the printer port card. To send a BASIC program listing to the printer, the user types .
Using on a slot with a disk drive (usually in slot 6) causes Applesoft to boot the disk drive. Using on a slot with an 80 column card (usually in slot 1) switches to 80 column text mode.
As with Commodore BASIC, numeric variables are stored as 40-bit floating point; each variable requires five bytes of memory. The programmer may designate variables as integer by following them with a percent sign, in which case they use two bytes and are limited to a range of -32768 to 32767; however BASIC internally converts them back to floating point when performing calculations, while each percent sign also takes an additional byte of program code, so in practice this feature is only useful for reducing the memory usage of large array variables, as it offers no performance benefit.
The RND function generates a pseudorandom fractional number between 0 and 1. returns the most recently generated random number. with a negative number will jump to a point in the sequence determined by the particular negative number used. RND with any positive value generates the next number in the sequence, not dependent on the actual value given.
Like other implementations of Microsoft BASIC, Applesoft discards spaces (outside of strings and comments) on program lines. codice_10 adds spaces when displaying code for the sake of readability. Since adds a space before and after every tokenized keyword, it often produces two spaces in a row where one would suffice for readability.
The default prompt for codice_11 is a question mark. codice_12 does not add a leading space in front of numbers.
Limitations.
Through several early models of the Apple II, Applesoft BASIC did not support the use of lowercase letters in programs, except in strings. codice_12 is a valid command but codice_14 and codice_15 result in a syntax error.
Applesoft lacks several commands and functions common to most of the non-6502 Microsoft BASIC interpreters, such as:
Applesoft does not have commands for file or disk handling, other than to save and load programs via cassette tape. The Apple II disk operating system, known simply as DOS, augments the language to provide such abilities.
Only the first two letters of variables names are significant. For example, "LOW" and "LOSS" are treated as the same variable, and attempting to assign a value to "LOSS" overwrites any value assigned to "LOW". A programmer also has to avoid consecutive letters that are Applesoft commands or operations. The name "SCORE" for a variable is interpreted as containing the codice_20 Boolean operator, rendered as codice_21. "BACKGROUND" contains codice_22, the command to invoke the low-resolution graphics mode, and results in a syntax error.
Sound and graphics.
The only built-in sound support is the option to codice_12 an ASCII bell character to sound the system alert beep.
Applesoft supports drawing in the Apple II's low resolution and high resolution modes. There are commands to plot pixels and draw horizontal and vertical lines in low resolution. High resolution allows arbitrary lines and vector-based shape tables for drawing scaled and rotated objects. The only provision for mixing text and graphics is the four lines of text at the bottom of a graphic display.
Beginning with the Apple IIe, a "double-high resolution" mode became available on machines with 128k of memory. This mode essentially duplicates the resolution of the original high resolution mode, but including all 16 colors of the low resolution palette. Applesoft does not provide direct support for this mode. Apple IIGS-specific modes are likewise not supported.
Extensions.
Applesoft BASIC can be extended by two means: the ampersand () command and the function. These are two features that call low-level machine-language routines stored in memory, which is useful for routines that need to be fast or require direct access to arbitrary functions or data in memory. The function takes one argument, and can be programmed to derive and return a calculated function value to be used in a numerical expression. is effectively a shorthand for , with an address that is predefined. By calling routines in the Applesoft ROM, it is possible for ampersand routines to parse values that follow the ampersand. Numerous third-party commercial packages were available to extend Applesoft using ampersand routines.
Bugs.
A deficiency with error-trapping via codice_24 means that the system stack is not reset if an error-handling routine does not invoke codice_25, potentially leading to a crash. The built-in pseudorandom number generator function codice_26 is capable of producing a predictable series of outputs due to the manner in which the generator is seeded when first powering on. This behavior is contrary to how Apple's documentation describes the function.
Performance.
Wozniak originally referred to his Integer BASIC as "Game BASIC" (having written it so he could implement a "Breakout" clone for his new computer). Few action games were written in Applesoft BASIC, in large part because the use of floating-point numbers for all math operations degrades performance.
Applesoft BASIC programs are stored as a linked list of lines; a codice_27 or codice_28 takes linear time. Some programs have the subroutines at the top to reduce the time for calling them.
Unlike Integer BASIC, Applesoft does not convert literal numbers (like 100) in the source code to binary when a line is entered. Rather, the ASCII string is converted whenever the line is executed. Since variable lookup is often faster than this conversion, it can be faster to store numeric constants used inside loops in variables before the loop is entered.
Sample code.
Hello World in Applesoft BASIC can be entered as the following:
10TEXT:HOME
20?"HELLO WORLD"
Multiple commands can be included on the same line of code if separated by a colon (codice_29). The codice_30 can be used in Applesoft BASIC (and almost all versions of Microsoft BASIC) as a shortcut for "PRINT", though spelling out the word is not only acceptable but canonical—Applesoft converted "?" in entered programs to the same token as "PRINT" (thus no memory is actually saved by using "?"), thus either appears as "PRINT" when a program is listed. The program above appears in a codice_10 command as:
10 TEXT : HOME
20 PRINT "HELLO WORLD"
When Applesoft II BASIC was initially released in mid-1978, it came on cassette tape and could be loaded into memory via the Apple II's machine language monitor. When the enhanced Apple II+ replaced the original II in 1979, Applesoft was now included in ROM and automatically started on power-up if no bootable floppy disk was present. Conversely, Integer BASIC was now removed from ROM and turned into an executable file on the DOS 3.3 disk.
Early evolution.
The original Applesoft, stored in RAM as documented in its Reference Manual of November 1977, has smaller interpreter code than the later Applesoft II, occupying 8½ KB of memory, instead of the 10 KB used by the later Applesoft II. Consequently, it lacks a number of command features developed for the later, mainstream version:
as well as several the later version would have, that had already been present in Apple's Integer BASIC:
In addition, its low-resolution graphics commands have different names from their Integer BASIC/Applesoft II counterparts. All command names are of the form PLTx such that GR, COLOR=, PLOT, HLIN and VLIN are called PLTG, PLTC, PLTP, PLTH, and PLTV, respectively. The command for returning to text mode, known as TEXT in other versions, is simply TEX, and carries the proviso that it has to be the last statement in a program line.
Applesoft BASIC 1.x was closer to Microsoft's original 6502 BASIC code than the later Applesoft II; it retained the Memory Size? prompt and displayed a Microsoft copyright notice. To maintain consistency with Integer BASIC, the "Ok" prompt from Microsoft's code was replaced by a ] character. Applesoft 1.x also prompted the user upon loading if he wished to disable the REM statement and the LET keyword in assignment statements in exchange for lores graphics commands.
The USR() function is also defined differently, serving as a stand-in for the absent CALL command. Its argument is not for passing a numerical value to the machine-language routine, but is instead the call-address of the routine itself; there is no "hook" to pre-define the address. All of several examples in the manual use the function only to access "system monitor ROM" routines, or short user-routines to manipulate the ROM routines. No mention is made of any code to calculate the value returned by the function itself; the function is always shown being assigned to "dummy" variables, which, without action to set a value by user-code, just receive a meaningless value handed back to them. Even accessed ROM routines that return values (in examples, those that provide the service of PDL() and SCRN() functions) merely have their values stored, by user-routines, in locations that are separately PEEKed in a subsequent statement.
Unlike in Integer BASIC and Applesoft II, the Boolean operators AND, OR and NOT perform bitwise operations on 16-bit integer values. If they are given values outside that range, an error results.
The terms OUT and PLT (and the aforementioned IN) appear in the list of reserved words, but are not explained anywhere in the manual.
Related BASICs.
Coleco claimed that its Adam home computer's SmartBASIC was source-code compatible with Applesoft.
Microsoft licensed a BASIC compatible with Applesoft to VTech for its Laser 128 clone.
References.
This article includes text from Everything2, licensed under GFDL.
|
2101 | Asterix | Asterix or The Adventures of Asterix ( or , "Asterix the Gaul") is a "bande dessinée" comic book series about a village of indomitable Gaulish warriors who adventure around the world and fight the Roman Republic, with the aid of a magic potion, during the era of Julius Caesar, in an ahistorical telling of the time after the Gallic Wars. The series first appeared in the Franco-Belgian comic magazine "Pilote" on 29 October 1959. It was written by René Goscinny and illustrated by Albert Uderzo until Goscinny's death in 1977. Uderzo then took over the writing until 2009, when he sold the rights to publishing company Hachette; he died in 2020. In 2013, a new team consisting of Jean-Yves Ferri (script) and Didier Conrad (artwork) took over. , 39 volumes have been released, with the most recent released in October 2021.
Description.
Asterix comics usually start with the following introduction: " The year is 50 BC. Gaul is entirely occupied by the Romans. Well, not entirely... One small village of indomitable Gauls still holds out against the invaders. And life is not easy for the Roman legionaries who garrison the fortified camps of Totorum, Aquarium, Laudanum and Compendium..." The series follows the adventures of a village of Gauls as they resist Roman occupation in 50 BC. They do so using a magic potion, brewed by their druid Getafix (Panoramix in the French version), which temporarily gives the recipient superhuman strength. The protagonists, the title character Asterix and his friend Obelix, have various adventures. The "-ix" ending of both names (as well as all the other pseudo-Gaulish "-ix" names in the series) alludes to the "-rix" suffix (meaning "king", like "-rex" in Latin) present in the names of many real Gaulish chieftains such as Vercingetorix, Orgetorix, and Dumnorix.
In some of the stories they travel to foreign countries, whilst other tales are set in and around their village. For much of the history of the series (volumes 4 through 29), settings in Gaul and abroad alternate, with even-numbered volumes set abroad and odd-numbered volumes set in Gaul, mostly in the village.
The "Asterix" series is one of the most popular Franco-Belgian comics in the world, with the series being translated into 111 languages and dialects .
The success of the series has led to the adaptation of its books into 15 films: ten animated, and five live action (two of which, "" and "Asterix and Obelix vs. Caesar", were major box office successes in France). There have also been a number of games based on the characters, and a theme park near Paris, Parc Astérix. The very first French satellite, Astérix, launched in 1965, was named after the character, whose name is close to Greek ἀστήρ and Latin "astrum", meaning a "star". As of 20 April 2022, 385million copies of "Asterix" books had been sold worldwide and translated in 111 languages making it the world's most widely translated comic book series, with co-creators René Goscinny and Albert Uderzo being France's best-selling authors abroad.
In April 2022 Albert and René’s general director, Céleste Surugue, hosted a 45-minute talk entitled ‘The Next Incarnation of a Heritage Franchise: Asterix’ and spoke about the success of the Asterix franchise, of which he noted "The idea was to find a subject with a strong connection with French culture and, while looking at the country's history, they ended up choosing its first defeat, namely the Gaul's Roman colonisation". He also went on to say how since 1989 Parc Asterix has attracted an average of 2.3 million visitors per year. Other notable mentions were how the franchise includes 10 animated movies, which recorded over 53 million viewers worldwide. The inception of Studios Idefix in 1974 and the opening of Studio 58 in 2016 were among the necessary steps to make Asterix a "100% Gaulish production", considered the best solution to keep the creative process under control from start to finish and to employ French manpower. He also noted how a new album is now published every two years, with print figures of 5 million and an estimated readership of 20 million.
History.
Prior to creating the "Asterix" series, Goscinny and Uderzo had had success with their series "Oumpah-pah", which was published in "Tintin" magazine.
"Astérix" was originally serialised in "Pilote" magazine, debuting in the first issue on 29 October 1959. In 1961 the first book was put together, titled "Asterix the Gaul". From then on, books were released generally on a yearly basis. Their success was exponential; the first book sold 6,000 copies in its year of publication; a year later, the second sold 20,000. In 1963, the third sold 40,000; the fourth, released in 1964, sold 150,000. A year later, the fifth sold 300,000; 1966's "Asterix and the Big Fight" sold 400,000 upon initial publication. The ninth "Asterix" volume, when first released in 1967, sold 1.2 million copies in two days.
Uderzo's first preliminary sketches portrayed Asterix as a huge and strong traditional Gaulish warrior. But Goscinny had a different picture in his mind, visualizing Asterix as a shrewd, compact warrior who would possess intelligence and wit more than raw strength. However, Uderzo felt that the downsized hero needed a strong but dim companion, to which Goscinny agreed. Hence, Obelix was born. Despite the growing popularity of "Asterix" with the readers, the financial backing for the publication "Pilote" ceased. "Pilote" was taken over by Georges Dargaud.
When Goscinny died in 1977, Uderzo continued the series by popular demand of the readers, who implored him to continue. He continued to issue new volumes of the series, but on a less frequent basis. Many critics and fans of the series prefer the earlier collaborations with Goscinny. Uderzo created his own publishing company, Éditions Albert René, which published every album drawn and written by Uderzo alone since then. However, Dargaud, the initial publisher of the series, kept the publishing rights on the 24 first albums made by both Uderzo and Goscinny. In 1990, the Uderzo and Goscinny families decided to sue Dargaud to take over the rights. In 1998, after a long trial, Dargaud lost the rights to publish and sell the albums. Uderzo decided to sell these rights to Hachette instead of Albert-René, but the publishing rights on new albums were still owned by Albert Uderzo (40%), Sylvie Uderzo (20%) and Anne Goscinny (40%).
In December 2008, Uderzo sold his stake to Hachette, which took over the company. In a letter published in the French newspaper "Le Monde" in 2009, Uderzo's daughter, Sylvie, attacked her father's decision to sell the family publishing firm and the rights to produce new "Astérix" adventures after his death. She said:
... the co-creator of "Astérix", France's comic strip hero, has betrayed the Gaulish warrior to the modern-day Romans – the men of industry and finance.
However, René Goscinny's daughter, Anne, also gave her agreement to the continuation of the series and sold her rights at the same time. She is reported to have said that ""Asterix" has already had two lives: one during my father's lifetime and one after it. Why not a third?". A few months later, Uderzo appointed three illustrators, who had been his assistants for many years, to continue the series. In 2011, Uderzo announced that a new "Asterix" album was due out in 2013, with Jean-Yves Ferri writing the story and Frédéric Mébarki drawing it. A year later, in 2012, the publisher Albert-René announced that Frédéric Mébarki had withdrawn from drawing the new album, due to the pressure he felt in following in the steps of Uderzo. Comic artist Didier Conrad was officially announced to take over drawing duties from Mébarki, with the due date of the new album in 2013 unchanged.
In January 2015, after the murders of seven cartoonists at the satirical Paris weekly "Charlie Hebdo", "Astérix" creator Albert Uderzo came out of retirement to draw two "Astérix" pictures honouring the memories of the victims.
List of titles.
Numbers 1–24, 32 and 34 are by Goscinny and Uderzo. Numbers 25–31 and 33 are by Uderzo alone. Numbers 35–39 are by Jean-Yves Ferri and Didier Conrad. Years stated are for their initial album release.
"Asterix Conquers Rome" is a comics adaptation of the animated film "The Twelve Tasks of Asterix". It was released in 1976 and was the 23rd volume to be published, but it has been rarely reprinted and is not considered to be canonical to the series. The only English translations ever to be published were in the "Asterix Annual 1980" and never an English standalone volume. A picture-book version of the same story was published in English translation as "The Twelve Tasks of Asterix" by Hodder & Stoughton in 1978.
In 1996, a tribute album in honour of Albert Uderzo was released titled "Uderzo Croqué par ses Amis", a volume containing 21 short stories with Uderzo in Ancient Gaul. This volume was published by Soleil Productions and has not been translated into English.
In 2007, Éditions Albert René released a tribute volume titled "Astérix et ses Amis", a 60-page volume of one-to-four-page short stories. It was a tribute to Albert Uderzo on his 80th birthday by 34 European cartoonists. The volume was translated into nine languages. , it has not been translated into English.
In 2016, the French publisher Hachette, along with Anne Goscinny and Albert Uderzo decided to make the special issue album "The XII Tasks of Asterix" for the 40th anniversary of the film "The Twelve Tasks of Asterix". There was no English edition.
Synopsis and characters.
The main setting for the series is an unnamed coastal village, rumoured to be inspired by Erquy in Armorica (present-day Brittany), a province of Gaul (modern France), in the year 50 BC. Julius Caesar has conquered nearly all of Gaul for the Roman Empire during the Gallic Wars. The little Armorican village, however, has held out because the villagers can gain temporary superhuman strength by drinking a magic potion brewed by the local village druid, Getafix. His chief is Vitalstatistix.
The main protagonist and hero of the village is Asterix, who, because of his shrewdness, is usually entrusted with the most important affairs of the village. He is aided in his adventures by his rather corpulent and slower thinking friend, Obelix, who, because he fell into the druid's cauldron of the potion as a baby, has permanent superhuman strength (because of this, Getafix steadfastly refuses to allow Obelix to drink the potion, as doing so would have a dangerous and unpredictable result, as shown in Asterix and Obelix All at Sea). Obelix is usually accompanied by Dogmatix, his little dog. (Except for Asterix and Obelix, the names of the characters change with the language. For example, Obelix's dog's name is "Idéfix" in the original French edition.)
Asterix and Obelix (and sometimes other members of the village) go on various adventures both within the village and in far away lands. Places visited in the series include parts of Gaul (Lutetia, Corsica etc.), neighbouring nations (Belgium, Spain, Britain, Germany etc.), and far away lands (North America, Middle East, India etc.).
The series employs science-fiction and fantasy elements in the more recent books; for instance, the use of extraterrestrials in "Asterix and the Falling Sky" and the city of Atlantis in "Asterix and Obelix All at Sea".
With rare exceptions, the ending of the albums usually shows a big banquet with the village's inhabitants gathering - the sole exception is the bard Cacofonix restrained and gagged to prevent him from singing (but in Asterix and the Normans the blacksmith Fulliautomatix was tied up). Mostly the banquets are held under the starry nights in the village, where roast boar is devoured and all (but one) are set about in merrymaking. However, there are a few exceptions, such as in Asterix and Cleopatra.
Humour.
The humour encountered in the "Asterix" comics often centers around puns, caricatures, and tongue-in-cheek stereotypes of contemporary European nations and French regions. Much of the humour in the initial Asterix books was French-specific, which delayed the translation of the books into other languages for fear of losing the jokes and the spirit of the story. Some translations have actually added local humour: In the Italian translation, the Roman legionaries are made to speak in 20th-century Roman dialect, and Obelix's famous "Ils sont fous, ces Romains" ("These Romans are crazy") is translated properly as "Sono pazzi questi romani", humorously alluding to the Roman abbreviation "SPQR". In another example: Hiccups are written onomatopoeically in French as "hips", but in English as "hic", allowing Roman legionaries in more than one of the English translations to decline their hiccups absurdly in Latin ("hic, haec, hoc"). The newer albums share a more universal humour, both written and visual.
Character names.
All the fictional characters in "Asterix" have names which are puns on their roles or personalities, and which follow certain patterns specific to nationality. Certain rules are followed (most of the time) such as Gauls (and their neighbours) having an "-ix" suffix for the men and ending in "-a" for the women; for example, Chief Vitalstatistix (so called due to his portly stature) and his wife Impedimenta (often at odds with the chief). The male Roman names end in "-us", echoing Latin nominative male singular form, as in Gluteus Maximus, a muscle-bound athlete whose name is literally the butt of the joke. Gothic names (present-day Germany) end in "-ic", after Gothic chiefs such as Alaric and Theoderic; for example Rhetoric the interpreter. Greek names end in "-os" or "-es"; for example, Thermos the restaurateur. British names usually end in "-ax" or "-os" and are often puns on the taxation associated with the later United Kingdom; examples include Mykingdomforanos, a British tribal chieftain, Valuaddedtax the druid, and Selectivemploymentax the mercenary. Names of Normans end with "-af", for example Nescaf or Cenotaf. Egyptian characters often end in "-is", such as the architects Edifis and Artifis, and the scribe Exlibris. Indic names, apart from the only Indic female characters Orinjade and Lemuhnade, exhibit considerable variation; examples include Watziznehm, Watzit, Owzat, and Howdoo. Other nationalities are treated to pidgin translations from their language, like Huevos y Bacon, a Spanish chieftain (whose name, meaning eggs and bacon, is often guidebook Spanish for tourists), or literary and other popular media references, like Dubbelosix (a sly reference to James Bond's codename "007").
Most of these jokes, and hence the names of the characters, are specific to the translation; for example, the druid named Getafix in English translation - "get a fix", referring to the character's role in dispensing the magic potion - is "Panoramix" in the original French and "Miraculix" in German. Even so, occasionally the wordplay has been preserved: Obelix's dog, known in the original French as "Idéfix" (from "idée fixe", a "fixed idea" or obsession), is called "Dogmatix" in English, which not only renders the original meaning strikingly closely ("dogmatic") but in fact adds another layer of wordplay with the syllable "Dog-" at the beginning of the name.
The name "Asterix", French "Astérix", comes from ', meaning "asterisk", which is the typographical symbol *"' indicating a footnote, from the Greek word ἀστήρ ("aster"), meaning a "star". His name is usually left unchanged in translations, aside from accents and the use of local alphabets. For example, in Esperanto, Polish, Slovene, Latvian, and Turkish it is "Asteriks" (in Turkish he was first named "Bücür" meaning "shorty", but the name was then standardised). Two exceptions include Icelandic, in which he is known as "Ástríkur" ("Rich of love"), and Sinhala, where he is known as ("Soora Pappa"), which can be interpreted as "Hero". The name "Obelix" ("Obélix") may refer to "obelisk", a stone column from ancient Egypt, but also to another typographical symbol, the obelisk or obelus ().
For explanations of some of the other names, see List of "Asterix" characters.
Ethnic stereotypes.
Many of the "Asterix" adventures take place in other countries away from their homeland in Gaul. In every album that takes place abroad, the characters meet (usually modern-day) stereotypes for each country, as seen by the French.
When the Gauls see foreigners speaking their foreign languages, these have different representations in the cartoon speech bubbles:
Translations.
The various volumes have been translated into more than 100 languages and dialects. Besides the original French language, most albums are available in Bengali, Estonian, English, Czech, Dutch, German, Galician, Danish, Icelandic, Norwegian, Swedish, Finnish, Spanish, Catalan, Basque, Portuguese, Italian, Greek, Hungarian, Polish, Romanian, Turkish, Slovene, Bulgarian, Serbian, Croatian, Latvian, Welsh, and also in Latin.
Some books have also been translated into languages including Esperanto, Scottish Gaelic, Irish, Scots, Indonesian, Persian, Mandarin, Korean, Japanese, Bengali, Afrikaans, Arabic, Hindi, Hebrew, Frisian, Romansch, Vietnamese, Sinhala, Ancient Greek, and Luxembourgish.
In Europe, several volumes were translated into a variety of regional languages and dialects, such as Alsatian, Breton, Chtimi (Picard), and Corsican in France; Bavarian, Swabian, and Low German in Germany; and Savo, Karelia, Rauma, and Helsinki slang dialects in Finland. In Portugal a special edition of the first volume, Asterix the Gaul, was translated into local language Mirandese. In Greece, a number of volumes have appeared in the Cretan Greek, Cypriot Greek, and Pontic Greek dialects. In the Italian version, while the Gauls speak standard Italian, the legionaries speak in the Romanesque dialect. In the former Yugoslavia, the "Forum" publishing house translated Corsican text in "Asterix in Corsica" into the Montenegrin dialect of Serbo-Croatian (today called Montenegrin).
In the Netherlands, several volumes were translated into West Frisian, a Germanic language spoken in the province of Friesland; into Limburgish, a regional language spoken not only in Dutch Limburg but also in Belgian Limburg and North Rhine-Westphalia, Germany; and into Tweants, a dialect in the region of Twente in the eastern province of Overijssel. Hungarian-language books were published in Yugoslavia for the Hungarian minority living in Serbia. Although not translated into a fully autonomous dialect, the books differ slightly from the language of the books issued in Hungary. In Sri Lanka, the cartoon series was adapted into Sinhala as "Sura Pappa".
Most volumes have been translated into Latin and Ancient Greek, with accompanying teachers' guides, as a way of teaching these ancient languages.
English translation.
Before Asterix became famous, translations of some strips were published in British comics including "Valiant", "Ranger", and "Look & Learn", under names "Little Fred and Big Ed" and "Beric the Bold", set in Roman-occupied Britain. These were included in an exhibition on Goscinny's life and career, and Asterix, in London's Jewish Museum in 2018.
In 1970 William Morrow published English translations in hardback of three Asterix albums for the American market. These were "Asterix the Gaul", "Asterix and Cleopatra" and "Asterix the Legionary". Lawrence Hughes in a letter to "The New York Times" stated, "Sales were modest, with the third title selling half the number of the first. I was publisher at the time, and Bill Cosby tried to buy film and television rights. When that fell through, we gave up the series."
The first 33 Asterix albums were translated into English by Anthea Bell and Derek Hockridge (including the three volumes reprinted by William Morrow), who were widely praised for maintaining the spirit and humour of the original French versions. Hockridge died in 2013, so Bell translated books 34 to 36 by herself, before retiring in 2016 for health reasons. She died in 2018. Adriana Hunter became translator.
US publisher Papercutz in December 2019 announced it would begin publishing "all-new more American translations" of the Asterix books, starting on 19 May 2020. The launch was postponed to 15 July 2020 as a result of the COVID-19 pandemic. The new translator is Joe Johnson, a professor of French and Spanish at Clayton State University.
Adaptations.
The series has been adapted into various media. There are 18 films, 15 board games, 40 video games, and 1 theme park.
Television series.
On 17 November, 2018, a 52 eleven-minute episode computer-animated series centred around Dogmatix was announced to be in production by Studio 58 and Futurikon for broadcast on France Télévisions in 2020. On 21 December, 2020, it was confirmed that "Dogmatix and the Indomitables" had been pushed back to fall 2021, with o2o Studio producing the animation. The show is distributed globally by LS Distribution. The series premiered on the Okoo streaming service on 2 July before beginning its linear broadcast on France 4 on 28 August 2021.
On 3 March, 2021, it was announced that Asterix the Gaul is to star in a new Netflix animated series directed by Alain Chabat. The series will be adapted from one of the classic volumes, "Asterix and the Big Fight", where the Romans, after being constantly embarrassed by Asterix and his village cohorts, organize a brawl between rival Gaulish chiefs and try to fix the result by kidnapping a druid along with his much-needed magic potion. The series will debut in 2023. The series will be CG-Animated.
Games.
Many gamebooks, board games and video games are based upon the "Asterix" series. In particular, many video games were released by various computer game publishers.
Theme park.
Parc Astérix, a theme park 22 miles north of Paris, based upon the series, was opened in 1989. It is one of the most visited sites in France, with around 2.3 million visitors per year.
|
2102 | Arizona Cardinals | The Arizona Cardinals are a professional American football team based in the Phoenix metropolitan area. The Cardinals compete in the National Football League (NFL) as a member of the National Football Conference (NFC) West division, and play their home games at State Farm Stadium in Glendale, a suburb northwest of Phoenix.
The team was established in Chicago in 1898 as the Morgan Athletic Club, and joined the NFL as a charter member on September 17, 1920. The Cardinals are the oldest continuously run professional football franchise in the United States, as well as one of only two NFL charter member franchises still in operation since the league's founding, the other also from Chicago, the Chicago Bears (the Green Bay Packers were an independent team and did not join the NFL until a year after its creation in 1921). The team moved to St. Louis in and played there until . The team in St. Louis was commonly referred to as the "Football Cardinals", the "Gridbirds" or the "Big Red" to avoid confusion with the Major League Baseball's (MLB) St. Louis Cardinals. Before the season, the team moved west to Tempe, Arizona, a suburb east of Phoenix and played their home games for the next 18 seasons at Sun Devil Stadium on the campus of Arizona State University. In , the team moved to their current home field in suburban Glendale, although their executive offices and training facility remain in Tempe. From 1988 to 2012 (except 2005, when they trained in Prescott), the Cardinals conducted their annual summer training camp at Northern Arizona University in Flagstaff. The Cardinals moved their training camp to State Farm Stadium (then University of Phoenix Stadium) in 2013.
The Cardinals have won two NFL championships, both while the team was in Chicago. The first occurred in , but is the subject of controversy, with supporters of the Pottsville Maroons believing that Pottsville should have won the title. Their second, and the first to be won in a championship game, came in , nearly two decades before the first Super Bowl. They returned to the title game to defend in 1948, but lost the rematch 7–0 in a snowstorm in Philadelphia.
Since winning the championship in 1947, the team suffered many losing seasons, and currently holds the longest active championship drought of North American sports at 75 consecutive seasons. In 2012, the Cardinals became the first NFL franchise to lose 700 games since its inception. The team's all-time win–loss record (including regular season and playoff games) at the conclusion of the 2022 season is ( in the regular season, in the playoffs). They have been to the playoffs eleven times and have won seven playoff games, three of which were victories during their run in the 2008–09 NFL playoffs. During that season, they won their only NFC Championship Game since the 1970 AFL–NFL merger, and reached Super Bowl XLIII in 2009, losing 27–23 to the Pittsburgh Steelers. The team has also won five division titles (, , , and ) since their 1947–48 NFL championship game appearances. The Cardinals are the only NFL team who have never lost a playoff game at home, with a 5–0 record: the 1947 NFL Championship Game, two postseason victories during the aforementioned 2008–09 NFL playoffs, one during the 2009–10 playoffs, and one during the 2015–16 playoffs. The Cardinals have a total of 6 playoff appearances, 3 division titles and the one NFC championship in their 35 seasons since relocating to the Valley of the Sun in 1988.
Franchise history.
Chicago Cardinals (1920–1959).
The franchise's inception dates back to 1898, when a neighborhood group gathered to play on the South Side of Chicago, calling themselves the Morgan Athletic Club. Chicago painting and building contractor Chris O'Brien acquired the team, which he relocated to Normal Field on Racine Avenue. The team was known as the Racine Normals until 1901, when O'Brien bought used jerseys from the University of Chicago. After he described the faded maroon clothing as "Cardinal red", the team became the Racine Street Cardinals. Eventually in 1920, the team became a charter member of the American Professional Football Association (APFA), which was rechristened the National Football League (NFL) two years later. The team entered the league as the Racine Cardinals, but changed their name to the Chicago Cardinals in 1922 to avoid confusion with the Horlick-Racine Legion, who entered the league two years earlier.
NFL champions (1925).
In 1925, the Cardinals were awarded the NFL Championship after the Pottsville Maroons were suspended for playing a game in what was deemed "another teams ". Having beat the Cardinals in a head-to-head game earlier in the season, the Pottsville Maroons won their extra game against the University of Notre Dame, helping them finish the year with the same record as the Cardinals. The Cardinals were also guilty of breaking NFL rules when they had scheduled two additional games, playing against the Hammond Pros and the Milwaukee Badgers, both of whom had already disbanded for the season. The game against the Badgers spurred a scandal when the Badgers filled out their roster with four high school players, in violation of NFL rules. The Cardinals experienced some success on the playing field during their first 26 seasons in the league.
NFL Champions (1947).
During the post-World War II years, the team reached two straight NFL finals against the Philadelphia Eagles, winning in 1947 (eight months after Charles Bidwill's death) but losing the following year. In the late 1950s, after years of bad seasons and losing fans to their crosstown rivals, the Chicago Bears, the Cardinals were almost bankrupt, and owner Violet Bidwill Wolfner became interested in a relocation.
St. Louis Cardinals (1960–1987).
Due to the formation of the rival American Football League, the NFL allowed Bidwill to relocate the team to St. Louis, Missouri, where they became the St. Louis Cardinals (they were locally called the "Big Red", the "Gridbirds" or the "Football Cardinals" to avoid confusion with the baseball team of the same name). During the Cardinals' 28-year stay in St. Louis, they advanced to the playoffs just three times (1974, 1975 and 1982), never hosting or winning in any appearance. The overall mediocrity of the Cardinals, combined with a then-21-year-old stadium, caused game attendance to dwindle, and owner Bill Bidwill decided to move the team to Arizona.
Phoenix/Arizona Cardinals (1988–present).
Not long after the end of the 1987 NFL season, Bidwill agreed to move to Phoenix on a handshake deal with state and local officials, and the team became the Phoenix Cardinals (the franchise has never played in the city of Phoenix proper; however, there are several NFL teams which do not play in their market's central cities). The team changed their geographic name to the Arizona Cardinals on March 17, 1994. The 1998 NFL season saw the Cardinals break two long droughts, qualifying for the playoffs for the first time in 16 years. The team got their first postseason win since 1947 by defeating the Dallas Cowboys 20–7 in the .
In , the Cardinals, led by quarterback Kurt Warner, won the against the Philadelphia Eagles to advance to the Super Bowl for the first time in their history. They lost Super Bowl XLIII 27–23 to the Pittsburgh Steelers in the final seconds of the game.
After their historic 2008 season, the Cardinals posted a 10–6 record in , their first season with 10 wins in Arizona. The Cardinals clinched their second consecutive NFC West title, but were defeated by the eventual Super Bowl champion, the New Orleans Saints, 45–14 in the divisional playoffs. The next time they would make the playoffs would be in , when they ended up as a wild card. They set the best regular-season record in their history in Arizona at 11–5, but were defeated by the 7–8–1 NFC South champions, the Carolina Panthers.
The next year, the Cardinals set a franchise-best 13–3 record, and clinched their first-ever first-round playoff bye as the NFC's second seed. They defeated the Green Bay Packers , giving quarterback Carson Palmer his first playoff victory. The Cardinals then advanced to their second in their history, but were blown out by the top-seeded 15–1 Panthers 49–15, committing seven turnovers.
The Cardinals then fell to 7–8–1 in and 8–8 in before ultimately dropping to 3–13 in , tying the franchise record set in for the worst record in a 16-game season. The team improved to 5–10–1 in and 8–8 in . In , the Cardinals went 11–6, posting a winning record and returning to the postseason for the first time since 2015, but lost to the Los Angeles Rams in the Wild Card round. They failed to improve their record in 2022, dropping to the bottom of NFC West at 4-13, and missing the playoffs.
Logos and uniforms.
Starting in , the team had a logo of a cardinal bird perched on the laces of a football.
The Cardinals moved to Arizona in , and the flag of Arizona was added to the sleeves the following year. In , the team began wearing red pants with their white jerseys, as new coach Joe Bugel wanted to emulate his former employer, the Washington Redskins, who at the time wore burgundy pants with their white jerseys (the Redskins later returned to their 1970s gold pants with all their jerseys).
In , the Cardinals participated in the NFL's 75th-anniversary throwback uniform program. The jerseys were similar to those of the 1920s Chicago Cardinals, with an interlocking "CC" logo and three stripes on each sleeve. The uniform numbers were relocated to the right chest. The pants were khaki to simulate the color and material used in that era. The Cardinals also stripped the logos from their helmets for two games: at Cleveland and home vs. Pittsburgh.
The Cardinal head on the helmet also appeared on the sleeve of the white jersey from 1982 to 1995. In 1996, the state flag of Arizona was moved higher on the sleeve after the Cardinal head was eliminated as sleeves on football jerseys became shorter, and black was removed as an accent color, instead replaced with a blue to match the predominant color of the state flag. In 2002, the Cardinals began to wear all-red and all-white combinations, and continued to do so through 2004, prior to the team's makeover.
In , the team unveiled its first major changes in a century. The cardinal-head logo was updated to look sleeker and meaner than its predecessor. Numerous fans had derisively called the previous version a "parakeet". Black again became an accent color after an eight-year absence, while trim lines were added to the outside shoulders, sleeves, and sides of the jerseys and pants. Both the red and white jerseys have the option of red or white pants.
Hoping to break a six-game losing streak, the Cardinals wore the red pants for the first time on October 29, 2006, in a game at Lambeau Field against the Green Bay Packers. The Packers won 31–14, and the Cards headed into their bye week with a 1–7 mark. Following the bye week, the Cardinals came out in an all-red combination at home against the Dallas Cowboys and lost, 27–10. Arizona did not wear the red pants for the remainder of the season and won four of their last seven games. However, the following season, in , the Cardinals again wore their red pants for their final 3 home games. They wore red pants with white jerseys in games on the road at the Cincinnati Bengals and Seattle Seahawks. They paired red pants with red jerseys, the all-red combination, for home games against the Detroit Lions, San Francisco 49ers, Cleveland Browns, and St. Louis Rams. The red pants were not worn at all in , but they were used in home games vs. Seattle, Minnesota, and St. Louis in . The red pants were paired with the white road jersey for the first time in three years during a 2010 game at Carolina, but the white jersey/red pants combination was not used again until 2018, when they broke out the combination against the Kansas City Chiefs.
The Cardinals' first home game in Arizona, in 1988, saw them play in red jerseys. Thereafter, for the next 18 years in Arizona, the Cardinals, like a few other NFL teams in warm climates, wore their white jerseys at home during the first half of the season—forcing opponents to suffer in their darker jerseys during Arizona autumns that frequently see temperatures over 100 °F (38 °C). However, this tradition did not continue when the Cardinals moved from Sun Devil Stadium to State Farm Stadium in 2006, as early-season games (and some home games late in the season) were played with the roof closed. With the temperature inside at a comfortable 70 °F (21 °C), the team opted to wear red jerseys at home full-time. The Cardinals wore white jerseys at home for the first time at State Farm Stadium on August 29, 2008, in a preseason game against the Denver Broncos.
The Cardinals wore white at home for the first time in a regular-season game at State Farm Stadium against the Houston Texans on October 11, . In October 2009, the NFL recognized Breast Cancer Awareness Month, and players wore pink-accented items, including gloves, wristbands, and shoes. The team thought the pink accents looked better with white uniforms than with red.
From 1970 through 1983, and again in many seasons between 1989 and 2002, the Cardinals would wear white when hosting the Dallas Cowboys in order to force the Cowboys to don their "jinxed" blue jerseys. They have not done this since moving into State Farm Stadium, however.
The season saw the Cardinals debut a new, alternate black jersey. In , the Cardinals debuted an all-black set for the NFL Color Rush program. While the regular black alternates featured white lettering and are paired with white pants, the Cardinals' Color Rush alternates used red lettering and black pants for the occasion. Starting in 2022, both black uniforms would be paired with an alternate black helmet.
Before the season, the Cardinals unveiled new uniforms. Most notably, the team opted to wear all-red uniforms at home and all-white uniforms on the road, with all-black uniforms as the alternate. The red uniform featured the state name in front in addition to white numbers with silver trim. The white uniform featured red numbers with black trim, and red and silver stripes along the pants and sleeves. The black alternate uniform design mirrored that of the white uniform, featuring red numbers with silver trim, and red and silver stripes along the pants and sleeves. On both uniforms, the silver sleeve stripe contained the team name. Both the red and white uniforms are worn with white helmets and silver facemasks, while the black uniform is worn with black helmets and black facemasks.
Fans.
The Cardinals' playoff drought has exhibited resilience for some fans who have shown longtime devotion to the team. Fans of the Cardinals are often referred to as the Red Sea or the Bird Gang, with several notable fans such as Blake Shelton and Jordin Sparks. In honor of the tragic death of former safety Pat Tillman, the Cardinals strengthened their relationship with members of the armed forces community. The team regularly markets to military personnel and frequently visits nearby Luke Air Force Base in support of Arizona's servicemen.
Rivalries.
Los Angeles Rams.
One of the oldest matchups for the Cardinals as both teams first met during the 1937 NFL Season whilst the Rams played in Cleveland, and the Cardinals were still originally located in Chicago. Their Rivalry with the Los Angeles Rams has resurged in recent years as both teams found playoff success, despite the Cardinals' best efforts; the Rams have been 9-1 since hiring head coach Sean McVay in 2017. The Week 17 matchup of the 2020 season saw both teams playing for a playoff berth; despite the injury to Rams quarterback Jared Goff, the Cardinals lost 18-7 and were eliminated from the postseason. The Cardinals' streak ended against the Rams the following season. They took the lead in the NFC over the Rams and started the season 7-0. In the following matchup, the Rams won on Monday Night Football; the Cardinals lost 6 of 10 games after their 7-0 start. The Cardinals would clinch a wild card berth after a week 17 win over the Dallas Cowboys. They played the Rams in Los Angeles and lost 34-11 as Kyler Murray threw 2 interceptions with one returned for a touchdown.
Seattle Seahawks.
One of the newer rivalries in the NFL, the Cardinals and Seahawks became divisional rivals after both were relocated to the NFC West as a result of the league's realignment in 2002. This rivalry has become one of the NFL's more bitter in recent years, as the mid-to-late 2010s often saw the Seahawks and Cardinals squaring off for NFC West supremacy. Many Cardinals fans see the Seahawks as their top rival due to their 2010s dominance under quarterback Russell Wilson and head coach Pete Carroll, although Seattle shares more intense rivalries with the San Francisco 49ers, Los Angeles Rams, and even the Green Bay Packers. Seattle leads the series 23–22–1, and the two teams have yet to meet in the playoffs.
San Francisco 49ers.
The 49ers lead the series 34–29. Though they first met in 1951 and would meet occasionally until 2000, this would not become a full-fledged rivalry until both teams were placed in the NFC West division in 2002. While a close rivalry, it is often lopsided on both ends. After the 49ers won nine of ten meetings between 2009 and 2013, the Cardinals won eight straight meetings between 2014 and 2018.
The two teams have yet to meet in the playoffs.
Chicago Bears.
The Cardinals' rivalry with the Bears features the only two teams that remain from the league's inception in 1920. At that time, the Bears were known as the Decatur Staleys, and the Cardinals were the Racine Cardinals. In 1922, the matchup between the teams became known as "The Battle of Chicago" for 38 years, making it the first true rivalry in the league's history. The Bears lead the all-time series 59–28–6.
Seasons and overall records.
Single-season records.
Points Scored: 489 ()
Passing
Rushing
Receiving
Returns
Kicking
Career records.
"As of 2021"
Players of note.
Retired numbers.
Notes:
Pro Football Hall of Famers.
"Italics" = played a portion of career with the Cardinals and enshrined representing another team
Dierdorf, Smith, Wehrli and Wilson were members of the St. Louis Football Ring of Fame in The Dome at America's Center when the Rams played there from 1995 to 2015.
Ring of Honor.
The Cardinals' Ring of Honor was started in to mark the opening of State Farm Stadium. It honors former Cardinal greats from all eras of the franchise's history. Following is a list of inductees and the dates that they were inducted.
Staff.
The Cardinals have had 42 head coaches throughout their history. Their first head coach was Paddy Driscoll, who compiled a 17–8–4 record with the team from 1920 to 1922. Jimmy Conzelman, Jim Hanifan and Ken Whisenhunt are tied as the longest-serving head coaches in Cardinals history. On April 14, 2022, Mark Ahlemeier, the Cardinals equipment manager retired after working with the organization for 41 seasons.
Radio and television.
The Cardinals' flagship radio station is KMVP-FM; Dave Pasch, Ron Wolfley, and Paul Calvisi handle the radio broadcast. Spanish-language radio broadcasts are heard on the combo of KQMR/KHOV-FM "Latino Mix" under a contract with Univisión, signed in 2015. Prior to 2015, they were heard on KDVA/KVVA-FM "José FM", as well as co-owned KBMB AM 710. The Cardinals were the first NFL team to offer all 20 preseason and regular season games on Spanish-language radio, doing so in 2000. Luis Hernandez and Rolando Cantú are the Spanish broadcast team. The Cardinals have the most extensive Mexican affiliate network in the NFL, with contracts with Grupo Larsa (in the state of Sonora) and Grupo Radiorama (outside Sonora) and stations in 20 cities, including Hermosillo, Guadalajara and Mexico City.
As of the 2017 season, NBC affiliate KPNX broadcasts the team's preseason games on television (which, that year, included the Hall of Fame Game broadcast by NBC), called by Pasch and Wolfley, with station anchor Paul Gerke as sideline reporter. The broadcasts are syndicated regionally to KTTU and KMSB-TV in Tucson, and KVVU-TV in Las Vegas.
References.
Notes
Further reading
|
2103 | Atlanta Falcons | The Atlanta Falcons are a professional American football team based in Atlanta. The Falcons compete in the National Football League (NFL) as a member club of the league's National Football Conference (NFC) South division. The Falcons were founded on June 30, 1965, and joined the NFL in 1966 as an expansion team, after the NFL offered then-owner Rankin Smith a franchise to keep him from joining the rival American Football League (AFL).
In their 55 years of existence, the Falcons have compiled a record of 379–487–6 ( in the regular season and in the playoffs), winning division championships in 1980, 1998, 2004, 2010, 2012, and 2016. The Falcons have appeared in two Super Bowls, the first during the 1998 season in Super Bowl XXXIII, where they lost to the Denver Broncos and the second 18 years later, a overtime loss to the New England Patriots in Super Bowl LI.
The Falcons' current home field is Mercedes-Benz Stadium, which opened for the 2017 season; the team's headquarters and practice facilities are located at a site in Flowery Branch, northeast of Atlanta in Hall County.
Franchise history.
Professional football comes to Atlanta (1962).
Professional football first came to Atlanta in 1962, when the American Football League (AFL) staged two preseason contests, with one featuring the Denver Broncos vs. the Houston Oilers and the second pitting the Dallas Texans against the Oakland Raiders. Two years later, the AFL held another exhibition, this time with the New York Jets taking on the San Diego Chargers.
In 1965, after the Atlanta–Fulton County Stadium (then known simply as Atlanta Stadium) was built, the city of Atlanta felt the time was right to start pursuing professional football. One independent group which had been active in NFL exhibition promotions in Atlanta applied for franchises in both the AFL and NFL, acting entirely on its own with no guarantee of stadium rights. Another group reported it had deposited earnest money for a team in the AFL.
With everyone running in different directions, some local businessmen (Cox Broadcasting) worked out a deal and were awarded an AFL franchise on contingent upon acquiring exclusive stadium rights from city NFL Commissioner Pete Rozelle, who had been moving slowly in Atlanta matters, was spurred by the AFL interest and headed on the next plane down to Atlanta to block the rival league's claim on the city of Atlanta. He forced the city to make a choice between the two leagues; by June 30, the city picked Rankin Smith and the NFL.
The AFL's original expansion plans in June 1965 were for two new teams in Atlanta and It later evolved into the Miami Dolphins in 1966 and the Cincinnati Bengals in 1968. The NFL had planned to add two teams in ; the competition with the AFL for Atlanta forced the first to be added a year early in . The odd number of teams (15) resulted in one idle team (bye) each week, with each team playing 14 games over 15 weeks (similar to : 12 games over 13 weeks). The second expansion team, the New Orleans Saints, joined the NFL as planned in 1967 as its sixteenth franchise.
The Atlanta Falcons franchise began when it was approved to begin play in 1966 by a unanimous vote of the NFL club owners on June 21, 1965. Rozelle granted ownership nine days later on June 30 to 40-year-old Rankin Smith Sr., an Executive Vice President of Life Insurance Company of Georgia. He paid $8.5 million, the highest price in NFL history at the time for a franchise. Rozelle and Smith made the deal in about five minutes and the Atlanta Falcons brought the largest and most popular sport to the city of Atlanta. The Atlanta expansion team became the 15th NFL franchise, and they were awarded the first overall pick in the 1966 NFL Draft as well as the final pick in each of the first five rounds. They selected consensus All-American linebacker Tommy Nobis from the University of Texas, making him the first-ever Falcon. The league also held the expansion draft six weeks later in which Atlanta selected unprotected players from the 14 existing franchises. Although the Falcons selected many good players in those drafts, they still were not able to win right away.
The Atlanta team received its nickname on August 29, 1965. Miss Julia Elliott, a school teacher from Griffin, was singled out from many people who suggested "Falcons" as the nickname for the new franchise. She wrote: "the Falcon is proud and dignified, with great courage and fight. It never drops its prey. It is deadly and has a great sporting tradition."
Smith family era (1966–2001).
The Falcons' inaugural season was in 1966, and their first preseason game was on August 1, a loss to the Philadelphia Eagles. Under head coach Norb Hecker, Atlanta lost their first nine regular-season games in 1966; their first victory came on the road against the struggling New York Giants on November 20 in Yankee Stadium. Two weeks later, Atlanta won at Minnesota, and beat St. Louis in Atlanta the next week for their first home win. The team finished the 1960s with 12 wins in four seasons.
The Falcons had their first Monday Night Football game in Atlanta during the 1970 season, a 20–7 loss to the Miami Dolphins. The only two winning seasons in their first 12 years were and
In the 1978 season, the Falcons qualified for the playoffs for the first time and won the Wild Card game against the Eagles 14–13. The following week, they lost to the Dallas Cowboys 27–20 in the Divisional Playoffs.
In the 1980 season, after a nine-game winning streak, the Falcons posted a franchise then-best record of 12–4 and captured their first NFC West division title. The next week, their dream season ended at home with a loss to the Cowboys 30–27 in the divisional playoffs. In the strike-shortened 1982 season, the Falcons made the playoffs but lost to the Minnesota Vikings, 30–24. Falcons coach Leeman Bennett was fired after the loss. The team then had losing seasons for the next eight years.
In the 1989 NFL Draft, the Falcons selected cornerback Deion Sanders in the first round, who helped them for the next four years, setting many records for the franchise. "Neon Deion" (a.k.a. "Prime Time") had a flashy appeal and helped bring media attention to one of the league's most anonymous franchises. Sanders was also famous for playing on major league baseball teams (New York Yankees and the Atlanta Braves) while simultaneously playing in the NFL.
After defeating the New Orleans Saints in the NFC Wild Card game, the Falcons' 1991 season ended in a divisional playoff loss to the Washington Redskins. In the 1991 NFL Draft, the Falcons selected quarterback Brett Favre as the 33rd overall pick. During his rookie season, he played in two games where he amassed a record of four passing attempts with no receptions and two interceptions. The following February, Favre was traded to the Green Bay Packers.
In 1992, the Atlanta Falcons opened a new chapter in their history moving into the newly constructed Georgia Dome, where the team has defeated all 31 other NFL teams at least once during its time there.
Dan Reeves years (1997–2003).
In 1998, under recently acquired head coach Dan Reeves, quarterback Chris Chandler and running back Jamal Anderson the "Dirty Bird" Falcons had their greatest season to date. On November 8, they beat the New England Patriots 41–10, ending a streak of 22 losses at cold-weather sites. The team finished with a franchise-best 14–2 regular-season record and the NFC West division championship. On January 17, 1999, the Falcons upset the top-seeded Vikings at the Hubert H. Humphrey Metrodome in the NFC Championship Game 30–27, in an exciting overtime victory. However, in their first-ever Super Bowl appearance, they lost 34–19 to the defending champion Denver Broncos in Super Bowl XXXIII.
In the second game of the Falcons 1999 season, running back Jamal Anderson, who had been a key player in the Falcons' 1998 success, suffered a season-ending knee injury. The Falcons finished the season with a very disappointing 5–11 regular-season record. In 2000, the Falcons suffered through another horrendous season finishing 4–12 and once again missing the playoffs.
In the 2001 NFL draft, the Falcons orchestrated a trade with the San Diego Chargers, acquiring the first overall pick (which was used on quarterback Michael Vick) in exchange for wide receiver-return specialist Tim Dwight and the fifth overall pick (used on running back LaDainian Tomlinson).
The Falcons finished the 2001 season with a record of 7–9 and missed the playoffs. Jessie Tuggle retired following 14 seasons in Atlanta.
Arthur Blank era (2002–present).
On December 6, 2001, Arthur M. Blank reached a preliminary agreement with the Falcons' Taylor Smith to purchase the team. In a special meeting prior to Super Bowl XXXVI in New Orleans on February 2, 2002, NFL owners voted unanimously to approve the purchase.
The 2002 season saw the Falcons return to the playoffs with a regular-season record of 9–6–1, tying the Pittsburgh Steelers. It was Vick's first year as the starter, and the team, with newly acquired running back Warrick Dunn, delivered the Green Bay Packers their first home playoff loss ever. A 20–6 loss to the Donovan McNabb-led Philadelphia Eagles the following week, however, ended the Falcons' season.
On March 19, 2003, the Falcons presented their new logo. During the 2003 preseason Vick broke his leg and missed the first 12 games of the season. After losing 7 straight games, the decision was made to release head coach Dan Reeves. Wade Phillips acted as interim coach for the final 3 games. Although the Falcons won 3 of their last 4 games after the return of Vick, they ended up with a 5–11 record that year. In 2004, a new head coach, Jim L. Mora, was hired and Vick returned for the full season. The Falcons went 11–5, winning their third division title and earning a first-round bye into the playoffs. In the divisional playoffs, the Falcons defeated the St. Louis Rams, 47–17, in the Georgia Dome, advancing to the NFC Championship Game, which they lost to the Eagles, 27–10.
The Falcons again fell short of achieving back-to-back winning seasons in , going 8–8. In , Michael Vick became the first quarterback in league history to rush for more than 1,000 yards in a season, with 1,039. After finishing the season 7–9, however, coach Jim Mora was dismissed and Bobby Petrino, the University of Louisville's football coach, replaced him. Before the 2007 season began, Vick was suspended indefinitely by the NFL after pleading guilty to charges involving dog fighting in the state of Virginia. On December 10, 2007, Vick received a 23-month prison sentence and was officially cut from the Atlanta roster.
For the 2007 season, the Falcons were forced to start Joey Harrington at quarterback. On December 11, 13 games into his first NFL season as head coach, Bobby Petrino resigned without notice to coach at the University of Arkansas, leaving the beleaguered players only a note in the locker room. Secondary Coach Emmitt Thomas was named interim coach for the final three games of the season on December 12. The Falcons ended the year with a dismal 4–12 record.
After the tumultuous and disappointing 2007 season, the Falcons made a number of moves, hiring a new General Manager and head coach, drafting a new starting quarterback, and signing a starting running back.
On January 13, 2008, the Falcons named former Patriots director of college football scouting Thomas Dimitroff General Manager. On January 23, Jacksonville Jaguars defensive coach and former linebackers coach for the 2000 Super Bowl champion Baltimore Ravens Mike Smith was named the Falcons' new head coach. Chargers back-up running back Michael Turner agreed to a 6-year, $30 million deal on March 2. On April 26, Matt Ryan (quarterback from Boston College) was drafted third overall in the 2008 NFL draft by the Falcons.
The Falcons finished the 2008 regular season with a record of 11–5, and the #5 seed in the playoffs. On December 21, 2008, Atlanta beat the Minnesota Vikings 24–17 to clinch a wild card spot, earning a trip to the playoffs for the first time since 2004. The Falcons would go on to lose in the wild-card round of the 2008 NFL playoffs to the eventual NFC champion Arizona Cardinals, 30–24.
Matt Ryan started all 16 games in his rookie season and was named the Associated Press Offensive Rookie of the Year. First-year head coach Mike Smith was named 2008 NFL Coach of the Year.
Although they failed to make the playoffs in 2009 the team rallied to win their final three regular-season games to record back-to-back winning seasons for the first time in franchise history. The Falcons defeated the Tampa Bay Buccaneers 20–10 in the final game of the season to improve their record to 9–7.
In 2010, with a regular-season record of 13–3, the Falcons secured a third straight winning season, their fourth overall divisional title, and the top overall seed in the NFC playoffs; however, the Falcons were overpowered by the eventual Super Bowl XLV champion Green Bay Packers in the NFC Divisional Playoffs 48–21. The Falcons scored 414 points – the fifth-most in franchise history. The team sent an NFL-high and franchise-best nine players to the 2011 Pro Bowl.
The Falcons made a surprise trade up with the Cleveland Browns in the 2011 NFL draft to select Alabama wide receiver Julio Jones sixth overall. In exchange, the Falcons gave up their first-, second- and fourth-round draft picks in 2011, and their first and fourth draft picks in 2012. Jones, along with teammates Tony Gonzalez and Roddy White, have since been dubbed Atlanta's "Big Three" (based on their total number of reception yards). On August 30, 2011, Sports Illustrated senior writer Peter King, who correctly predicted the 2011 Super Bowl, made his predictions for the 2011 season and picked the Falcons to defeat the San Diego Chargers in the 2012 Super Bowl. The Falcons finished the season at 10–6, securing the fifth seed after a Week 17 beatdown of Tampa Bay in which the Falcons pulled their starters after leading 42–0 just 23 minutes into the game.
The Falcons then went on to play the New York Giants in a 2011 NFC Wild Card Game at MetLife Stadium in East Rutherford, New Jersey. The first half was a defensive struggle, with the first points coming off of a safety by the Falcons, giving Atlanta a 2–0 lead. In the second quarter, though, Eli Manning connected with Hakeem Nicks for a short touchdown pass to make it 7–2 Giants heading into the second half. Then the Giants took control, as Manning threw for two more touchdown passes to Mario Manningham and Nicks and the defense completed its shutout of the Falcons to give the New York Giants the win, 24–2, and the Falcons their third straight playoff loss with Matt Ryan and Mike Smith. After the season, defensive coordinator Brian VanGorder accepted a coaching job at Auburn University, and the offensive coordinator Mike Mularkey took the head coaching job in Jacksonville.
Atlanta exploded out of the gate, going a franchise-best 8–0 and remaining the last unbeaten team in the NFL that year. Their hopes to get an undefeated season came to an end with a 27–31 loss to the division rival Saints. Julio Jones had a remarkable second year, grabbing 10 touchdowns and 1,198 yards. The Falcons finished the season 13–3, and clinched the number one seed in the NFC playoffs.
The Falcons played the Seattle Seahawks in their first playoff game. Although they went down 28–27 with only 31 seconds left on the clock, Matt Ryan led the team to their first playoff victory, 30–28. It was the only playoff victory in the Mike Smith era.
The Atlanta Falcons then advanced to face the San Francisco 49ers. The Falcons seized control of the game early with a Matt Bryant field goal, a trio of Matt Ryan touchdown passes caught by Julio Jones and Tony Gonzalez coupled with outstanding defensive play. By the end of the half, the score was 24–14. The tides of the game began to shift in the second half as the 49ers rallied back with a pair of Frank Gore touchdown runs. Atlanta's offense attempted to reply but were ultimately shut down by the 49er defense. A few series later, late in the 4th quarter with little time remaining, Atlanta found themselves in a 4th and 4 situation at the 10-yard line. The Falcons needed just 10 more yards to secure victory and advance to their first Super Bowl berth in 14 years. Matt Ryan fired a pass to Roddy White which was ultimately broken up by inside linebacker NaVorro Bowman, resulting in a 28–24 defeat.
Following the success of the previous season, the Falcons were an expected Super Bowl contender. However, injuries hampered the team's performance and the team finished the season 4–12. With that, the streak of consecutive winning seasons came to an end and Mike Smith had his first losing season as a head coach. Tony Gonzalez, in his final season in the NFL, was selected to the 2014 Pro Bowl as a starter representing Team Rice. Following the conclusion of the 2012 season, director of player personnel Les Snead departed the team to join the St. Louis Rams and Dave Caldwell, assistant to general manager Thomas Dimitroff, left the team to join the Jacksonville Jaguars. Scott Pioli, former GM of the Kansas City Chiefs, was announced as the Falcons' new assistant GM. Mike Smith was given a one-year extension on his contract as head coach. The Falcons had the 6th overall pick in the 2014 NFL draft with which they selected Jake Matthews, who played as offensive tackle for Texas A&M.
Despite having another rough season, the Falcons still had an opportunity to qualify for the playoffs at the end of the regular season. The Falcons hosted the Carolina Panthers in their regular season finale, with the winners clinching the NFC South division. Unfortunately, the Falcons lost in a 34–3 blowout as Matt Ryan threw two interceptions that were returned for touchdowns and got sacked six times. The Falcons finished the season 6–10, marking the second consecutive losing season for the team. The following day, Mike Smith was fired after seven seasons as head coach. The Falcons would soon hire Seattle Seahawks defensive coordinator Dan Quinn as the team's 16th head coach. The Falcons had the 8th overall pick in the 2015 NFL draft with which they selected Vic Beasley, a defensive end from Clemson University.
Dan Quinn years (2015–2020).
In February 2015, the team was investigated by the NFL for alleged use of artificial crowd noise in the Georgia Dome. The Falcons lost a 2016 NFL Draft selection as a result of the league's investigation.
Dan Quinn's first season saw a 5–0 start, the team's best start in four years. They would then struggle throughout the rest of the season by losing 8 of their last 11 games, resulting in an 8–8 record. They did, however, give the Panthers their only regular-season loss. The Falcons used their first-round pick in the 2016 NFL Draft on safety Keanu Neal from the University of Florida.
In the Falcons' 25th and final season in the Georgia Dome, Atlanta lost their week 1 game to the Buccaneers 24–31. The Falcons would then win their next four including one over the Panthers, when the franchise set new records: Matt Ryan threw for 503 yards, and Julio Jones caught 12 passes for 300 yards. Beating the San Francisco 49ers 41–13 in Week 15, the Falcons improved to 9–5 and secured their first winning season since 2012. One week later, the Falcons defeated the Panthers in Charlotte, North Carolina, and clinched their first NFC South division title since 2012. In their last regular-season game at the Georgia Dome, the Falcons defeated the New Orleans Saints, and secured an 11–5 record and a first-round bye.
In the divisional round of the playoffs, Atlanta defeated the Seahawks 36–20 in the Georgia Dome, and hosted their last game at the Dome against the Green Bay Packers in the NFC Championship Game on January 22, 2017. The Falcons defeated the Packers 44–21 to advance to Super Bowl LI as the NFC champions. Atlanta was up 28–3 late in the third quarter, and the New England Patriots scored 31 unanswered points, with the last 6 in the first-ever overtime in the Super Bowl. The Patriots' 25-point comeback was the largest in Super Bowl history.
In 2016, the Falcons scored 540 points in the regular season, the seventh-most in NFL history, tied with the Greatest Show on Turf (the 2000 St. Louis Rams). However, the Falcons defense gave up 406 points, 27th in the league.
The Falcons moved into their new home, the Mercedes-Benz Stadium, this season. Their first game ever played at the new stadium was a preseason loss to the Arizona Cardinals. The first regular-season game at the new stadium was a rematch of the 2016–17 NFC Championship, with Atlanta defeating Green Bay 34–23. Their first loss of the season was a 23–17 home defeat to the Buffalo Bills in week 4. The team returned to the playoffs with a 10–6 record (albeit with a third-place finish in the NFC South). The Falcons defeated the Los Angeles Rams 26–13 in the Wild Card round, but their 2017 season came to an end a week later in the Divisional Playoff round at the hands of the eventual Super Bowl champion Philadelphia Eagles 15–10.
In their first game with new uniforms, the Falcons lost to the Seattle Seahawks at home 38–25. The Falcons then suffered comebacks made by both the Cowboys on the road (39–40) and then back in Atlanta against the Bears (26–30). On October 11, after the team suffered a 23–16 loss at home against the Carolina Panthers and fell to 0–5, the Falcons announced the firings of Quinn and Dimitroff. Defensive coordinator Raheem Morris took over for the rest of the season, leading the team to a 4–12 record. Morris was not retained after the season, and soon joined the Los Angeles Rams as their defensive coordinator.
Arthur Smith years (2021–present).
On January 15, 2021, the Falcons announced that Tennessee Titans offensive coordinator Arthur Smith had been named the 18th head coach in franchise history. Four days later, New Orleans Saints executive Terry Fontenot was named the Falcons' new general manager. Tight end Kyle Pitts was selected with the 4th pick of the 2021 draft, and longtime star receiver Julio Jones was traded to the Titans, after publicly requesting a trade from Atlanta. The Falcons improved on their record from the prior year, finishing the season with a 7–10 record.
On March 21, 2022, the Falcons traded longtime star quarterback Matt Ryan to the Indianapolis Colts.
Stadiums.
The Falcons have called three stadiums home in their 51 years of existence, and its third home in their history opened in the late summer of 2017. The first was the Atlanta–Fulton County Stadium, sharing with the Atlanta Braves Major League Baseball team until 1991. In 1992, the Georgia Dome was built, and the Falcons played there from its opening to the 2016 season. The Dome has been frequently used for college football, including Georgia State football and college bowl games such as the Peach Bowl.
In an effort to replace the aging Georgia Dome and potentially host a future Super Bowl, team owner Arthur Blank proposed a deal with the city of Atlanta to build a new state-of-the-art stadium not far from where the Georgia Dome is located. Blank will contribute $800 million and the city of Atlanta will contribute an additional $200 million via bonds backed by the city's hotel/motel tax towards the construction of a retractable roof stadium. Blank will contribute additional money for cost overruns if it is needed. The team will provide up to $50 million towards infrastructure costs that weren't included in the construction budget and to retire the remaining debt on the Georgia Dome. In addition, Blank's foundation and the city will each provide $15 million for development in surrounding neighborhoods. Though the total cost of the stadium was initially estimated to be around $1 billion, the total cost was revised to $1.5 billion according to Blank. In March 2013, the Atlanta City Council voted 11–4 in favor of building the stadium. The retractable roof Mercedes-Benz Stadium broke ground in May 2014, and became the third home stadium for the Falcons and the first for the new Atlanta United FC Major League Soccer club upon opening in 2017.
Logo and uniforms.
The Atlanta Falcons' colors are black, red, silver and white. When the team began play in 1966, the Falcons wore red helmets with a black falcon crest logo. In the center of the helmet was a center black stripe surrounded by two gold stripes and two white stripes. These colors represented the two college rival schools in the state of Georgia; rival schools Georgia Tech Yellow Jackets (white and gold) and the Georgia Bulldogs (red and black). Although the gold was removed after several seasons, the white remains to this day. They wore white pants and either black or white jerseys. At first, the falcon crest logo was also put on the jersey sleeves, but it was replaced by a red and white stripe pattern four years later. They switched from black to red jerseys in 1971, and the club began to wear silver pants in 1978. The facemasks on the helmets were initially gray, becoming white in 1978, and then black in 1984; the team wore black face masks until its 2020 redesign.
A prototype white helmet was developed for the team prior to the 1974 season, but was never worn.
In 1990, the uniform design changed to black helmets, silver pants, and either black or white jerseys. The numbers on the white jerseys were black, but were changed to red in 1997. (The red numerals could be seen on the away jerseys briefly in 1990.)
Both the logo and uniforms changed in 2003. The logo was redesigned with red and silver accents to depict a more powerful, aggressive falcon, which now more closely resembles the capital letter "F".
Although the Falcons still wore black helmets, the new uniforms featured jerseys and pants with red trim down the sides. The uniform design consisted of either black or white jerseys, and either black or white pants. During that same year, a red alternate jersey with black trim was also introduced. The Falcons also started wearing black cleats with these uniforms.
In 2004, the red jerseys became the primary jerseys, and the black ones became the alternate, both worn with white pants. In select road games, the Falcons wear black pants with white jerseys. The Falcons wore an all-black combination for home games against their archrivals, the New Orleans Saints, winning the first two contests (24–21 in and 36–17 in ), but losing 31–13 in . The Falcons wore the all-black combination against the New Orleans Saints for four straight seasons starting in 2004, With the last time being in 2007, losing 34–14. They wore the combination again in 2006, against the Tampa Bay Buccaneers in Week 2. The Falcons won that game, 14–3. The Falcons also wore their all-black uniform in 2007 against the New York Giants, and in 2008 against the Carolina Panthers and against the Tampa Bay Buccaneers (for the second time). After that, the black pants and uniforms were retired and the white pants were now used full-time with the regular uniforms.
In the 1980s, the Falcons wore their white uniforms at home most of the time because of the heat. When the Falcons started playing in a dome, the team switched to their dark uniforms for home games but have worn their white uniforms at home a few times since switching to the dome. It was announced at the 2009 state of the franchise meeting that the Falcons would wear 1966 throwback uniforms for a couple games during the 2009 season. The Atlanta Falcons wore 1966 throwback jerseys for two home games in 2009 – against the Carolina Panthers on September 20 and against the Tampa Bay Buccaneers on November 29. The Falcons won both of those games. They donned the throwbacks again for 2 games in 2010, against Baltimore and San Francisco, winning both of those games as well. The throwbacks were used twice in 2011 and 2012; both times were against the Panthers and Saints. However, the throwbacks were retired following a 2013 NFL rule requiring only one helmet shell per team.
The Falcons unveiled an all-red Color Rush uniform on September 13, 2016; however, due to the fact that the Falcons and the Tampa Bay Buccaneers had similar all-red Color Rush uniforms, the Falcons were unable to wear their Color Rush uniform until the 2017 season.
Also in 2016, the Falcons unveiled a mixed throwback uniform set. The uniform tops, pants and socks closely resembled their 1960s kits. From 2016 to 2021, due to the NFL's one-shell rule, the Falcons wore the black helmets with the original logo decal similar to the design they wore in the 1990s. However, starting in 2022, with the NFL now reinstating the use of alternate helmets, the Falcons brought back the original red helmets to pair with their throwback uniforms.
It was revealed in January 2020 that the Falcons will change uniforms for the 2020 NFL season. The ensuing design featured the return to black as the primary home uniform color for the first time since 2003. Both the primary home and road uniforms featured the "ATL" abbreviation in red above either white or black numbers with red drop shadows. The white and black tops are usually paired with either white or black pants. The alternate uniform featured a red/black gradient design and also featured the "ATL" abbreviation in white above white numbers with black drop shadows. Black pants are only used with this uniform. All three uniforms feature red side stripes. The current throwback uniform was also retained. In addition, the Falcons switched to matte helmets with the enlarged falcon logo and gray facemasks.
Rivalries.
New Orleans Saints.
The Falcons have shared a heated divisional rivalry with the New Orleans Saints (first the NFC West, and now the NFC South). The two teams were often basement-dwellers in the division; but the rivalry grew as a means of pride between the two cities, as they were the only two NFL teams in the Deep South for multiple decades. The series is the oldest and most iconic rivalry in the NFC South as the two teams have long harbored bad blood against one another. Atlanta leads the series 52–48, including the lone postseason matchup with the Saints 1–0.
Carolina Panthers.
In addition, the Falcons share a similar, rivalry with the Carolina Panthers, with both teams having been in the NFC West from the Panthers' founding in 1995 to the NFL realignment in 2002. Similar to their rivalry with the Saints, the Falcons have often endured several competitive divisional battles with the Panthers for lead of the NFC South, though the two have yet to meet in the postseason. The series is also known as the "I-85 Rivalry" due to Atlanta and Charlotte being only four hours apart on Interstate 85. The Falcons lead the series 27–17.
Tampa Bay Buccaneers.
The Falcons share a less-intense divisional rivalry with the Tampa Bay Buccaneers since the NFL realignment in 2002. The two had been regional opponents but very little had linked any further animosity towards the two as the Buccaneers played in the former NFC Central before the realignment. The two teams would find themselves competing over staff and players alike, particularly during the 2000s after the Falcons had lured General Manager Rich McKay after winning Super Bowl XXXVII the season prior. McKay's ties with Tampa extend into his family as his father John McKay was head coach of the Buccaneers for nine seasons.
Philadelphia Eagles.
The Eagles lead the Falcons 21–15–1, with a 3–1 lead in playoff games. The rivalry first emerged after the Falcons upset the Eagles in the , and only intensified further in the 2000s thanks to the rivalry between prominent dual-threat quarterbacks Donovan McNabb, and Michael Vick. Recently, the Falcons lost to the Eagles in the .
Green Bay Packers.
The Falcons have also shared a playoff rivalry with the Green Bay Packers as much of the connections between the two teams stems from Atlanta trading future hall-of-fame quarterback Brett Favre to the Green Bay on February 11, 1992 in exchange for a first-round pick. The two teams have met four times in the postseason, most recently during the 2016–17 NFC Championship as it would also be the final game played at the Georgia Dome. The Packer lead the all-time series 19–16, while both teams are tied in the postseason 2–2.
Statistics.
Record vs. opponents.
Includes postseason records
Source:
! Total || 378 || 455 || 6 || || || || || 10–14 ()
Players.
Pro Football Hall of Famers.
Sanders, Humphrey, Andersen, and Gonzalez are the only players in the Hall of Fame that have been inducted based substantially on their service with the Falcons. Andersen spent eight of his 25 NFL seasons with the Falcons, previously being the team's all-time scoring leader, but he also played his first 13 NFL seasons with the New Orleans Saints, leading that team's career scoring list.
Ring of Honor.
The Atlanta Falcons organization does not officially retire jersey numbers, but considers certain players' jerseys worthy of being honored. The Falcons Ring of Honor honors individual players.
Coaching staff.
Head coaches.
In their history, the Atlanta Falcons have had 18 head coaches.
Radio and television.
The Falcons' flagship radio station is WZGC 92.9 The Game. Wes Durham, son of longtime North Carolina Tar Heels voice Woody Durham, is the Falcons' play-by-play announcer, with former Atlanta Falcons QB and pro football veteran, Dave Archer serving as color commentator.
In 2014, The CW affiliate WUPA became the official television station of the Falcons, gaining rights to its preseason games, which are produced by CBS Sports.
In the regular season, the team's games are seen on Fox's O&O affiliate WAGA. When the Falcons challenge an AFC team, CBS affiliate WANF will air those games while Sunday night games are televised on WXIA, the local NBC affiliate.
Radio affiliates.
Source:
|
2104 | Heathenry in the United States | Heathenry is a modern Pagan new religious movement that has been active in the United States since at least the early 1970s. Although the term "Heathenry" is often employed to cover the entire religious movement, different Heathen groups within the United States often prefer the term "Ásatrú" or "Odinism" as self-designations.
Heathenry appeared in the United States during the 1960s, at the same time as the wider emergence of modern Paganism in the United States. Among the earliest American group was the Odinist Fellowship, founded by Danish migrant Else Christensen in 1969.
History.
Ásatrú grew steadily in the United States during the 1960s. In 1969, the Danish Odinist Else Christensen established the Odinist Fellowship from her home in Florida, U.S.A. Heavily influenced by Alexander Rud Mills' writings, she began publication of a magazine, "The Odinist", although this focused to a greater extent on right-wing and racialist ideas than theological ones. Stephen McNallen first founded the Viking Brotherhood in the early 1970s, before creating the Ásatrú Free Assembly (AFA) in 1976, which broke up in 1986 amid widespread political disagreements after McNallen's repudiation of neo-Nazis within the group. In the 1990s, McNallen founded the Ásatrú Folk Assembly (AFA), an ethnically oriented Heathen group headquartered in California.
Meanwhile, Valgard Murray and his kindred in Arizona founded the Ásatrú Alliance (AA) in the late 1980s, which shared the AFA's perspectives on race and which published the "Vor Tru" newsletter. In 1987, Edred Thorsson and James Chisholm founded The Troth, which was incorporated in Texas. Taking an inclusive, non-racialist view, it soon grew into an international organisation.
Terminology.
In English usage, the genitive "" "of Æsir faith" is often used on its own to denote adherents (both singular and plural). This term is favored by practitioners who focus on the deities of Scandinavia, although it is problematic as many Asatruar worship deities and entities other than the Æsir, such as the Vanir, Valkyries, Elves, and Dwarves. Other practitioners term their religion "Vanatrú", meaning "those who honour the Vanir" or "Dísitrú", meaning "those who honour the Goddesses", depending on their particular theological emphasis.
Within the community it is sometimes stated that the term "Ásatrú" pertains to groups which are not racially focused, while "Odinism" is the term preferred by racially oriented groups. However, in practice, there is no such neat division in terminology.
There are notable differences of emphasis between "Ásatrú" as practiced in the US and in Scandinavia. According to , American Asatruar tend to prefer a more devotional form of worship and a more emotional conception of the Nordic gods than Scandinavian practitioners, reflecting the parallel tendency of highly emotional forms of Christianity prevalent in the United States.
Demographics.
Although deeming it impossible to calculate the exact size of the Heathen community in the US, sociologist Jeffrey Kaplan estimated that, in the mid-1990s, there were around 500 active practitioners in the country, with a further thousand individuals on the periphery of the movement. He noted that the overwhelming majority of individuals in the movement were white, male, and young. Most had at least an undergraduate degree, and worked in a mix of white collar and blue collar jobs. From her experience within the community, Snook concurred that the majority of American Heathens were male, adding also that most were also white and middle-aged, but believed that there had been a growth in the proportion of Heathen women in the US since the mid-1990s.
In 2003, the Pagan Census Project led by Helen A. Berger, Evan A. Leach, and Leigh S. Shaffer gained 60 responses from Heathens in the US, noting that 65% were male and 35% female, which they saw as the "opposite" of the rest of the country's Pagan community. The majority had a college education, but were generally less well educated than the wider Pagan community, with a lower median income than the wider Pagan community too.
Politics and controversies.
Ásatrú organizations have memberships which span the entire political and spiritual spectrum. There is a history of political controversy within organized US Ásatrú, mostly surrounding the question of how to deal with such adherents as place themselves in a context of the far right and white supremacy, notably resulting in the fragmentation of the "Asatru Free Assembly" in 1986.
Externally, political activity on the part of Ásatrú organizations has surrounded campaigns against alleged religious discrimination, such as the call for the introduction of an Ásatrú "emblem of belief" by the United States Department of Veterans Affairs to parallel the Wiccan pentacle granted to the widow of Patrick Stewart in 2006. In May 2013, the "Hammer of Thor" was added to the list of United States Department of Veterans Affairs emblems for headstones and markers. It was reported in early 2019 that a Heathenry service was held on the U.S. Navy's USS John C. Stennis
Folkish Ásatrú, Universalism and racialism.
Historically, the main dispute between the national organizations has generally centered on the interpretation of "Nordic heritage" as either something cultural, or as something genetic or racial. In the internal discourse within American Ásatrú, this cultural/racial divide has long been known as "universalist" vs. "folkish" Ásatrú.
The Troth takes the "universalist" position, claiming "Ásatrú" as a synonym for "Northern European Heathenry" taken to comprise "many variations, names, and practices, including Theodism, Irminism, Odinism, and Anglo-Saxon Heathenry". The Asatru Folk Assembly takes the folkish position, claiming that Ásatrú and the Germanic beliefs are ancestral in nature, and as an indigenous religion of the European Folk should only be accessed by the descendants of Europe. In the UK, Germanic Neopaganism is more commonly known as Odinism or as "Heathenry". This is mostly a matter of terminology, and US Ásatrú may be equated with UK Odinism for practical purposes, as is evident in the short-lived International Asatru-Odinic Alliance of folkish Ásatrú/Odinist groups.
Some groups identifying as Ásatrú have been associated with national socialist and white nationalist movements. Wotansvolk, for example, is an explicitly racial form.
More recently, however, many Ásatrú groups have been taking a harder stance against these elements of their community. Declaration 127, so named for the corresponding stanza of the Hávamál: "When you see misdeeds, speak out against them, and give your enemies no frið” is a collective statement denouncing and testifying disassociation with the Asatru Folk Assembly for alleged racial and sexually-discriminatory practices and beliefs signed by over 150 Ásatrú religious organizations from over 15 different nations mainly represented on Facebook.
Discrimination charges.
Inmates of the "Intensive Management Unit" at Washington State Penitentiary who are adherents of Ásatrú in 2001 were deprived of their Thor's Hammer medallions.
In 2007, a federal judge confirmed that Ásatrú adherents in US prisons have the right to possess a Thor's Hammer pendant. An inmate sued the Virginia Department of Corrections after he was denied it while members of other religions were allowed their medallions.
In the Georgacarakos v. Watts case Peter N. Georgacarakos filed a pro se civil-rights complaint in the United States District Court for the District of Colorado against 19 prison officials for "interference with the free exercise of his Ásatrú religion" and "discrimination on the basis of his being Ásatrú".
|
2106 | Ansible | An ansible is a category of fictional devices or technology capable of near-instantaneous or faster-than-light communication. It can send and receive messages to and from a corresponding device over any distance or obstacle whatsoever with no delay, even between star systems. As a name for such a device, the word "ansible" first appeared in a 1966 novel by Ursula K. Le Guin. Since that time, the term has been broadly used in the works of numerous science fiction authors, across a variety of settings and continuities. A related term is ultrawave.
Coinage by Ursula Le Guin.
Ursula K. Le Guin coined the word "ansible" in her 1966 novel "Rocannon's World". The word was a contraction of "answerable", as the device would allow its users to receive answers to their messages in a reasonable amount of time, even over interstellar distances.
The ansible was the basis for creating a specific kind of interstellar civilizationone where communications between far-flung stars are instantaneous, but humans can only travel at relativistic speeds. Under these conditions, a full-fledged galactic empire is not possible, but there is a looser interstellar organization, in which several of Le Guin's protagonists are involved.
Although Le Guin invented the name "ansible" for this type of device, fleshed out with specific details in her fictional works, the broader concept of instantaneous or faster-than-light communication had previously existed in science fiction. For example, similar communication functions were included in a device called an interocitor in the 1952 novel "This Island Earth" by Raymond F. Jones, and the 1955 film based on that novel, and in the "Dirac Communicator", which first appeared in James Blish's short story "Beep" (1954), which was later expanded into the novel "The Quincunx of Time" (1973). Robert A. Heinlein in "Time for the Stars" (1958) employed instantaneous telepathic communication between identical twin pairs over interstellar distances, and like Le Guin, provided a technical explanation based on a non-Einsteinian principle of simultaneity.
In Le Guin's works.
In her subsequent works, Le Guin continued to develop the concept of the ansible:
Any ansible may be used to communicate through any other, by setting its coordinates to those of the receiving ansible. They have a limited bandwidth, which only allows for at most a few hundred characters of text to be communicated in any transaction of a dialog session, and are attached to a keyboard and small display to perform text messaging.
Use by later authors.
Since Le Guin's conception of the ansible, the name of the device has been borrowed by numerous authors. While Le Guin's ansible was said to communicate "instantaneously", the name has also been adopted for devices capable of communication at finite speeds that are faster than light.
Orson Scott Card's works.
Orson Scott Card, in his 1977 novelette and 1985 novel "Ender's Game" and its sequels, used the term "ansible" as an unofficial name for the philotic parallax instantaneous communicator, a machine capable of communicating across infinite distances with no time delay. In "Ender's Game", a character states that "somebody dredged the name "ansible" out of an old book somewhere".
In the universe of the "Ender's Game" series, the ansible's functions involved a fictional subatomic particle, the philote. The two quarks inside a pi meson can be separated by an arbitrary distance, while remaining connected by "philotic rays". This concept is similar to quantum teleportation due to entanglement; however, in reality, quark confinement prevents quarks from being separated by any observable distance.
Card's version of the ansible was also featured in the video game "Advent Rising", for which Card helped write the story, and in the movie "Ender's Game", which was based on the book.
Other writers.
Numerous other writers have included faster-than-light communication devices in their fictional works. Notable examples include:
|
2108 | Adalbert of Prague | Adalbert of Prague (, , , , ; 95623 April 997), known in the Czech Republic, Poland and Slovakia by his birth name Vojtěch (), was a Czech missionary and Christian saint. He was the Bishop of Prague and a missionary to the Hungarians, Poles, and Prussians, who was martyred in his efforts to convert the Baltic Prussians to Christianity. He is said to be the composer of the oldest Czech hymn "Hospodine, pomiluj ny" and "Bogurodzica", the oldest known Polish hymn, but his authorship of them has not been confirmed.
Adalbert was later declared the patron saint of the Czech Republic, Poland, and the Duchy of Prussia. He is also the patron saint of the Archdiocese of Esztergom in Hungary.
Life.
Early years.
Born as "Vojtěch" in 952 or ca. 956 in gord Libice, he belonged to the Slavnik clan, one of the two most powerful families in Bohemia. Events from his life were later recorded by a Bohemian priest Cosmas of Prague (1045–1125). Vojtěch's father was Slavník (d. 978–981), a duke ruling a province centred at Libice. His mother was Střezislava (d. 985–987), and according to David Kalhous belonged to the Přemyslid dynasty. He had five brothers: Soběslav, Spytimír, Dobroslav, Pořej, and Čáslav. Cosmas also refers to Radim (later Gaudentius) as a brother; who is believed to have been a half-brother by his father's liaison with another woman. After he survived a grave illness in childhood, his parents decided to dedicate him to the service of God. Adalbert was well educated, having studied for approximately ten years (970-80) in Magdeburg under Adalbert of Magdeburg. The young Vojtěch took his tutor's name "Adalbert" at his Confirmation.
Episcopacy.
In 981 Adalbert of Magdeburg died, and his young protege Adalbert returned to Bohemia. Later Bishop Dietmar of Prague ordained him a Catholic priest. In 982, Bishop Dietmar died, and Adalbert, despite being under canonical age, was chosen to succeed him as Bishop of Prague. Amiable and somewhat worldly, he was not expected to trouble the secular powers by making excessive claims for the Church. Although Adalbert was from a wealthy family, he avoided comfort and luxury, and was noted for his charity and austerity. After six years of preaching and prayer, he had made little headway in evangelizing the Bohemians, who maintained deeply embedded pagan beliefs.
Adalbert opposed the participation of Christians in the slave trade and complained of polygamy and idolatry, which were common among the people. Once he started to propose reforms he was met with opposition from both the secular powers and the clergy. His family refused to support Duke Boleslaus in an unsuccessful war against Poland. Adalbert was no longer welcome and eventually forced into exile. In 988 he went to Rome. He lived as a hermit at the Benedictine monastery of Saint Alexis. Five years later, Boleslaus requested that the Pope send Adalbert back to Prague, in hopes of securing his family's support. Pope John XV agreed, with the understanding that Adalbert was free to leave Prague if he continued to encounter entrenched resistance. Adalbert returned as bishop of Prague, where he was initially received with demonstrations of apparent joy. Together with a group of Italian Benedictine monks which brought with him, he founded in 14 January 993 a monastery in Břevnov (then situated westward from Prague, now part of the city), the second oldest monastery on Czech territory.
In 995, the Slavniks' former rivalry with the Přemyslids, who were allied with the powerful Bohemian clan of the Vršovids, resulted in the storming of the Slavnik town of Libice nad Cidlinou, which was led by the Přemyslid Boleslaus II the Pious. During the struggle four or five of Adalbert's brothers were killed. The Zlič Principality became part of the Přemyslids' estate. Adalbert unsuccessfully attempted to protect a noblewoman caught in adultery. She had fled to a convent, where she was killed. In upholding the right of sanctuary, Bishop Adalbert responded by excommunicating the murderers. Butler suggests that the incident was orchestrated by enemies of his family.
After this, Adalbert could not safely stay in Bohemia and escaped from Prague. Strachkvas was eventually appointed to be his successor. However, Strachkvas suddenly died during the liturgy at which he was to accede to his episcopal office in Prague. The cause of his death is still ambiguous. The Pope directed Adalbert to resume his see, but believing that he would not be allowed back, Adalbert requested a brief as an itinerant missionary.
Adalbert then traveled to Hungary and probably baptized Géza of Hungary and his son Stephen in Esztergom. Then he went to Poland where he was cordially welcomed by then-Duke Boleslaus I and installed as Bishop of Gniezno.
Mission and martyrdom in Prussia.
Adalbert again relinquished his diocese, namely that of Gniezno, and set out as a missionary to preach to the inhabitants near Prussia. Bolesław I, Duke (and, later, King) of Poland, sent soldiers with Adalbert on his mission to the Prussians. The Bishop and his companions, entered Prussian territory and traveled along the coast of the Baltic Sea to Gdańsk. At the borders of the Polish realm, at the mouth of the Vistula River, his half-brother Radim (Gaudentius), Benedict-Bogusza (who was probably a Pole), and at least one interpreter, ventured out into Prussia alone, as Bolesław had only sent his soldiers to escort them to the border.
Adalbert achieved some success upon his arrival, however his arrival mostly caused strain upon the local Prussian populations. Partially this was because of the imperious manner with which he preached, but potentially because he preached utilizing a book. The Prussians had an oral society where communication was face to face. To the locals Adalbert reading from a book may have come off as a manifestation of an evil action. He was forced to leave this first village after being struck in the back of the head by an oar by a local chieftain, causing the pages of his book to scatter upon the ground. He and his companions then fled across a river.
In the next place that Adalbert tried to preach, his message was met with the locals banging their sticks upon the ground, calling for the death of Adalbert and his companions. Retreating once again Adalbert and his companions went to a market place of Truso (near modern-day Elbląg). Here they were met with a similar response as at the previous place. On the 23 April 997, after mass, while Adalbert and his companions lay in the grass while eating a snack, they were set upon by a pagan mob. The mob was led by a man named Sicco, possibly a pagan priest, who delivered the first blow against Adalbert, before the others joined in. They removed Adalbert's head from his body after he was dead, and mounted on a pole while they returned home. This encounter may also have taken place in Tenkitten and Fischhausen (now Primorsk, Kaliningrad Oblast, Russia). It is recorded that his body was bought back for its weight in gold by King Boleslaus I of Poland.
Veneration and relics.
A few years after his martyrdom, Adalbert was canonized as Saint Adalbert of Prague. His life was written in "Vita Sancti Adalberti Pragensis" by various authors, the earliest being traced to imperial Aachen and the Bishop of Liège, Notger von Lüttich, although it was previously assumed that the Roman monk John Canaparius wrote the first "Vita" in 999. Another famous biographer of Adalbert was Bruno of Querfurt who wrote a hagiography of him in 1001–4.
Notably, the Přemyslid rulers of Bohemia initially refused to ransom Adalbert's body from the Prussians who murdered him, and therefore it was purchased by Poles. This fact may be explained by Adalbert's belonging to the Slavniks family which was rival to the Přemyslids. Thus Adalbert's bones were preserved in Gniezno, which assisted Boleslaus I of Poland in increasing Polish political and diplomatic power in Europe.
According to Bohemian accounts, in 1039 the Bohemian Duke Bretislav I looted the bones of Adalbert from Gniezno in a raid and translated them to Prague. According to Polish accounts, however, he stole the wrong relics, namely those of Gaudentius, while the Poles concealed Adalbert's relics which remain in Gniezno. In 1127 his severed head, which was not in the original purchase according to "Roczniki Polskie", was discovered and translated to Gniezno. In 1928, one of the arms of Adalbert, which Bolesław I had given to Holy Roman Emperor Otto III in 1000, was added to the bones preserved in Gniezno. Therefore, today Adalbert has two elaborate shrines in the Prague Cathedral and Royal Cathedral of Gniezno, each of which claims to possess his relics, but which of these bones are his authentic relics is unknown. For example, pursuant to both claims two skulls are attributed to Adalbert. The one in Gniezno was stolen in 1923.
The massive bronze doors of Gniezno Cathedral, dating from around 1175, are decorated with eighteen reliefs of scenes from Adalbert's life. They are the only Romanesque ecclesiastical doors in Europe depicting a cycle illustrating the life of a saint, and therefore are a precious relic documenting Adalbert's martyrdom. We can read that door literally and theologically.
The one thousandth anniversary of Adalbert's martyrdom was on 23 April 1997. It was commemorated in Poland, the Czech Republic, Germany, Russia, and other nations. Representatives of Catholic, Eastern Orthodox, and Evangelical churches traveled on a pilgrimage to Adalbert's tomb located in Gniezno. Pope John Paul II visited the cathedral and celebrated a liturgy there in which heads of seven European nations and approximately one million faithful participated.
A ten-meter cross was erected near the village of Beregovoe (formerly Tenkitten), Kaliningrad Oblast, where Adalbert is thought to have been martyred by the Prussians.
Feast day.
He is also commemorated on 23 April by Evangelical Church in Germany and Eastern Orthodox Church.
In popular culture and society.
The Dagmar and Václav Havel VIZE 97 Foundation Prize, given annually to a distinguished thinker "whose work exceeds the traditional framework of scientific knowledge, contributes to the understanding of science as an integral part of general culture and is concerned with unconventional ways of asking fundamental questions about cognition, being and human existence"
includes a massive replica of Adalbert's crozier by Czech artist Jiří Plieštík.
St. Vojtech Fellowship was established in 1870 by Slovak Catholic priest Andrej Radlinský. It had facilitated Slovak Catholic thinkers and authors, continuing to publish religious original works and translations to this day. It is the official publishing body of Episcopal Conference of Slovakia.
|
2110 | Ælfheah of Canterbury | Ælfheah ( – 19 April 1012), more commonly known today as Alphege, was an Anglo-Saxon Bishop of Winchester, later Archbishop of Canterbury. He became an anchorite before being elected abbot of Bath Abbey. His reputation for piety and sanctity led to his promotion to the episcopate and, eventually, to his becoming archbishop. Ælfheah furthered the cult of Dunstan and also encouraged learning. He was captured by Viking raiders in 1011 during the siege of Canterbury and killed by them the following year after refusing to allow himself to be ransomed. Ælfheah was canonised as a saint in 1078. Thomas Becket, a later Archbishop of Canterbury, prayed to him just before his own murder in Canterbury Cathedral in 1170.
Life.
Ælfheah was born around 953, supposedly in Weston on the outskirts of Bath, and became a monk early in life. He first entered the monastery of Deerhurst, but then moved to Bath, where he became an anchorite. He was noted for his piety and austerity and rose to become abbot of Bath Abbey. The 12th century chronicler, William of Malmesbury recorded that Ælfheah was a monk and prior at Glastonbury Abbey, but this is not accepted by all historians. Indications are that Ælfheah became abbot at Bath by 982, perhaps as early as around 977. He perhaps shared authority with his predecessor Æscwig after 968.
Probably due to the influence of Dunstan, the Archbishop of Canterbury (959–988), Ælfheah was elected Bishop of Winchester in 984, and was consecrated on 19 October that year. While bishop he was largely responsible for the construction of a large organ in the cathedral, audible from over a mile (1600 m) away and said to require more than 24 men to operate. He also built and enlarged the city's churches, and promoted the cult of Swithun and his own predecessor, Æthelwold of Winchester. One act promoting Æthelwold's cult was the translation of Æthelwold's body to a new tomb in the cathedral at Winchester, which Ælfheah presided over on 10 September 996.
Following a Viking raid in 994, a peace treaty was agreed with one of the raiders, Olaf Tryggvason. Besides receiving danegeld, Olaf converted to Christianity and undertook never to raid or fight the English again. Ælfheah may have played a part in the treaty negotiations, and it is certain that he confirmed Olaf in his new faith.
In 1006, Ælfheah succeeded Ælfric as Archbishop of Canterbury, taking Swithun's head with him as a relic for the new location. He went to Rome in 1007 to receive his pallium—symbol of his status as an archbishop—from Pope John XVIII, but was robbed during his journey. While at Canterbury, he promoted the cult of Dunstan, ordering the writing of the second "Life of Dunstan", which Adelard of Ghent composed between 1006 and 1011. He also introduced new practices into the liturgy, and was instrumental in the Witenagemot's recognition of Wulfsige of Sherborne as a saint in about 1012.
Ælfheah sent Ælfric of Eynsham to Cerne Abbey to take charge of its monastic school. He was present at the council of May 1008 at which Wulfstan II, Archbishop of York, preached his "Sermo Lupi ad Anglos" ("The Sermon of the Wolf to the English"), castigating the English for their moral failings and blaming the latter for the tribulations afflicting the country.
In 1011, the Danes again raided England, and from 8–29 September they laid siege to Canterbury. Aided by the treachery of Ælfmaer, whose life Ælfheah had once saved, the raiders succeeded in sacking the city. Ælfheah was taken prisoner and held captive for seven months. Godwine (Bishop of Rochester), Leofrun (abbess of St Mildrith's), and the king's reeve, Ælfweard were captured also, but the abbot of St Augustine's Abbey, Ælfmær, managed to escape. Canterbury Cathedral was plundered and burned by the Danes following Ælfheah's capture.
Death.
Ælfheah refused to allow a ransom to be paid for his freedom, and as a result was killed on 19 April 1012 at Greenwich, reputedly on the site of St Alfege's Church. The account of Ælfheah's death appears in the E version of the "Anglo-Saxon Chronicle":
Ælfheah was the first Archbishop of Canterbury to die a violent death. A contemporary report tells that Thorkell the Tall attempted to save Ælfheah from the mob about to kill him by offering everything he owned except for his ship, in exchange for Ælfheah's life; Thorkell's presence is not mentioned in the "Anglo-Saxon Chronicle", however. Some sources record that the final blow, with the back of an axe, was delivered as an act of kindness by a Christian convert known as "Thrum". Ælfheah was buried in Old St Paul's Cathedral. In 1023, his body was moved by King Cnut to Canterbury, with great ceremony. Thorkell the Tall was appalled at the brutality of his fellow raiders, and switched sides to the English king Æthelred the Unready following Ælfheah's death.
Veneration.
Pope Gregory VII canonised Ælfheah in 1078, with a feast day of 19 April. Lanfranc, the first post-Conquest archbishop, was dubious about some of the saints venerated at Canterbury. He was persuaded of Ælfheah's sanctity, but Ælfheah and Augustine of Canterbury were the only pre-conquest Anglo-Saxon archbishops kept on Canterbury's calendar of saints. Ælfheah's shrine, which had become neglected, was rebuilt and expanded in the early 12th century under Anselm of Canterbury, who was instrumental in retaining Ælfheah's name in the church calendar. After the 1174 fire in Canterbury Cathedral, Ælfheah's remains together with those of Dunstan were placed around the high altar, at which Thomas Becket is said to have commended his life into Ælfheah's care shortly before his martyrdom during the Becket controversy. The new shrine was sealed in lead, and was north of the high altar, sharing the honour with Dunstan's shrine, which was located south of the high altar. A "Life of Saint Ælfheah" in prose and verse was written by a Canterbury monk named Osbern, at Lanfranc's request. The prose version has survived, but the "Life" is very much a hagiography; many of the stories it contains have obvious Biblical parallels, making them suspect as a historical record.
In the late medieval period, Ælfheah's feast day was celebrated in Scandinavia, perhaps because of the saint's connection with Cnut. Few church dedications to him are known, with most of them occurring in Kent and one each in London and Winchester; as well as St Alfege's Church in Greenwich, a nearby hospital (1931–1968) was named after him. In Kent, there are two 12th-century parish churches dedicated to St Alphege at Seasalter and Canterbury. Reputedly his body lay in these churches overnight on his way back to Canterbury Cathedral for burial. In the town of Solihull in the West Midlands, St Alphege Church is dedicated to Ælfheah dating back to approximately 1277. In 1929, a new Roman Catholic church in Bath, the Church of Our Lady & St Alphege, was designed by Giles Gilbert Scott in homage to the ancient Roman church of Santa Maria in Cosmedin, and dedicated to Ælfheah under the name of Alphege. St George the Martyr with St Alphege & St Jude stands in Borough in London.
Artistic representations of Ælfheah often depict him holding a pile of stones in his chasuble, in reference to his martydom.
|
2112 | Associative algebra | In mathematics, an associative algebra "A" is an algebraic structure with compatible operations of addition, multiplication (assumed to be associative), and a scalar multiplication by elements in some field "K". The addition and multiplication operations together give "A" the structure of a ring; the addition and scalar multiplication operations together give "A" the structure of a vector space over "K". In this article we will also use the term "K"-algebra"' to mean an associative algebra over the field "K". A standard first example of a "K"-algebra is a ring of square matrices over a field "K", with the usual matrix multiplication.
A commutative algebra is an associative algebra that has a commutative multiplication, or, equivalently, an associative algebra that is also a commutative ring.
In this article associative algebras are assumed to have a multiplicative identity, denoted 1; they are sometimes called unital associative algebras for clarification. In some areas of mathematics this assumption is not made, and we will call such structures non-unital associative algebras. We will also assume that all rings are unital, and all ring homomorphisms are unital.
Many authors consider the more general concept of an associative algebra over a commutative ring "R", instead of a field: An "R"-algebra is an "R"-module with an associative "R"-bilinear binary operation, which also contains a multiplicative identity. For examples of this concept, if "S" is any ring with center "C", then "S" is an associative "C"-algebra.
Definition.
Let "R" be a commutative ring (so "R" could be a field). An associative "R"-algebra (or more simply, an "R"-algebra) is a ring
that is also an "R"-module in such a way that the two additions (the ring addition and the module addition) are the same operation, and scalar multiplication satisfies
for all "r" in "R" and "x", "y" in the algebra. (This definition implies that the algebra is unital, since rings are supposed to have a multiplicative identity.)
Equivalently, an associative algebra "A" is a ring together with a ring homomorphism from "R" to the center of "A". If "f" is such a homomorphism, the scalar multiplication is formula_2 (here the multiplication is the ring multiplication); if the scalar multiplication is given, the ring homomorphism is given by formula_3 (See also below).
Every ring is an associative formula_4-algebra, where formula_4 denotes the ring of the integers.
A is an associative algebra that is also a commutative ring.
As a monoid object in the category of modules.
The definition is equivalent to saying that a unital associative "R"-algebra is a monoid object in "R"-Mod (the monoidal category of "R"-modules). By definition, a ring is a monoid object in the category of abelian groups; thus, the notion of an associative algebra is obtained by replacing the category of abelian groups with the category of modules.
Pushing this idea further, some authors have introduced a "generalized ring" as a monoid object in some other category that behaves like the category of modules. Indeed, this reinterpretation allows one to avoid making an explicit reference to elements of an algebra "A". For example, the associativity can be expressed as follows. By the universal property of a tensor product of modules, the multiplication (the "R"-bilinear map) corresponds to a unique "R"-linear map
The associativity then refers to the identity:
From ring homomorphisms.
An associative algebra amounts to a ring homomorphism whose image lies in the center. Indeed, starting with a ring "A" and a ring homomorphism formula_8 whose image lies in the center of "A", we can make "A" an "R"-algebra by defining
for all "r" ∈ "R" and "x" ∈ "A". If "A" is an "R"-algebra, taking "x" = 1, the same formula in turn defines a ring homomorphism formula_8 whose image lies in the center.
If a ring is commutative then it equals its center, so that a commutative "R"-algebra can be defined simply as a commutative ring "A" together with a commutative ring homomorphism formula_8.
The ring homomorphism "η" appearing in the above is often called a structure map. In the commutative case, one can consider the category whose objects are ring homomorphisms "R" → "A"; i.e., commutative "R"-algebras and whose morphisms are ring homomorphisms "A" → "A" that are under "R"; i.e., "R" → "A" → "A" is "R" → "A" (i.e., the coslice category of the category of commutative rings under "R".) The prime spectrum functor Spec then determines an anti-equivalence of this category to the category of affine schemes over Spec "R".
How to weaken the commutativity assumption is a subject matter of noncommutative algebraic geometry and, more recently, of derived algebraic geometry. See also: generic matrix ring.
Algebra homomorphisms.
A homomorphism between two "R"-algebras is an "R"-linear ring homomorphism. Explicitly, formula_12 is an associative algebra homomorphism if
The class of all "R"-algebras together with algebra homomorphisms between them form a category, sometimes denoted "R"-Alg.
The subcategory of commutative "R"-algebras can be characterized as the coslice category "R"/CRing where CRing is the category of commutative rings.
Examples.
The most basic example is a ring itself; it is an algebra over its center or any subring lying in the center. In particular, any commutative ring is an algebra over any of its subrings. Other examples abound both from algebra and other fields of mathematics.
Dual of an associative algebra.
Let "A" be an associative algebra over a commutative ring "R". Since "A" is in particular a module, we can take the dual module "A"* of "A". A priori, the dual "A"* need not have a structure of an associative algebra. However, "A" may come with an extra structure (namely, that of a Hopf algebra) so that the dual is also an associative algebra.
For example, take "A" to be the ring of continuous functions on a compact group "G". Then, not only "A" is an associative algebra, but it also comes with the co-multiplication formula_38 and co-unit formula_39. The "co-" refers to the fact that they satisfy the dual of the usual multiplication and unit in the algebra axiom. Hence, the dual formula_40 is an associative algebra. The co-multiplication and co-unit are also important in order to form a tensor product of representations of associative algebras (see below).
Enveloping algebra.
Given an associative algebra "A" over a commutative ring "R", the enveloping algebra formula_41 of "A" is the algebra formula_42 or formula_43, depending on authors.
Note that a bimodule over "A" is exactly a left module over formula_41.
Separable algebra.
Let "A" be an algebra over a commutative ring "R". Then the algebra "A" is a right module over formula_45 with the action formula_46. Then, by definition, "A" is said to separable if the multiplication map formula_47 splits as an formula_41-linear map, where formula_49 is an formula_41-module by formula_51. Equivalently,
formula_18 is separable if it is a projective module over formula_41; thus, the formula_41-projective dimension of "A", sometimes called the bidimension of "A", measures the failure of separability.
Finite-dimensional algebra.
Let "A" be a finite-dimensional algebra over a field "k". Then "A" is an Artinian ring.
Commutative case.
As "A" is Artinian, if it is commutative, then it is a finite product of Artinian local rings whose residue fields are algebras over the base field "k". Now, a reduced Artinian local ring is a field and thus the following are equivalent
Let formula_62, the profinite group of finite Galois extensions of "k". Then formula_63 is an anti-equivalence of the category of finite-dimensional separable "k"-algebras to the category of finite sets with continuous formula_64-actions.
Noncommutative case.
Since a simple Artinian ring is a (full) matrix ring over a division ring, if "A" is a simple algebra, then "A" is a (full) matrix algebra over a division algebra "D" over "k"; i.e., formula_65. More generally, if "A" is a semisimple algebra, then it is a finite product of matrix algebras (over various division "k"-algebras), the fact known as the Artin–Wedderburn theorem.
The fact that "A" is Artinian simplifies the notion of a Jacobson radical; for an Artinian ring, the Jacobson radical of "A" is the intersection of all (two-sided) maximal ideals (in contrast, in general, a Jacobson radical is the intersection of all left maximal ideals or the intersection of all right maximal ideals.)
The Wedderburn principal theorem states: for a finite-dimensional algebra "A" with a nilpotent ideal "I", if the projective dimension of formula_66 as a module over the enveloping algebra formula_67 is at most one, then the natural surjection formula_68 splits; i.e., formula_18 contains a subalgebra formula_70 such that formula_71 is an isomorphism. Taking "I" to be the Jacobson radical, the theorem says in particular that the Jacobson radical is complemented by a semisimple algebra. The theorem is an analog of Levi's theorem for Lie algebras.
Lattices and orders.
Let "R" be a Noetherian integral domain with field of fractions "K" (for example, they can be formula_72). A "lattice" "L" in a finite-dimensional "K"-vector space "V" is a finitely generated "R"-submodule of "V" that spans "V"; in other words, formula_73.
Let formula_74 be a finite-dimensional "K"-algebra. An "order" in formula_74 is an "R"-subalgebra that is a lattice. In general, there are a lot fewer orders than lattices; e.g., formula_76 is a lattice in formula_77 but not an order (since it is not an algebra).
A "maximal order" is an order that is maximal among all the orders.
Related concepts.
Coalgebras.
An associative algebra over "K" is given by a "K"-vector space "A" endowed with a bilinear map "A" × "A" → "A" having two inputs (multiplicator and multiplicand) and one output (product), as well as a morphism "K" → "A" identifying the scalar multiples of the multiplicative identity. If the bilinear map "A" × "A" → "A" is reinterpreted as a linear map (i. e., morphism in the category of "K"-vector spaces) "A" ⊗ "A" → "A" (by the universal property of the tensor product), then we can view an associative algebra over "K" as a "K"-vector space "A" endowed with two morphisms (one of the form "A" ⊗ "A" → "A" and one of the form "K" → "A") satisfying certain conditions that boil down to the algebra axioms. These two morphisms can be dualized using categorial duality by reversing all arrows in the commutative diagrams that describe the algebra axioms; this defines the structure of a coalgebra.
There is also an abstract notion of "F"-coalgebra, where "F" is a functor. This is vaguely related to the notion of coalgebra discussed above.
Representations.
A representation of an algebra "A" is an algebra homomorphism "ρ" : "A" → End("V") from "A" to the endomorphism algebra of some vector space (or module) "V". The property of "ρ" being an algebra homomorphism means that "ρ" preserves the multiplicative operation (that is, "ρ"("xy") = "ρ"("x")"ρ"("y") for all "x" and "y" in "A"), and that "ρ" sends the unit of "A" to the unit of End("V") (that is, to the identity endomorphism of "V").
If "A" and "B" are two algebras, and "ρ" : "A" → End("V") and "τ" : "B" → End("W") are two representations, then there is a (canonical) representation "A" formula_78 "B" → End("V" formula_78 "W") of the tensor product algebra "A formula_78 B" on the vector space "V formula_78 W". However, there is no natural way of defining a tensor product of two representations of a single associative algebra in such a way that the result is still a representation of that same algebra (not of its tensor product with itself), without somehow imposing additional conditions. Here, by "tensor product of representations", the usual meaning is intended: the result should be a linear representation of the same algebra on the product vector space. Imposing such additional structure typically leads to the idea of a Hopf algebra or a Lie algebra, as demonstrated below.
Motivation for a Hopf algebra.
Consider, for example, two representations formula_82 and formula_83. One might try to form a tensor product representation formula_84 according to how it acts on the product vector space, so that
However, such a map would not be linear, since one would have
for "k" ∈ "K". One can rescue this attempt and restore linearity by imposing additional structure, by defining an algebra homomorphism Δ: "A" → "A" ⊗ "A", and defining the tensor product representation as
Such a homomorphism Δ is called a comultiplication if it satisfies certain axioms. The resulting structure is called a bialgebra. To be consistent with the definitions of the associative algebra, the coalgebra must be co-associative, and, if the algebra is unital, then the co-algebra must be co-unital as well. A Hopf algebra is a bialgebra with an additional piece of structure (the so-called antipode), which allows not only to define the tensor product of two representations, but also the Hom module of two representations (again, similarly to how it is done in the representation theory of groups).
Motivation for a Lie algebra.
One can try to be more clever in defining a tensor product. Consider, for example,
so that the action on the tensor product space is given by
This map is clearly linear in "x", and so it does not have the problem of the earlier definition. However, it fails to preserve multiplication:
But, in general, this does not equal
This shows that this definition of a tensor product is too naive; the obvious fix is to define it such that it is antisymmetric, so that the middle two terms cancel. This leads to the concept of a Lie algebra.
Non-unital algebras.
Some authors use the term "associative algebra" to refer to structures which do not necessarily have a multiplicative identity, and hence consider homomorphisms which are not necessarily unital.
One example of a non-unital associative algebra is given by the set of all functions "f": R → R whose limit as "x" nears infinity is zero.
Another example is the vector space of continuous periodic functions, together with the convolution product.
|
2113 | Axiom of regularity | In mathematics, the axiom of regularity (also known as the axiom of foundation) is an axiom of Zermelo–Fraenkel set theory that states that every non-empty set "A" contains an element that is disjoint from "A". In first-order logic, the axiom reads:
The axiom of regularity together with the axiom of pairing implies that no set is an element of itself, and that there is no infinite sequence ("an") such that "ai+1" is an element of "ai" for all "i". With the axiom of dependent choice (which is a weakened form of the axiom of choice), this result can be reversed: if there are no such infinite sequences, then the axiom of regularity is true. Hence, in this context the axiom of regularity is equivalent to the sentence that there are no downward infinite membership chains.
The axiom was introduced by ; it was adopted in a formulation closer to the one found in contemporary textbooks by . Virtually all results in the branches of mathematics based on set theory hold even in the absence of regularity; see chapter 3 of . However, regularity makes some properties of ordinals easier to prove; and it not only allows induction to be done on well-ordered sets but also on proper classes that are well-founded relational structures such as the lexicographical ordering on formula_2
Given the other axioms of Zermelo–Fraenkel set theory, the axiom of regularity is equivalent to the axiom of induction. The axiom of induction tends to be used in place of the axiom of regularity in intuitionistic theories (ones that do not accept the law of the excluded middle), where the two axioms are not equivalent.
In addition to omitting the axiom of regularity, non-standard set theories have indeed postulated the existence of sets that are elements of themselves.
Elementary implications of regularity.
No set is an element of itself.
Let "A" be a set, and apply the axiom of regularity to {"A"}, which is a set by the axiom of pairing. We see that there must be an element of {"A"} which is disjoint from {"A"}. Since the only element of {"A"} is "A", it must be that "A" is disjoint from {"A"}. So, since formula_3, we cannot have "A" ∈ "A" (by the definition of disjoint).
No infinite descending sequence of sets exists.
Suppose, to the contrary, that there is a function, "f", on the natural numbers with "f"("n"+1) an element of "f"("n") for each "n". Define "S" = {"f"("n"): "n" a natural number}, the range of "f", which can be seen to be a set from the axiom schema of replacement. Applying the axiom of regularity to "S", let "B" be an element of "S" which is disjoint from "S". By the definition of "S", "B" must be "f"("k") for some natural number "k". However, we are given that "f"("k") contains "f"("k"+1) which is also an element of "S". So "f"("k"+1) is in the intersection of "f"("k") and "S". This contradicts the fact that they are disjoint sets. Since our supposition led to a contradiction, there must not be any such function, "f".
The nonexistence of a set containing itself can be seen as a special case where the sequence is infinite and constant.
Notice that this argument only applies to functions "f" that can be represented as sets as opposed to undefinable classes. The hereditarily finite sets, Vω, satisfy the axiom of regularity (and all other axioms of ZFC except the axiom of infinity). So if one forms a non-trivial ultrapower of Vω, then it will also satisfy the axiom of regularity. The resulting model will contain elements, called non-standard natural numbers, that satisfy the definition of natural numbers in that model but are not really natural numbers. They are "fake" natural numbers which are "larger" than any actual natural number. This model will contain infinite descending sequences of elements. For example, suppose "n" is a non-standard natural number, then formula_4 and formula_5, and so on. For any actual natural number "k", formula_6. This is an unending descending sequence of elements. But this sequence is not definable in the model and thus not a set. So no contradiction to regularity can be proved.
Simpler set-theoretic definition of the ordered pair.
The axiom of regularity enables defining the ordered pair ("a","b") as {"a",{"a","b"}}; see ordered pair for specifics. This definition eliminates one pair of braces from the canonical Kuratowski definition ("a","b") = .
Every set has an ordinal rank.
This was actually the original form of the axiom in von Neumann's axiomatization.
Suppose "x" is any set. Let "t" be the transitive closure of {"x"}. Let "u" be the subset of "t" consisting of unranked sets. If "u" is empty, then "x" is ranked and we are done. Otherwise, apply the axiom of regularity to "u" to get an element "w" of "u" which is disjoint from "u". Since "w" is in "u", "w" is unranked. "w" is a subset of "t" by the definition of transitive closure. Since "w" is disjoint from "u", every element of "w" is ranked. Applying the axioms of replacement and union to combine the ranks of the elements of "w", we get an ordinal rank for "w", to wit formula_7. This contradicts the conclusion that "w" is unranked. So the assumption that "u" was non-empty must be false and "x" must have rank.
For every two sets, only one can be an element of the other.
Let "X" and "Y" be sets. Then apply the axiom of regularity to the set {"X","Y"} (which exists by the axiom of pairing). We see there must be an element of {"X","Y"} which is also disjoint from it. It must be either "X" or "Y". By the definition of disjoint then, we must have either "Y" is not an element of "X" or vice versa.
The axiom of dependent choice and no infinite descending sequence of sets implies regularity.
Let the non-empty set "S" be a counter-example to the axiom of regularity; that is, every element of "S" has a non-empty intersection with "S". We define a binary relation "R" on "S" by formula_8, which is entire by assumption. Thus, by the axiom of dependent choice, there is some sequence ("an") in "S" satisfying "anRan+1" for all "n" in N. As this is an infinite descending chain, we arrive at a contradiction and so, no such "S" exists.
Regularity and the rest of ZF(C) axioms.
Regularity was shown to be relatively consistent with the rest of ZF by and , meaning that if ZF without regularity is consistent, then ZF (with regularity) is also consistent. For his proof in modern notation see for instance.
The axiom of regularity was also shown to be independent from the other axioms of ZF(C), assuming they are consistent. The result was announced by Paul Bernays in 1941, although he did not publish a proof until 1954. The proof involves (and led to the study of) Rieger-Bernays permutation models (or method), which were used for other proofs of independence for non-well-founded systems ( and ).
Regularity and Russell's paradox.
Naive set theory (the axiom schema of unrestricted comprehension and the axiom of extensionality) is inconsistent due to Russell's paradox. In early formalizations of sets, mathematicians and logicians have avoided that contradiction by replacing the axiom schema of comprehension with the much weaker axiom schema of separation. However, this step alone takes one to theories of sets which are considered too weak. So some of the power of comprehension was added back via the other existence axioms of ZF set theory (pairing, union, powerset, replacement, and infinity) which may be regarded as special cases of comprehension. So far, these axioms do not seem to lead to any contradiction. Subsequently, the axiom of choice and the axiom of regularity were added to exclude models with some undesirable properties. These two axioms are known to be relatively consistent.
In the presence of the axiom schema of separation, Russell's paradox becomes a proof that there is no set of all sets. The axiom of regularity together with the axiom of pairing also prohibit such a universal set. However, Russell's paradox yields a proof that there is no "set of all sets" using the axiom schema of separation alone, without any additional axioms. In particular, ZF without the axiom of regularity already prohibits such a universal set.
If a theory is extended by adding an axiom or axioms, then any (possibly undesirable) consequences of the original theory remain consequences of the extended theory. In particular, if ZF without regularity is extended by adding regularity to get ZF, then any contradiction (such as Russell's paradox) which followed from the original theory would still follow in the extended theory.
The existence of Quine atoms (sets that satisfy the formula equation "x" = {"x"}, i.e. have themselves as their only elements) is consistent with the theory obtained by removing the axiom of regularity from ZFC. Various non-wellfounded set theories allow "safe" circular sets, such as Quine atoms, without becoming inconsistent by means of Russell's paradox.
Regularity, the cumulative hierarchy, and types.
In ZF it can be proven that the class formula_9, called the von Neumann universe, is equal to the class of all sets. This statement is even equivalent to the axiom of regularity (if we work in ZF with this axiom omitted). From any model which does not satisfy axiom of regularity, a model which satisfies it can be constructed by taking only sets in formula_9.
wrote that "The idea of rank is a descendant of Russell's concept of "type"". Comparing ZF with type theory, Alasdair Urquhart wrote that "Zermelo's system has the notational advantage of not containing any explicitly typed variables, although in fact it can be seen as having an implicit type structure built into it, at least if the axiom of regularity is included. The details of this implicit typing are spelled out in [Zermelo 1930], and again in a well-known article of George Boolos [Boolos 1971]."
went further and claimed that:
In the same paper, Scott shows that an axiomatic system based on the inherent properties of the cumulative hierarchy turns out to be equivalent to ZF, including regularity.
History.
The concept of well-foundedness and rank of a set were both introduced by Dmitry Mirimanoff (1917) cf. and . Mirimanoff called a set "x" "regular" (French: "ordinaire") if every descending chain "x" ∋ "x"1 ∋ "x"2 ∋ ... is finite. Mirimanoff however did not consider his notion of regularity (and well-foundedness) as an axiom to be observed by all sets; in later papers Mirimanoff also explored what are now called non-well-founded sets ("extraordinaire" in Mirimanoff's terminology).
and pointed out that non-well-founded sets are superfluous (on p. 404 in van Heijenoort's translation) and in the same publication von Neumann gives an axiom (p. 412 in translation) which excludes some, but not all, non-well-founded sets. In a subsequent publication, gave the following axiom (rendered in modern notation by A. Rieger):
Regularity in the presence of urelements.
Urelements are objects that are not sets, but which can be elements of sets. In ZF set theory, there are no urelements, but in some other set theories such as ZFA, there are. In these theories, the axiom of regularity must be modified. The statement "formula_12" needs to be replaced with a statement that formula_13 is not empty and is not an urelement. One suitable replacement is formula_14, which states that "x" is inhabited.
|
2114 | IBM AIX | AIX (Advanced Interactive eXecutive, pronounced , "ay-eye-ex") is a series of proprietary Unix operating systems developed and sold by IBM for several of its computer platforms.
Background.
Originally released for the IBM RT PC RISC workstation in 1986, AIX has supported a wide variety of hardware platforms, including the IBM RS/6000 series and later Power and PowerPC-based systems, IBM System i, System/370 mainframes, PS/2 personal computers, and the Apple Network Server. It is currently supported on IBM Power Systems alongside IBM i and Linux.
AIX is based on UNIX System V with 4.3BSD-compatible extensions. It is certified to the UNIX 03 and UNIX V7 marks of the Single UNIX Specification, beginning with AIX versions 5.3 and 7.2 TL5 respectively. Older versions were previously certified to the UNIX 95 and UNIX 98 marks.
AIX was the first operating system to have a journaling file system, and IBM has continuously enhanced the software with features such as processor, disk and network virtualization, dynamic hardware resource allocation (including fractional processor units), and reliability engineering ported from its mainframe designs.
History.
Unix started life at AT&T's Bell Labs research center in the early 1970s, running on DEC minicomputers. By 1976, the operating system was in use at various academic institutions, including Princeton, where Tom Lyon and others ported it to the S/370, to run as a guest OS under VM/370. This port would later grow out to become UTS, a mainframe Unix offering by IBM's competitor Amdahl Corporation.
IBM's own involvement in Unix can be dated to 1979, when it assisted Bell Labs in doing its own Unix port to the 370 (to be used as a build host for the 5ESS switch's software). In the process, IBM made modifications to the TSS/370 hypervisor to better support Unix.
It took until 1985 for IBM to offer its own Unix on the S/370 platform, IX/370, which was developed by Interactive Systems Corporation and intended by IBM to compete with Amdahl UTS. The operating system offered special facilities for interoperating with PC/IX, Interactive/IBM's version of Unix for IBM PC compatible hardware, and was licensed at $10,000 per sixteen concurrent users.
AIX Version 1, introduced in 1986 for the IBM RT PC workstation, was based on UNIX System V Releases 1 and 2. In developing AIX, IBM and Interactive Systems Corporation (whom IBM contracted) also incorporated source code from 4.2 and 4.3 BSD UNIX.
Among other variants, IBM later produced AIX Version 3 (also known as AIX/6000), based on System V Release 3, for their POWER-based RS/6000 platform. Since 1990, AIX has served as the primary operating system for the RS/6000 series (later renamed "IBM eServer pSeries", then "IBM System p", and now "IBM Power Systems"). AIX Version 4, introduced in 1994, added symmetric multiprocessing with the introduction of the first RS/6000 SMP servers and continued to evolve through the 1990s, culminating with AIX 4.3.3 in 1999. Version 4.1, in a slightly modified form, was also the standard operating system for the Apple Network Server systems sold by Apple Computer to complement the Macintosh line.
In the late 1990s, under Project Monterey, IBM and the Santa Cruz Operation planned to integrate AIX and UnixWare into a single 32-bit/64-bit multiplatform UNIX with particular emphasis on running on Intel IA-64 (Itanium) architecture CPUs. A beta test version of AIX 5L for IA-64 systems was released, but according to documents released in the "SCO v. IBM" lawsuit, less than forty licenses for the finished Monterey Unix were ever sold before the project was terminated in 2002. In 2003, the SCO Group alleged that (among other infractions) IBM had misappropriated licensed source code from UNIX System V Release 4 for incorporation into AIX; SCO subsequently withdrew IBM's license to develop and distribute AIX. IBM maintains that their license was irrevocable, and continued to sell and support the product until the litigation was adjudicated.
AIX was a component of the 2003 "SCO v. IBM" lawsuit, in which the SCO Group filed a lawsuit against IBM, alleging IBM contributed SCO's intellectual property to the Linux codebase. The SCO Group, who argued they were the rightful owners of the copyrights covering the Unix operating system, attempted to revoke IBM's license to sell or distribute the AIX operating system. In March 2010, a jury returned a verdict finding that Novell, not the SCO Group, owns the rights to Unix.
AIX 6 was announced in May 2007, and it ran as an open beta from June 2007 until the general availability (GA) of AIX 6.1 on November 9, 2007. Major new features in AIX 6.1 included full role-based access control, workload partitions (which enable application mobility), enhanced security (Addition of AES encryption type for NFS v3 and v4), and Live Partition Mobility on the POWER6 hardware.
AIX 7.1 was announced in April 2010, and an open beta ran until general availability of AIX 7.1 in September 2010. Several new features, including better scalability, enhanced clustering and management capabilities were added. AIX 7.1 includes a new built-in clustering capability called Cluster Aware AIX. AIX is able to organize multiple LPARs through the multipath communications channel to neighboring CPUs, enabling very high-speed communication between processors. This enables multi-terabyte memory address range and page table access to support global petabyte shared memory space for AIX POWER7 clusters so that software developers can program a cluster as if it were a single system, without using message passing (i.e. semaphore-controlled Inter-process Communication). AIX administrators can use this new capability to cluster a pool of AIX nodes. By default, AIX V7.1 pins kernel memory and includes support to allow applications to pin their kernel stack. Pinning kernel memory and the kernel stack for applications with real-time requirements can provide performance improvements by ensuring that the kernel memory and kernel stack for an application is not paged out.
AIX 7.2 was announced in October 2015, and released in December 2015. The principal feature of AIX 7.2 is the Live Kernel Update capability, which allows OS fixes to replace the entire AIX kernel with no impact to applications, by live migrating workloads to a temporary surrogate AIX OS partition while the original OS partition is patched. AIX 7.2 was also restructured to remove obsolete components. The networking component, bos.net.tcp.client was repackaged to allow additional installation flexibility. Unlike AIX 7.1, AIX 7.2 is only supported on systems based on POWER7 or later processors.
In January 2023, IBM moved development of AIX to its Indian subsidiary.
Supported hardware platforms.
IBM RT PC.
The original AIX (sometimes called AIX/RT) was developed for the IBM RT PC workstation by IBM in conjunction with Interactive Systems Corporation, who had previously ported UNIX System III to the IBM PC for IBM as PC/IX. According to its developers, the AIX source (for this initial version) consisted of one million lines of code. Installation media consisted of eight 1.2M floppy disks. The RT was based on the IBM ROMP microprocessor, the first commercial RISC chip. This was based on a design pioneered at IBM Research (the IBM 801).
One of the novel aspects of the RT design was the use of a microkernel, called Virtual Resource Manager (VRM). The keyboard, mouse, display, disk drives and network were all controlled by a microkernel. One could "hotkey" from one operating system to the next using the Alt-Tab key combination. Each OS in turn would get possession of the keyboard, mouse and display. Besides AIX v2, the PICK OS also included this microkernel.
Much of the AIX v2 kernel was written in the PL/8 programming language, which proved troublesome during the migration to AIX v3. AIX v2 included full TCP/IP networking, as well as SNA and two networking file systems: NFS, licensed from Sun Microsystems, and Distributed Services (DS). DS had the distinction of being built on top of SNA, and thereby being fully compatible with DS on and on midrange systems running OS/400 through IBM i. For the graphical user interfaces, AIX v2 came with the X10R3 and later the X10R4 and X11 versions of the X Window System from MIT, together with the Athena widget set. Compilers for Fortran and C were available.
IBM PS/2 series.
AIX PS/2 (also known as AIX/386) was developed by Locus Computing Corporation under contract to IBM. AIX PS/2, first released in October 1988, ran on IBM PS/2 personal computers with Intel 386 and compatible processors.
The product was announced in September 1988 with a baseline tag price of $595, although some utilities like uucp were included in a separate Extension package priced at $250. nroff and troff for AIX were also sold separately in a Text Formatting System package priced at $200. The TCP/IP stack for AIX PS/2 retailed for another $300. The X Window System package was priced at $195, and featured a graphical environment called the AIXwindows Desktop, based on IXI's X.desktop. The C and FORTRAN compilers each had a price tag of $275. Locus also made available their DOS Merge virtual machine environment for AIX, which could run MS DOS 3.3 applications inside AIX; DOS Merge was sold separately for another $250. IBM also offered a $150 AIX PS/2 DOS Server Program, which provided file server and print server services for client computers running PC DOS 3.3.
The last version of PS/2 AIX is 1.3. It was released in 1992 and announced to add support for non-IBM (non-microchannel) computers as well. Support for PS/2 AIX ended in March 1995.
IBM mainframes.
In 1988, IBM announced AIX/370, also developed by Locus Computing. AIX/370 was IBM's fourth attempt to offer Unix-like functionality for their mainframe line, specifically the System/370 (the prior versions were a TSS/370-based Unix system developed jointly with AT&T c.1980, a VM/370-based system named VM/IX developed jointly with Interactive Systems Corporation c.1984, and a VM/370-based version of TSS/370 named IX/370 which was upgraded to be compatible with UNIX System V). AIX/370 was released in 1990 with functional equivalence to System V Release 2 and 4.3BSD as well as IBM enhancements. With the introduction of the ESA/390 architecture, AIX/370 was replaced by AIX/ESA in 1991, which was based on OSF/1, and also ran on the System/390 platform. This development effort was made partly to allow IBM to compete with Amdahl UTS. Unlike AIX/370, AIX/ESA ran both natively as the host operating system, and as a guest under VM. AIX/ESA, while technically advanced, had little commercial success, partially because UNIX functionality was added as an option to the existing mainframe operating system, MVS, as MVS/ESA SP Version 4 Release 3 OpenEdition in 1994, and continued as an integral part of MVS/ESA SP Version 5, OS/390 and z/OS, with the name eventually changing from "OpenEdition" to "Unix System Services". IBM also provided OpenEdition in VM/ESA Version 2 through z/VM.
IA-64 systems.
As part of Project Monterey, IBM released a beta test version of AIX 5L for the IA-64 (Itanium) architecture in 2001, but this never became an official product due to lack of interest.
Apple Network Servers.
The Apple Network Server (ANS) systems were PowerPC-based systems designed by Apple Computer to have numerous high-end features that standard Apple hardware did not have, including swappable hard drives, redundant power supplies, and external monitoring capability. These systems were more or less based on the Power Macintosh hardware available at the time but were designed to use AIX (versions 4.1.4 or 4.1.5) as their native operating system in a specialized version specific to the ANS called AIX for Apple Network Servers.
AIX was only compatible with the Network Servers and was not ported to standard Power Macintosh hardware. It should not be confused with A/UX, Apple's earlier version of Unix for 68k-based Macintoshes.
POWER ISA/PowerPC/Power ISA-based systems.
The release of AIX version 3 (sometimes called AIX/6000) coincided with the announcement of the first POWER1-based IBM RS/6000 models in 1990.
AIX v3 innovated in several ways on the software side. It was the first operating system to introduce the idea of a journaling file system, JFS, which allowed for fast boot times by avoiding the need to ensure the consistency of the file systems on disks (see fsck) on every reboot. Another innovation was shared libraries which avoid the need for static linking from an application to the libraries it used. The resulting smaller binaries used less of the hardware RAM to run, and used less disk space to install. Besides improving performance, it was a boon to developers: executable binaries could be in the tens of kilobytes instead of a megabyte for an executable statically linked to the C library. AIX v3 also scrapped the microkernel of AIX v2, a contentious move that resulted in v3 containing no PL/8 code and being somewhat more "pure" than v2.
Other notable subsystems included:
In addition, AIX applications can run in the PASE subsystem under IBM i.
Source code.
IBM formerly made the AIX for RS/6000 source code available to customers for an additional fee; in 1991, IBM customers could order the AIX 3.0 source code for a one-time charge of US$60,000; subsequently, IBM released the AIX 3.1 source code in 1992, and AIX 3.2 in 1993. These source code distributions excluded certain files (authored by third-parties) which IBM did not have rights to redistribute, and also excluded layered products such as the MS-DOS emulator and the C compiler. Furthermore, in order to be able to license the AIX source code, the customer first had to procure source code license agreements with AT&T and the University of California, Berkeley.
User interfaces.
The default shell was Bourne shell up to AIX version 3, but was changed to KornShell (ksh88) in version 4 for XPG4 and POSIX compliance.
Graphical.
The Common Desktop Environment (CDE) is AIX's default graphical user interface. As part of Linux Affinity and the free AIX Toolbox for Linux Applications (ATLA), open-source KDE Plasma Workspaces and GNOME desktop are also available.
System Management Interface Tool.
SMIT is the System Management Interface Tool for AIX. It allows a user to navigate a menu hierarchy of commands, rather than using the command line. Invocation is typically achieved with the command codice_1. Experienced system administrators make use of the codice_2 function key which generates the command line that SMIT will invoke to complete it.
SMIT also generates a log of commands that are performed in the codice_3 file. The codice_3 file automatically records the commands with the command flags and parameters used. The codice_3 file can be used as an executable shell script to rerun system configuration tasks. SMIT also creates the codice_6 file, which contains additional detailed information that can be used by programmers in extending the SMIT system.
codice_1 and codice_8 refer to the same program, though codice_8 invokes the text-based version, while codice_1 will invoke an X Window System based interface if possible; however, if codice_1 determines that X Window System capabilities are not present, it will present the text-based version instead of failing. Determination of X Window System capabilities is typically performed by checking for the existence of the codice_12 variable.
Database.
Object Data Manager (ODM) is a database of system information integrated into AIX, analogous to the registry in Microsoft Windows. A good understanding of the ODM is essential for managing AIX systems.
Data managed in ODM is stored and maintained as objects with associated attributes. Interaction with ODM is possible via application programming interface (API) library for programs, and command-line utilities such as "odmshow", "odmget", "odmadd", "odmchange" and "odmdelete" for shell scripts and users. SMIT and its associated AIX commands can also be used to query and modify information in the ODM. ODM is stored on disk using Berkeley DB files.
Example of information stored in the ODM database are:
|
2115 | AppleTalk | AppleTalk is a discontinued proprietary suite of networking protocols developed by Apple Computer for their Macintosh computers. AppleTalk includes a number of features that allow local area networks to be connected with no prior setup or the need for a centralized router or server of any sort. Connected AppleTalk-equipped systems automatically assign addresses, update the distributed namespace, and configure any required inter-networking routing.
AppleTalk was released in 1985 and was the primary protocol used by Apple devices through the 1980s and 1990s. Versions were also released for the IBM PC and compatibles and the Apple IIGS. AppleTalk support was also available in most networked printers (especially laser printers), some file servers, and a number of routers.
The rise of TCP/IP during the 1990s led to a reimplementation of most of these types of support on that protocol, and AppleTalk became unsupported as of the release of Mac OS X v10.6 in 2009. Many of AppleTalk's more advanced autoconfiguration features have since been introduced in Bonjour, while Universal Plug and Play serves similar needs.
History.
AppleNet.
After the release of the Apple Lisa computer in January 1983, Apple invested considerable effort in the development of a local area networking (LAN) system for the machines. Known as AppleNet, it was based on the seminal Xerox XNS protocol stack but running on a custom 1 Mbit/s coaxial cable system rather than Xerox's 2.94 Mbit/s Ethernet. AppleNet was announced early in 1983 with a full introduction at the target price of $500 for plug-in AppleNet cards for the Lisa and the Apple II.
At that time, early LAN systems were just coming to market, including Ethernet, Token Ring, Econet, and ARCNET. This was a topic of major commercial effort at the time, dominating shows like the National Computer Conference (NCC) in Anaheim in May 1983. All of the systems were jockeying for position in the market, but even at this time, Ethernet's widespread acceptance suggested it was to become a "de facto" standard. It was at this show that Steve Jobs asked Gursharan Sidhu a seemingly innocuous question: "Why has networking not caught on?"
Four months later, in October, AppleNet was cancelled. At the time, they announced that "Apple realized that it's not in the business to create a networking system. We built and used AppleNet in-house, but we realized that if we had shipped it, we would have seen new standards coming up." In January, Jobs announced that they would instead be supporting IBM's Token Ring, which he expected to come out in a "few months".
AppleBus.
Through this period, Apple was deep in development of the Macintosh computer. During development, engineers had made the decision to use the Zilog 8530 serial controller chip (SCC) instead of the lower-cost and more common UART to provide serial port connections. The SCC cost about $5 more than a UART, but offered much higher speeds of up to 250 kilobits per second (or higher with additional hardware) and internally supported a number of basic networking-like protocols like IBM's Bisync.
The SCC was chosen because it would allow multiple devices to be attached to the port. Peripherals equipped with similar SCCs could communicate using the built-in protocols, interleaving their data with other peripherals on the same bus. This would eliminate the need for more ports on the back of the machine, and allowed for the elimination of expansion slots for supporting more complex devices. The initial concept was known as AppleBus, envisioning a system controlled by the host Macintosh polling "dumb" devices in a fashion similar to the modern Universal Serial Bus.
AppleBus networking.
The Macintosh team had already begun work on what would become the LaserWriter and had considered a number of other options to answer the question of how to share these expensive machines and other resources. A series of memos from Bob Belleville clarified these concepts, outlining the Mac, LaserWriter, and a file server system which would become the Macintosh Office. By late 1983 it was clear that IBM's Token Ring would not be ready in time for the launch of the Mac, and might miss the launch of these other products as well. In the end, Token Ring would not ship until October 1985.
Jobs' earlier question to Sidhu had already sparked a number of ideas. When AppleNet was cancelled in October, Sidhu led an effort to develop a new networking system based on the AppleBus hardware. This new system would not have to conform to any existing preconceptions, and was designed to be worthy of the Mac – a system that was user-installable, had zero configuration, and no fixed network addresses – in short, a true plug-and-play network. Considerable effort was needed, but by the time the Mac was released, the basic concepts had been outlined, and some of the low-level protocols were on their way to completion. Sidhu mentioned the work to Belleville only two hours after the Mac was announced.
The "new" AppleBus was announced in early 1984, allowing direct connection from the Mac or Lisa through a small box that is plugged into the serial port and connected via cables to the next computer upstream and downstream. Adaptors for Apple II and Apple III were also announced. Apple also announced that AppleBus networks could be attached to, and would appear to be a single node within, a Token Ring system. Details of how this would work were sketchy.
AppleTalk Personal Network.
Just prior to its release in early 1985, AppleBus was renamed AppleTalk. Initially marketed as AppleTalk Personal Network, it comprised a family of network protocols and a physical layer.
The physical layer had a number of limitations, including a speed of only 230.4 kbit/s, a maximum distance of from end to end, and only 32 nodes per LAN. But as the basic hardware was built into the Mac, adding nodes only cost about $50 for the adaptor box. In comparison, Ethernet or Token Ring cards cost hundreds or thousands of dollars. Additionally, the entire networking stack required only about 6 kB of RAM, allowing it to run on any Mac.
The relatively slow speed of AppleTalk allowed further reductions in cost. Instead of using RS-422's balanced transmit and receive circuits, the AppleTalk cabling used a single common electrical ground, which limited speeds to about 500 kbit/s, but allowed one conductor to be removed. This meant that common three-conductor cables could be used for wiring. Additionally, the adaptors were designed to be "self-terminating", meaning that nodes at the end of the network could simply leave their last connector unconnected. There was no need for the wires to be connected back together into a loop, nor the need for hubs or other devices.
The system was designed for future expansion; the addressing system allowed for expansion to 255 nodes in a LAN (although only 32 could be used at that time), and by using "bridges" (which came to be known as "routers", although technically not the same) one could interconnect LANs into larger collections. "Zones" allowed devices to be addressed within a bridge-connected internet. Additionally, AppleTalk was designed from the start to allow use with any potential underlying physical link, and within a few years, the physical layer would be renamed LocalTalk, so as to differentiate it from the AppleTalk protocols.
The main advantage of AppleTalk was that it was completely maintenance-free. To join a device to a network, a user simply plugged the adaptor into the machine, then connected a cable from it to any free port on any other adaptor. The AppleTalk network stack negotiated a network address, assigned the computer a human-readable name, and compiled a list of the names and types of other machines on the network so the user could browse the devices through the Chooser. AppleTalk was so easy to use that ad hoc networks tended to appear whenever multiple Macs were in the same room. Apple would later use this in an advertisement showing a network being created between two seats in an airplane.
PhoneNet and other adaptors.
A thriving 3rd party market for AppleTalk devices developed over the next few years. One particularly notable example was an alternate adaptor designed by BMUG and commercialized by Farallon as PhoneNet in 1987. This was essentially a replacement for Apple's connector that had conventional phone jacks instead of Apple's round connectors. PhoneNet allowed AppleTalk networks to be connected together using normal telephone wires, and with very little extra work, could run analog phones and AppleTalk on a single four-conductor phone cable.
Other companies took advantage of the SCC's ability to read external clocks in order to support higher transmission speeds, up to 1 Mbit/s. In these systems, the external adaptor also included its own clock, and used that to signal the SCC's clock input pins. The best-known such system was Centram's FlashTalk, which ran at 768 kbit/s, and was intended to be used with their TOPS networking system. A similar solution was the 850 kbit/s DaynaTalk, which used a separate box that plugged in between the computer and a normal LocalTalk/PhoneNet box. Dayna also offered a PC expansion card that ran up to 1.7 Mbit/s when talking to other Dayna PC cards. Several other systems also existed with even higher performance, but these often required special cabling that was incompatible with LocalTalk/PhoneNet, and also required patches to the networking stack that often caused problems.
AppleTalk over Ethernet.
As Apple expanded into more commercial and education markets, they needed to integrate AppleTalk into existing network installations. Many of these organizations had already invested in a very expensive Ethernet infrastructure and there was no direct way to connect a Macintosh to Ethernet. AppleTalk included a protocol structure for interconnecting AppleTalk subnets and so as a solution, EtherTalk was initially created to use the Ethernet as a backbone between LocalTalk subnets. To accomplish this, organizations would need to purchase a LocalTalk-to-Ethernet bridge and Apple left it to third parties to produce these products. A number of companies responded, including Hayes and a few newly formed companies like Kinetics.
LocalTalk, EtherTalk, TokenTalk, and AppleShare.
By 1987, Ethernet was clearly winning the standards battle over Token Ring, and in the middle of that year, Apple introduced EtherTalk 1.0, an implementation of the AppleTalk protocol over the Ethernet physical layer. Introduced for the newly released Macintosh II computer, Apple's first Macintosh with expansion slots, the operating system included a new Network control panel that allowed the user to select which physical connection to use for networking (from "Built-in" or "EtherTalk"). At introduction, Ethernet interface cards were available from 3Com and Kinetics that plugged into a Nubus slot in the machine. The new networking stack also expanded the system to allow a full 255 nodes per LAN. With EtherTalk's release, AppleTalk Personal Network was renamed LocalTalk, the name it would be known under for the bulk of its life. Token Ring would later be supported with a similar TokenTalk product, which used the same Network control panel and underlying software. Over time, many third-party companies would introduce compatible Ethernet and Token Ring cards that used these same drivers.
The appearance of a Macintosh with a direct Ethernet connection also magnified the Ethernet and LocalTalk compatibility problem: Networks with new and old Macs needed some way to communicate with each other. This could be as simple as a network of Ethernet Mac II's trying to talk to a LaserWriter that only connected to LocalTalk. Apple initially relied on the aforementioned LocalTalk-to-Ethernet bridge products, but contrary to Apple's belief that these would be low-volume products, by the end of 1987, 130,000 such networks were in use. AppleTalk was at that time the most used networking system in the world, with over three times the installations of any other vendor.
1987 also marked the introduction of the AppleShare product, a dedicated file server that ran on any Mac with 512 kB of RAM or more. A common AppleShare machine was the Mac Plus with an external SCSI hard drive. AppleShare was the #3 network operating system in the late 1980s, behind Novell NetWare and Microsoft's MS-Net. AppleShare was effectively the replacement for the failed Macintosh Office efforts, which had been based on a dedicated file server device.
AppleTalk Phase II and other developments.
A significant re-design was released in 1989 as AppleTalk Phase II. In many ways, Phase II can be considered an effort to make the earlier version (never called Phase I) more generic. LANs could now support more than 255 nodes, and zones were no longer associated with physical networks but were entirely virtual constructs used simply to organize nodes. For instance, one could now make a "Printers" zone that would list all the printers in an organization, or one might want to place that same device in the "2nd Floor" zone to indicate its physical location. Phase II also included changes to the underlying inter-networking protocols to make them less "chatty", which had previously been a serious problem on networks that bridged over wide-area networks.
By this point, Apple had a wide variety of communications products under development, and many of these were announced along with AppleTalk Phase II. These included updates to EtherTalk and TokenTalk, AppleTalk software and LocalTalk hardware for the IBM PC, EtherTalk for Apple's A/UX operating system allowing it to use LaserPrinters and other network resources, and the Mac X.25 and MacX products.
Ethernet had become almost universal by 1990, and it was time to build Ethernet into Macs direct from the factory. However, the physical wiring used by these networks was not yet completely standardized. Apple solved this problem using a single port on the back of the computer into which the user could plug an adaptor for any given cabling system. This FriendlyNet system was based on the industry-standard Attachment Unit Interface or AUI, but deliberately chose a non-standard connector that was smaller and easier to use, which they called "Apple AUI", or AAUI. FriendlyNet was first introduced on the Quadra 700 and Quadra 900 computers, and used across much of the Mac line for some time. As with LocalTalk, a number of 3rd party FriendlyNet adaptors quickly appeared.
As 10BASE-T became the de facto cabling system for Ethernet, second-generation Power Macintosh machines added a 10BASE-T port in addition to AAUI. The PowerBook 3400c and lower-end Power Macs also added 10BASE-T. The Power Macintosh 7300/8600/9600 were the final Macs to include AAUI, and 10BASE-T became universal starting with the Power Macintosh G3 and PowerBook G3.
The capital-I Internet.
From the beginning of AppleTalk, users wanted to connect the Macintosh to the TCP/IP network environments. In 1984, Bill Croft at Stanford University pioneered the development of IP packets encapsulated in DDP as part of the SEAGATE (Stanford Ethernet–AppleTalk Gateway) project. SEAGATE was commercialized by Kinetics in their LocalTalk-to-Ethernet bridge as an additional routing option. A few years later, MacIP, was separated from the SEAGATE code and became the de facto method for IP packets to be routed over LocalTalk networks. By 1986, Columbia University released the first version of the Columbia AppleTalk Package (CAP) that allowed higher integration of Unix, TCP/IP, and AppleTalk environments. In 1988, Apple released MacTCP, a system that allowed the Mac to support TCP/IP on machines with suitable Ethernet hardware. However, this left many universities with the problem of supporting IP on their many LocalTalk-equipped Macs. It was soon common to include MacIP support in LocalTalk-to-Ethernet bridges. MacTCP would not become a standard part of the Classic Mac OS until 1994, by which time it also supported SNMP and PPP.
For some time in the early 1990s, the Mac was a primary client on the rapidly expanding Internet. Among the better-known programs in wide use were Fetch, Eudora, eXodus, NewsWatcher, and the NCSA packages, especially NCSA Mosaic and its offspring, Netscape Navigator. Additionally, a number of server products appeared that allowed the Mac to host Internet content. Through this period, Macs had about 2 to 3 times as many clients connected to the Internet as any other platform, despite the relatively small overall microcomputer market share.
As the world quickly moved to IP for both LAN and WAN uses, Apple was faced with maintaining two increasingly outdated code bases on an ever-wider group of machines as well as the introduction of the PowerPC based machines. This led to the Open Transport efforts, which re-implemented both MacTCP and AppleTalk on an entirely new code base adapted from the Unix standard STREAMS. Early versions had problems and did not become stable for some time. By that point, Apple was deep in their ultimately doomed Copland efforts.
Legacy and abandonment.
With the purchase of NeXT and subsequent development of Mac OS X, AppleTalk was strictly a legacy system. Support was added to OS X in order to provide support for a large number of existing AppleTalk devices, notably laser printers and file shares, but alternate connection solutions common in this era, notably USB for printers, limited their demand. As Apple abandoned many of these product categories, and all new systems were based on IP, AppleTalk became less and less common. AppleTalk support was finally removed from the MacOS in Mac OS X v10.6 in 2009.
However, the loss of AppleTalk did not reduce the desire for networking solutions that combined its ease of use with IP routing. Apple has led the development of many such efforts, from the introduction of the AirPort router to the development of the Zero-configuration networking system and their implementation of it, Bonjour.
As of 2020, AppleTalk support has been completely removed from legacy support with macOS 11 Big Sur.
Design.
The AppleTalk design rigorously followed the OSI model of protocol layering. Unlike most of the early LAN systems, AppleTalk was not built using the archetypal Xerox XNS system. The intended target was not Ethernet, and it did not have 48-bit addresses to route. Nevertheless, many portions of the AppleTalk system have direct analogs in XNS.
One key differentiation for AppleTalk was it contained two protocols aimed at making the system completely self-configuring. The "AppleTalk address resolution protocol" ("AARP") allowed AppleTalk hosts to automatically generate their own network addresses, and the "Name Binding Protocol" ("NBP") was a dynamic system for mapping network addresses to user-readable names. Although systems similar to AARP existed in other systems, Banyan VINES for instance. Beginning about 2002 Multicast DNS provided capabilities similar to NBP.
Both AARP and NBP had defined ways to allow "controller" devices to override the default mechanisms. The concept was to allow routers to provide the information or "hardwire" the system to known addresses and names. On larger networks where AARP could cause problems as new nodes searched for free addresses, the addition of a router could reduce "chattiness." Together AARP and NBP made AppleTalk an easy-to-use networking system. New machines were added to the network by plugging them in and optionally giving them a name. The NBP lists were examined and displayed by a program known as the "Chooser" which would display a list of machines on the local network, divided into classes such as file-servers and printers.
Addressing.
An AppleTalk address was a four-byte quantity. This consisted of a two-byte network number, a one-byte node number, and a one-byte socket number. Of these, only the network number required any configuration, being obtained from a router. Each node dynamically chose its own node number, according to a protocol (originally the LocalTalk Link Access Protocol LLAP and later, for Ethernet/EtherTalk, the AppleTalk Address Resolution Protocol, AARP) which handled contention between different nodes accidentally choosing the same number. For socket numbers, a few well-known numbers were reserved for special purposes specific to the AppleTalk protocol itself. Apart from these, all application-level protocols were expected to use dynamically-assigned socket numbers at both the client and server end.
Because of this dynamism, users could not be expected to access services by specifying their address. Instead, all services had "names" which, being chosen by humans, could be expected to be meaningful to users, and also could be sufficiently long to minimize the chance of conflicts.
As NBP names translated to an address, which included a socket number as well as a node number, a name in AppleTalk mapped directly to a "service" being provided by a machine, which was entirely separate from the name of the machine itself. Thus, services could be moved to a different machine and, so long as they kept the same service name, there was no need for users to do anything different in order to continue accessing the service. And the same machine could host any number of instances of services of the same type, without any network connection conflicts.
Contrast this with "A records" in the DNS, in which a name translates to a machine's address, not including the port number that might be providing a service. Thus, if people are accustomed to using a particular machine name to access a particular service, their access will break when the service is moved to a different machine. This can be mitigated somewhat by insistence on using "CNAME records" indicating service rather than actual machine names to refer to the service, but there is no way of guaranteeing that users will follow such a convention. Some newer protocols, such as Kerberos and Active Directory use DNS SRV records to identify services by name, which is much closer to the AppleTalk model.
Protocols.
AppleTalk Address Resolution Protocol.
AARP resolves AppleTalk addresses to link layer addresses. It is functionally equivalent to ARP and obtains address resolution by a method very similar to ARP.
AARP is a fairly simple system. When powered on, an AppleTalk machine broadcasts an "AARP probe packet" asking for a network address, intending to hear back from controllers such as routers. If no address is provided, one is picked at random from the "base subnet", 0. It then broadcasts another packet saying "I am selecting this address", and then waits to see if anyone else on the network complains. If another machine has that address, it will pick another address, and keep trying until it finds a free one. On a network with many machines it may take several tries before a free address is found, so for performance purposes the successful address is "written down" in NVRAM and used as the default address in the future. This means that in most real-world setups where machines are added a few at a time, only one or two tries are needed before the address effectively become constant.
AppleTalk Data Stream Protocol.
This was a comparatively late addition to the AppleTalk protocol suite, done when it became clear that a TCP-style reliable connection-oriented transport was needed. Significant differences from TCP were that:
Apple Filing Protocol.
The Apple Filing Protocol (AFP), formerly AppleTalk Filing Protocol, is the protocol for communicating with AppleShare file servers. Built on top of AppleTalk Session Protocol (for legacy AFP over DDP) or the Data Stream Interface (for AFP over TCP), it provides services for authenticating users (extensible to different authentication methods including two-way random-number exchange) and for performing operations specific to the Macintosh HFS filesystem. AFP is still in use in macOS, even though most other AppleTalk protocols have been deprecated.
AppleTalk Session Protocol.
ASP was an intermediate protocol, built on top of ATP, which in turn was the foundation of AFP. It provided basic services for requesting responses to arbitrary "commands" d performing out-of-band status queries. It also allowed the server to send asynchronous "attention" messages to the client.
Datagram Delivery Protocol.
DDP was the lowest-level data-link-independent transport protocol. It provided a datagram service with no guarantees of delivery. All application-level protocols, including the infrastructure protocols NBP, RTMP and ZIP, were built on top of DDP. AppleTalk's DDP corresponds closely to the Network layer of the Open Systems Interconnection (OSI) communication model.
Name Binding Protocol.
Name Binding Protocol was a dynamic, distributed system for managing AppleTalk names. When a service started up on a machine, it registered a name for itself as chosen by a human administrator. At this point, NBP provided a system for checking that no other machine had already registered the same name. Later, when a client wanted to access that service, it used NBP to query machines to find that service. NBP provided browsability ("what are the names of all the services available?") as well as the ability to find a service with a particular name. Names were human readable, containing spaces, upper and lower case letters, and including support for searching.
AppleTalk Echo Protocol.
AEP (AppleTalk Echo Protocol) is a transport layer protocol designed to test the reachability of network nodes. AEP generates packets to be sent to the network node and is identified in the Type field of a packet as an AEP packet. The packet is first passed to the source DDP. After it is identified as an AEP packet, it is forwarded to the node where the packet is examined by the DDP at the destination. After the packet is identified as an AEP packet, the packet is then copied and a field in the packet is altered to create an AEP reply packet, and is then returned to the source node.
Printer Access Protocol.
PAP was the standard way of communicating with PostScript printers. It was built on top of ATP. When a PAP connection was opened, each end sent the other an ATP request which basically meant "send me more data". The client's response to the server was to send a block of PostScript code, while the server could respond with any diagnostic messages that might be generated as a result, after which another "send-more-data" request was sent. This use of ATP provided automatic flow control; each end could only send data to the other end if there was an outstanding ATP request to respond to.
PAP also provided for out-of-band status queries, handled by separate ATP transactions. Even while it was busy servicing a print job from one client, a PAP server could continue to respond to status requests from any number of other clients. This allowed other Macintoshes on the LAN that were waiting to print to display status messages indicating that the printer was busy, and what the job was that it was busy with.
Routing Table Maintenance Protocol.
RTMP was the protocol by which routers kept each other informed about the topology of the network. This was the only part of AppleTalk that required periodic unsolicited broadcasts: every 10 seconds, each router had to send out a list of all the network numbers it knew about and how far away it thought they were.
Zone Information Protocol.
ZIP was the protocol by which AppleTalk network numbers were associated with zone names. A "zone" was a subdivision of the network that made sense to humans (for example, "Accounting Department"); but while a network number had to be assigned to a topologically-contiguous section of the network, a zone could include several different discontiguous portions of the network.
Physical implementation.
The initial default hardware implementation for AppleTalk was a high-speed serial protocol known as "LocalTalk" that used the Macintosh's built-in RS-422 ports at 230.4 kbit/s. LocalTalk used a splitter box in the RS-422 port to provide an upstream and downstream cable from a single port. The topology was a bus: cables were daisy-chained from each connected machine to the next, up to the maximum of 32 permitted on any LocalTalk segment. The system was slow by today's standards, but at the time the additional cost and complexity of networking on PC machines was such that it was common that Macs were the only networked personal computers in an office. Other larger computers, such as UNIX or VAX workstations, would commonly be networked via Ethernet.
Other physical implementations were also available. A very popular replacement for LocalTalk was "PhoneNet", a 3rd party solution from Farallon Computing, Inc. (renamed Netopia, acquired by Motorola in 2007) that also used the RS-422 port and was indistinguishable from LocalTalk as far as Apple's LocalTalk port drivers were concerned, but ran over the two unused wires in standard four-wire phone cabling. Foreshadowing today's network hubs and switches, Farallon provided solutions for PhoneNet to be used in "star" as well as bus configurations, with both "passive" star connections (with the phone wires simply bridged to each other at a central point), and "active" star with "PhoneNet Star Controller" hub hardware. Apple's LocalTalk connectors didn't have a locking feature, so connectors could easily come loose, and the bus configuration resulted in any loose connector bringing down the whole network, and being hard to track down. PhoneNet RJ-11 connectors, on the other hand, snapped into place, and in a star configuration any wiring issue only affected one device, and problems were easy to pinpoint. PhoneNet's low cost, flexibility, and easy troubleshooting resulted in it being the dominant choice for Mac networks into the early 1990s.
AppleTalk protocols also came to run over Ethernet (first coaxial and then twisted pair) and Token Ring physical layers, labeled by Apple as "EtherTalk" and "TokenTalk", respectively. EtherTalk gradually became the dominant implementation method for AppleTalk as Ethernet became generally popular in the PC industry throughout the 1990s. Besides AppleTalk and TCP/IP, any Ethernet network could also simultaneously carry other protocols such as DECnet and IPX.
Cross-platform solutions.
When AppleTalk was first introduced, the dominant office computing platform was the PC compatible running MS-DOS. Apple introduced the AppleTalk PC Card in early 1987, allowing PCs to join AppleTalk networks and print to LaserWriter printers. A year later AppleShare PC was released, allowing PCs to access AppleShare file servers.
The "TOPS Teleconnector" MS-DOS networking system over AppleTalk system enabled MS-DOS PCs to communicate over AppleTalk network hardware; it comprised an AppleTalk interface card for the PC and a suite of networking software allowing such functions as file, drive and printer sharing. As well as allowing the construction of a PC-only AppleTalk network, it allowed communication between PCs and Macs with TOPS software installed. (Macs without TOPS installed could use the same network but only to communicate with other Apple machines.) The Mac TOPS software did not match the quality of Apple's own either in ease of use or in robustness and freedom from crashes, but the DOS software was relatively simple to use in DOS terms, and was robust.
The BSD and Linux operating systems support AppleTalk through an open source project called Netatalk, which implements the complete protocol suite and allows them to both act as native file or print servers for Macintosh computers, and print to LocalTalk printers over the network.
The Windows Server operating systems supported AppleTalk starting with Windows NT and ending after Windows Server 2003. Miramar included AppleTalk in its PC MacLAN product which was discontinued by CA in 2007. GroupLogic continues to bundle its AppleTalk protocol with its ExtremeZ-IP server software for Macintosh-Windows integration which supports Windows Server 2008 and Windows Vista as well prior versions. HELIOS Software GmbH offers a proprietary implementation of the AppleTalk protocol stack, as part of their HELIOS UB2 server. This is essentially a File and Print Server suite that runs on a whole range of different platforms.
In addition, Columbia University released the Columbia AppleTalk Package (CAP) which implemented the protocol suite for various Unix flavors including Ultrix, SunOS, BSD and IRIX. This package is no longer actively maintained.
|
2116 | Apple II series | The Apple II series (trademarked with square brackets as "Apple ][" and rendered on later models as "Apple //") is a family of home computers, one of the first highly successful mass-produced microcomputer products, designed primarily by Steve Wozniak, manufactured by Apple Computer (now Apple Inc.), and launched in 1977 with the original Apple II.
In terms of ease of use, features, and expandability, the Apple II was a major advancement over its predecessor, the Apple I, a limited-production bare circuit board computer for electronics hobbyists. Through 1988, a number of models were introduced, with the most popular, the Apple IIe, remaining relatively unchanged into the 1990s.
A model with more advanced graphics and sound and a 16-bit processor, the Apple II, was added in 1986. It remained compatible with earlier Apple II models, but the II had more in common with mid-1980s systems like the Atari ST, Amiga, and Acorn Archimedes.
The Apple II was first sold on June 10, 1977. By the end of production in 1993, somewhere between five and six million Apple II series computers (including about 1.25 million Apple II models) had been produced. The Apple II was one of the longest running mass-produced home computer series, with models in production for just under 17 years.
The Apple II became one of several recognizable and successful computers during the 1980s and early 1990s, although this was mainly limited to the US. It was aggressively marketed through volume discounts and manufacturing arrangements to educational institutions, which made it the first computer in widespread use in American secondary schools, displacing the early leader Commodore PET. The effort to develop educational and business software for the Apple II, including the 1979 release of the popular VisiCalc spreadsheet, made the computer especially popular with business users and families.
Despite the introduction of the Motorola 68000-based Macintosh in 1984, the Apple II series still reportedly accounted for 85% of the company's hardware sales in the first quarter of fiscal 1985. Apple continued to sell Apple II systems alongside the Macintosh until terminating the II in December 1992 and the IIe in November 1993. The last II-series Apple in production, the IIe card for Macintoshes, was discontinued on October 15, 1993. The total Apple II sales of all of its models during its 16-year production run were about 6 million units, with the peak occurring in 1983 when 1 million were sold.
Hardware.
All the machines in the series, except the //c, shared similar overall design elements. The plastic case was designed to look more like a home appliance than a piece of electronic equipment, and the machine could be opened without the use of tools, allowing access to the computer's internals.
The motherboard held eight expansion slots and an array of random access memory (RAM) sockets that could hold up to 48 kilobytes. Over the course of the Apple II series' life, an enormous amount of first- and third-party hardware was made available to extend the capabilities of the machine.
The //c was designed as a compact, portable unit, not intended to be disassembled, and could not use most of the expansion hardware sold for the other machines in the series.
All machines in the Apple II series had a built-in keyboard, with the exception of the IIgs which had a separate keyboard.
Apple IIs had color and high-resolution graphics modes, sound capabilities and a built-in BASIC programming language. The Apple II was targeted for the masses rather than just hobbyists and engineers, and influenced many of the microcomputers that followed it. Unlike preceding home microcomputers, it was sold as a finished consumer appliance rather than as a kit (unassembled or preassembled). The Apple II series eventually supported over 1,500 software programs.
Apple marketed the machine as a durable product, including a 1981 ad in which an Apple II survived a fire started when a cat belonging to one early user knocked over a lamp.
Software.
The original Apple II provided an operating system in ROM along with a BASIC variant called Integer BASIC. The only form of storage available was cassette tape.
When the Disk II floppy disk drive was released in 1978, a new operating system, Apple DOS, was commissioned from Shepardson Microsystems and developed by Paul Laughton, adding support for the disk drive. The final and most popular version of this software was Apple DOS 3.3.
Apple DOS was superseded by ProDOS, which supported a hierarchical filesystem and larger storage devices. With an optional third-party Z80-based expansion card, the Apple II could boot into the CP/M operating system and run WordStar, dBase II, and other CP/M software. With the release of MousePaint in 1984 and the Apple II in 1986, the platform took on the look of the Macintosh user interface, including a mouse.
Apple eventually released Applesoft BASIC, a more advanced variant of the language which users could run instead of Integer BASIC for more capabilities.
Some commercial Apple II software booted directly and did not use standard DOS disk formats. This discouraged the copying or modifying of the software on the disks, and improved loading speed.
Models.
Apple II.
The first Apple II computers went on sale on June 10, 1977 with a MOS Technology 6502 (later Synertek) microprocessor running at 1.023 MHz, 4 KB of RAM, an audio cassette interface for loading programs and storing data, and the Integer BASIC programming language built into the ROMs. The video controller displayed 40 columns by 24 lines of monochrome, upper-case-only (the original character set matches ASCII characters 0x20 to 0x5F) text on the screen, with NTSC composite video output suitable for display on a TV monitor, or on a regular TV set by way of a separate RF modulator. The original retail price of the computer was US$1298(with 4 KB of RAM) and US$2638 (with the maximum 48 KB of RAM). To reflect the computer's color graphics capability, the Apple logo on the casing was represented using rainbow stripes, which remained a part of Apple's corporate logo until early 1998. The earliest Apple IIs were assembled in Silicon Valley, and later in Texas; printed circuit boards were manufactured in Ireland and Singapore.
An external -inch floppy disk drive, the Disk II, attached via a controller card that plugged into one of the computer's expansion slots (usually slot 6), was used for data storage and retrieval to replace cassettes. The Disk II interface, created by Steve Wozniak, was regarded as an engineering masterpiece for its economy of electronic components.
Rather than having a dedicated sound-synthesis chip, the Apple II had a toggle circuit that could only emit a click through a built-in speaker or a line out jack; all other sounds (including two, three and, eventually, four-voice music and playback of audio samples and speech synthesis) were generated entirely by software that clicked the speaker at just the right times.
The Apple II's multiple expansion slots permitted a wide variety of third-party devices, including Apple II peripheral cards such as serial controllers, display controllers, memory boards, hard disks, networking components, and realtime clocks. There were plug-in expansion cards – such as the Z-80 SoftCard – that permitted the Apple to use the Z80 processor and run a multitude of programs developed under the CP/M operating system, including the dBase II database and the WordStar word processor. There was also a third-party 6809 card that would allow OS-9 Level One to be run. Third-party sound cards greatly improved audio capabilities, allowing simple music synthesis and text-to-speech functions. Eventually, Apple II accelerator cards were created to double or quadruple the computer's speed.
Rod Holt designed the Apple II's power supply. He employed a switched-mode power supply design, which was far smaller and generated less unwanted heat than the linear power supply some other home computers used.
The original Apple II was discontinued at the start of 1981, having been superseded by the Apple II+. By 1984, over six million machines had been sold.
Apple II Plus.
The Apple II Plus, introduced in June 1979, included the Applesoft BASIC programming language in ROM. This Microsoft-authored dialect of BASIC, which was previously available as an upgrade, supported floating-point arithmetic, and became the standard BASIC dialect on the Apple II series (though it ran at a noticeably slower speed than Steve Wozniak's Integer BASIC).
Except for improved graphics and disk-booting support in the ROM, and the removal of the 2k 6502 assembler/disassembler to make room for the floating point BASIC, the II+ was otherwise identical to the original II. RAM prices fell during 1980–81 and all II+ machines came from the factory with a full 48k of memory already installed.
Apple II Europlus and J-Plus.
After the success of the first Apple II in the United States, Apple expanded its market to include Europe, Australia and the Far East in 1979, with the Apple II Europlus (Europe, Australia) and the Apple II J-Plus (Japan). In these models, Apple made the necessary hardware, software and firmware changes in order to comply to standards outside of the US.
Apple IIe.
The Apple II Plus was followed in 1983 by the Apple IIe, a cost-reduced yet more powerful machine that used newer chips to reduce the component count and add new features, such as the display of upper and lowercase letters and a standard 64 KB of RAM.
The IIe RAM was configured as if it were a 48 KB Apple II Plus with a language card. The machine had no slot 0, but instead had an auxiliary slot that could accept a 1 KB memory card to enable the 80-column display. This card contained only RAM; the hardware and firmware for the 80-column display was built into the Apple IIe. An "extended 80-column card" with more memory increased the machine's RAM to 128 KB.
The Apple IIe was the most popular machine in the Apple II series. It has the distinction of being the longest-lived Apple computer of all time—it was manufactured and sold with only minor changes for nearly 11 years. The IIe was the last Apple II model to be sold, and was discontinued in November 1993.
During its lifespan two variations were introduced: the Apple IIe Enhanced (four replacement chips to give it some of the features of the later model Apple IIc) and the Apple IIe Platinum (a modernized case color to match other Apple products of the era, along with the addition of a numeric keypad).
Some of the feature of the IIe were carried over from the less successful "Apple III", among them the ProDOS operating system.
Apple IIc.
The Apple IIc was released in April 1984, billed as a portable Apple II because it could be easily carried due to its size and carrying handle, which could be flipped down to prop the machine up into a typing position. Unlike modern portables it lacked a built-in display and battery. It was the first of three Apple II models to be made in the Snow White design language, and the only one that used its unique creamy off-white color.
The Apple IIc was the first Apple II to use the 65C02 low-power variant of the 6502 processor, and featured a built-in 5.25-inch floppy drive and 128 KB RAM, with a built-in disk controller that could control external drives, composite video (NTSC or PAL), serial interfaces for modem and printer, and a port usable by either a joystick or mouse. Unlike previous Apple II models, the IIc had no internal expansion slots at all.
Two different monochrome LC displays were sold for use with the IIc's video expansion port, although both were short-lived due to high cost and poor legibility. The IIc had an external power supply that converted AC power to 15 V DC, though the IIc itself will accept between 12 V and 17 V DC, allowing third parties to offer battery packs and automobile power adapters that connected in place of the supplied AC adapter.
Apple II.
The Apple II, released on September 15, 1986, is the last model in the Apple II series, and a radical departure from prior models. It uses a 16-bit microprocessor, the 65C816 operating at 2.8 MHz with 24-bit addressing, allowing expansion up to 8 MB of RAM. The graphics are significantly improved, with 4096 colors and new modes with resolutions of 320×200 and 640×400.
The Apple II evolved the platform while still maintaining near-complete backward compatibility. Its Mega II chip contains the functional equivalent of an entire Apple IIe computer (sans processor). This, combined with the 65816's ability to execute 65C02 code directly, provides full support for legacy software, while also supporting 16-bit software running under a new OS.
The OS eventually included a Macintosh-like graphical Finder for managing disks and files and opening documents and applications, along with desk accessories. Later, the II gained the ability to read and write Macintosh disks and, through third-party software, a multitasking Unix-like shell and TrueType font support.
The GS includes a 32-voice Ensoniq 5503 DOC sample-based sound synthesizer chip with 64 KB dedicated RAM, 256 KB (or later 1.125 MB) of standard RAM, built-in peripheral ports (switchable between IIe-style card slots and IIc-style onboard controllers for disk drives, mouse, RGB video, and serial devices) and, built-in AppleTalk networking.
Apple IIc Plus.
The final Apple II model was the Apple IIc Plus introduced in 1988. It was the same size and shape as the IIc that came before it, but the 5.25-inch floppy drive had been replaced with a -inch drive, the power supply was moved inside the case, and the processor was a fast 4 MHz 65C02 processor that actually ran 8-bit Apple II software faster than the II.
The IIc Plus also featured a new keyboard layout that matched the Platinum IIe and II. Unlike the IIe IIc and II, the IIc Plus came only in one version (American) and was not officially sold anywhere outside the US. The Apple IIc Plus ceased production in 1990, with its two-year production run being the shortest of all the Apple II computers.
Apple IIe Card.
Although not an extension of the Apple II line, in 1990 the Apple IIe Card, an expansion card for the LC line of Macintosh computers, was released. Essentially a miniaturized Apple IIe computer on a card (using the Mega II chip from the Apple II), it allowed the Macintosh to run 8-bit Apple IIe software through hardware emulation (although video was emulated in software and was slower at times than a IIe).
Many of the LC's built-in Macintosh peripherals could be "borrowed" by the card when in Apple II mode (i.e. extra RAM, 3.5-inch floppy, AppleTalk networking, hard disk). The IIe card could not, however, run software intended for the 16-bit Apple II.
Advertising, marketing, and packaging.
Mike Markkula, a retired Intel marketing manager, provided the early critical funding for Apple Computer. From 1977 to 1981, Apple used the Regis McKenna agency for its advertisements and marketing. In 1981, Chiat-Day acquired Regis McKenna's advertising operations and Apple used Chiat-Day. At Regis McKenna Advertising, the team assigned to launch the Apple II consisted of Rob Janoff, art director, Chip Schafer, copywriter and Bill Kelley, account executive. Janoff came up with the Apple logo with a bite out of it. The design was originally an olive green with matching company logotype all in lower case. Steve Jobs insisted on promoting the color capability of the Apple II by putting rainbow stripes on the Apple logo. In its letterhead and business card implementation, the rounded "a" of the logotype echoed the "bite" in the logo. This logo was developed simultaneously with an advertisement and a brochure; the latter being produced for distribution initially at the first West Coast Computer Faire.
Since the original Apple II, Apple has paid high attention to its quality of packaging, partly because of Steve Jobs' personal preferences and opinions on packaging and final product appearance. All of Apple's packaging for the Apple II series looked similar, featuring much clean white space and showing the Apple rainbow logo prominently. For several years up until the late 1980s, Apple used the Motter Tektura font for packaging, until changing to the Apple Garamond font.
Apple ran the first advertisement for the Apple II, a two-page spread ad titled "Introducing Apple II", in "BYTE" in July 1977. The first brochure, was entitled "Simplicity" and the copy in both the ad and brochure pioneered "demystifying" language intended to make the new idea of a home computer more "personal." The Apple II introduction ad was later run in the September 1977 issue of "Scientific American".
Apple later aired eight television commercials for the Apple II, emphasizing its benefits to education and students, along with some print ads.
Clones.
The Apple II was frequently cloned, both in the United States and abroad, in a similar way to the IBM PC. According to some sources (see below), more than 190 different models of Apple II clones were manufactured. Most could not be legally imported into the United States. Apple sued and sought criminal charges against clone makers in more than a dozen countries.
Data storage.
Cassette.
Originally the Apple II used Compact Cassette tapes for program and data storage. A dedicated tape recorder along the lines of the Commodore Datasette was never produced; Apple recommended using the Panasonic RQ309 in some of its early printed documentation. The uses of common consumer cassette recorders and a standard video monitor or television set (with a third party R-F modulator) made the total cost of owning an Apple II less expensive and helped contribute to the Apple II's success.
Cassette storage may have been inexpensive, but it was also slow and unreliable. The Apple II's lack of a disk drive was "a glaring weakness" in what was otherwise intended to be a polished, professional product. Recognizing that the II needed a disk drive to be taken seriously, Apple set out to develop a disk drive and a DOS to run it. Wozniak spent the 1977 Christmas holidays designing a disk controller that reduced the number of chips used by a factor of 10 compared to existing controllers. Still lacking a DOS, and with Wozniak inexperienced in operating system design, Jobs approached Shepardson Microsystems with the project. On April 10, 1978, Apple signed a contract for $13,000 with Sheperdson to develop the DOS.
Even after disk drives made the cassette tape interfaces obsolete they were still used by enthusiasts as simple one-bit audio input-output ports. Ham radio operators used the cassette input to receive slow scan TV (single frame images). A commercial speech recognition Blackjack program was available, after some user-specific voice training it would recognize simple commands (Hit, stand). Bob Bishop's "Music Kaleidoscope" was a simple program that monitored the cassette input port and based on zero-crossings created color patterns on the screen, a predecessor to current audio visualization plug-ins for media players. Music Kaleidoscope was especially popular on projection TV sets in dance halls.
The OS Disk.
Apple and many third-party developers made software available on tape at first, but after the Disk II became available in 1978, tape-based Apple II software essentially disappeared from the market. The initial price of the Disk II drive and controller was US$595, although a $100 off coupon was available through the Apple newsletter "Contact". The controller could handle two drives and a second drive (without controller) retailed for $495.
The Disk II single-sided floppy drive used 5.25-inch floppy disks; double-sided disks could be used, one side at a time, by turning them over and notching a hole for the write protect sensor. The first disk operating systems for the were and DOS 3.2, which stored 113.75 KB on each disk, organized into 35 tracks of 13 256-byte sectors each. After about two years, DOS 3.3 was introduced, storing 140 KB thanks to a minor firmware change on the disk controller that allowed it to store 16 sectors per track. (This upgrade was user-installable as two PROMs on older controllers.) After the release of DOS 3.3, the user community discontinued use of except for running legacy software. Programs that required DOS 3.2 were fairly rare; however, as DOS 3.3 was not a major architectural change aside from the number of sectors per track, a program called MUFFIN was provided with DOS 3.3 to allow users to copy files from DOS 3.2 disks to DOS 3.3 disks. It was possible for software developers to create a DOS 3.2 disk which would also boot on a system with firmware.
Later, double-sided drives, with heads to read both sides of the disk, became available from third-party companies. (Apple only produced double-sided 5.25-inch disks for the Lisa 1 computer).
On a DOS 3.x disk, tracks 0, 1, and most of track 2 were reserved to store the operating system. (It was possible, with a special utility, to reclaim most of this space for data if a disk did not need to be bootable.) A short ROM program on the disk controller had the ability to seek to track zero which it did without regard for the read/write head's current position, resulting in the characteristic "chattering" sound of a Disk II boot, which was the read/write head hitting the rubber stop block at the end of the rail – and read and execute code from sector 0. The code contained in there would then pull in the rest of the operating system. DOS stored the disk's directory on track 17, smack in the middle of the 35-track disks, in order to reduce the average seek time to the frequently used directory track. The directory was fixed in size and could hold a maximum of 105 files. Subdirectories were not supported.
Most game publishers did not include DOS on their floppy disks, since they needed the memory it occupied more than its capabilities; instead, they often wrote their own boot loaders and read-only file systems. This also served to discourage "crackers" from snooping around in the game's copy-protection code, since the data on the disk was not in files that could be accessed easily.
Some third-party manufacturers produced floppy drives that could write 40 tracks to most 5.25-inch disks, yielding 160 KB of storage per disk, but the format did not catch on widely, and no known commercial software was published on 40-track media. Most drives, even Disk IIs, could write 36 tracks; a two byte modification to DOS to format the extra track was common.
The Apple Disk II stored 140 KB on single-sided, "single-density" floppy disks, but it was very common for Apple II users to extend the capacity of a single-sided floppy disk to 280 KB by cutting out a second write-protect notch on the side of the disk using a "disk notcher" or hole puncher and inserting the disk flipped over. Double-sided disks, with notches on both sides, were available at a higher price, but in practice the magnetic coating on the reverse of nominally single-sided disks was usually of good enough quality to be used (both sides were coated in the same way to prevent warping, although only one side was certified for use). Early on, diskette manufacturers routinely warned that this technique would damage the read/write head of the drives or wear out the disk faster, and these warnings were frequently repeated in magazines of the day. In practice, however, this method was an inexpensive way to store twice as much data for no extra cost, and was widely used for commercially released floppies as well.
Later, Apple IIs were able to use 3.5-inch disks with a total capacity of 800 KB and hard disks. did not support these drives natively; third-party software was required, and disks larger than about 400 KB had to be split up into multiple "virtual disk volumes."
DOS 3.3 was succeeded by ProDOS, a 1983 descendant of the Apple ///'s SOS. It added support for subdirectories and volumes up to 32 MB in size. ProDOS became the DOS of choice; AppleWorks and other newer programs required it.
Legacy.
Industry impact.
The Apple II series of computers had an enormous impact on the technology industry and expanded the role of microcomputers in society. The Apple II was the first personal computer many people ever saw. Its price was within the reach of many middle-class families, and a partnership with MECC helped make the Apple II popular in schools. By the end of 1980 Apple had already sold over 100,000 Apple IIs. Its popularity bootstrapped the computer game and educational software markets and began the boom in the word processor and computer printer markets. The first spreadsheet application, VisiCalc, was initially released for the Apple II, and many businesses bought them just to run VisiCalc. Its success drove IBM in part to create the IBM PC, which many businesses purchased to run spreadsheet and word processing software, at first ported from Apple II versions.
The Apple II's slots, allowing any peripheral card to take control of the bus and directly access memory, enabled an independent industry of card manufacturers who together created a flood of hardware products that let users build systems that were far more powerful and useful (at a lower cost) than any competing system, most of which were not nearly as expandable and were universally proprietary. The first peripheral card was a blank prototyping card intended for electronics enthusiasts who wanted to design their own peripherals for the Apple II.
Specialty peripherals kept the Apple II in use in industry and education environments for many years after Apple Computer stopped supporting the Apple II. Well into the 1990s every clean-room (the super-clean facility where spacecraft are prepared for flight) at the Kennedy Space Center used an Apple II to monitor the environment and air quality. Most planetariums used Apple IIs to control their projectors and other equipment.
Even the game port was unusually powerful and could be used for digital and analog input and output. The early manuals included instructions for how to build a circuit with only four commonly available components (one transistor and three resistors) and a software routine to drive a common Teletype Model 33 machine. Don Lancaster used the game I/O to drive a LaserWriter printer.
Modern use.
Today, emulators for various Apple II models are available to run Apple II software on macOS, Linux, Microsoft Windows, homebrew enabled Nintendo DS and other operating systems. Numerous disk images of Apple II software are available free over the Internet for use with these emulators. AppleWin and MESS are among the best emulators compatible with most Apple II images. The MESS emulator supports recording and playing back of Apple II emulation sessions, as does Home Action Replay Page (a.k.a. HARP).
In addition, an active retrocomputing community of vintage Apple II collectors and users, continue to restore, maintain and develop hardware and software for daily use of these original computers. There is still a small annual convention, KansasFest, dedicated to the platform.
In 2017, the band 8 Bit Weapon released the world's first 100% Apple II based music album entitled, "Class Apples." The album featured dance-oriented cover versions of classical music by Bach, Beethoven, and Mozart recorded directly off the Apple II motherboard.
|
2117 | Apple III | The Apple III (styled as apple ///) is a business-oriented personal computer produced by Apple Computer and released in 1980. Running the Apple SOS operating system, it was intended as the successor to the Apple II series, but was largely considered a failure in the market. It was designed to provide key features business users wanted in a personal computer: a true typewriter-style upper/lowercase keyboard (the Apple II only supported uppercase) and an 80-column display.
Work on the Apple III started in late 1978 under the guidance of Dr. Wendell Sander. It had the internal code name of "Sara", named after Sander's daughter. The system was announced on May 19, 1980 and released in late November that year. Serious stability issues required a design overhaul and a recall of the first 14,000 machines produced. The Apple III was formally reintroduced on November 9, 1981.
Damage to the computer's reputation had already been done, however, and it failed to do well commercially. Development stopped, and the Apple III was discontinued on April 24, 1984. Its last successor, the III Plus, was dropped from the Apple product line in September 1985.
An estimated 65,000–75,000 Apple III computers were sold. The Apple III Plus brought this up to approximately 120,000. Apple co-founder Steve Wozniak stated that the primary reason for the Apple III's failure was that the system was designed by Apple's marketing department, unlike Apple's previous engineering-driven projects. The Apple III's failure led Apple to reevaluate its plan to phase out the Apple II, prompting the eventual continuation of development of the older machine. As a result, later Apple II models incorporated some hardware and software technologies of the Apple III.
Overview.
Design.
Steve Wozniak and Steve Jobs expected hobbyists to purchase the Apple II, but because of VisiCalc and Disk II, small businesses purchased 90% of the computers. The Apple III was designed to be a business computer and successor. Though the Apple II contributed to the inspirations of several important business products, such as VisiCalc, Multiplan, and Apple Writer, the computer's hardware architecture, operating system, and developer environment are limited. Apple management intended to clearly establish market segmentation by designing the Apple III to appeal to the 90% business market, leaving the Apple II to home and education users. Management believed that "once the Apple III was out, the Apple II would stop selling in six months", Wozniak said.
The Apple III is powered by a 1.8-megahertz Synertek 6502A or B 8-bit CPU and, like some of the later machines in the Apple II family, uses bank switching techniques to address memory beyond the 6502's traditional 64 KB limit, up to 256 KB in the III's case. Third-party vendors produced memory upgrade kits that allow the Apple III to reach up to 512 KB of random-access memory (RAM). Other Apple III built-in features include an 80-column, 24-line display with upper and lowercase characters, a numeric keypad, dual-speed (pressure-sensitive) cursor control keys, 6-bit (DAC) audio, and a built-in 140-kilobyte 5.25-inch floppy disk drive. Graphics modes include 560x192 in black and white, and 280x192 with 16 colors or shades of gray. Unlike the Apple II, the Disk III controller is part of the logic board.
The Apple III is the first Apple product to allow the user to choose both a screen font and a keyboard layout: either QWERTY or Dvorak. These choices cannot be changed while programs were running, unlike the Apple IIc, which has a keyboard switch directly above the keyboard, allowing the user to switch on the fly.
Software.
The Apple III introduced an advanced operating system called Apple SOS, pronounced "apple sauce". Its ability to address resources by name allows the Apple III to be more scalable than the Apple II's addressing by physical location such as codice_1. Apple SOS allows the full capacity of a storage device to be used as a single volume, such as the Apple ProFile hard disk drive, and it supports a hierarchical file system. Some of the features and code base of Apple SOS were later adopted into the Apple II's ProDOS and GS/OS operating systems, as well as Lisa 7/7 and Macintosh system software.
With a starting price between , the Apple III was more expensive than many of the CP/M-based business computers that were available at the time. Few software applications other than VisiCalc are available for the computer; according to a presentation at KansasFest 2012, fewer than 50 Apple III-specific software packages were ever published, most shipping when the III Plus was released. Because Apple did not view the Apple III as suitable for hobbyists, it did not provide much of the technical software information that accompanies the Apple II. Originally intended as a direct replacement to the Apple II series, it was designed to be backward compatible with Apple II software. However, since Apple did not want to encourage continued development of the II platform, Apple II compatibility exists only in a special Apple II Mode which is limited in its capabilities to the emulation of a basic Apple II Plus configuration with of RAM. Special chips were intentionally added to prevent access from Apple II Mode to the III's advanced features such as its larger amount of memory.
Peripherals.
The Apple III has four expansion slots, a number that "inCider" in 1986 called "miserly". Apple II cards are compatible but risk violating government RFI regulations, and require Apple III-specific device drivers; "BYTE" stated that "Apple provides virtually no information on how to write them". As with software, Apple provided little hardware technical information with the computer but Apple III-specific products became available, such as one that made the computer compatible with the Apple IIe. Several new Apple-produced peripherals were developed for the Apple III. The original Apple III has a built-in real-time clock, which is recognized by Apple SOS. The clock was later removed from the "revised" model, and was instead made available as an add-on.
Along with the built-in floppy drive, the Apple III can also handle up to three additional external Disk III floppy disk drives. The Disk III is only officially compatible with the Apple III. The Apple III Plus requires an adaptor from Apple to use the Disk III with its DB-25 disk port.
With the introduction of the revised Apple III a year after launch, Apple began offering the ProFile external hard disk system. Priced at $3,499 for 5 MB of storage, it also required a peripheral slot for its controller card.
Backward compatibility.
The Apple III has the built-in hardware capability to run Apple II software. In order to do so, an emulation boot disk is required that functionally turns the machine into a standard 48-kilobyte Apple II Plus, until it is powered off. The keyboard, internal floppy drive (and one external Disk III), display (color is provided through the 'B/W video' port) and speaker all act as Apple II peripherals. The paddle and serial ports can also function in Apple II mode, however with some limitations and compatibility issues.
Apple engineers added specialized circuitry with the sole purpose of blocking access to its advanced features when running in Apple II emulation mode. This was done primarily to discourage further development and interest in the Apple II line, and to push the Apple III as its successor. For example, no more than of RAM can be accessed, even if the machine has of RAM or higher present. Many Apple II programs require a minimum of of RAM, making them impossible to run on the Apple III. Similarly, access to lowercase support, 80 columns text, or its more advanced graphics and sound are blocked by this hardware circuitry, making it impossible for even skilled software programmers to bypass Apple's lockout. A third-party company, Titan Technologies, sold an expansion board called the III Plus II that allows Apple II mode to access more memory, a standard game port, and with a later released companion card, even emulate the Apple IIe.
Certain Apple II slot cards can be installed in the Apple III and used in native III-mode with custom written SOS device drivers, including Grappler Plus and Liron 3.5 Controller.
Revisions.
After overheating issues were attributed to serious design flaws, a redesigned logic board was introduced in mid-December 1981 – which included a lower power supply requirement, wider circuit traces and better-designed chip sockets. The $3,495 revised model also includes 256 KB of RAM as the standard configuration. The 14,000 units of the original Apple III sold were returned and replaced with the entirely new revised model.
Apple III Plus.
Apple discontinued the III in October 1983 because it violated FCC regulations, and the FCC required the company to change the redesigned computer's name. It introduced the Apple III Plus in December 1983 at a price of US$2,995. This newer version includes a built-in clock, video interlacing, standardized rear port connectors, 55-watt power supply, 256 KB of RAM as standard, and a redesigned, Apple IIe-like keyboard.
Owners of the Apple III could purchase individual III Plus upgrades, like the clock and interlacing feature, and obtain the newer logic board as a service replacement. A keyboard upgrade kit, dubbed "Apple III Plus upgrade kit" was also made available – which included the keyboard, cover, keyboard encoder ROM, and logo replacements. This upgrade had to be installed by an authorized service technician.
Design flaws.
According to Wozniak, the Apple III "had 100 percent hardware failures". Former Apple executive Taylor Pohlman stated that:
Jobs insisted on the idea of having no fan or air vents, in order to make the computer run quietly. He would later push this same ideology onto almost all Apple models he had control of, from the Apple Lisa and Macintosh 128K to the iMac. To allow the computer to dissipate heat, the base of the Apple III was made of heavy cast aluminum, which supposedly acts as a heat sink. One advantage to the aluminum case was a reduction in RFI (Radio Frequency Interference), a problem which had plagued the Apple II series throughout its history. Unlike the Apple II series, the power supply was mounted – without its own shell – in a compartment separate from the logic board. The decision to use an aluminum shell ultimately led to engineering issues which resulted in the Apple III's reliability problems. The lead time for manufacturing the shells was high, and this had to be done before the motherboard was finalized. Later, it was realized that there was not enough room on the motherboard for all of the components unless narrow traces were used.
Many Apple IIIs were thought to have failed due to their inability to properly dissipate heat. "inCider" stated in 1986 that "Heat has always been a formidable enemy of the Apple ///", and some users reported that their Apple IIIs became so hot that the chips started dislodging from the board, causing the screen to display garbled data or their disk to come out of the slot "melted". "BYTE" wrote, "the integrated circuits tended to wander out of their sockets". It has been rumored Apple advised customers to tilt the front of the Apple III six inches above the desk and then drop it to reseat the chips as a temporary solution. Other analyses blame a faulty automatic chip insertion process, not heat.
Case designer Jerry Manock denied the design flaw charges, insisting that tests proved that the unit adequately dissipated the internal heat. The primary cause, he claimed, was a major logic board design problem. The logic board used "fineline" technology that was not fully mature at the time, with narrow, closely spaced traces. When chips were "stuffed" into the board and wave-soldered, solder bridges would form between traces that were not supposed to be connected. This caused numerous short circuits, which required hours of costly diagnosis and hand rework to fix. Apple designed a new circuit board with more layers and normal-width traces. The new logic board was laid out by one designer on a huge drafting board, rather than using the costly CAD-CAM system used for the previous board, and the new design worked.
Earlier Apple III units came with a built-in real time clock. The hardware, however, would fail after prolonged use. Assuming that National Semiconductor would test all parts before shipping them, Apple did not perform this level of testing. Apple was soldering chips directly to boards and could not easily replace a bad chip if one was found. Eventually, Apple solved this problem by removing the real-time clock from the Apple III's specification rather than shipping the Apple III with the clock pre-installed, and then sold the peripheral as a level 1 technician add-on.
BASIC.
Microsoft and Apple each developed their own versions of BASIC for the Apple III. Apple III Microsoft BASIC was designed to run on the CP/M platform available for the Apple III. Apple Business BASIC shipped with the Apple III. Donn Denman ported Applesoft BASIC to SOS and reworked it to take advantage of the extended memory of the Apple III.
Both languages introduced a number of new or improved features over Applesoft BASIC. Both languages replaced Applesoft's single-precision floating-point variables using 5-byte storage with the somewhat-reduced-precision 4-byte variables, while also adding a larger numerical format. Apple III Microsoft BASIC provides double-precision floating-point variables, taking 8 bytes of storage, while Apple Business BASIC offers an extra-long integer type, also taking 8 bytes for storage. Both languages also retain 2-byte integers, and maximum 255-character strings.
Other new features common to both languages include:
Some features work differently in each language:
Microsoft BASIC additional features.
There is no support for graphics provided within the language, nor for reading analog controls or buttons; nor is there a means of defining the active window of the text screen.
Business BASIC additional features.
Apple Business BASIC eliminates all references to absolute memory addresses. Thus, the POKE command and PEEK() function were not included in the language, and new features replaced the CALL statement and USR() function. The functionality of certain features in Applesoft that had been achieved with various PEEK and POKE locations is now provided by:
External binary subroutines and functions are loaded into memory by a single INVOKE disk-command that loads separately-assembled code modules. A PERFORM statement is then used to call an INVOKEd procedure by name, with an argument-list. INVOKEd functions would be referenced in expressions by EXFN. (floating-point) or EXFN%. (integer), with the function name appended, plus the argument-list for the function.
Graphics are supported with an INVOKEd module, with features including displaying text within graphics in various fonts, within four different graphics modes available on the Apple III.
Reception.
Despite devoting the majority of its R&D to the Apple III and so ignoring the II that for a while dealers had difficulty in obtaining the latter, the III's technical problems made marketing the computer difficult. Ed Smith, who after designing the APF Imagination Machine worked as a distributor's representative, described the III as "a complete disaster". He recalled that he "was responsible for going to every dealership, setting up the Apple III in their showroom, and then explaining to them the functions of the Apple III, which in many cases didn't really work".
Sales.
Pohlman reported that Apple was only selling 500 units a month by late 1981, mostly as replacements. The company was able to eventually raise monthly sales to 5,000, but the IBM PC's successful launch had encouraged software companies to develop for it instead, prompting Apple to shift focus to the Lisa and Macintosh. The PC almost ended sales of the Apple III, the most closely comparable Apple computer model. By early 1984, sales were primarily to existing III owners, Apple itself—its 4,500 employees were equipped with some 3,000-4,500 units—and some small businesses. Apple finally discontinued the Apple III series on April 24, 1984, four months after introducing the III Plus, after selling only 65,000-75,000 units and replacing 14,000 defective units.
Jobs said that the company lost "infinite, incalculable amounts" of money on the Apple III. Wozniak estimated that Apple had spent $100 million on the III, instead of improving the II and better competing against IBM. Pohlman claimed that there was a "stigma" at Apple associated with having contributed to the computer. Most employees who worked on the III reportedly left Apple.
Legacy.
The file system and some design ideas from Apple SOS, the Apple III's operating system, were part of Apple ProDOS and Apple GS/OS, the major operating systems for the Apple II series following the demise of the Apple III, as well as the Apple Lisa, which was the de facto business-oriented successor to the Apple III. The hierarchical file system influenced the evolution of the Macintosh: while the original Macintosh File System (MFS) was a flat file system designed for a floppy disk without subdirectories, subsequent file systems were hierarchical. By comparison, the IBM PC's first file system (again designed for floppy disks) was also flat and later versions (designed for hard disks) were hierarchical.
In popular culture.
At the start of the Walt Disney Pictures film "TRON", lead character Kevin Flynn (played by Jeff Bridges) is seen hacking into the ENCOM mainframe using an Apple III.
|
2118 | AVL tree | In computer science, an AVL tree (named after inventors Adelson-Velsky and Landis) is a self-balancing binary search tree. In an AVL tree, the heights of the two child subtrees of any node differ by at most one; if at any time they differ by more than one, rebalancing is done to restore this property. Lookup, insertion, and deletion all take time in both the average and worst cases, where formula_1 is the number of nodes in the tree prior to the operation. Insertions and deletions may require the tree to be rebalanced by one or more tree rotations.
The AVL tree is named after its two Soviet inventors, Georgy Adelson-Velsky and Evgenii Landis, who published it in their 1962 paper "An algorithm for the organization of information". It is the oldest self-balancing binary search tree data structure to be invented.
AVL trees are often compared with red–black trees because both support the same set of operations and take formula_2 time for the basic operations. For lookup-intensive applications, AVL trees are faster than red–black trees because they are more strictly balanced. Similar to red–black trees, AVL trees are height-balanced. Both are, in general, neither weight-balanced nor formula_3-balanced for any formula_4;
AVL trees are more rigidly balanced than RB trees with an asymptotic relation AVL/RB ≈0.720 of the maximal heights. For insertions and deletions, Ben Pfaff shows in 79 measurements a relation of AVL/RB between 0.677 and 1.077 with median ≈0.947 and geometric mean ≈0.910.
|
2120 | Aliphatic compound | In organic chemistry, hydrocarbons (compounds composed solely of carbon and hydrogen) are divided into two classes: aromatic compounds and aliphatic compounds (; G. "aleiphar", fat, oil). Aliphatic compounds can be saturated (in which all the C-C bonds are single requiring the structure to be completed, or 'saturated', by hydrogen) like hexane, or unsaturated, like hexene and hexyne. Open-chain compounds, whether straight or branched, and which contain no rings of any type, are always aliphatic. Cyclic compounds can be aliphatic if they are not aromatic.
Structure.
Aliphatic compounds can be saturated, joined by single bonds (alkanes), or unsaturated, with double bonds (alkenes) or triple bonds (alkynes). If other elements (heteroatoms) are bound to the carbon chain, the most common being oxygen, nitrogen, sulfur, and chlorine, it is no longer a hydrocarbon, and therefore no longer an aliphatic compound.
The least complex aliphatic compound is methane (CH4).
Properties.
Most aliphatic compounds are flammable, allowing the use of hydrocarbons as fuel, such as methane in natural gas for stoves or heating; butane in torches and lighters; various aliphatic (as well as aromatic) hydrocarbons in liquid transportation fuels like petrol/gasoline, diesel, and jet fuel; and other uses such as ethyne (acetylene) in welding.
Examples of aliphatic compounds.
The most important aliphatic compounds are:
Important examples of low-molecular aliphatic compounds can be found in the list below (sorted by the number of carbon-atoms):
|
2122 | Astrology | Astrology is a range of divinatory practices, recognized as pseudoscientific since the 18th century, that claim to discern information about human affairs and terrestrial events by studying the apparent positions of celestial objects. Different cultures have employed forms of astrology since at least the 2nd millennium BCE, these practices having originated in calendrical systems used to predict seasonal shifts and to interpret celestial cycles as signs of divine communications. Most, if not all, cultures have attached importance to what they observed in the sky, and some—such as the Hindus, Chinese, and the Maya—developed elaborate systems for predicting terrestrial events from celestial observations. Western astrology, one of the oldest astrological systems still in use, can trace its roots to 19th–17th century BCE Mesopotamia, from where it spread to Ancient Greece, Rome, the Islamic world, and eventually Central and Western Europe. Contemporary Western astrology is often associated with systems of horoscopes that purport to explain aspects of a person's personality and predict significant events in their lives based on the positions of celestial objects; the majority of professional astrologers rely on such systems.
Throughout most of its history, astrology was considered a scholarly tradition and was common in academic circles, often in close relation with astronomy, alchemy, meteorology, and medicine. It was present in political circles and is mentioned in various works of literature, from Dante Alighieri and Geoffrey Chaucer to William Shakespeare, Lope de Vega, and Calderón de la Barca. During the Enlightenment, however, astrology lost its status as an area of legitimate scholarly pursuit. Following the end of the 19th century and the wide-scale adoption of the scientific method, researchers have successfully challenged astrology on both theoretical and experimental grounds, and have shown it to have no scientific validity or explanatory power. Astrology thus lost its academic and theoretical standing in the western world, and common belief in it largely declined, until a continuing resurgence starting in the 1960s. In India, belief in astrology is long-standing, widespread and continuing.
Etymology.
The word "astrology" comes from the early Latin word "astrologia", which derives from the Greek —from ἄστρον "astron" ("star") and -λογία "-logia", ("study of"—"account of the stars"). The word entered the English language via Latin and medieval French, and its use overlapped considerably with that of "astronomy" (derived from the Latin "astronomia"). By the 17th century, "astronomy" became established as the scientific term, with "astrology" referring to divinations and schemes for predicting human affairs.
History.
Many cultures have attached importance to astronomical events, and the Indians, Chinese, and Maya developed elaborate systems for predicting terrestrial events from celestial observations. A form of astrology was practised in the Old Babylonian period of Mesopotamia, . "Vedāṅga Jyotiṣa" is one of earliest known Hindu texts on astronomy and astrology ("Jyotisha"). The text is dated between 1400 BCE to final centuries BCE by various scholars according to astronomical and linguistic evidences. Chinese astrology was elaborated in the Zhou dynasty (1046–256 BCE). Hellenistic astrology after 332 BCE mixed Babylonian astrology with Egyptian Decanic astrology in Alexandria, creating horoscopic astrology. Alexander the Great's conquest of Asia allowed astrology to spread to Ancient Greece and Rome. In Rome, astrology was associated with "Chaldean wisdom". After the conquest of Alexandria in the 7th century, astrology was taken up by Islamic scholars, and Hellenistic texts were translated into Arabic and Persian. In the 12th century, Arabic texts were imported to Europe and translated into Latin. Major astronomers including Tycho Brahe, Johannes Kepler and Galileo practised as court astrologers. Astrological references appear in literature in the works of poets such as Dante Alighieri and Geoffrey Chaucer, and of playwrights such as Christopher Marlowe and William Shakespeare.
Throughout most of its history, astrology was considered a scholarly tradition. It was accepted in political and academic contexts, and was connected with other studies, such as astronomy, alchemy, meteorology, and medicine. At the end of the 17th century, new scientific concepts in astronomy and physics (such as heliocentrism and Newtonian mechanics) called astrology into question. Astrology thus lost its academic and theoretical standing, and common belief in astrology has largely declined.
Ancient world.
Astrology, in its broadest sense, is the search for meaning in the sky. Early evidence for humans making conscious attempts to measure, record, and predict seasonal changes by reference to astronomical cycles, appears as markings on bones and cave walls, which show that lunar cycles were being noted as early as 25,000 years ago. This was a first step towards recording the Moon's influence upon tides and rivers, and towards organising a communal calendar. Farmers addressed agricultural needs with increasing knowledge of the constellations that appear in the different seasons—and used the rising of particular star-groups to herald annual floods or seasonal activities. By the 3rd millennium BCE, civilisations had sophisticated awareness of celestial cycles, and may have oriented temples in alignment with heliacal risings of the stars.
Scattered evidence suggests that the oldest known astrological references are copies of texts made in the ancient world. The Venus tablet of Ammisaduqa is thought to have been compiled in Babylon around 1700 BCE. A scroll documenting an early use of electional astrology is doubtfully ascribed to the reign of the Sumerian ruler Gudea of Lagash ( – 2124 BCE). This describes how the gods revealed to him in a dream the constellations that would be most favourable for the planned construction of a temple. However, there is controversy about whether these were genuinely recorded at the time or merely ascribed to ancient rulers by posterity. The oldest undisputed evidence of the use of astrology as an integrated system of knowledge is therefore attributed to the records of the first dynasty of Mesopotamia (1950–1651 BCE). This astrology had some parallels with Hellenistic Greek (western) astrology, including the zodiac, a norming point near 9 degrees in Aries, the trine aspect, planetary exaltations, and the dodekatemoria (the twelve divisions of 30 degrees each). The Babylonians viewed celestial events as possible signs rather than as causes of physical events.
The system of Chinese astrology was elaborated during the Zhou dynasty (1046–256 BCE) and flourished during the Han Dynasty (2nd century BCE to 2nd century CE), during which all the familiar elements of traditional Chinese culture – the Yin-Yang philosophy, theory of the five elements, Heaven and Earth, Confucian morality – were brought together to formalise the philosophical principles of Chinese medicine and divination, astrology, and alchemy.
The ancient Arabs that inhabitated the Arabian Peninsula before the advent of Islam used to profess a widespread belief in fatalism ("ḳadar") alongside a fearful consideration for the sky and the stars, which they held to be ultimately responsible for every phenomena that occurs on Earth and for the destiny of humankind. Accordingly, they shaped their entire lives in accordance with their interpretations of astral configurations and phenomena.
Ancient objections.
The Hellenistic schools of philosophical skepticism criticized the rationality of astrology. Criticism of astrology by academic skeptics such as Cicero, Carneades, and Favorinus; and Pyrrhonists such as Sextus Empiricus has been preserved.
Carneades argued that belief in fate denies free will and morality; that people born at different times can all die in the same accident or battle; and that contrary to uniform influences from the stars, tribes and cultures are all different.
Cicero stated the twins objection (that with close birth times, personal outcomes can be very different), later developed by Augustine. He argued that since the other planets are much more distant from the Earth than the Moon, they could have only very tiny influence compared to the Moon's. He also argued that if astrology explains everything about a person's fate, then it wrongly ignores the visible effect of inherited ability and parenting, changes in health worked by medicine, or the effects of the weather on people.
Favorinus argued that it was absurd to imagine that stars and planets would affect human bodies in the same way as they affect the tides, and equally absurd that small motions in the heavens cause large changes in people's fates.
Sextus Empiricus argued that it was absurd to link human attributes with myths about the signs of the zodiac, and wrote an entire book, "Against the Astrologers", compiling arguments against astrology.
Plotinus, a neoplatonist, argued that since the fixed stars are much more distant than the planets, it is laughable to imagine the planets' effect on human affairs should depend on their position with respect to the zodiac. He also argues that the interpretation of the moon's conjunction with a planet as good when the moon is full, but bad when the moon is waning, is clearly wrong, as from the moon's point of view, half of its surface is always in sunlight; and from the planet's point of view, waning should be better, as then the planet sees some light from the moon, but when the moon is full to us, it is dark, and therefore bad, on the side facing the planet in question.
Hellenistic Egypt.
In 525 BCE, Egypt was conquered by the Persians. The 1st century BCE Egyptian Dendera Zodiac shares two signs – the Balance and the Scorpion – with Mesopotamian astrology.
With the occupation by Alexander the Great in 332 BCE, Egypt became Hellenistic. The city of Alexandria was founded by Alexander after the conquest, becoming the place where Babylonian astrology was mixed with Egyptian Decanic astrology to create Horoscopic astrology. This contained the Babylonian zodiac with its system of planetary exaltations, the triplicities of the signs and the importance of eclipses. It used the Egyptian concept of dividing the zodiac into thirty-six decans of ten degrees each, with an emphasis on the rising decan, and the Greek system of planetary Gods, sign rulership and four elements. 2nd century BCE texts predict positions of planets in zodiac signs at the time of the rising of certain decans, particularly Sothis. The astrologer and astronomer Ptolemy lived in Alexandria. Ptolemy's work the "Tetrabiblos" formed the basis of Western astrology, and, "...enjoyed almost the authority of a Bible among the astrological writers of a thousand years or more."
Greece and Rome.
The conquest of Asia by Alexander the Great exposed the Greeks to ideas from Syria, Babylon, Persia and central Asia. Around 280 BCE, Berossus, a priest of Bel from Babylon, moved to the Greek island of Kos, teaching astrology and Babylonian culture. By the 1st century BCE, there were two varieties of astrology, one using horoscopes to describe the past, present and future; the other, theurgic, emphasising the soul's ascent to the stars. Greek influence played a crucial role in the transmission of astrological theory to Rome.
The first definite reference to astrology in Rome comes from the orator Cato, who in 160 BCE warned farm overseers against consulting with Chaldeans, who were described as Babylonian 'star-gazers'. Among both Greeks and Romans, Babylonia (also known as Chaldea) became so identified with astrology that 'Chaldean wisdom' became synonymous with divination using planets and stars. The 2nd-century Roman poet and satirist Juvenal complains about the pervasive influence of Chaldeans, saying, "Still more trusted are the Chaldaeans; every word uttered by the astrologer they will believe has come from Hammon's fountain."
One of the first astrologers to bring Hermetic astrology to Rome was Thrasyllus, astrologer to the emperor Tiberius, the first emperor to have had a court astrologer, though his predecessor Augustus had used astrology to help legitimise his Imperial rights.
Medieval world.
Hindu.
The main texts upon which classical Indian astrology is based are early medieval compilations, notably the "", and "Sārāvalī" by .
The "Horāshastra" is a composite work of 71 chapters, of which the first part (chapters 1–51) dates to the 7th to early 8th centuries and the second part (chapters 52–71) to the later 8th century. The "Sārāvalī" likewise dates to around 800 CE. English translations of these texts were published by N.N. Krishna Rau and V.B. Choudhari in 1963 and 1961, respectively.
Islamic.
Astrology was taken up by Islamic scholars following the collapse of Alexandria to the Arabs in the 7th century, and the founding of the Abbasid empire in the 8th. The second Abbasid caliph, Al Mansur (754–775) founded the city of Baghdad to act as a centre of learning, and included in its design a library-translation centre known as "Bayt al-Hikma" 'House of Wisdom', which continued to receive development from his heirs and was to provide a major impetus for Arabic-Persian translations of Hellenistic astrological texts. The early translators included Mashallah, who helped to elect the time for the foundation of Baghdad, and Sahl ibn Bishr, ("a.k.a." "Zael"), whose texts were directly influential upon later European astrologers such as Guido Bonatti in the 13th century, and William Lilly in the 17th century. Knowledge of Arabic texts started to become imported into Europe during the Latin translations of the 12th century.
Europe.
In the seventh century, Isidore of Seville argued in his "Etymologiae" that astronomy described the movements of the heavens, while astrology had two parts: one was scientific, describing the movements of the sun, the moon and the stars, while the other, making predictions, was theologically erroneous.
The first astrological book published in Europe was the "Liber Planetis et Mundi Climatibus" ("Book of the Planets and Regions of the World"), which appeared between 1010 and 1027 AD, and may have been authored by Gerbert of Aurillac. Ptolemy's second century AD "Tetrabiblos" was translated into Latin by Plato of Tivoli in 1138. The Dominican theologian Thomas Aquinas followed Aristotle in proposing that the stars ruled the imperfect 'sublunary' body, while attempting to reconcile astrology with Christianity by stating that God ruled the soul. The thirteenth century mathematician Campanus of Novara is said to have devised a system of astrological houses that divides the prime vertical into 'houses' of equal 30° arcs, though the system was used earlier in the East. The thirteenth century astronomer Guido Bonatti wrote a textbook, the "Liber Astronomicus", a copy of which King Henry VII of England owned at the end of the fifteenth century.
In "Paradiso", the final part of the "Divine Comedy", the Italian poet Dante Alighieri referred "in countless details" to the astrological planets, though he adapted traditional astrology to suit his Christian viewpoint, for example using astrological thinking in his prophecies of the reform of Christendom.
John Gower in the fourteenth century defined astrology as essentially limited to the making of predictions. The influence of the stars was in turn divided into natural astrology, with for example effects on tides and the growth of plants, and judicial astrology, with supposedly predictable effects on people. The fourteenth-century sceptic Nicole Oresme however included astronomy as a part of astrology in his "Livre de divinacions". Oresme argued that current approaches to prediction of events such as plagues, wars, and weather were inappropriate, but that such prediction was a valid field of inquiry. However, he attacked the use of astrology to choose the timing of actions (so-called interrogation and election) as wholly false, and rejected the determination of human action by the stars on grounds of free will. The friar Laurens Pignon (c. 1368–1449) similarly rejected all forms of divination and determinism, including by the stars, in his 1411 "Contre les Devineurs". This was in opposition to the tradition carried by the Arab astronomer Albumasar (787-886) whose "Introductorium in Astronomiam" and "De Magnis Coniunctionibus" argued the view that both individual actions and larger scale history are determined by the stars.
In the late 15th century, Giovanni Pico della Mirandola forcefully attacked astrology in "Disputationes contra Astrologos", arguing that the heavens neither caused, nor heralded earthly events. His contemporary, Pietro Pomponazzi, a "rationalistic and critical thinker", was much more sanguine about astrology and critical of Pico's attack.
Renaissance and Early Modern.
Renaissance scholars commonly practised astrology. Gerolamo Cardano cast the horoscope of king Edward VI of England, while John Dee was the personal astrologer to queen Elizabeth I of England. Catherine de Medici paid Michael Nostradamus in 1566 to verify the prediction of the death of her husband, king Henry II of France made by her astrologer Lucus Gauricus. Major astronomers who practised as court astrologers included Tycho Brahe in the royal court of Denmark, Johannes Kepler to the Habsburgs, Galileo Galilei to the Medici, and Giordano Bruno who was burnt at the stake for heresy in Rome in 1600. The distinction between astrology and astronomy was not entirely clear. Advances in astronomy were often motivated by the desire to improve the accuracy of astrology. Kepler, for example, was driven by a belief in harmonies between Earthly and celestial affairs, yet he disparaged the activities of most astrologers as "evil-smelling dung".
Ephemerides with complex astrological calculations, and almanacs interpreting celestial events for use in medicine and for choosing times to plant crops, were popular in Elizabethan England. In 1597, the English mathematician and physician Thomas Hood made a set of paper instruments that used revolving overlays to help students work out relationships between fixed stars or constellations, the midheaven, and the twelve astrological houses. Hood's instruments also illustrated, for pedagogical purposes, the supposed relationships between the signs of the zodiac, the planets, and the parts of the human body adherents believed were governed by the planets and signs. While Hood's presentation was innovative, his astrological information was largely standard and was taken from Gerard Mercator's astrological disc made in 1551, or a source used by Mercator. Despite its popularity, Renaissance astrology had what historian Gabor Almasi calls "elite debate", exemplified by the polemical letters of Swiss physician Thomas Erastus who fought against astrology, calling it "vanity" and "superstition." Then around the time of the new star of 1572 and the comet of 1577 there began what Almasi calls an "extended epistemological reform" which began the process of excluding religion, astrology and anthropocentrism from scientific debate. By 1679, the yearly publication La Connoissance des temps eschewed astrology as a legitimate topic.”
Enlightenment period and onwards.
During the Enlightenment, intellectual sympathy for astrology fell away, leaving only a popular following supported by cheap almanacs. One English almanac compiler, Richard Saunders, followed the spirit of the age by printing a derisive "Discourse on the Invalidity of Astrology", while in France Pierre Bayle's "Dictionnaire" of 1697 stated that the subject was puerile. The Anglo-Irish satirist Jonathan Swift ridiculed the Whig political astrologer John Partridge.
In the second half of the Seventeenth Century, the Society of Astrologers (1647–1684), a trade, educational, and social organization, sought to unite London’s often fractious astrologers in the task of revitalizing Astrology. Following the template of the popular “Feasts of Mathematicians” they endeavored to defend their art in the face of growing religious criticism. The Society hosted banquets, exchanged “instruments and manuscripts”, proposed research projects, and funded the publication of sermons that depicted astrology as a legitimate biblical pursuit for Christians. They commissioned sermons that argued Astrology was divine, Hebraic, and scripturally supported by Bible passages about the Magi and the sons of Seth. According to historian Michelle Pfeffer, “The society’s public relations campaign ultimately failed.” Modern historians have mostly neglected the Society of Astrologers in favor of the still extant Royal Society (1660), even though both organizations initially had some of the same members.
Astrology saw a popular revival starting in the 19th century, as part of a general revival of spiritualism and—later, New Age philosophy, and through the influence of mass media such as newspaper horoscopes. Early in the 20th century the psychiatrist Carl Jung developed some concepts concerning astrology, which led to the development of psychological astrology.
Principles and practice.
Advocates have defined astrology as a symbolic language, an art form, a science, and a method of divination. Though most cultural astrology systems share common roots in ancient philosophies that influenced each other, many use methods that differ from those in the West. These include Hindu astrology (also known as "Indian astrology" and in modern times referred to as "Vedic astrology") and Chinese astrology, both of which have influenced the world's cultural history.
Western.
Western astrology is a form of divination based on the construction of a horoscope for an exact moment, such as a person's birth. It uses the tropical zodiac, which is aligned to the equinoctial points.
Western astrology is founded on the movements and relative positions of celestial bodies such as the Sun, Moon and planets, which are analysed by their movement through signs of the zodiac (twelve spatial divisions of the ecliptic) and by their aspects (based on geometric angles) relative to one another. They are also considered by their placement in houses (twelve spatial divisions of the sky). Astrology's modern representation in western popular media is usually reduced to sun sign astrology, which considers only the zodiac sign of the Sun at an individual's date of birth, and represents only 1/12 of the total chart.
The horoscope visually expresses the set of relationships for the time and place of the chosen event. These relationships are between the seven 'planets', signifying tendencies such as war and love; the twelve signs of the zodiac; and the twelve houses. Each planet is in a particular sign and a particular house at the chosen time, when observed from the chosen place, creating two kinds of relationship. A third kind is the aspect of each planet to every other planet, where for example two planets 120° apart (in 'trine') are in a harmonious relationship, but two planets 90° apart ('square') are in a conflicted relationship. Together these relationships and their interpretations are said to form "...the language of the heavens speaking to learned men."
Along with tarot divination, astrology is one of the core studies of Western esotericism, and as such has influenced systems of magical belief not only among Western esotericists and Hermeticists, but also belief systems such as Wicca, which have borrowed from or been influenced by the Western esoteric tradition. Tanya Luhrmann has said that "all magicians know something about astrology," and refers to a table of correspondences in Starhawk's "The Spiral Dance", organised by planet, as an example of the astrological lore studied by magicians.
Hindu.
The earliest Vedic text on astronomy is the "Vedanga Jyotisha"; Vedic thought later came to include astrology as well.
Hindu natal astrology originated with Hellenistic astrology by the 3rd century BCE, though incorporating the Hindu lunar mansions. The names of the signs (e.g. Greek 'Krios' for Aries, Hindi 'Kriya'), the planets (e.g. Greek 'Helios' for Sun, astrological Hindi 'Heli'), and astrological terms (e.g. Greek 'apoklima' and 'sunaphe' for declination and planetary conjunction, Hindi 'apoklima' and 'sunapha' respectively) in Varaha Mihira's texts are considered conclusive evidence of a Greek origin for Hindu astrology. The Indian techniques may also have been augmented with some of the Babylonian techniques.
Chinese and East Asian.
Chinese astrology has a close relation with Chinese philosophy (theory of the three harmonies: heaven, earth and man) and uses concepts such as yin and yang, the Five phases, the 10 Celestial stems, the 12 Earthly Branches, and shichen (時辰 a form of timekeeping used for religious purposes). The early use of Chinese astrology was mainly confined to political astrology, the observation of unusual phenomena, identification of portents and the selection of auspicious days for events and decisions.
The constellations of the Zodiac of western Asia and Europe were not used; instead the sky is divided into Three Enclosures (三垣 sān yuán), and Twenty-Eight Mansions (二十八宿 èrshíbā xiù) in twelve Ci (). The Chinese zodiac of twelve animal signs is said to represent twelve different types of personality. It is based on cycles of years, lunar months, and two-hour periods of the day (the shichen). The zodiac traditionally begins with the sign of the Rat, and the cycle proceeds through 11 other animal signs: the Ox, Tiger, Rabbit, Dragon, Snake, Horse, Goat, Monkey, Rooster, Dog, and Pig. Complex systems of predicting fate and destiny based on one's birthday, birth season, and birth hours, such as "ziping" and Zi Wei Dou Shu () are still used regularly in modern-day Chinese astrology. They do not rely on direct observations of the stars.
The Korean zodiac is identical to the Chinese one. The Vietnamese zodiac is almost identical to the Chinese, except for second animal being the "Water Buffalo" instead of the "Ox", and the fourth animal the "Cat" instead of the "Rabbit". The Japanese have since 1873 celebrated the beginning of the new year on 1 January as per the Gregorian calendar. The Thai zodiac begins, not at Chinese New Year, but either on the first day of the fifth month in the Thai lunar calendar, or during the Songkran festival (now celebrated every 13–15 April), depending on the purpose of the use.
Theological viewpoints.
Ancient.
Augustine (354430) believed that the determinism of astrology conflicted with the Christian doctrines of man's free will and responsibility, and God not being the cause of evil, but he also grounded his opposition philosophically, citing the failure of astrology to explain twins who behave differently although conceived at the same moment and born at approximately the same time.
Medieval.
Some of the practices of astrology were contested on theological grounds by medieval Muslim astronomers such as Al-Farabi (Alpharabius), Ibn al-Haytham (Alhazen) and Avicenna. They said that the methods of astrologers conflicted with orthodox religious views of Islamic scholars, by suggesting that the Will of God can be known and predicted. For example, Avicenna's 'Refutation against astrology', "Risāla fī ibṭāl aḥkām al-nojūm", argues against the practice of astrology while supporting the principle that planets may act as agents of divine causation. Avicenna considered that the movement of the planets influenced life on earth in a deterministic way, but argued against the possibility of determining the exact influence of the stars. Essentially, Avicenna did not deny the core dogma of astrology, but denied our ability to understand it to the extent that precise and fatalistic predictions could be made from it. Ibn Qayyim al-Jawziyya (1292–1350), in his "Miftah Dar al-SaCadah", also used physical arguments in astronomy to question the practice of judicial astrology. He recognised that the stars are much larger than the planets, and argued: And if you astrologers answer that it is precisely because of this distance and smallness that their influences are negligible, then why is it that you claim a great influence for the smallest heavenly body, Mercury? Why is it that you have given an influence to [the head] and [the tail], which are two imaginary points [ascending and descending nodes]?
Modern.
Martin Luther denounced astrology in his "Table Talk". He asked why twins like Esau and Jacob had two different natures yet were born at the same time. Luther also compared astrologers to those who say their dice will always land on a certain number. Although the dice may roll on the number a couple of times, the predictor is silent for all the times the dice fails to land on that number.
The Catechism of the Catholic Church maintains that divination, including predictive astrology, is incompatible with modern Catholic beliefs such as free will:
Scientific analysis and criticism.
The scientific community rejects astrology as having no explanatory power for describing the universe, and considers it a pseudoscience. Scientific testing of astrology has been conducted, and no evidence has been found to support any of the premises or purported effects outlined in astrological traditions. There is no proposed mechanism of action by which the positions and motions of stars and planets could affect people and events on Earth that does not contradict basic and well understood aspects of biology and physics. Those who have faith in astrology have been characterised by scientists including Bart J. Bok as doing so "...in spite of the fact that there is no verified scientific basis for their beliefs, and indeed that there is strong evidence to the contrary".
Confirmation bias is a form of cognitive bias, a psychological factor that contributes to belief in astrology. Astrology believers tend to selectively remember predictions that turn out to be true, and do not remember those that turn out false. Another, separate, form of confirmation bias also plays a role, where believers often fail to distinguish between messages that demonstrate special ability and those that do not. Thus there are two distinct forms of confirmation bias that are under study with respect to astrological belief.
Demarcation.
Under the criterion of falsifiability, first proposed by the philosopher of science Karl Popper, astrology is a pseudoscience. Popper regarded astrology as "pseudo-empirical" in that "it appeals to observation and experiment," but "nevertheless does not come up to scientific standards." In contrast to scientific disciplines, astrology has not responded to falsification through experiment.
In contrast to Popper, the philosopher Thomas Kuhn argued that it was not lack of falsifiability that makes astrology unscientific, but rather that the process and concepts of astrology are non-empirical. Kuhn thought that, though astrologers had, historically, made predictions that categorically failed, this in itself does not make astrology unscientific, nor do attempts by astrologers to explain away failures by claiming that creating a horoscope is very difficult. Rather, in Kuhn's eyes, astrology is not science because it was always more akin to medieval medicine; astrologers followed a sequence of rules and guidelines for a seemingly necessary field with known shortcomings, but they did no research because the fields are not amenable to research, and so "they had no puzzles to solve and therefore no science to practise." While an astronomer could correct for failure, an astrologer could not. An astrologer could only explain away failure but could not revise the astrological hypothesis in a meaningful way. As such, to Kuhn, even if the stars could influence the path of humans through life, astrology is not scientific.
The philosopher Paul Thagard asserts that astrology cannot be regarded as falsified in this sense until it has been replaced with a successor. In the case of predicting behaviour, psychology is the alternative. To Thagard a further criterion of demarcation of science from pseudoscience is that the state-of-the-art must progress and that the community of researchers should be attempting to compare the current theory to alternatives, and not be "selective in considering confirmations and disconfirmations." Progress is defined here as explaining new phenomena and solving existing problems, yet astrology has failed to progress having only changed little in nearly 2000 years. To Thagard, astrologers are acting as though engaged in normal science believing that the foundations of astrology were well established despite the "many unsolved problems", and in the face of better alternative theories (psychology). For these reasons Thagard views astrology as pseudoscience.
For the philosopher Edward W. James, astrology is irrational not because of the numerous problems with mechanisms and falsification due to experiments, but because an analysis of the astrological literature shows that it is infused with fallacious logic and poor reasoning.
Effectiveness.
Astrology has not demonstrated its effectiveness in controlled studies and has no scientific validity. Where it has made falsifiable predictions under controlled conditions, they have been falsified. One famous experiment included 28 astrologers who were asked to match over a hundred natal charts to psychological profiles generated by the California Psychological Inventory (CPI) questionnaire. The double-blind experimental protocol used in this study was agreed upon by a group of physicists and a group of astrologers nominated by the National Council for Geocosmic Research, who advised the experimenters, helped ensure that the test was fair and helped draw the central proposition of natal astrology to be tested. They also chose 26 out of the 28 astrologers for the tests (two more volunteered afterwards). The study, published in "Nature" in 1985, found that predictions based on natal astrology were no better than chance, and that the testing "...clearly refutes the astrological hypothesis."
In 1955, the astrologer and psychologist Michel Gauquelin stated that though he had failed to find evidence that supported indicators like zodiacal signs and planetary aspects in astrology, he did find positive correlations between the diurnal positions of some planets and success in professions that astrology traditionally associates with those planets. The best-known of Gauquelin's findings is based on the positions of Mars in the natal charts of successful athletes and became known as the "Mars effect". A study conducted by seven French scientists attempted to replicate the claim, but found no statistical evidence. They attributed the effect to selective bias on Gauquelin's part, accusing him of attempting to persuade them to add or delete names from their study.
Geoffrey Dean has suggested that the effect may be caused by self-reporting of birth dates by parents rather than any issue with the study by Gauquelin. The suggestion is that a small subset of the parents may have had changed birth times to be consistent with better astrological charts for a related profession. The number of births under astrologically undesirable conditions was also lower, indicating that parents choose dates and times to suit their beliefs. The sample group was taken from a time where belief in astrology was more common. Gauquelin had failed to find the Mars effect in more recent populations, where a nurse or doctor recorded the birth information.
Dean, a scientist and former astrologer, and psychologist Ivan Kelly conducted a large scale scientific test that involved more than one hundred cognitive, behavioural, physical, and other variables—but found no support for astrology. Furthermore, a meta-analysis pooled 40 studies that involved 700 astrologers and over 1,000 birth charts. Ten of the tests—which involved 300 participants—had the astrologers pick the correct chart interpretation out of a number of others that were not the astrologically correct chart interpretation (usually three to five others). When date and other obvious clues were removed, no significant results suggested there was any preferred chart.
Lack of mechanisms and consistency.
Testing the validity of astrology can be difficult, because there is no consensus amongst astrologers as to what astrology is or what it can predict. Most professional astrologers are paid to predict the future or describe a person's personality and life, but most horoscopes only make vague untestable statements that can apply to almost anyone.
Many astrologers claim that astrology is scientific, while some have proposed conventional causal agents such as electromagnetism and gravity. Scientists reject these mechanisms as implausible since, for example, the magnetic field, when measured from Earth, of a large but distant planet such as Jupiter is far smaller than that produced by ordinary household appliances.
Western astrology has taken the earth's axial precession (also called precession of the equinoxes) into account since Ptolemy's "Almagest", so the "first point of Aries", the start of the astrological year, continually moves against the background of the stars. The tropical zodiac has no connection to the stars, and as long as no claims are made that the constellations themselves are in the associated sign, astrologers avoid the concept that precession seemingly moves the constellations. Charpak and Broch, noting this, referred to astrology based on the tropical zodiac as being "...empty boxes that have nothing to do with anything and are devoid of any consistency or correspondence with the stars." Sole use of the tropical zodiac is inconsistent with references made, by the same astrologers, to the Age of Aquarius, which depends on when the vernal point enters the constellation of Aquarius.
Astrologers usually have only a small knowledge of astronomy, and often do not take into account basic principles—such as the precession of the equinoxes, which changes the position of the sun with time. They commented on the example of Élizabeth Teissier, who claimed that, "The sun ends up in the same place in the sky on the same date each year", as the basis for claims that two people with the same birthday, but a number of years apart, should be under the same planetary influence. Charpak and Broch noted that, "There is a difference of about twenty-two thousand miles between Earth's location on any specific date in two successive years", and that thus they should not be under the same influence according to astrology. Over a 40-year period there would be a difference greater than 780,000 miles.
Reception in the social sciences.
The general consensus of astronomers and other natural scientists is that astrology is a pseudoscience which carries no predictive capability, with many philosophers of science considering it a "paradigm or prime example of pseudoscience." Some scholars in the social sciences have cautioned against categorizing astrology, especially ancient astrology, as "just" a pseudoscience or projecting the distinction backwards into the past. Thagard, while demarcating it as a pseudoscience, notes that astrology "should be judged as not pseudoscientific in classical or Renaissance times...Only when the historical and social aspects of science are neglected does it become plausible that pseudoscience is an unchanging category." Historians of science such as Tamsyn Barton, Roger Beck, Francesca Rochberg, and Wouter J. Hanegraaff argue that such a wholesale description is anachronistic when applied to historical contexts, stressing that astrology was not pseudoscience before the 18th century and the importance of the discipline to the development of medieval science. R. J. Hakinson writes in the context of Hellenistic astrology that "the belief in the possibility of [astrology] was, at least some of the time, the result of careful reflection on the nature and structure of the universe."
Nicholas Campion, both an astrologer and academic historian of astrology, argues that Indigenous astronomy is largely used as a synonym for astrology in academia, and that modern Indian and Western astrology are better understood as modes of cultural astronomy or ethnoastronomy. Roy Willis and Patrick Curry draw a distinction between propositional ' and metaphoric ' in the ancient world, identifying astrology with the latter and noting that the central concern of astrology "is not knowledge (factual, let alone scientific) but (ethical, spiritual and pragmatic)". Similarly, historian of science Justin Niermeier-Dohoney writes that astrology was "more than simply a science of prediction using the stars and comprised a vast body of beliefs, knowledge, and practices with the overarching theme of understanding the relationship between humanity and the rest of the cosmos through an interpretation of stellar, solar, lunar, and planetary movement." Scholars such as Assyriologist Matthew Rutz have begun using the term "astral knowledge" rather than astrology "to better describe a category of beliefs and practices much broader than the term 'astrology' can capture."
Cultural impact.
Western politics and society.
In the West, political leaders have sometimes consulted astrologers. For example, the British intelligence agency MI5 employed Louis de Wohl as an astrologer after claims surfaced that Adolf Hitler used astrology to time his actions. The War Office was "...interested to know what Hitler's own astrologers would be telling him from week to week." In fact, de Wohl's predictions were so inaccurate that he was soon labelled a "complete charlatan", and later evidence showed that Hitler considered astrology "complete nonsense". After John Hinckley's attempted assassination of US President Ronald Reagan, first lady Nancy Reagan commissioned astrologer Joan Quigley to act as the secret White House astrologer. However, Quigley's role ended in 1988 when it became public through the memoirs of former chief of staff, Donald Regan.
There was a boom in interest in astrology in the late 1960s. The sociologist Marcello Truzzi described three levels of involvement of "Astrology-believers" to account for its revived popularity in the face of scientific discrediting. He found that most astrology-believers did not claim it was a scientific explanation with predictive power. Instead, those superficially involved, knowing "next to nothing" about astrology's 'mechanics', read newspaper astrology columns, and could benefit from "tension-management of anxieties" and "a cognitive belief-system that transcends science." Those at the second level usually had their horoscopes cast and sought advice and predictions. They were much younger than those at the first level, and could benefit from knowledge of the language of astrology and the resulting ability to belong to a coherent and exclusive group. Those at the third level were highly involved and usually cast horoscopes for themselves. Astrology provided this small minority of astrology-believers with a ""meaningful" view of their universe and [gave] them an "understanding" of their place in it." This third group took astrology seriously, possibly as an overarching religious worldview (a "sacred canopy", in Peter L. Berger's phrase), whereas the other two groups took it playfully and irreverently.
In 1953, the sociologist Theodor W. Adorno conducted a study of the astrology column of a Los Angeles newspaper as part of a project examining mass culture in capitalist society. Adorno believed that popular astrology, as a device, invariably leads to statements that encouraged conformity—and that astrologers who go against conformity, by discouraging performance at work etc., risk losing their jobs. Adorno concluded that astrology is a large-scale manifestation of systematic irrationalism, where individuals are subtly led—through flattery and vague generalisations—to believe that the author of the column is addressing them directly. Adorno drew a parallel with the phrase opium of the people, by Karl Marx, by commenting, "occultism is the metaphysic of the dopes."
A 2005 Gallup poll and a 2009 survey by the Pew Research Center reported that 25% of US adults believe in astrology, while a 2018 Pew survey found a figure of 29%. According to data released in the National Science Foundation's 2014 "Science and Engineering Indicators" study, "Fewer Americans rejected astrology in 2012 than in recent years." The NSF study noted that in 2012, "slightly more than half of Americans said that astrology was 'not at all scientific,' whereas nearly two-thirds gave this response in 2010. The comparable percentage has not been this low since 1983." Astrology apps became popular in the late 2010s, some receiving millions of dollars in Silicon Valley venture capital.
India and Japan.
In India, there is a long-established and widespread belief in astrology. It is commonly used for daily life, particularly in matters concerning marriage and career, and makes extensive use of electional, horary and karmic astrology. Indian politics have also been influenced by astrology. It is still considered a branch of the Vedanga. In 2001, Indian scientists and politicians debated and critiqued a proposal to use state money to fund research into astrology, resulting in permission for Indian universities to offer courses in Vedic astrology.
In February 2011, the Bombay High Court reaffirmed astrology's standing in India when it dismissed a case that challenged its status as a science.
In Japan, strong belief in astrology has led to dramatic changes in the fertility rate and the number of abortions in the years of Fire Horse. Adherents believe that women born in "hinoeuma" years are unmarriageable and bring bad luck to their father or husband. In 1966, the number of babies born in Japan dropped by over 25% as parents tried to avoid the stigma of having a daughter born in the hinoeuma year.
Literature and music.
The fourteenth-century English poets John Gower and Geoffrey Chaucer both referred to astrology in their works, including Gower's "Confessio Amantis" and Chaucer's "The Canterbury Tales". Chaucer commented explicitly on astrology in his "Treatise on the Astrolabe", demonstrating personal knowledge of one area, judicial astrology, with an account of how to find the ascendant or rising sign.
In the fifteenth century, references to astrology, such as with similes, became "a matter of course" in English literature.
In the sixteenth century, John Lyly's 1597 play, "The Woman in the Moon", is wholly motivated by astrology, while Christopher Marlowe makes astrological references in his plays "Doctor Faustus" and "Tamburlaine" (both c. 1590), and Sir Philip Sidney refers to astrology at least four times in his romance "The Countess of Pembroke's Arcadia" (c. 1580). Edmund Spenser uses astrology both decoratively and causally in his poetry, revealing "...unmistakably an abiding interest in the art, an interest shared by a large number of his contemporaries." George Chapman's play, "Byron's Conspiracy" (1608), similarly uses astrology as a causal mechanism in the drama. William Shakespeare's attitude towards astrology is unclear, with contradictory references in plays including "King Lear", "Antony and Cleopatra", and "Richard II". Shakespeare was familiar with astrology and made use of his knowledge of astrology in nearly every play he wrote, assuming a basic familiarity with the subject in his commercial audience. Outside theatre, the physician and mystic Robert Fludd practised astrology, as did the quack doctor Simon Forman. In Elizabethan England, "The usual feeling about astrology ... [was] that it is the most useful of the sciences."
In seventeenth century Spain, Lope de Vega, with a detailed knowledge of astronomy, wrote plays that ridicule astrology. In his pastoral romance "La Arcadia" (1598), it leads to absurdity; in his novela "Guzman el Bravo" (1624), he concludes that the stars were made for man, not man for the stars. Calderón de la Barca wrote the 1641 comedy "Astrologo Fingido" (The Pretended Astrologer); the plot was borrowed by the French playwright Thomas Corneille for his 1651 comedy "Feint Astrologue".
The most famous piece of music influenced by astrology is the orchestral suite "The Planets". Written by the British composer Gustav Holst (1874–1934), and first performed in 1918, the framework of "The Planets" is based upon the astrological symbolism of the planets. Each of the seven movements of the suite is based upon a different planet, though the movements are not in the order of the planets from the Sun. The composer Colin Matthews wrote an eighth movement entitled "Pluto, the Renewer", first performed in 2000. In 1937, another British composer, Constant Lambert, wrote a ballet on astrological themes, called "Horoscope". In 1974, the New Zealand composer Edwin Carr wrote "The Twelve Signs: An Astrological Entertainment" for orchestra without strings. Camille Paglia acknowledges astrology as an influence on her work of literary criticism "Sexual Personae" (1990).
Astrology features strongly in Eleanor Catton's "The Luminaries", recipient of the 2013 Man Booker Prize.
|
2123 | Abyssinia (disambiguation) | Abyssinia is a historical name for the Ethiopian Empire.
Abyssinia may also refer to:
|
2125 | Algebraic extension | In mathematics, an algebraic extension is a field extension such that every element of the larger field is algebraic over the smaller field ; that is, every element of is a root of a non-zero polynomial with coefficients in . A field extension that is not algebraic, is said to be transcendental, and must contain transcendental elements, that is, elements that are not algebraic.
The algebraic extensions of the field formula_1 of the rational numbers are called algebraic number fields and are the main objects of study of algebraic number theory. Another example of a common algebraic extension is the extension formula_2 of the real numbers by the complex numbers.
Some properties.
All transcendental extensions are of infinite degree. This in turn implies that all finite extensions are algebraic. The converse is not true however: there are infinite extensions which are algebraic. For instance, the field of all algebraic numbers is an infinite algebraic extension of the rational numbers.
Let be an extension field of , and . The smallest subfield of that contains and is commonly denoted formula_3 If is algebraic over , then the elements of can be expressed as polynomials in with coefficients in "K"; that is, is also the smallest ring containing and . In this case, formula_4 is a finite extension of (it is a finite dimensional -vector space), and all its elements are algebraic over . These properties do not hold if is not algebraic. For example, formula_5 and they are both infinite dimensional vector spaces over formula_6
An algebraically closed field "F" has no proper algebraic extensions, that is, no algebraic extensions "E" with "F" < "E". An example is the field of complex numbers. Every field has an algebraic extension which is algebraically closed (called its algebraic closure), but proving this in general requires some form of the axiom of choice.
An extension "L"/"K" is algebraic if and only if every sub "K"-algebra of "L" is a field.
Properties.
The following three properties hold:
These finitary results can be generalized using transfinite induction:
This fact, together with Zorn's lemma (applied to an appropriately chosen poset), establishes the existence of algebraic closures.
Generalizations.
Model theory generalizes the notion of algebraic extension to arbitrary theories: an embedding of "M" into "N" is called an algebraic extension if for every "x" in "N" there is a formula "p" with parameters in "M", such that "p"("x") is true and the set
is finite. It turns out that applying this definition to the theory of fields gives the usual definition of algebraic extension. The Galois group of "N" over "M" can again be defined as the group of automorphisms, and it turns out that most of the theory of Galois groups can be developed for the general case.
Relative algebraic closures.
Given a field "k" and a field "K" containing "k", one defines the relative algebraic closure of "k" in "K" to be the subfield of "K" consisting of all elements of "K" that are algebraic over "k", that is all elements of "K" that are a root of some nonzero polynomial with coefficients in "k".
|
2126 | Ani DiFranco | Angela Maria "Ani" DiFranco (; born September 23, 1970) is an American-Canadian singer-songwriter. She has released more than 20 albums. DiFranco's music has been classified as folk rock and alternative rock, although it has additional influences from punk, funk, hip hop and jazz. She has released all her albums on her own record label, Righteous Babe.
DiFranco supports many social and political movements by performing benefit concerts, appearing on benefit albums and speaking at rallies. Through the Righteous Babe Foundation, DiFranco has backed grassroots cultural and political organizations supporting causes including abortion rights and LGBT visibility. She counts American folk singer and songwriter Pete Seeger among her mentors.
DiFranco released a memoir, "No Walls and the Recurring Dream", on May 7, 2019, via Viking Books and made "The New York Times" Best Seller list.
Early life and education.
DiFranco was born in Buffalo, New York, on September 23, 1970, the daughter of Elizabeth (Ross) and Dante Americo DiFranco, who had met while attending the Massachusetts Institute of Technology. Her father was of Italian descent, and her mother was from Montreal. DiFranco started playing Beatles covers at local bars and busking with her guitar teacher, Michael Meldrum, at the age of nine. By 14 she was writing her own songs. She played them at bars and coffee houses throughout her teens. DiFranco graduated from the Buffalo Academy for Visual and Performing Arts high school at 16 and began attending classes at Buffalo State College. She was living by herself, having moved out of her mother's apartment after she became an emancipated minor when she was 15.
Career.
DiFranco started her own record company, Righteous Babe Records, in 1989 at age 19. She released her self-titled debut album in the winter of 1990, shortly after relocating to New York City. There, she took poetry classes at The New School, where she met poet Sekou Sundiata, who was to become a friend and mentor. She toured steadily for the next 15 years, pausing only to record albums. Appearances at Canadian folk festivals and increasingly larger venues in the U.S. reflected her increasing popularity on the North American folk and roots scene. Throughout the early and mid-1990s DiFranco toured solo and also as a duo with Canadian drummer Andy Stochansky.
In September 1995, DiFranco participated in a concert at the Rock and Roll Hall of Fame in Cleveland Ohio, inaugurating the opening of the Woody Guthrie Archives in New York City. She later released a CD on Righteous Babe of the concert "Til We Outnumber Em" featuring artists such as DiFranco, Billy Bragg, Ramblin' Jack Elliott, Arlo Guthrie, Indigo Girls, Dave Pirner, Tim Robbins, and Bruce Springsteen with 100 percent of proceeds going to the Woody Guthrie Foundation and Archives and the Rock and Roll Hall of Fame Museum educational department.
In 1996, bassist Sara Lee joined the touring group, whose live rapport is showcased on the 1997 album "Living in Clip". DiFranco would later release Lee's solo album "Make It Beautiful" on Righteous Babe. In 1998, Stochansky left to pursue a solo career as a singer-songwriter. A new touring ensemble consisting of Jason Mercer on bass, Julie Wolf on keyboards, and Daren Hahn on drums, augmented at times by a horn section, accompanied DiFranco on tour between 1998 and 2002.
The 1990s were a period of heightened exposure for DiFranco, as she continued playing ever larger venues around the world and attracted international attention of the press, including cover stories in "Spin", "Ms.", and "Magnet", among others, as well as appearances on MTV and VH1. Her playfully ironic cover of the Bacharach/David song "Wishin' and Hopin'" appeared under the opening titles of the film "My Best Friend's Wedding".
She guest starred on a 1998 episode of the Fox sitcom "King of the Hill", as the voice of Peggy's feminist guitar teacher, Emily.
Beginning in 1999, Righteous Babe Records began releasing albums by other artists including Sara Lee, Sekou Sundiata, Arto Lindsay, Bitch and Animal, That One Guy, Utah Phillips, Hamell on Trial, Andrew Bird, Kurt Swinghammer, Buddy Wakefield, Anaïs Mitchell and Nona Hendryx.
On September 11, 2001, DiFranco was in Manhattan and later penned the poem "Self Evident" about the experience. The poem was featured in the book "It's a Free Country: Personal Freedom in America After September 11". The poem's title also became the name of DiFranco's first book of poetry released exclusively in Italy by Minimum Fax. It was later also featured in "Verses", a book of her poetry published in the U.S. by Seven Stories press. DiFranco has written and performed many spoken-word pieces throughout her career and was showcased as a poet on the HBO series "Def Poetry" in 2005.
Since her 2005 release "Knuckle Down" (co-produced by Joe Henry) DiFranco's touring band and recordings have featured bass player Todd Sickafoose and in turns other musicians such as Allison Miller, Andy Borger, Herlin Riley, and Terence Higgins on drums and Mike Dillon on percussion and vibes.
On September 11, 2007, she released the first retrospective of her career, a two-disc compilation entitled "Canon" and simultaneously a retrospective collection of poetry book "Verses". On September 30, 2008, she released "Red Letter Year".
In 2009, DiFranco appeared at Pete Seeger's 90th birthday celebration at Madison Square Garden, debuting her revamped version of the 1930s labor anthem "Which Side Are You On?" in a duet with Bruce Cockburn and also duetting with Kris Kristofferson on the folk classic "There's a Hole in the Bucket".
DiFranco released an album on January 17, 2012, "¿Which Side Are You On?". It includes collaborations with Pete Seeger, Ivan Neville, Cyril Neville, Skerik, Adam Levy, Righteous Babe recording artist Anaïs Mitchell, CC Adcock, and a host of New Orleans-based horn players known for their work in such outfits as Galactic, Bonerama, and Rebirth Brass Band.
In 2014, she released her eighteenth album, "Allergic to Water". In 2017, she released her nineteenth, "Binary".
On May 7, 2019, DiFranco released a memoir, "No Walls and the Recurring Dream", via Viking Books. It is described as a "coming-of-age story".
In 2021, DiFranco released the album "Revolutionary Love" which was largely inspired by Valarie Kaur's book "See No Stranger."
Personal life.
DiFranco came out as bisexual in her twenties, and has written songs about love and sex with women and men. She addressed the controversy about her sexuality in the song "In or Out" on the album "Imperfectly" (1992). However, in 2015 she told the blog GoPride.com that she was ""not so queer anymore, but definitely a woman-centered woman and just a human rights-centered artist." In a 2019 interview with "Jezebel", she stated that she preferred the term "queer" because "bisexual" "always sounded very medical, like something you do to a frog in 9th grade science or something", and further added that "the irony is I’m pretty fuckin’ hetero, which is unfortunate for me because many of my deepest connections are with women. But, naw, I just like what’s in boys’ pants better.". In 1998, she married her sound engineer Andrew Gilchrist in a Unitarian Universalist service in Canada. DiFranco and Gilchrist divorced in 2003.
In 1990, she wrote "Lost Woman Song", which was inspired by her abortions at ages eighteen and twenty.
DiFranco's father died in the summer of 2004. In July 2005, DiFranco developed tendinitis and took a nine-month hiatus from touring. In January 2007 DiFranco gave birth to her first child, a daughter, at her Buffalo home. She married the child's father, Mike Napolitano, also her regular producer, in 2009. In an interview on September 13, 2012, DiFranco mentioned that she was pregnant with her second child. In April 2013, she gave birth to her second child, a son.
DiFranco has resided in the Bywater, New Orleans, neighborhood since 2008.
DiFranco has described herself as an atheist. On the subject of religion, DiFranco has stated:
DiFranco has spoken critically of cancel culture, saying it is "just gonna get us nowhere" and "The human family can't divorce each other". DiFranco herself has received criticism for planning a 2013 songwriting retreat at Nottoway, a former slave plantation, and wrote that she "[sympathized] with both sides" regarding the controversial trans-exclusionary policies of the Michigan Womyn's Music Festival.
Critical reception.
DiFranco has been a critical success for much of her career, with a career album average of 72 on Metacritic. "Living in Clip", DiFranco's 1998 double live album, is the only one to achieve gold record status to date. DiFranco was praised by "The Buffalo News" in 2006 as "Buffalo's leading lady of rock music".
Starting in 2003, DiFranco was nominated four consecutive times for Best Recording Package at the Grammy Awards, winning in 2004 for "Evolve".
On July 21, 2006, DiFranco received the Woman of Courage Award at the National Organization for Women (NOW) Conference and Young Feminist Summit in Albany, New York. DiFranco was one of the first musicians to receive the award, given each year to a woman who has set herself apart by her contributions to the feminist movement.
In 2009, DiFranco received the Woody Guthrie Award for being a voice of positive social change.
Music.
Style.
DiFranco's guitar playing is often characterized by a signature staccato style, rapid fingerpicking and many alternate tunings. She delivers many of her lines in a speaking style notable for its rhythmic variation. Her lyrics, which often include alliteration, metaphor, word play and a more or less gentle irony, have also received praise for their sophistication.
Although DiFranco's music has been classified as both folk rock and alternative rock, she has reached across genres since her earliest albums incorporating first punk, then funk, hiphop, and jazz influences.
While primarily an acoustic guitarist she has used a variety of instruments and styles: brass instrumentation was prevalent in 1998's "Little Plastic Castle"; a simple walking bass in her 1997 cover of Hal David and Burt Bacharach's "Wishin' and Hopin' "; strings on the 1997 live album "Living in Clip" and 2004's "Knuckle Down"; and electronics and synthesizers in 1999's "To the Teeth" and 2006's "Reprieve".
DiFranco has stated that "folk music is not an acoustic guitar – that's not where the heart of it is. I use the word 'folk' in reference to punk music and rap music. It's an attitude, it's an awareness of one's heritage, and it's a community. It's subcorporate music that gives voice to different communities and their struggle against authority."
Musical collaborations, cover versions, and samples.
DiFranco has collaborated with a wide range of artists. In 1997, she appeared on Canadian songwriter Bruce Cockburn's "Charity of Night" album. In 1998, she produced fellow folksinger Dan Bern's album "Fifty Eggs".
She developed a deep association with folksinger and social activist Utah Phillips throughout the mid-1990s, sharing her stage and her audience with the older musician until his death in 2008 and resulting in two collaborative albums: "The Past Didn't Go Anywhere" (1996) and "Fellow Workers" (1999, with liner notes by Howard Zinn). "The Past" is built around Phillips's storytelling, an important part of his art that had not previously been documented on recordings; on the album, DiFranco provides musical settings for his speaking voice. The followup, "Fellow Workers", was recorded live in Daniel Lanois's Kingsway Studio in New Orleans and features Phillips fronting DiFranco's touring band for a collection of songs and stories.
Prince recorded two songs with DiFranco in 1999, "Providence" on her "To the Teeth" album, and "Eye Love U, But Eye Don't Trust U Anymore" on Prince's "Rave Un2 the Joy Fantastic" album. Funk and soul jazz musician Maceo Parker and rapper Corey Parker have both appeared on DiFranco's albums and featured appearances by her on theirs. Parker and DiFranco toured together in 1999.
She has appeared on several compilations of the songs of Pete Seeger and frequented his Hudson Clearwater Revival Festival. In 2001, she appeared on Brazilian artist Lenine's album "Falange Canibal". In 2002, her rendition of Greg Brown's "The Poet Game" appeared on "Going Driftless: An Artist's Tribute to Greg Brown". Also in 2002 she recorded a duet with Jackie Chan of the Irving Gordon song "Unforgettable" for a record of unlikely collaborations, "When Pigs Fly: Songs You Never Thought You'd Hear".
In 2005, she appeared on Dar Williams' record "My Better Self", duetting on William's cover of Pink Floyd's "Comfortably Numb". She performed with Cyndi Lauper on "Sisters of Avalon" a track from Lauper's 2005 "The Body Acoustic" album. In 2006, she produced Hamell on Trial's album "Songs for Parents Who Enjoy Drugs". In 2008, she appeared on Todd Sickafoose's album "Tiny Resisters". In 2010, she co-produced a track with Margaret Cho called "Captain Cameltoe" for the comedian's "Cho Dependant" album. In 2011, she appeared on Rob Wasserman's album "Note of Hope", an exploration of the writings of Woody Guthrie with musical accompaniment, though the track in which she appeared, "Voice", was actually recorded 13 years earlier. Also in 2011 she duetted with Greg Dulli on the Twilight Singers record "Dynamite Steps".
Other artists have covered and sampled DiFranco's work throughout the years. Her spoken word poem "Self Evident" was covered by Public Enemy founder Chuck D's group called Impossebulls. Alana Davis had some commercial success with DiFranco's song "32 Flavors".
Samples from the track "Coming Up" were used by DJ Spooky in his album "Live Without Dead Time", produced for AdBusters Magazine in 2003.
In 2010, DiFranco played Persephone on Anaïs Mitchell's album Hadestown.
DiFranco was approached by Zoe Boekbinder to work on their "Prison Music Project", an album of collaborations between incarcerated and formerly incarcerated writers and musicians on the outside. DiFranco co-produced the project with Boekbinder and co-wrote and performed "Nowhere but Barstow and Prison." The album "Long Time Gone" was released on Righteous Babe Records in 2020 after ten years in the making.
Lyrical content.
Although much of DiFranco's material is autobiographical, it is often also strongly political. Many of her songs are concerned with contemporary social issues such as racism, sexism, sexual abuse, homophobia, reproductive rights, poverty, and war. In 2008, she donated a song to Aid Still Required's CD to assist with the restoration of the devastation done to Southeast Asia from the 2004 tsunami.
The combination of personal and political is partially responsible for DiFranco's early popularity among politically active college students, particularly those of the left wing, some of whom set up fan pages on the web to document DiFranco's career as early as 1994. DiFranco's rapid rise in popularity in the mid-1990s was fueled mostly by personal contact and word of mouth rather than mainstream media.
Label independence.
Ani cites her anti-corporate ethos for the main reason she decided to start her own label. This has allowed her a considerable degree of creative freedom over the years, including, for example, providing all instrumentals and vocals and recording the album herself at her home on an analog 8-track reel to reel, and handling much of the artwork and packaging design for her 2004 album "Educated Guess". She has referenced this independence from major labels in song more than once, including "The Million You Never Made" ("Not a Pretty Girl"), which discusses the act of turning down a lucrative contract, "The Next Big Thing" ("Not So Soft"), which describes an imagined meeting with a label head-hunter who evaluates the singer based on her looks, and "Napoleon" ("Dilate"), which sympathizes sarcastically with an unnamed friend who did sign with a label.
The business grew organically starting in 1990 with the first cassette tape. Connections were made when women in colleges started duplicating and sharing tapes. Offers to play at colleges started coming in and her popularity grew largely by word of mouth and through women's groups or organizations. Zango and Goldenrod, two music distributors specializing in women's music, started carrying DiFranco's music. In general they sold music to independent music stores and women's book stores. In 1995, Righteous Babe Records signed with Koch International for DiFranco's release of "Not a Pretty Girl". Her records could then be found in large and small record stores alike.
DiFranco has occasionally joined with Prince in discussing publicly the problems associated with major record companies. Righteous Babe Records employs a number of people in her hometown of Buffalo. In a 1997 open letter to "Ms." magazine she expressed displeasure that what she considers a way to ensure her own artistic freedom was seen by others solely in terms of its financial success.
Activism.
From the earliest days of her career, DiFranco has lent her voice and her name to a broad range of social movements, performing benefit concerts, appearing on benefit albums, speaking at rallies, and offering info table space to organizations at her concerts and the virtual equivalent on her website, among other methods and actions. In 1999, she created her own not-for-profit organization; as the Buffalo News has reported, "Through the Righteous Babe Foundation, DiFranco has backed various grassroots cultural and political organizations, supporting causes ranging from abortion rights to gay visibility."
During the first Gulf War, DiFranco participated in the anti-war movement. In early 1993 she played Pete Seeger's Clearwater Folk Festival for the first time. In 1998, she was a featured performer in the Dead Man Walking benefit concert series raising money for Sister Helen Prejean's "Not in Our Name" anti-death penalty organization. DiFranco's commitment to opposing the death penalty is longstanding; she has also been a long time supporter of the Southern Center for Human Rights.
During the 2000 U.S. presidential election, she actively supported and voted for Green Party candidate Ralph Nader, though in an open letter she made clear that if she lived in a swing state, she would vote for Al Gore to prevent George W. Bush from being elected.
In 2004, DiFranco visited Burma in order to learn about the Burmese resistance movement and the country's fight for democracy. During her travels she met with then-detained resistance leader Aung San Suu Kyi. Her song "In The Way" was later featured on "For the Lady", a benefit CD that donated all proceeds to the United States Campaign for Burma.
During the 2004 presidential primaries, she supported liberal, anti-war Democrat Dennis Kucinich, who appeared on stage with her during several of her concerts. After the primary season ended, and John Kerry was the clear Democratic candidate, DiFranco launched a "Vote Dammit!" tour of swing states encouraging audience members to vote. In 2005, she lobbied Congress against the proliferation of nuclear power in general and the placement of nuclear waste dumps on Indian land in particular. In 2008, she again backed Kucinich in his bid for the presidency.
In 2002, Righteous Babe Records established the "Aiding Buffalo's Children" program in conjunction with members of the local community to raise funds for Buffalo's public school system. To kick off the program, DiFranco donated "a day's pay"—the performance fee from her concert that year at Shea's Performing Arts Center— to ABC and challenged her fans to do the same. Aiding Buffalo's Children has since been folded into the Community Foundation of Greater Buffalo, contributing to a variety of charitable funds.
In 2005, when Hurricane Katrina devastated DiFranco's newly adopted home town of New Orleans, she collected donations from fans around the world through The Righteous Babe Store website for the Katrina Piano Fund, helping musicians replace instruments lost in the hurricane, raising over $47,500 for the cause.
In 2010, after the Deepwater Horizon oil spill, she performed at the "For Our Coast" benefit concert joining Marianne Faithfull, C. C. Adcock and others at the Acadiana Center for the Arts Theater in Lafayette, raising money for Gulf Aid Acadiana, and the Gulf Aid show with Lenny Kravitz, Mos Def, and others at Mardi Gras World River City in New Orleans, both shows raising money to help protect the wetlands, clean up the coast and to assist the fishermen and their families affected by the spill.
DiFranco also sits on the board for The Roots of Music, founded by Rebirth Brass Band drummer Derrick Tabb. The organization provides free marching band instruction to children in the New Orleans area in addition to academic tutoring and mentoring.
DiFranco joined about 500,000 people at the March for Women's Lives in DC in April 2004. As an honored guest she marched in the front row for the three-mile route, along with Margaret Cho, Janeane Garofalo, Whoopi Goldberg, Gloria Steinem and others. Later in the day, Ani played a few songs on the main stage in front of the Capitol, including "Your Next Bold Move".
Scot Fisher, formerly Righteous Babe label president and DiFranco's manager for many years, has been a longtime advocate of the preservation movement in Buffalo. In 1999, he and DiFranco purchased a decaying church on the verge of demolition in downtown Buffalo and began the lengthy process of restoring it. In 2006, the building opened its doors again, first briefly as "The Church" and then as "Babeville," housing two concert venues, the record label's business office, and Hallwalls Contemporary Arts Center.
DiFranco is also a member of the Toronto-based charity Artists Against Racism for which she participated in a radio PSA.
|
2127 | Arene (disambiguation) | An Arene, or aromatic hydrocarbon, is a hydrocarbon with alternating double and single bonds between carbon atoms forming rings.
Arene may also refer to:
|
2129 | Arizona Diamondbacks | The Arizona Diamondbacks (colloquially known as the D-backs) are an American professional baseball team based in Phoenix, Arizona. The Diamondbacks compete in Major League Baseball (MLB) as a member club of the National League (NL) West division. The franchise was established on March 9, 1995, and began play in 1998 as an expansion team. The team plays its home games at Chase Field, formerly known as Bank One Ballpark. Along with the Tampa Bay Rays, the Diamondbacks are one of the newest teams in MLB.
After a fifth-place finish in their inaugural season, the Diamondbacks made several off-season acquisitions, including future Hall of Fame pitcher Randy Johnson, who won four consecutive Cy Young Awards in his first four seasons with the team. In 1999, Arizona won 100 games and their first division championship. In 2001, they won the World Series over the three-time defending champion New York Yankees, becoming the fastest expansion team in major league history to win the World Series, and the only men's major professional sports team in the state of Arizona to win a championship.
From 1998 to 2022, the Diamondbacks have an overall record of 1,914–2,034 ().
Franchise history.
On March 9, 1995, Phoenix was awarded an expansion franchise to begin play for the season. A $130 million franchise fee was paid to Major League Baseball and on January 16, 1997, the Diamondbacks were officially voted into the National League. The Diamondbacks' first major league game was played against the Colorado Rockies on March 31, 1998, at Bank One Ballpark. The ballpark was renamed Chase Field in 2005, as a result of Bank One Corporation's merger with JPMorgan Chase & Co.
Since their debut, the Diamondbacks have won five NL West division titles, one NL pennant, one Wild Card game, and the 2001 World Series.
Logos and uniforms.
1998–2006.
The Diamondbacks' original colors were purple, black, teal and copper. Their first logo was an italicized block letter "A" with a diamond pattern, and the crossbar represented by a snake's tongue. This period saw the Diamondbacks wear various uniform combinations.
At home, the Diamondbacks wore cream uniforms with purple pinstripes. The primary sleeved uniform, worn from 1998 to 2000, featured the full team name ("Diamond" and "Backs" stacked together) in front and chest numbers. The alternate sleeveless version contained the "A" logo on the right chest, and was paired with purple undershirts. Before the 2001 season, the sleeved uniform was changed to feature the "A" logo. In all three uniforms, player names were teal with purple trim, and numbers were purple with white with teal trim.
The Diamondbacks' primary road gray uniform also contained purple pinstripes. The first version featured "Arizona" in purple with white and teal trim along with black drop shadows. Chest numbers were also added. Player names were in purple with white trim, and numbers were teal with white and purple trim. In 2001, the uniform became sleeveless with black undershirts, and the lettering scheme was changed to purple with white, copper and black accents.
The alternate home purple uniform featured "Arizona" in teal with white and copper trim and black drop shadows. Originally the letters were rendered in teal with copper and white trim, but was changed to copper with teal and white trim after only one season. This set was worn until 2002.
The alternate road black uniform initially featured the "A" logo on the right chest, while letters were in purple with white trim and numbers in teal with white and purple trim. A zigzag pattern of teal, copper and purple was also featured on the sleeves. In 2001, the uniform was changed to feature "Arizona" in front. Letters were now purple with white and copper trim.
The Diamondbacks initially wore four different cap versions. The primary home cap is all-purple, while the road cap is black with a teal brim. They also wore a cream cap with purple brim, and a teal cap with purple brim. All designs featured the primary "A" logo. In 1999, the road cap became all-black and contained the alternate "D-snake" logo rendered in copper. Also, the teal and cream alternate caps were dropped.
The left sleeve of all four uniforms initially contained the snake logo with the full team name, but became only exclusive to the road black uniform after the 2003 season.
2007–2015.
The franchise unveiled new uniforms and colors of Sedona red, Sonoran sand and black on November 8, 2006. The red shade is named for the sandstone canyon at Red Rock State Park near Sedona, while the beige (sand) shade is named for the Sonoran Desert. A sleeve patch was added featuring a lowercase "d" and "b" configured to look like a snake's head. The team also kept the "D" logo, which was slightly altered and put on an all-red cap to be used as their game cap. They also kept the "A" logo with the new colors applied to it, with a solid black cap used as the alternate cap. Arizona's updated color scheme bore a striking resemblance to the Houston Astros' then-current color scheme (brick red, sand and black) that the latter used until 2012, as well as the NHL's Phoenix Coyotes, whose adoption of said colors predate the Diamondbacks by four years.
The white home uniform featured "D-Backs" in red with sand and black trim. The road gray uniform featured "Arizona" in red with sand and black trim. Player names were red with black trim while numbers were black with red trim.
The alternate red uniform contained "D-Backs" in sand with red and black trim, with player names in sand with black trim and numbers in black with sand trim.
There were two versions of the alternate black uniform. One design has the alternate "A" logo on the right chest, while the other has "Arizona" written in red with black and sand trim. The latter was introduced in 2013 as a tribute to the victims of the Yarnell Hill Fire. On both uniforms, player names were sand with red trim, and numbers in red with sand trim.
2016–present.
Prior to the 2016 season, the Diamondbacks reincorporated teal into its color scheme while keeping Sedona Red, Sonoran Sand and black. They also unveiled eight different uniform combinations, including two separate home white and away grey uniforms. One major difference between the two sets is that the non-teal uniforms feature a snakeskin pattern on the shoulders, while the teal-trimmed uniforms include a charcoal/grey snakeskin pattern on the back. Arizona also kept the throwback pinstriped sleeveless uniforms from their 2001 championship season for use during Thursday home games.
Starting with the 2020 season, the Diamondbacks made slight redesigns to their current uniforms. The snakeskin patterns were removed while the teal-trimmed grey uniforms were retired. The team also reverted to a standard grey uniform after wearing a darker shade on the previous set. Two home white uniforms remain in use: the primary Sedona Red and the alternate teal. They would also wear two black uniforms: one with the primary "A" logo on the left chest and the other with "Los D-Backs" trimmed in teal. Three cap designs were also unveiled, all with a black base: the primary "A" cap, the teal-trimmed "snake" cap (paired exclusively on the teal alternates), and the sand-trimmed "snake" cap with red brim (paired exclusively on the Sedona Red alternates). The Nike swoosh logo is also placed on the right chest near the shoulder. In 2022, the Diamondbacks introduced a red "A" cap with black brim.
In 2021, the Diamondbacks were one of seven teams to wear Nike "City Connect" uniforms. The design is primarily sand and has "Serpientes" in black script lettering emblazoned in front. The first "S" in "Serpientes" was shaped to resemble a rattlesnake. The right sleeve has the flag of Arizona patch recolored to the Diamondbacks' current red, sand and black scheme, and the left sleeve has the "A" logo recolored to black and sand. Numerals are in red. The cap is primarily sand with black brim and has the "A" logo in black and sand; the regular batting helmet is used with the uniform. Initially, the Diamondbacks wore white pants with this uniform, but has since switched to sand pants.
Before the 2023 season, the Diamondbacks promoted the alternate white uniform with teal accents to its primary home uniform and retired the previous Sedona Red white uniform. This is due to Nike's new rule that limits teams to only four regular uniforms plus the "City Connect" uniform.
Radio and television.
The primary television play-by-play voice for the team's first nine seasons of play was Thom Brennaman, who also broadcast baseball and college football games nationally for Fox Television. Brennaman was the TV announcer for the Chicago Cubs and Cincinnati Reds (along with his father Marty Brennaman) before being hired by Diamondbacks founder Jerry Colangelo in 1996, two years before the team would begin play.
In October 2006, Brennaman left the Diamondbacks to call games with his father for the Reds beginning in 2007, signing a four-year deal (his Fox duties remained unchanged).
On November 1, 2006, the team announced that the TV voice of the Milwaukee Brewers since 2002, Daron Sutton, would be hired as the Diamondbacks primary TV play-by-play voice. Sutton was signed to a five-year contract with a team option for three more years. Sutton is considered one of the best of the younger generation of baseball broadcasters. His signature chants include "let's get some runs" when the D-backs trail in late innings. Sutton's father was Hall of Fame pitcher and Atlanta Braves broadcaster Don Sutton.
Former Diamondbacks and Chicago Cubs first baseman Mark Grace and former Major League knuckleball pitcher Tom Candiotti were the Diamondbacks primary color analysts for the 2006 and 2007 seasons. Former Diamondbacks third baseman Matt Williams also did color commentary on occasion, as did former Cardinals and NBC broadcast legend Joe Garagiola, Sr., a longtime Phoenix-area resident and father of Joe Garagiola, Jr., the first GM of the Diamondbacks (as head of the Maricopa County Sports Authority in the early 1990s, Garagiola, Jr. was one of the primary people involved in Phoenix obtaining a Major League Baseball franchise).
The Diamondbacks announced in July 2007 that for the 2008 season, all regionally broadcast Diamondbacks TV games would be shown exclusively on Fox Sports Arizona (now Bally Sports Arizona) and a few could possibly be shown on the national "MLB on Fox" telecasts. Bally Sports Arizona is currently seen in 2.8 million households in Arizona and New Mexico. The previous flagship station since the inaugural 1998 season was KTVK (Channel 3), a popular over-the-air independent station (and former longtime ABC affiliate) in Phoenix.
From 2009 to 2012, Mark Grace and Daron Sutton were tagged as the main broadcasters of the Diamondbacks with pre-game and postgame shows on Fox Sports Arizona, being hosted by former big-league closer Joe Borowski.
On June 21, 2012, Daron Sutton was suspended indefinitely, amid rumors of insubordination. Then on August 24, the team announced that Mark Grace had requested an indefinite leave of absence after being arrested for his second DUI in less than two years (Grace was later indicted on four DUI counts). For the remainder of the 2012 season, Sutton was replaced by Greg Schulte (Jeff Munn replaced Schulte on the radio broadcast) and Grace was replaced by Luis Gonzalez. At the end of the 2012 season, the team announced that neither Sutton nor Grace would be returning for the 2013 season.
On October 18, 2012, the team announced that Bob Brenly would be returning as a broadcaster to replace Grace and that he would be joined by then-ESPN personality Steve Berthiaume.
On July 18, 2023, a federal bankruptcy court granted Bally Sports' parent company Diamond Sports Group a motion to decline its contract with the Diamondbacks as part of its chapter 11 bankruptcy. As a result, Major League Baseball assumed production of the Diamondbacks' regional telecasts (maintaining staff such as commentators), and is distributing them via local television providers and MLB.tv.
The English language flagship radio station is KTAR. Greg Schulte is the regular radio play-by-play voice, a 25-year veteran of sports radio in the Phoenix market, also well known for his previous work on Phoenix Suns, Arizona Cardinals and Arizona State University (ASU) broadcasts. He calls games with analyst Tom Candiotti.
Jeff Munn served as a backup radio play-by-play announcer until 2016; he served as the regular public address announcer at Chase Field in the early days of the franchise. He is well known to many Phoenix area sports fans, having also served as the public address announcer for the Suns in the 1990s at what became Footprint Center. He is also the play-by-play radio voice for ASU women's basketball. Mike Ferrin served in the same role for 6 years before parting ways with the team, and he was replaced by Chris Garagiola in December 2021.
Spanish broadcasts.
The flagship Spanish language radio station is KHOV-FM 105.1 with Oscar Soria, Rodrigo López, and Richard Saenz.
Games were televised in Spanish on KPHE-LP—with Oscar Soria and Jerry Romo as the announcers—but this arrangement ended prior to the 2009 season due to the team switching fully to Fox Sports Arizona and the lack of carriage of KPHE-LP on the Cox cable system.
Rivalry with the Los Angeles Dodgers.
The rivalry between the Diamondbacks and the Los Angeles Dodgers has been one of the fiercest divisional matchups for several years. Animosity between the two teams began to escalate during the 2010s in multiple incidents involving either team throwing pitches at one another or instigating into large scale brawls between both benches. Famously; after eliminating the Diamondbacks and clinching the division on September 19, 2013; multiple Dodgers' players celebrated the win by jumping into the pool at Chase Field. The two sides met during the 2017 National League Division Series as the Diamondbacks were swept 3-0 by the Dodgers en route to their appearance in the World Series that season. The Dodgers lead the series 257–191 with a 3–0 lead in the postseason.
Minor league affiliations.
The Arizona Diamondbacks farm system consists of eight minor league affiliates.
|
2130 | Aesthetics | Aesthetics (occasionally spelled esthetics in American English) is a branch of philosophy that deals with the nature of beauty and taste, as well as the philosophy of art (its own area of philosophy that comes out of aesthetics). It examines aesthetic values, often expressed through judgments of taste.
Aesthetics covers both natural and artificial sources of experiences and how we form a judgment about those sources. It considers what happens in our minds when we engage with objects or environments such as viewing visual art, listening to music, reading poetry, experiencing a play, watching a fashion show, movie, sports or even exploring various aspects of nature. The philosophy of art specifically studies how artists imagine, create, and perform works of art, as well as how people use, enjoy, and criticize art. Aesthetics considers why people like some works of art and not others, as well as how art can affect moods or even our beliefs. Both aesthetics and the philosophy of art try to find answers to what exactly is art, artwork, or what makes good art.
Scholars in the field have defined aesthetics as "critical reflection on art, culture and nature". In modern English, the term "aesthetic" can also refer to a set of principles underlying the works of a particular art movement or theory (one speaks, for example, of a Renaissance aesthetic).
Etymology.
The word "aesthetic" is derived from the Ancient Greek (', "perceptive, sensitive, pertaining to sensory perception"), which in turn comes from (', "I perceive, sense, learn") and is related to ("", "perception, sensation"). Aesthetics in this central sense has been said to start with the series of articles on "The Pleasures of the Imagination", which the journalist Joseph Addison wrote in the early issues of the magazine The Spectator in 1712.
The term "aesthetics" was appropriated and coined with new meaning by the German philosopher Alexander Baumgarten in his dissertation "Meditationes philosophicae de nonnullis ad poema pertinentibus" () in 1735; Baumgarten chose "aesthetics" because he wished to emphasize the experience of art as a means of knowing. Baumgarten's definition of aesthetics in the fragment "Aesthetica" (1750) is occasionally considered the first definition of modern aesthetics.
The term was introduced into the English language by Thomas Carlyle in his "Life of Friedrich Schiller" (1825).
Aesthetics and the philosophy of art.
Some distinguish aesthetics from the philosophy of art, claiming that the former is the study of beauty and taste while the latter is the study of works of art. But aesthetics typically considers questions of beauty as well as of art. It examines topics such as art works, aesthetic experience, and aesthetic judgments. Aesthetic judgement refers to the sensory contemplation or appreciation of an object (not necessarily a work of art), while artistic judgement refers to the recognition, appreciation or criticism of art in general or a specific work of art. In the words of one philosopher, "Philosophy of art is about art. Aesthetics is about many things—including art. But it is also about our experience of breathtaking landscapes or the pattern of shadows on the wall opposite your office.
Philosophers of art weigh a culturally contingent conception of art versus one that is purely theoretical. They study the varieties of art in relation to their physical, social, and cultural environments. Aesthetic philosophers sometimes also refer to psychological studies to help understand how people see, hear, imagine, think, learn, and act in relation to the materials and problems of art. Aesthetic psychology studies the creative process and the aesthetic experience.
Aesthetic judgment, universals, and ethics.
Aesthetic judgment.
Aesthetics examines affective domain response to an object or phenomenon. Judgments of aesthetic value rely on the ability to discriminate at a sensory level. However, aesthetic judgments usually go beyond sensory discrimination.
For David Hume, delicacy of taste is not merely "the ability to detect all the ingredients in a composition", but also the sensitivity "to pains as well as pleasures, which escape the rest of mankind." Thus, sensory discrimination is linked to capacity for pleasure.
For Immanuel Kant ("Critique of Judgment", 1790), "enjoyment" is the result when pleasure arises from sensation, but judging something to be "beautiful" has a third requirement: sensation must give rise to pleasure by engaging reflective contemplation. Judgments of beauty are sensory, emotional and intellectual all at once. Kant (1790) observed of a man "If he says that canary wine is agreeable he is quite content if someone else corrects his terms and reminds him to say instead: It is agreeable to "me"," because "Everyone has his own (sense of) taste". The case of "beauty" is different from mere "agreeableness" because, "If he proclaims something to be beautiful, then he requires the same liking from others; he then judges not just for himself but for everyone, and speaks of beauty as if it were a property of things."
Viewer interpretations of beauty may on occasion be observed to possess two concepts of value: aesthetics and taste. Aesthetics is the philosophical notion of beauty. Taste is a result of an education process and awareness of elite cultural values learned through exposure to mass culture. Bourdieu examined how the elite in society define the aesthetic values like taste and how varying levels of exposure to these values can result in variations by class, cultural background, and education. According to Kant, beauty is subjective and universal; thus certain things are beautiful to everyone. In the opinion of Władysław Tatarkiewicz, there are six conditions for the presentation of art: beauty, form, representation, reproduction of reality, artistic expression and innovation. However, one may not be able to pin down these qualities in a work of art.
The question of whether there are facts about aesthetic judgments belongs to the branch of metaphilosophy known as meta-aesthetics.
Factors involved in aesthetic judgment.
Aesthetic judgement is closely tied to disgust. Responses like disgust show that sensory detection is linked in instinctual ways to facial expressions including physiological responses like the gag reflex. Disgust is triggered largely by dissonance; as Darwin pointed out, seeing a stripe of soup in a man's beard is disgusting even though neither soup nor beards are themselves disgusting. Aesthetic judgments may be linked to emotions or, like emotions, partially embodied in physical reactions. For example, the awe inspired by a sublime landscape might physically manifest with an increased heart-rate or pupil dilation.
As seen, emotions are conformed to 'cultural' reactions, therefore aesthetics is always characterized by 'regional responses', as Francis Grose was the first to affirm in his 'Rules for Drawing Caricaturas: With an Essay on Comic Painting' (1788), published in W. Hogarth, The Analysis of Beauty, Bagster, London s.d. (1791? [1753]), pp. 1–24. Francis Grose can therefore be claimed to be the first critical 'aesthetic regionalist' in proclaiming the anti-universality of aesthetics in contrast to the perilous and always resurgent dictatorship of beauty. 'Aesthetic Regionalism' can thus be seen as a political statement and stance which vies against any universal notion of beauty to safeguard the counter-tradition of aesthetics related to what has been considered and dubbed un-beautiful just because one's culture does not contemplate it, e.g. Edmund Burke's sublime, what is usually defined as 'primitive' art, or un-harmonious, non-cathartic art, camp art, which 'beauty' posits and creates, dichotomously, as its opposite, without even the need of formal statements, but which will be 'perceived' as ugly.
Likewise, aesthetic judgments may be culturally conditioned to some extent. Victorians in Britain often saw African sculpture as ugly, but just a few decades later, Edwardian audiences saw the same sculptures as beautiful. Evaluations of beauty may well be linked to desirability, perhaps even to sexual desirability. Thus, judgments of aesthetic value can become linked to judgments of economic, political, or moral value. In a current context, a Lamborghini might be judged to be beautiful partly because it is desirable as a status symbol, or it may be judged to be repulsive partly because it signifies over-consumption and offends political or moral values.
The context of its presentation also affects the perception of artwork; artworks presented in a classical museum context are liked more and rated more interesting than when presented in a sterile laboratory context. While specific results depend heavily on the style of the presented artwork, overall, the effect of context proved to be more important for the perception of artwork than the effect of genuineness (whether the artwork was being presented as original or as a facsimile/copy).
Aesthetic judgments can often be very fine-grained and internally contradictory. Likewise aesthetic judgments seem often to be at least partly intellectual and interpretative. What a thing means or symbolizes is often what is being judged. Modern aestheticians have asserted that will and desire were almost dormant in aesthetic experience, yet preference and choice have seemed important aesthetics to some 20th-century thinkers. The point is already made by Hume, but see Mary Mothersill, "Beauty and the Critic's Judgment", in "The Blackwell Guide to Aesthetics", 2004. Thus aesthetic judgments might be seen to be based on the senses, emotions, intellectual opinions, will, desires, culture, preferences, values, subconscious behaviour, conscious decision, training, instinct, sociological institutions, or some complex combination of these, depending on exactly which theory is employed.
A third major topic in the study of aesthetic judgments is how they are unified across art forms. For instance, the source of a painting's beauty has a different character to that of beautiful music, suggesting their aesthetics differ in kind. The distinct inability of language to express aesthetic judgment and the role of social construction further cloud this issue.
Aesthetic universals.
The philosopher Denis Dutton identified six universal signatures in human aesthetics:
Artists such as Thomas Hirschhorn have indicated that there are too many exceptions to Dutton's categories. For example, Hirschhorn's installations deliberately eschew technical virtuosity. People can appreciate a Renaissance Madonna for aesthetic reasons, but such objects often had (and sometimes still have) specific devotional functions. "Rules of composition" that might be read into Duchamp's "Fountain" or John Cage's "4′33″" do not locate the works in a recognizable style (or certainly not a style recognizable at the time of the works' realization). Moreover, some of Dutton's categories seem too broad: a physicist might entertain hypothetical worlds in his/her imagination in the course of formulating a theory. Another problem is that Dutton's categories seek to universalize traditional European notions of aesthetics and art forgetting that, as André Malraux and others have pointed out, there have been large numbers of cultures in which such ideas (including the idea "art" itself) were non-existent.
Aesthetic ethics.
Aesthetic ethics refers to the idea that human conduct and behaviour ought to be governed by that which is beautiful and attractive. John Dewey has pointed out that the unity of aesthetics and ethics is in fact reflected in our understanding of behaviour being "fair"—the word having a double meaning of attractive and morally acceptable. More recently, James Page has suggested that aesthetic ethics might be taken to form a philosophical rationale for peace education.
Beauty.
Beauty is one of the main subjects of aesthetics, together with art and taste. Many of its definitions include the idea that an object is beautiful if perceiving it is accompanied by aesthetic pleasure. Among the examples of beautiful objects are landscapes, sunsets, humans and works of art. Beauty is a positive aesthetic value that contrasts with ugliness as its negative counterpart.
Different intuitions commonly associated with beauty and its nature are in conflict with each other, which poses certain difficulties for understanding it. On the one hand, beauty is ascribed to things as an objective, public feature. On the other hand, it seems to depend on the subjective, emotional response of the observer. It is said, for example, that "beauty is in the eye of the beholder". It may be possible to reconcile these intuitions by affirming that it depends both on the objective features of the beautiful thing and the subjective response of the observer. One way to achieve this is to hold that an object is beautiful if it has the power to bring about certain aesthetic experiences in the perceiving subject. This is often combined with the view that the subject needs to have the ability to correctly perceive and judge beauty, sometimes referred to as "sense of taste". Various conceptions of how to define and understand beauty have been suggested. "Classical conceptions" emphasize the objective side of beauty by defining it in terms of the relation between the beautiful object as a whole and its parts: the parts should stand in the right proportion to each other and thus compose an integrated harmonious whole. "Hedonist conceptions", on the other hand, focus more on the subjective side by drawing a necessary connection between pleasure and beauty, e.g. that for an object to be beautiful is for it to cause disinterested pleasure. Other conceptions include defining beautiful objects in terms of their value, of a loving attitude towards them or of their function.
New Criticism and "The Intentional Fallacy".
During the first half of the twentieth century, a significant shift to general aesthetic theory took place which attempted to apply aesthetic theory between various forms of art, including the literary arts and the visual arts, to each other. This resulted in the rise of the New Criticism school and debate concerning "the intentional fallacy". At issue was the question of whether the aesthetic intentions of the artist in creating the work of art, whatever its specific form, should be associated with the criticism and evaluation of the final product of the work of art, or, if the work of art should be evaluated on its own merits independent of the intentions of the artist.
In 1946, William K. Wimsatt and Monroe Beardsley published a classic and controversial New Critical essay entitled "The Intentional Fallacy", in which they argued strongly against the relevance of an author's intention, or "intended meaning" in the analysis of a literary work. For Wimsatt and Beardsley, the words on the page were all that mattered; importation of meanings from outside the text was considered irrelevant, and potentially distracting.
In another essay, "The Affective Fallacy," which served as a kind of sister essay to "The Intentional Fallacy" Wimsatt and Beardsley also discounted the reader's personal/emotional reaction to a literary work as a valid means of analyzing a text. This fallacy would later be repudiated by theorists from the reader-response school of literary theory. One of the leading theorists from this school, Stanley Fish, was himself trained by New Critics. Fish criticizes Wimsatt and Beardsley in his essay "Literature in the Reader" (1970).
As summarized by Berys Gaut and Livingston in their essay "The Creation of Art": "Structuralist and post-structuralists theorists and critics were sharply critical of many aspects of New Criticism, beginning with the emphasis on aesthetic appreciation and the so-called autonomy of art, but they reiterated the attack on biographical criticisms' assumption that the artist's activities and experience were a privileged critical topic." These authors contend that: "Anti-intentionalists, such as formalists, hold that the intentions involved in the making of art are irrelevant or peripheral to correctly interpreting art. So details of the act of creating a work, though possibly of interest in themselves, have no bearing on the correct interpretation of the work."
Gaut and Livingston define the intentionalists as distinct from formalists stating that: "Intentionalists, unlike formalists, hold that reference to intentions is essential in fixing the correct interpretation of works." They quote Richard Wollheim as stating that, "The task of criticism is the reconstruction of the creative process, where the creative process must in turn be thought of as something not stopping short of, but terminating on, the work of art itself."
Derivative forms of aesthetics.
A large number of derivative forms of aesthetics have developed as contemporary and transitory forms of inquiry associated with the field of aesthetics which include the post-modern, psychoanalytic, scientific, and mathematical among others.
Post-modern aesthetics and psychoanalysis.
Early-twentieth-century artists, poets and composers challenged existing notions of beauty, broadening the scope of art and aesthetics. In 1941, Eli Siegel, American philosopher and poet, founded Aesthetic Realism, the philosophy that reality itself is aesthetic, and that "The world, art, and self explain each other: each is the aesthetic oneness of opposites."
Various attempts have been made to define Post-Modern Aesthetics. The challenge to the assumption that beauty was central to art and aesthetics, thought to be original, is actually continuous with older aesthetic theory; Aristotle was the first in the Western tradition to classify "beauty" into types as in his theory of drama, and Kant made a distinction between beauty and the sublime. What was new was a refusal to credit the higher status of certain types, where the taxonomy implied a preference for tragedy and the sublime to comedy and the Rococo.
Croce suggested that "expression" is central in the way that beauty was once thought to be central. George Dickie suggested that the sociological institutions of the art world were the glue binding art and sensibility into unities. Marshall McLuhan suggested that art always functions as a "counter-environment" designed to make visible what is usually invisible about a society. Theodor Adorno felt that aesthetics could not proceed without confronting the role of the culture industry in the commodification of art and aesthetic experience. Hal Foster attempted to portray the reaction against beauty and Modernist art in "The Anti-Aesthetic: Essays on Postmodern Culture". Arthur Danto has described this reaction as "kalliphobia" (after the Greek word for beauty, κάλλος "kallos"). André Malraux explains that the notion of beauty was connected to a particular conception of art that arose with the Renaissance and was still dominant in the eighteenth century (but was supplanted later). The discipline of aesthetics, which originated in the eighteenth century, mistook this transient state of affairs for a revelation of the permanent nature of art. Brian Massumi suggests to reconsider beauty following the aesthetical thought in the philosophy of Deleuze and Guattari. Walter Benjamin echoed Malraux in believing aesthetics was a comparatively recent invention, a view proven wrong in the late 1970s, when Abraham Moles and Frieder Nake analyzed links between beauty, information processing, and information theory. Denis Dutton in "The Art Instinct" also proposed that an aesthetic sense was a vital evolutionary factor.
Jean-François Lyotard re-invokes the Kantian distinction between taste and the sublime. Sublime painting, unlike kitsch realism, "... will enable us to see only by making it impossible to see; it will please only by causing pain."
Sigmund Freud inaugurated aesthetical thinking in Psychoanalysis mainly via the "Uncanny" as aesthetical affect. Following Freud and Merleau-Ponty, Jacques Lacan theorized aesthetics in terms of sublimation and the Thing.
The relation of Marxist aesthetics to post-modern aesthetics is still a contentious area of debate.
Aesthetics and science.
The field of experimental aesthetics was founded by Gustav Theodor Fechner in the 19th century. Experimental aesthetics in these times had been characterized by a subject-based, inductive approach. The analysis of individual experience and behaviour based on experimental methods is a central part of experimental aesthetics. In particular, the perception of works of art, music, or modern items such as websites or other IT products is studied. Experimental aesthetics is strongly oriented towards the natural sciences. Modern approaches mostly come from the fields of cognitive psychology (aesthetic cognitivism) or neuroscience (neuroaesthetics).
In the 1970s, Abraham Moles and Frieder Nake were among the first to analyze links between aesthetics, information processing, and information theory.
In the 1990s, Jürgen Schmidhuber described an algorithmic theory of beauty which takes the subjectivity of the observer into account and postulates that among several observations classified as comparable by a given subjective observer, the most aesthetically pleasing is the one that is encoded by the shortest description. He uses the differences between these lengths to account for subjective differences between the aesthetic tastes of different observers, as one's ability to efficiently describe an observation is based on their particular mental method of encoding data and the proximity of the observation to the subject's prior knowledge. The theory is inspired by principles of algorithmic information theory, especially minimum description length, which prefers mathematical models that use the least information to describe data. As an example, Schmidhuber notes that mathematicians tend to aesthetically prefer simple proofs with a short description in their formal language. Another concrete example describes an aesthetically pleasing human face whose proportions can be described by very few bits of information, drawing inspiration from less detailed 15th century proportion studies by Leonardo da Vinci and Albrecht Dürer. Schmidhuber's theory explicitly distinguishes between that which is beautiful and that which is interesting, stating that interestingness corresponds to the first derivative of subjectively perceived beauty. He supposes that every observer continually tries to improve the predictability and compressibility of their observations by identifying regularities like repetition, symmetry, and fractal self-similarity. Whenever the observer's learning process (which may be a predictive artificial neural network) leads to improved data compression such that the observation sequence can be described by fewer bits than before, the temporary interestingness of the data corresponds to the number of saved bits. This compression progress is proportional to the observer's internal reward, also called curiosity reward. A reinforcement learning algorithm is used to maximize future expected reward by learning to execute action sequences that cause additional interesting input data with yet unknown but learnable predictability or regularity. The principles can be implemented on artificial agents which then exhibit a form of artificial curiosity.
Truth in beauty and mathematics.
Mathematical considerations, such as symmetry and complexity, are used for analysis in theoretical aesthetics. This is different from the aesthetic considerations of applied aesthetics used in the study of mathematical beauty. Aesthetic considerations such as symmetry and simplicity are used in areas of philosophy, such as ethics and theoretical physics and cosmology to define truth, outside of empirical considerations. Beauty and Truth have been argued to be nearly synonymous, as reflected in the statement "Beauty is truth, truth beauty" in the poem "Ode on a Grecian Urn" by John Keats, or by the Hindu motto "Satyam Shivam Sundaram" (Satya (Truth) is Shiva (God), and Shiva is Sundaram (Beautiful)). The fact that judgments of beauty and judgments of truth both are influenced by processing fluency, which is the ease with which information can be processed, has been presented as an explanation for why beauty is sometimes equated with truth. Recent research found that people use beauty as an indication for truth in mathematical pattern tasks. However, scientists including the mathematician David Orrell and physicist Marcelo Gleiser have argued that the emphasis on aesthetic criteria such as symmetry is equally capable of leading scientists astray.
Computational approaches.
Computational approaches to aesthetics emerged amid efforts to use computer science methods "to predict, convey, and evoke emotional response to a piece of art. It this field, aesthetics is not considered to be dependent on taste but is a matter of cognition, and, consequently, learning. In 1928, the mathematician George David Birkhoff created an aesthetic measure "M = O/C" as the ratio of order to complexity.
Since about 2005, computer scientists have attempted to develop automated methods to infer aesthetic quality of images. Typically, these approaches follow a machine learning approach, where large numbers of manually rated photographs are used to "teach" a computer about what visual properties are of relevance to aesthetic quality. A study by Y. Li and C.J. Hu employed Birkhoff's measurement in their statistical learning approach where order and complexity of an image determined aesthetic value. The image complexity was computed using information theory while the order was determined using fractal compression. There is also the case of the Acquine engine, developed at Penn State University, that rates natural photographs uploaded by users.
There have also been relatively successful attempts with regard to chess and music. Computational approaches have also been attempted in film making as demonstrated by a software model developed by Chitra Dorai and a group of researchers at the IBM T.J. Watson Research Center. The tool predicted aesthetics based on the values of narrative elements. A relation between Max Bense's mathematical formulation of aesthetics in terms of "redundancy" and "complexity" and theories of musical anticipation was offered using the notion of Information Rate.
Evolutionary aesthetics.
Evolutionary aesthetics refers to evolutionary psychology theories in which the basic aesthetic preferences of "Homo sapiens" are argued to have evolved in order to enhance survival and reproductive success. One example being that humans are argued to find beautiful and prefer landscapes which were good habitats in the ancestral environment. Another example is that body symmetry and proportion are important aspects of physical attractiveness which may be due to this indicating good health during body growth. Evolutionary explanations for aesthetical preferences are important parts of evolutionary musicology, Darwinian literary studies, and the study of the evolution of emotion.
Applied aesthetics.
As well as being applied to art, aesthetics can also be applied to cultural objects, such as crosses or tools. For example, aesthetic coupling between art-objects and medical topics was made by speakers working for the US Information Agency. Art slides were linked to slides of pharmacological data, which improved attention and retention by simultaneous activation of intuitive right brain with rational left. It can also be used in topics as diverse as cartography, mathematics, gastronomy, fashion and website design.
Other approaches.
Guy Sircello has pioneered efforts in analytic philosophy to develop a rigorous theory of aesthetics, focusing on the concepts of beauty, love and sublimity. In contrast to romantic theorists, Sircello argued for the objectivity of beauty and formulated a theory of love on that basis.
British philosopher and theorist of conceptual art aesthetics, Peter Osborne, makes the point that "'post-conceptual art' aesthetic does not concern a particular type of contemporary art so much as the historical-ontological condition for the production of contemporary art in general ...". Osborne noted that contemporary art is 'post-conceptual' in a public lecture delivered in 2010.
Gary Tedman has put forward a theory of a subjectless aesthetics derived from Karl Marx's concept of alienation, and Louis Althusser's antihumanism, using elements of Freud's group psychology, defining a concept of the 'aesthetic level of practice'.
Gregory Loewen has suggested that the subject is key in the interaction with the aesthetic object. The work of art serves as a vehicle for the projection of the individual's identity into the world of objects, as well as being the irruptive source of much of what is uncanny in modern life. As well, art is used to memorialize individuated biographies in a manner that allows persons to imagine that they are part of something greater than themselves.
Criticism.
The philosophy of aesthetics as a practice has been criticized by some sociologists and writers of art and society. Raymond Williams, for example, argues that there is no unique and or individual aesthetic object which can be extrapolated from the art world, but rather that there is a continuum of cultural forms and experience of which ordinary speech and experiences may signal as art. By "art" we may frame several artistic "works" or "creations" as so though this reference remains within the institution or special event which creates it and this leaves some works or other possible "art" outside of the frame work, or other interpretations such as other phenomenon which may not be considered as "art".
Pierre Bourdieu disagrees with Kant's idea of the "aesthetic". He argues that Kant's "aesthetic" merely represents an experience that is the product of an elevated class habitus and scholarly leisure as opposed to other possible and equally valid "aesthetic" experiences which lay outside Kant's narrow definition.
Timothy Laurie argues that theories of musical aesthetics "framed entirely in terms of appreciation, contemplation or reflection risk idealizing an implausibly unmotivated listener defined solely through musical objects, rather than seeing them as a person for whom complex intentions and motivations produce variable attractions to cultural objects and practices".
|
2134 | Ark of the Covenant | The Ark of the Covenant, also known as the Ark of the Testimony or the Ark of God, is an artifact believed to be the most sacred relic of the Israelites, which is described as a wooden chest, covered in pure gold, with an elaborately designed lid called the mercy seat. According to the Book of Exodus, the Ark contained the two stone tablets of the Ten Commandments. According to the New Testament Book of Hebrews, it also contained Aaron's rod and a pot of manna.
The biblical account relates that approximately one year after the Israelites' exodus from Egypt, the Ark was created according to the pattern given to Moses by God when the Israelites were encamped at the foot of Mount Sinai. Thereafter, the gold-plated acacia chest was carried by its staves by the Levites approximately 2,000 cubits (approximately ) in advance of the people when on the march. God spoke with Moses "from between the two cherubim" on the Ark's cover.
Biblical account.
Construction and description.
According to the Book of Exodus, God instructed Moses to build the Ark during his 40-day stay upon Mount Sinai. He was shown the pattern for the tabernacle and furnishings of the Ark, and told that it would be made of shittim wood (also known as acacia wood) to house the Tablets of Stone. Moses instructed Bezalel and Aholiab to construct the Ark.
The Book of Exodus gives detailed instructions on how the Ark is to be constructed. It is to be cubits in length, cubits breadth, and cubits height (approximately ) of acacia wood. Then it is to be gilded entirely with gold, and a crown or molding of gold is to be put around it. Four rings of gold are to be attached to its four corners, two on each side—and through these rings staves of shittim wood overlaid with gold for carrying the Ark are to be inserted; and these are not to be removed. A golden lid, the (translated as or ), which is ornamented with two golden cherubim, is to be placed above the Ark. Missing from the account are instructions concerning the thickness of the mercy seat and details about the cherubim other than that the cover be beaten out over the ends of the Ark and that they form the space where God will appear. The Ark is finally to be placed under a veil to conceal it.
Mobile vanguard.
The biblical account continues that, after its creation by Moses, the Ark was carried by the Israelites during their 40 years of wandering in the desert. Whenever the Israelites camped, the Ark was placed in a separate room in a sacred tent, called the Tabernacle.
When the Israelites, led by Joshua toward the Promised Land, arrived at the banks of the River Jordan, the Ark was carried in the lead, preceding the people, and was the signal for their advance. During the crossing, the river grew dry as soon as the feet of the priests carrying the Ark touched its waters, and remained so until the priests—with the Ark—left the river after the people had passed over. As memorials, twelve stones were taken from the Jordan at the place where the priests had stood.
During the Battle of Jericho, the Ark was carried around the city once a day for six days, preceded by the armed men and seven priests sounding seven trumpets of rams' horns. On the seventh day, the seven priests sounding the seven trumpets of rams' horns before the Ark compassed the city seven times and, with a great shout, Jericho's wall fell down flat and the people took the city.
After the defeat at Ai, Joshua lamented before the Ark. When Joshua read the Law to the people between Mount Gerizim and Mount Ebal, they stood on each side of the Ark. We next hear of the Ark in Bethel, where it was being cared for by the priest Phinehas, the grandson of Aaron. According to this verse, it was consulted by the people of Israel when they were planning to attack the Benjaminites at the Battle of Gibeah. Later the Ark was kept at Shiloh, another religious centre some north of Bethel, at the time of the prophet Samuel's apprenticeship, where it was cared for by Hophni and Phinehas, two sons of Eli.
Capture by the Philistines.
According to the biblical narrative, a few years later the elders of Israel decided to take the Ark out onto the battlefield to assist them against the Philistines, having recently been defeated at the battle of Eben-Ezer. They were again heavily defeated, with the loss of 30,000 men. The Ark was captured by the Philistines and Hophni and Phinehas were killed. The news of its capture was at once taken to Shiloh by a messenger "with his clothes rent, and with earth upon his head". The old priest, Eli, fell dead when he heard it; and his daughter-in-law, bearing a son at the time the news of the Ark's capture was received, named him Ichabod—explained as "The glory has departed Israel" in reference to the loss of the Ark. Ichabod's mother died at his birth.
The Philistines took the Ark to several places in their country, and at each place misfortune befell them. At Ashdod it was placed in the temple of Dagon. The next morning Dagon was found prostrate, bowed down, before it; and on being restored to his place, he was on the following morning again found prostrate and broken. The people of Ashdod were smitten with tumors; a plague of rodents was sent over the land. This may have been the bubonic plague. The affliction of tumours was also visited upon the people of Gath and of Ekron, whither the Ark was successively removed.
Return of the Ark to the Israelites.
After the Ark had been among them for seven months, the Philistines, on the advice of their diviners, returned it to the Israelites, accompanying its return with an offering consisting of golden images of the tumors and mice wherewith they had been afflicted. The Ark was set up in the field of Joshua the Beth-shemite, and the Beth-shemites offered sacrifices and burnt offerings. Out of curiosity the men of Beth-shemesh gazed at the Ark; and as a punishment, seventy of them (fifty thousand and seventy in some translations) were struck down by the Lord. The Bethshemites sent to Kirjath-jearim, or Baal-Judah, to have the Ark removed; and it was taken to the house of Abinadab, whose son Eleazar was sanctified to keep it. Kirjath-jearim remained the abode of the Ark for twenty years. Under Saul, the Ark was with the army before he first met the Philistines, but the king was too impatient to consult it before engaging in battle. In 1 Chronicles 13:3 it is stated that the people were not accustomed to consulting the Ark in the days of Saul.
In the days of King David.
In the biblical narrative, at the beginning of his reign over the United Monarchy, King David removed the Ark from Kirjath-jearim amid great rejoicing. On the way to Zion, Uzzah, one of the drivers of the cart that carried the Ark, put out his hand to steady the Ark, and was struck dead by God for touching it. The place was subsequently named "Perez-Uzzah", literally , as a result. David, in fear, carried the Ark aside into the house of Obed-edom the Gittite, instead of carrying it on to Zion, and it stayed there for three months.
On hearing that God had blessed Obed-edom because of the presence of the Ark in his house, David had the Ark brought to Zion by the Levites, while he himself, "girded with a linen ephod[...] danced before the Lord with all his might" and in the sight of all the public gathered in Jerusalem, a performance which caused him to be scornfully rebuked by his first wife, Saul's daughter Michal. In Zion, David put the Ark in the tent he had prepared for it, offered sacrifices, distributed food, and blessed the people and his own household. David used the tent as a personal place of prayer.
The Levites were appointed to minister before the Ark. David's plan of building a temple for the Ark was stopped on the advice of the prophet Nathan. The Ark was with the army during the siege of Rabbah; and when David fled from Jerusalem at the time of Absalom's conspiracy, the Ark was carried along with him until he ordered Zadok the priest to return it to Jerusalem.
In Solomon's Temple.
According to the Biblical narrative, when Abiathar was dismissed from the priesthood by King Solomon for having taken part in Adonijah's conspiracy against David, his life was spared because he had formerly borne the Ark. Solomon worshipped before the Ark after his dream in which God promised him wisdom.
During the construction of Solomon's Temple, a special inner room, named ('Holy of Holies'), was prepared to receive and house the Ark; and when the Temple was dedicated, the Ark—containing the original tablets of the Ten Commandments—was placed therein. When the priests emerged from the holy place after placing the Ark there, the Temple was filled with a cloud, "for the glory of the Lord had filled the house of the Lord".
When Solomon married Pharaoh's daughter, he caused her to dwell in a house outside Zion, as Zion was consecrated because it contained the Ark. King Josiah also had the Ark returned to the Temple, from which it appears to have been removed by one of his predecessors (cf. 2 Chronicles 33–34 and 2 Kings 21–23).
In the days of King Hezekiah.
King Hezekiah is the last biblical figure mentioned as having seen the Ark. Hezekiah is also known for protecting Jerusalem against the Assyrian Empire by improving the city walls and diverting the waters of the Gihon Spring through a tunnel known today as Hezekiah's Tunnel, which channeled the water inside the city walls to the Pool of Siloam.
In a noncanonical text known as the Treatise of the Vessels, Hezekiah is identified as one of the kings who had the Ark and the other treasures of Solomon's Temple hidden during a time of crisis. This text lists the following hiding places, which it says were recorded on a bronze tablet: (1) a spring named Kohel or Kahal with pure water in a valley with a stopped-up gate; (2) a spring named Kotel (or "wall" in Hebrew); (3) a spring named Zedekiah; (4) an unidentified cistern; (5) Mount Carmel; and (6) locations in Babylon.
To many scholars, Hezekiah is also credited as having written all or some of the Book of Kohelet (Ecclesiastes in the Christian tradition), in particular the famously enigmatic epilogue. Notably, the epilogue appears to refer to the Ark story with references to almond blossoms (i.e., Aaron's rod), locusts, silver, and gold. The epilogue then cryptically refers to a pitcher broken at a fountain and a wheel broken at a cistern.
Although scholars disagree on whether the Pool of Siloam's pure spring waters were used by pilgrims for ritual purification, many scholars agree that a stepped pilgrimage road between the pool and the Temple had been built in the first century CE. This roadway has been partially excavated, but the west side of the Pool of Siloam remains unexcavated.
The Babylonian conquest and aftermath.
In 587 BC, the Babylonians destroyed Jerusalem and Solomon's Temple. There is no record of what became of the Ark in the Books of Kings and Chronicles. An ancient Greek version of the biblical third Book of Ezra, 1 Esdras, suggests that Babylonians took away the vessels of the ark of God, but does not mention taking away the Ark:
In Rabbinic literature, the final disposition of the Ark is disputed. Some rabbis hold that it must have been carried off to Babylon, while others hold that it must have been hidden lest it be carried off into Babylon and never brought back. A late 2nd-century rabbinic work known as the states the opinions of these rabbis that Josiah, the king of Judah, stored away the Ark, along with the jar of manna, and a jar containing the holy anointing oil, the rod of Aaron which budded and a chest given to Israel by the Philistines. This was said to have been done in order to prevent their being carried off into Babylon as had already happened to the other vessels. Rabbi Eliezer and Rabbi Shimon, in the same rabbinic work, state that the Ark was, in fact, taken into Babylon. Rabbi Yehudah, dissenting, says that the Ark was stored away in its own place, meaning somewhere on the Temple Mount.
Service of the Kohathites.
The Kohathites were one of the Levite houses from the Book of Numbers. Theirs was the responsibility to care for "the most holy things" in the tabernacle. When the camp, then wandering the Wilderness, set out the Kohathites would enter the tabernacle with Aaron and cover the ark with the screening curtain and "then they shall put on it a covering of fine leather, and spread over that a cloth all of blue, and shall put its poles in place." The ark was one of the items of the tent of meeting that the Kohathites were responsible for carrying.
Samaritan tradition.
Samaritan tradition claims that until the split between Samaritanism and Judaism, which arose when the priest Eli stole the Ark of the Covenant and established a rival cult at Shiloh, the Ark of the Covenant had been kept at the sanctuary of YHWH on Mt. Gerizim.
Archaeology.
Archaeological evidence shows strong cultic activity at Kiriath-Jearim in the 8th and 7th centuries BC, well after the ark was supposedly removed from there to Jerusalem. In particular, archaeologists found a large elevated podium, associated with the Northern Kingdom and not the Southern Kingdom, which may have been a shrine. Thomas Römer suggests that this may indicate that the ark was not moved to Jerusalem until much later, possibly during the reign of King Josiah (reigned ). He notes that this might explain why the ark featured prominently in the history before Solomon, but not after. Additionally, 2 Chronicles 35:3 indicates that it was moved during King Josiah's reign.
Some scholars believe the story of the Ark was written independently around the 8th century in a text referred to as the "Ark Narrative" and then incorporated into the main biblical narrative just before the Babylonian exile.
Römer also suggests that the ark may have originally carried sacred stones "of the kind found in the chests of pre-Islamic Bedouins" and speculates that these may have been either a statue of Yahweh or a pair of statues depicting both Yahweh and his companion goddess Asherah. In contrast, Scott Noegel has argued that the parallels between the ark and these practices "remain unconvincing" in part because the Bedouin objects lack the ark's distinctive structure, function, and mode of transportation. Specifically, unlike the ark, the Bedouin chests "contained no box, no lid, and no poles," they did not serve as the throne or footstool of a god, they were not overlaid with gold, did not have kerubim figures upon them, there were no restrictions on who could touch them, and they were transported on horses or camels. Noegel suggests that the ancient Egyptian bark is a more plausible model for the Israelite ark, since Egyptian barks had all the features just mentioned. Noegel adds that the Egyptians also were known to place written covenants beneath the feet of statues, proving a further parallel to the placement of the covenental tablets inside the ark.
References in Abrahamic religions.
Tanakh.
The Ark is first mentioned in the Book of Exodus and then numerous times in Deuteronomy, Joshua, Judges, I Samuel, II Samuel, I Kings, I Chronicles, II Chronicles, Psalms, and Jeremiah.
In the Book of Jeremiah, it is referenced by Jeremiah, who, speaking in the days of Josiah, prophesied a future time, possibly the end of days, when the Ark will no longer be talked about or be made use of again:
Rashi comments on this verse that "The entire people will be so imbued with the spirit of sanctity that God's Presence will rest upon them collectively, as if the congregation itself was the Ark of the Covenant."
Second Book of Maccabees.
According to Second Maccabees, at the beginning of chapter 2:
The "mountain from the top of which Moses saw God's promised land" would be Mount Nebo, located in what is now Jordan.
New Testament.
In the New Testament, the Ark is mentioned in the Letter to the Hebrews and the Revelation to St. John. Hebrews 9:4 states that the Ark contained "the golden pot that had manna, and Aaron's rod that budded, and the tablets of the covenant." Revelation 11:19 says the prophet saw God's temple in heaven opened, "and the ark of his covenant was seen within his temple."
The contents of the ark are seen by theologians such as the Church Fathers and Thomas Aquinas as personified by Jesus Christ: the manna as the Holy Eucharist; Aaron's rod as Jesus' eternal priestly authority; and the tablets of the Law, as the Lawgiver himself.
Catholic scholars connect this verse with the Woman of the Apocalypse in Revelation 12:2, which immediately follows, and say that the Blessed Virgin Mary is identified as the "Ark of the New Covenant." Carrying the saviour of mankind within her, she herself became the Holy of Holies. This is the interpretation given in the third century by Gregory Thaumaturgus, and in the fourth century by Saint Ambrose, Saint Ephraem of Syria and Saint Augustine. The Catholic Church teaches this in the Catechism of the Catholic Church: "Mary, in whom the Lord himself has just made his dwelling, is the daughter of Zion in person, the Ark of the Covenant, the place where the glory of the Lord dwells. She is 'the dwelling of God[...] with men."
In the Gospel of Luke, the author's accounts of the Annunciation and Visitation are constructed using eight points of literary parallelism to compare Mary to the Ark.
Saint Athanasius, the bishop of Alexandria, is credited with writing about the connections between the Ark and the Virgin Mary: "O noble Virgin, truly you are greater than any other greatness. For who is your equal in greatness, O dwelling place of God the Word? To whom among all creatures shall I compare you, O Virgin? You are greater than them all O (Ark of the) Covenant, clothed with purity instead of gold! You are the Ark in which is found the golden vessel containing the true manna, that is, the flesh in which Divinity resides" ("Homily of the Papyrus of Turin").
Quran.
The Ark is referred to in the Quran (Surah The Heifer: 248):
The Ark in other faiths.
According to Uri Rubin, the Ark of the Covenant has a religious basis in Islam (and the Baha'i faith), which gives it special significance.
Whereabouts.
Since its disappearance from the Biblical narrative, there have been a number of claims of having discovered or of having possession of the Ark, and several possible places have been suggested for its location.
Maccabees.
2 Maccabees 2:4–10, written around 100 BC, says that the prophet Jeremiah, "being warned by God" before the Babylonian invasion, took the Ark, the Tabernacle, and the Altar of Incense, and buried them in a cave, informing those of his followers who wished to find the place that it should remain unknown "until the time that God should gather His people again together, and receive them unto mercy."
Ethiopia.
The Ethiopian Orthodox Tewahedo Church claims to possess the Ark of the Covenant in Axum. The Ark is currently kept under guard in a treasury near the Church of Our Lady Mary of Zion. Replicas of the tablets within the Ark, or "Tabots", are kept in every Ethiopian Orthodox Tewahedo Church, and kept in its own holy of holies, each with its own dedication to a particular saint; the most popular of these include Saint Mary, Saint George and Saint Michael.
The "Kebra Nagast" is often said to have been composed to legitimise the Solomonic dynasty, which ruled the Ethiopian Empire following its establishment in 1270, but this is not the case. It was originally composed in some other language (Coptic or Greek), then translated into Arabic, and translated into Ge'ez in 1321. It narrates how the real Ark of the Covenant was brought to Ethiopia by Menelik I with divine assistance, while a forgery was left in the Temple in Jerusalem. Although the "Kebra Nagast" is the best-known account of this belief, it predates the document. Abu al-Makarim, writing in the last quarter of the twelfth century, makes one early reference to this belief that they possessed the Ark. "The Abyssinians possess also the Ark of the Covenant", he wrote, and, after a description of the object, describes how the liturgy is celebrated upon the Ark four times a year, "on the feast of the great nativity, on the feast of the glorious Baptism, on the feast of the holy Resurrection, and on the feast of the illuminating Cross."
In his controversial 1992 book "The Sign and the Seal", British writer Graham Hancock reports on the Ethiopian belief that the ark spent several years in Egypt before it came to Ethiopia via the Nile River, where it was kept in the islands of Lake Tana for about four hundred years and finally taken to Axum. (Archaeologist John Holladay of the University of Toronto called Hancock's theory "garbage and hogwash"; Edward Ullendorff, a former professor of Ethiopian Studies at the University of London, said he "wasted a lot of time reading it.") In a 1992 interview, Ullendorff says that he personally examined the ark held within the church in Axum in 1941 while a British Army officer. Describing the ark there, he says, "They have a wooden box, but it's empty. Middle- to late-medieval construction, when these were fabricated ad hoc."
On 25 June 2009, the patriarch of the Orthodox Church of Ethiopia, Abune Paulos, said he would announce to the world the next day the unveiling of the Ark of the Covenant, which he said had been kept safe and secure in a church in Axum, Ethiopia. The following day, on 26 June 2009, the patriarch announced that he would not unveil the Ark after all, but that instead he could attest to its current status.
Southern Africa.
The Lemba people of South Africa and Zimbabwe have claimed that their ancestors carried the Ark south, calling it the "ngoma lungundu" or "voice of God", eventually hiding it in a deep cave in the Dumghe mountains, their spiritual home.
On 14 April 2008, in a UK Channel 4 documentary, Tudor Parfitt, taking a literalist approach to the Biblical story, described his research into this claim. He says that the object described by the Lemba has attributes similar to the Ark. It was of similar size, was carried on poles by priests, was not allowed to touch the ground, was revered as a voice of their God, and was used as a weapon of great power, sweeping enemies aside.
In his book "The Lost Ark of the Covenant" (2008), Parfitt also suggests that the Ark was taken to Arabia following the events depicted in the Second Book of Maccabees, and cites Arabic sources which maintain it was brought in distant times to Yemen. Genetic Y-DNA analyses in the 2000s have established a partially Middle-Eastern origin for a portion of the male Lemba population but no specific Jewish connection. Lemba tradition maintains that the Ark spent some time in a place called Sena, which might be Sena in Yemen. Later, it was taken across the sea to East Africa and may have been taken inland at the time of the Great Zimbabwe civilization. According to their oral traditions, some time after the arrival of the Lemba with the Ark, it self-destructed. Using a core from the original, the Lemba priests constructed a new one. This replica was discovered in a cave by a Swedish German missionary named Harald von Sicard in the 1940s and eventually found its way to the Museum of Human Science in Harare.
Europe.
Rome.
The Ark of the Covenant was said to have been kept in the Basilica of St. John Lateran, surviving the pillages of Rome by Alaric I and Gaiseric but lost when the basilica burned.
"Rabbi Eliezer ben José stated that he saw in Rome the mercy-seat of the temple. There was a bloodstain on it. On inquiry he was told that it was a stain from the blood which the high priest sprinkled thereon on the Day of Atonement."
Ireland.
Between 1899 and 1902, the British-Israel Association of London carried out some excavations of the Hill of Tara in Ireland looking for the Ark of the Covenant. The Irish nationalists like Maud Gonne and the Royal Society of Antiquaries of Ireland (RSAI) campaigned successfully to have them stopped before they destroyed the hill. A non-invasive survey by archaeologist Conor Newman carried out from 1992 until 1995 found no evidence of the Ark.
The British Israelites believed that the Ark was located at the grave of the Egyptian princess Tea Tephi, who according to Irish legend came to Ireland in the 6th century BC and married Irish King Érimón. Because of the historical importance of Tara Irish nationalists like Douglas Hyde and W.B. Yeats voiced their protests in newspapers and in 1902 Maud Gonne led a protest against the excavations at the site.
In popular culture.
Philip Kaufman conceived of the Ark of the Covenant as the main plot device of Steven Spielberg's 1981 adventure film "Raiders of the Lost Ark", where it is found by Indiana Jones in the Egyptian city of Tanis in 1936. In early 2020, a prop version made for the film (which does not actually appear onscreen) was featured on "Antiques Roadshow".
In the Danish family film "The Lost Treasure of the Knights Templar" from 2006, the main part of the treasure found in the end is the Ark of the Covenant. The power of the Ark comes from static electricity stored in separated metal plates like a giant Leyden jar.
In Harry Turtledove's novel "Alpha and Omega" (2019) the ark is found by archeologists, and the characters have to deal with the proven existence of God.
Yom HaAliyah.
Yom HaAliyah (Aliyah Day) () is an Israeli national holiday celebrated annually on the tenth of the Hebrew month of Nisan to commemorate the Israelites crossing the Jordan River into the Land of Israel while carrying the Ark of the Covenant.
|
2136 | Angles (tribe) | The Angles (, ; ) were one of the main Germanic peoples who settled in Great Britain in the post-Roman period. They founded several kingdoms of the Heptarchy in Anglo-Saxon England. Their name is the root of the name "England" ("land of Ængle"). According to Tacitus, writing around 100 AD, a people known as Angles (Anglii) lived east of the Lombards and Semnones, who lived near the Elbe river.
Etymology.
The name of the Angles may have been first recorded in Latinised form, as "Anglii", in the "Germania" of Tacitus. It is thought to derive from the name of the area they originally inhabited, the Anglia Peninsula ("Angeln" in modern German, "Angel" in modern Danish).
Multiple theories concerning the etymology of the name have been hypothesised:
During the fifth century, all Germanic tribes who invaded Britain were referred to as either "Englisc", "Ængle" or "Engle", who were all speakers of Old English (which was known as "Englisc", "Ænglisc", or "Anglisc"). "Englisc" and its descendant, "English", also goes back to Proto-Indo-European "*h₂enǵʰ-", meaning narrow.
Pope Gregory I, in an epistle, simplified the Latinised name "Anglii" to "Angli", the latter form developing into the preferred form of the word. The country remained "Anglia" in Latin. Alfred the Great's translation of Orosius's history of the world uses "Angelcynn" (-kin) to describe the English people; Bede uses "Angelfolc" (-folk); also such forms as "Engel", "Englan" (the people), "Englaland", and "Englisc" occur, all showing i-mutation.
Greco-Roman historiography.
Tacitus.
The earliest known mention of the Angles may be in chapter 40 of Tacitus's "Germania" written around AD 98. Tacitus describes the "Anglii" as one of the more remote Suebic tribes compared to the Semnones and Langobardi, who lived on the Elbe and were better known to the Romans. He grouped the Angles with several other tribes in that region, the Reudigni, Aviones, Varini, Eudoses, Suarines, and Nuithones. These were all living behind ramparts of rivers and woods, and therefore inaccessible to attack.
He gives no precise indication of their geographical situation but states that, together with the six other tribes, they worshipped Nerthus, or Mother Earth, whose sanctuary was located on "an island in the Ocean". The Eudoses are the Jutes; these names probably refer to localities in Jutland or on the Baltic coast. The coast contains sufficient estuaries, inlets, rivers, islands, swamps, and marshes to have been then inaccessible to those not familiar with the terrain, such as the Romans, who considered it unknown, inaccessible, with a small population and of little economic interest.
The majority of scholars believe that the Anglii lived on the coasts of the Baltic Sea, probably in the southern part of the Jutland peninsula. This view is based partly on Old English and Danish traditions regarding persons and events of the fourth century, and partly because striking affinities to the cult of Nerthus as described by Tacitus are to be found in pre-Christian Scandinavian religion.
Ptolemy.
Surviving versions of the work of Ptolemy, who wrote around AD 150, in his atlas "Geography" (2.10), describes them in a confusing manner. In one passage, the "Sueboi Angeilloi" (in Greek equivalent to Latin spelling "Suevi Angili"), are living in a stretch of land between the northern Rhine and central Elbe, but apparently not touching either river, with the Suebic Langobardi on the Rhine to their west, and the Suebic Semnones on the Elbe stretching to their east. This is unexpected. However, as pointed out by Gudmund Schütte, the Langobards also appear as the "Laccobardi" in another position near the Elbe and the Saxons, which is considered more likely to be correct, and the Angles probably lived in that region also. Owing to the uncertainty of this passage, much speculation existed regarding the original home of the Anglii.
One theory is that they or part of them dwelt or moved among other coastal people, perhaps confederated up to the basin of the Saale (in the neighbourhood of the ancient canton of Engilin) on the Unstrut valleys below the Kyffhäuserkreis, from which region the "Lex Anglorum et Werinorum hoc est Thuringorum" is believed by many to have come. The ethnic names of Frisians and Warines are also attested in these Saxon districts.
A second possible solution is that these Angles of Ptolemy are not those of Schleswig at all. According to Julius Pokorny, the Angri- in Angrivarii, the -angr in Hardanger and the Angl- in Anglii all come from the same root meaning "bend", but in different senses. In other words, the similarity of the names is strictly coincidental and does not reflect any ethnic unity beyond Germanic. Gudmund Schütte, in his analysis of Ptolemy, believes that the Angles have simply been moved by an error coming from Ptolemy's use of imperfect sources. He points out that Angles are placed correctly just to the northeast of the Langobardi, but that these have been duplicated, so that they appear once, correctly, on the lower Elbe, and a second time, incorrectly, at the northern Rhine.
Medieval historiography.
Bede (died 735) stated that the Anglii, before coming to Great Britain, dwelt in a land called Angulus, "which lies between the province of the Jutes and the Saxons, and remains unpopulated to this day." Similar evidence is given by the 9th-century "Historia Brittonum". King Alfred the Great and the chronicler Æthelweard identified this place with Anglia, in the province of Schleswig (Slesvig; though it may then have been of greater extent), and this identification agrees with the indications given by Bede.
In the Norwegian seafarer Ohthere of Hålogaland's account of a two-day voyage from the Oslo fjord to Schleswig, he reported the lands on his starboard bow, and Alfred appended the note "on these islands dwelt the "Engle" before they came hither". Confirmation is afforded by English and Danish traditions relating to two kings named Wermund and Offa of Angel, from whom the Mercian royal family claimed descent and whose exploits are connected with Anglia, Schleswig, and Rendsburg.
Danish tradition has preserved record of two governors of Schleswig, father and son, in their service, Frowinus (Freawine) and Wigo (Wig), from whom the royal family of Wessex claimed descent. During the fifth century, the Anglii invaded Great Britain, after which time their name does not recur on the continent except in the title of the legal code issued to the Thuringians: "Lex Angliorum et Werinorum hoc est Thuringorum".
The Angles are the subject of a legend about Pope Gregory I, who happened to see a group of Angle children from Deira for sale as slaves in the Roman market. As the story was told by Bede, Gregory was struck by the unusual appearance of the slaves and asked about their background. When told they were called "Anglii" (Angles), he replied with a Latin pun that translates well into English: "Bene, nam et angelicam habent faciem, et tales angelorum in caelis decet esse coheredes" (It is well, for they have an angelic face, and such people ought to be co-heirs of the angels in heaven). Supposedly, this encounter inspired the pope to launch a mission to bring Christianity to their countrymen.
Archaeology.
The province of Schleswig has proved rich in prehistoric antiquities that date apparently from the fourth and fifth centuries. A large cremation cemetery has been found at Borgstedt, between Rendsburg and Eckernförde, and it has yielded many urns and brooches closely resembling those found in pagan graves in England. Of still greater importance are the great deposits at Thorsberg moor (in Anglia) and Nydam, which contained large quantities of arms, ornaments, articles of clothing, agricultural implements, etc., and in Nydam, even ships. By the help of these discoveries, Angle culture in the age preceding the invasion of Britannia can be pieced together.
Anglian kingdoms in England.
According to sources such as the "History" of Bede, after the invasion of Britannia, the Angles split up and founded the kingdoms of Northumbria, East Anglia, and Mercia. H. R. Loyn has observed in this context that "a sea voyage is perilous to tribal institutions", and the apparently tribe-based kingdoms were formed in England. Early times had two northern kingdoms (Bernicia and Deira) and two midland ones (Middle Anglia and Mercia), which had by the seventh century resolved themselves into two Angle kingdoms, viz., Northumbria and Mercia.
Northumbria held suzerainty amidst the Teutonic presence in the British Isles in the 7th century, but was eclipsed by the rise of Mercia in the 8th century. Both kingdoms fell in the great assaults of the Danish Viking armies in the 9th century. Their royal houses were effectively destroyed in the fighting, and their Angle populations came under the Danelaw. Further south, the Saxon kings of Wessex withstood the Danish assaults. Then in the late 9th and early 10th centuries, the kings of Wessex defeated the Danes and liberated the Angles from the Danelaw.
They united their house in marriage with the surviving Angle royalty, and were accepted by the Angles as their kings. This marked the passing of the old Anglo-Saxon world and the dawn of the "English" as a new people. The regions of East Anglia and Northumbria are still known by their original titles. Northumbria once stretched as far north as what is now southeast Scotland, including Edinburgh, and as far south as the Humber estuary and even the river Witham.
The rest of that people stayed at the centre of the Angle homeland in the northeastern portion of the modern German "Bundesland" of Schleswig-Holstein, on the Jutland Peninsula. There, a small peninsular area is still called Anglia today and is formed as a triangle drawn roughly from modern Flensburg on the Flensburger Fjord to the City of Schleswig and then to Maasholm, on the Schlei inlet.
References.
Sources
Attribution:
|
2137 | Aster CT-80 | The Aster CT-80, an early (1982) home/personal computer developed by the small Dutch company MCP (later renamed to Aster Computers), was sold in its first incarnation as a kit for hobbyists. Later it was sold ready to use. It consisted of several Eurocard PCB's with DIN 41612 connectors, and a backplane all based on a 19-inch rack configuration. It was the first commercially available Dutch personal/home computer. The Aster computer could use the software written for the popular Tandy TRS-80 computer while fixing many of the problems of that computer, but it could also run CP/M software, with a large amount of free memory Transient Program Area, (TPA) and a full 80×25 display, and it could be used as a Videotext terminal. Although the Aster was a clone of the TRS-80 Model I it was in fact more compatible with the TRS-80 Model III, and ran all the software of these systems including games. It also had a built in speaker which was compatible with such games software.
Models.
Three models were sold. The first model (launched June 1982) looked like the IBM PC, a rectangular base unit with two floppy drives on the front, and a monitor on top with a separate detachable keyboard. The second incarnation was a much smaller unit the width of two 5" floppy drives stacked on top of each other, and the third incarnation looked like a flattened Apple with a built-in keyboard.
All units ran much faster than the original TRS-80, at 4 MHz, (with a software selectable throttle to the original speed for compatibility purposes) and the display supported upper and lower case, hardware snow suppression (video ram bus arbitration logic), and an improved character font set. The floppy disk interface supported dual density, and disk capacities up to 800 KB, more than four times the capacity of the original TRS-80. A special version of NewDos/80, (an improved TRS-DOS compatible Disk operating system) was used to support these disk capacities when using the TRS-80 compatibility mode.
For the educational market a version of the first model was produced with a new plastic enclosure (the First Asters had an all-metal enclosure) that also had an opening on the top in which a cassette recorder could be placed. This model was used in a cluster with one Aster (with disk drives) for the teacher, and eight disk less versions for the pupils. The pupils could download software from the teachers computer through a network based on a fast serial connection, as well as sending back their work to the teachers computer. There was also hardware in place through which the teacher could see the display of each pupils screen on his own monitor.
Working modes.
The Aster used 64 KB of RAM and had the unique feature of supporting two fundamentally different internal architectures: when turned on without a boot floppy or with a TRS-DOS floppy, the Aster would be fully TRS-80 compatible, with 48 KB of RAM. When the boot loader detected a CP/M floppy, the Aster would reconfigure its internal memory architecture on the fly to optimally support CP/M with 60 KB free RAM for programs (TPA) and an 80 x 25 display. This dual-architecture capability only existed on one other TRS-80 clone, the LOBO Max-80.
With a special configuration tool, the CT-80 could reconfigure its floppy drivers to read and write the floppies of about 80 other CP/M systems.
A third mode was entered with a special boot floppy which turned the Aster into a Videotex terminal with a 40x25 display and a Videotex character set, The software used the built in RS-232 interface of the Aster to control a modem through which it could contact a Prestel service provider.
Sales.
Most Aster CT-80's (about 10 thousand of them) were sold to schools for computer education, in a project first known as the "honderd scholen project" (one hundred schools project), but which later involved many more than just one hundred schools. MCP received this order from the Dutch government because their computer met all the technical and other demands, including the demand that the computers should be of Dutch origin and should be built in the Netherlands. Another important demand was that the computers could be used in a network (Aster developed special software and hardware for that). Later however the Government turned around and gave 50% of the order to Philips and their P2000 homecomputer even though the P2000 did not meet all the technical demands, was made in Austria and did not have network hardware nor software.
Company.
Aster computers was based in the small town of Arkel near the town of Gorinchem.
Initially Aster computer b.v. was called MCP (Music print Computer Product), because it was specialized in producing computer assisted printing of sheet music. The director of the company was interested in Microprocessor technology and noticed there was a market for selling kits to computer building amateurs, so they started selling electronic kits to hobbyists, and employed four persons at that time . They also assembled kits for people without soldering skills, especially the "junior Computer" from Elektor (a copy of the KIM-1), and the ZX80 from Sinclair. Among the kits sold there were also alternative floppy disk drives for TRS-80 computers. But these needed the infamous TRS-80 expansion interface, which was very expensive, and had a very unreliable floppy disk controller because it used the WD1771 floppy disk controller chip without an external "data separator". To fix this problem MCP developed a small plugin board which could be plugged into the socket for the WD1771, and which contained a data separator, and a socket for the WD1791 to support dual-density operation. Still, the expansion interface was expensive and due to its design it was also unreliable. So they decided to also develop their own alternative in the form of an improved floppy disk controller and printer interface that could be built right into a floppy disk enclosure. The lack of RAM expansion offered by this solution was solved by a service in which the 16 KB RAM chips inside the base unit would be replaced by 64 KB RAM chips.
While this went on MCP renamed itself to "MCP CHIP" but ran into problems with the German computer magazine "CHIP", and had to return to its former name. At that time MCP did also sell imported home computers like the TRS-80, the Video Genie (another TRS-80 clone), the Luxor ABC 80 and the Apple II.
They also sold the exotic Olivetti M20, a very early 16-bit personal computer that was one of the very few systems to use a Z8000 CPU.
After designing their own fully functional replacement for the TRS-80 expansion interface (which was never commercialized) the company realized that they could do better than just re-designing the expansion interface. They observed that the TRS-80 was a great computer but it lacked in several areas. The display logic and resulting display 'snow' was irritating, as was the missing lower case support, the CPU speed could be improved, the quality and layout of the keyboard was bothersome, and the floppy disk capacity and reliability was low. Also the more interesting software offered for CP/M systems could not run well on a TRS-80. So they decided to design a TRS-80 and CP/M software-compatible computer system, which (following the lead of Apple Computer) they decided to name after a "typical Dutch flower". So they called it the Aster CT-80 (CP/M/Tandy-1980). Why they went with Aster, and not the more well known Tulip is unknown, perhaps they thought it would be to presumptuous, or perhaps the fact that "Aster" is also a Dutch girls' name has something to do with it. Remarkably "Aster" was also the name given to a Dutch Supercomputer much later, in 2002.
The first version of the Aster consisted of four "Eurocards", one Z80 CPU card with 64 KB memory, one Motorola MC6845-based video card, one double density floppy disk controller card and one "keyboard/RS-232/cassette interface" card. Plus a "backplane card", (which connected all the other cards) and a keyboard. And was intended for hobbyists, to be sold as a kit consisting of the parts and the PCB's for the computer and attached keyboard. After selling a few kits, MCP became convinced there was a much bigger market for an improved model sold as a completed working system. However the original kit version lacked many features that prevented its use as a serious computer system. Because the original designer had left the company another employee completely redesigned most of the system, (adding a display snow remover circuit, true 80/64 column text mode support, (with different size letters for TRS-80 and CP/M mode, so that in TRS-80 mode the full screen was also used, not just a 64×16 portion of the 80×25 screen) with an improved font set (adding "gray scale" version of the TRS-80 mozaik graphics and many special PETSCII like characters), and a more flexible and reliable floppy disk controller and keyboard interface plus many other small improvements), also an enclosure was developed for the main computer system, (in the form of a 19-inch rack for the Eurocards) and for two floppy disk drives and the power supply. A software engineer was hired to write the special "dual boot mode" BIOS and the special CP/M BIOS. The "dual boot mode" BIOS actually discovered whether a TRS-DOS, or Aster CP/M disk was placed in the drive, and would, depending on the type of disk, reorganise the internal memory architecture of the system, to either be 100% TRS-80 compatible or optimally support CP/M, with as much "workspace" as possible, and the 80×25 video mode. It also was responsible for switching to ROM BASIC when the system was turned on with the break key pressed, and later supported a primitive LAN system, using the RS-232 port with modified cabling. The very first of the ready made computers were sold with the "kit" versions of the euro cards, the version with redesigned cards came a month or so later.
Soon the little shop became much too small and they moved to a much larger factory building nearby (formerly a window glass factory), and started mass-producing the Aster for a period of a few years, in which time its staff grew twentyfold.
After the Aster having been a few years on the Market Tandy released its own improved model, the TRS-80 Model III computer which solved many of the same problems that the Aster also had solved, but the model 3 still did not fully support CP/M as the Aster did. In the meantime IBM had released its original IBM PC, which incidentally looked remarkably like the Asters base with floppy drives + separate keyboard set-up.
The Aster was chosen for Dutch schools by the Dutch ministry of education, in a set-up with eight disk-less Asters, and one Aster with high-capacity floppy drives all connected by a LAN based on the Aster's high-speed serial port hardware, and special cables that permitted that any single computer on the LAN could broadcast to all other computers. The floppy based system was operated by the teacher who could send programs from his floppy disk, and data, to the student's disk-less systems thanks to the special BIOS in those systems. The students could send programs and data back to the teacher through the same LAN, or could save to a cassette recorder built into the disk-less units. Through a special "video-switch" the teacher was also able to see a copy of each student's display on his own screen. About a thousand of such systems were sold for many hundreds of Dutch schools.
Because of cash flow problems (resulting from growing too fast, insufficient financial backing, technical problems, and a sudden problem with Z80 processor deliveries) the company suddenly folded even before it came to full fruition.
Perhaps the Aster computer inspired another Dutch computer firm to name their computer after another typical Dutch flower—the Tulip's Tulip System-1 which appeared about the same time Aster folded.
Most of the engineers who designed the hardware and software of the Aster went on to design hardware and software for the (then new) MSX system for a company called "Micro Technology b.v.".
Unreleased add ons.
To enhance and modernize the Aster CT-80 the company also designed three alternative video display adapters to supplement or replace the TRS-80 compatible video card, (due to the modular nature of the Aster it was simply a matter of changing the video card, and/or CPU card to upgrade the system).
A hard disk interface was also in the works, which would, add a SCSI interface, and the necessary software. A working prototype was developed that added a 40MB hard disk.
On the software front, work was being done to implement the replacement for the aging "user interface" of CP/M, (the Command Console Processor CCP) with the more modern ZCPR.
Finally a replacement for the aging Z80 processor was being developed in the form of an Intel 8086 board, and additional 512K 16 bit memory boards. Such replacements of CPU and memory system components were possible because the Aster CT-80 was designed to use a backplane that was designed to support both 8 and 16 bit processors, and used a modular Eurocard based design with slots to spare for expansion. In theory the system could support the Z80 and the 8086 simultaneously. Plans were formulated to support CP/M-86 and even MS-DOS.
Unfortunately none of these extensions to the system became available because the company folded before any of them could be released.
|
2138 | Arthur Wellesley | Arthur Wellesley may refer to:
|
2139 | Lists of animated television series | These are lists of animated television series. Animated television series are television programs produced by means of animation. Animated series produced for theaters are not included in this lists; for those, see List of animated short film series. These lists include compilation series of theatrical shorts such as "The Bugs Bunny Show" since they often feature some new wrap-around animation.
|
2140 | Atlanta Braves | The Atlanta Braves are an American professional baseball team based in the Atlanta metropolitan area. The Braves compete in Major League Baseball (MLB) as a member club of the National League (NL) East division. The Braves were founded in Boston, Massachusetts, in 1871, as the Boston Red Stockings. The club was known by various names until the franchise began operating as the Boston Braves in 1912. The Braves are the oldest continuously operating professional sports franchise in America.
After 81 seasons and one World Series title in Boston, the club moved to Milwaukee, Wisconsin in 1953. With a roster of star players like Hank Aaron, Eddie Mathews, and Warren Spahn the Milwaukee Braves won the World Series in 1957. Despite the team's success, fan attendance declined. The club's owners moved the team to Atlanta, Georgia in 1966.
The Braves did not find much success in Atlanta until 1991. From 1991 to 2005, the Braves were one of the most successful teams in baseball, winning an unprecedented 14 consecutive division titles, making an MLB record eight consecutive National League Championship Series appearances, and producing one of the greatest pitching rotations in the history of baseball including Hall of Famers Greg Maddux, John Smoltz, and Tom Glavine.
The Braves are one of the two remaining National League charter franchises that debuted in 1876. The club has won an MLB record 22 divisional titles, 18 National League pennants, and four World Series championships. The Braves are the only Major League Baseball franchise to have won the World Series in three different home cities. At the end of the 2022 season, the Braves' overall win–loss record is .
History.
Boston (1871–1952).
1871–1913.
The Cincinnati Red Stockings, established in 1869 as the first openly all-professional baseball team, voted to dissolve after the 1870 season. Player-manager Harry Wright, with brother George and two other Cincinnati players, then went to Boston, Massachusetts at the invitation of Boston Red Stockings founder Ivers Whitney Adams to form the nucleus of the "Boston Red Stockings", a charter member of the National Association of Professional Base Ball Players (NAPBBP).
The original Boston Red Stockings team and its successors can lay claim to being the oldest continuously playing team in American professional sports. The only other team that has been organized as long, the Chicago Cubs, did not play for the two years following the Great Chicago Fire of 1871. Two young players hired away from the Forest City club of Rockford, Illinois, turned out to be the biggest stars during the NAPBBP years: pitcher Al Spalding, founder of Spalding sporting goods, and second baseman Ross Barnes.
Led by the Wright brothers, Barnes, and Spalding, the Red Stockings dominated the National Association, winning four of that league's five championships. The team became one of the National League's charter franchises in 1876, sometimes called the "Red Caps" (as a new Cincinnati Red Stockings club was another charter member).
The Boston Red Caps played in the first game in the history of the National League, on Saturday, April 22, 1876, defeating the Philadelphia Athletics, 6–5.
Although somewhat stripped of talent in the National League's inaugural year, Boston bounced back to win the 1877 and 1878 pennants. The Red Caps/Beaneaters were one of the league's dominant teams during the 19th century, winning a total of eight pennants. For most of that time, their manager was Frank Selee. Boston came to be called the "Beaneaters" in 1883 while retaining red as the team color. The 1898 team finished 102–47, a club record for wins that would stand for almost a century. Stars of those 1890s Beaneater teams included the "Heavenly Twins", Hugh Duffy and Tommy McCarthy, as well as "Slidin'" Billy Hamilton.
The team was decimated when the American League's new Boston entry set up shop in 1901. Many of the Beaneaters' stars jumped to the new team, which offered contracts that the Beaneaters' owners did not even bother to match. They only managed one winning season from 1900 to 1913 and lost 100 games five times. In 1907, the Beaneaters temporarily eliminated the last bit of red from their stockings because their manager thought the red dye could cause wounds to become infected, as noted in "The Sporting News Baseball Guide" in the 1940s.
The American League club's owner, Charles Taylor, wasted little time in adopting Red Sox as his team's first official nickname. Up to that point they had been called by the generic "Americans". Media-driven nickname changes to the "Doves" in 1907 and the "Rustlers" in 1911 did nothing to change the National League club's luck. The team became the "Braves" for the first time before the 1912 season. The president of the club, John M. Ward named the club after the owner, James Gaffney. Gaffney was called one of the "braves" of New York City's political machine, Tammany Hall, which used an Indian chief as their symbol.
1914: Miracle.
Two years later, the Braves put together one of the most memorable seasons in baseball history. After a dismal 4–18 start, the Braves seemed to be on pace for a last-place finish. On July 4, 1914, the Braves lost both games of a doubleheader to the Brooklyn Dodgers. The consecutive losses put their record at 26–40 and the Braves were in last place, 15 games behind the league-leading New York Giants, who had won the previous three league pennants. After a day off, the Braves started to put together a hot streak, and from July 6 through September 5, the Braves went 41–12.
On September 7 and 8, the Braves took two of three games from the New York Giants and moved into first place. The Braves tore through September and early October, closing with 25 wins against six losses, while the Giants went 16–16. They were the only team, under the old eight-team league format, to win a pennant after being in last place on the Fourth of July. They were in last place as late as July 18, but were close to the pack, moving into fourth on July 21 and second place on August 12.
Despite their amazing comeback, the Braves entered the World Series as a heavy underdog to Connie Mack's Philadelphia A's. Nevertheless, the Braves swept the Athletics—the first unqualified sweep in the young history of the modern World Series (the 1907 Series had one tied game) to win the world championship. Meanwhile, Johnny Evers won the Chalmers Award.
The Braves played the World Series (as well as the last few games of the 1914 season) at Fenway Park, since their normal home, the South End Grounds, was too small. However, the Braves' success inspired owner Gaffney to build a modern park, Braves Field, which opened in August 1915. It was the largest park in the majors at the time, with 40,000 seats and a very spacious outfield. The park was novel for its time; public transportation brought fans right to the park.
1915–1953.
After contending for most of 1915 and 1916, the Braves only twice posted winning records from 1917 to 1932. The lone highlight of those years came when Judge Emil Fuchs bought the team in 1923 to bring his longtime friend, pitching great Christy Mathewson, back into the game. However, Mathewson died in 1925, leaving Fuchs in control of the team.
Fuchs was committed to building a winner, but the damage from the years prior to his arrival took some time to overcome. The Braves finally managed to be competitive in 1933 and 1934 under manager Bill McKechnie, but Fuchs' revenue was severely depleted due to the Great Depression.
Looking for a way to get more fans and more money, Fuchs worked out a deal with the New York Yankees to acquire Babe Ruth, who had started his career with the Red Sox. Fuchs made Ruth team vice president, and promised him a share of the profits. He was also granted the title of assistant manager, and was to be consulted on all of the Braves' deals. Fuchs even suggested that Ruth, who had long had his heart set on managing, could take over as manager once McKechnie stepped down—perhaps as early as 1936.
At first, it appeared that Ruth was the final piece the team needed in 1935. On opening day, he had a hand in all of the Braves' runs in a 4–2 win over the Giants. However, that proved to be the only time the Braves were over .500 all year. Events went downhill quickly. While Ruth could still hit, he could do little else. He could not run, and his fielding was so terrible that three of the Braves' pitchers threatened to go on strike if Ruth were in the lineup. It soon became obvious that he was vice president and assistant manager in name only and Fuchs' promise of a share of team profits was hot air. In fact, Ruth discovered that Fuchs expected him to invest some of "his" money in the team.
Seeing a franchise in complete disarray, Ruth retired on June 1—only six days after he clouted what turned out to be the last three home runs of his career. He had wanted to quit as early as May 12, but Fuchs wanted him to hang on so he could play in every National League park. The Braves finished 38–115, the worst season in franchise history. Their .248 winning percentage is the second-worst in the modern era and the second-worst in National League history (ahead of the 1899 Cleveland Spiders with a .130 winning percentage).
Fuchs lost control of the team in August 1935, and the new owners tried to change the team's image by renaming it the Boston Bees. This did little to change the team's fortunes. After five uneven years, a new owner, construction magnate Lou Perini, changed the nickname back to the Braves. He immediately set about rebuilding the team. World War II slowed things down a little, but the team rode the pitching of Warren Spahn to impressive seasons in 1946 and 1947.
In 1948, the team won the pennant, behind the pitching of Spahn and Johnny Sain, who won 39 games between them. The remainder of the rotation was so thin that in September, "Boston Post" writer Gerald Hern wrote this poem about the pair:
The poem received such a wide audience that the sentiment, usually now paraphrased as "Spahn and Sain and pray for rain", entered the baseball vocabulary. However, in the 1948 season, the Braves had the same overall winning percentage as in games that Spahn and Sain started.
The 1948 World Series, which the Braves lost in six games to the Indians, turned out to be the Braves' last hurrah in Boston. In 1950, Sam Jethroe became the team's first African American player, making his major league debut on April 18. Amid four mediocre seasons, attendance steadily dwindled until, on March 13, 1953, Perini, who had recently bought out his original partners, announced he was moving the team to Milwaukee, where the Braves had their top farm club, the Brewers. Milwaukee had long been a possible target for relocation. Bill Veeck had tried to return his St. Louis Browns there earlier the same year (Milwaukee was the original home of that franchise), but his proposal had been voted down by the other American League owners.
Milwaukee (1953–1965).
Milwaukee went wild over the Braves, drawing a then-NL record 1.8 million fans. The Braves finished 92–62 in their first season in Milwaukee. The success of the relocated team showed that baseball could succeed in new markets, and the Philadelphia Athletics, St. Louis Browns, Brooklyn Dodgers, and New York Giants left their hometowns within the next five years.
As the 1950s progressed, the reinvigorated Braves became increasingly competitive. Sluggers Eddie Mathews and Hank Aaron drove the offense (they hit a combined 1,226 home runs as Braves, with 850 of those coming while the franchise was in Milwaukee and 863 coming while they were teammates), often aided by another power hitter, Joe Adcock, while Warren Spahn, Lew Burdette, and Bob Buhl anchored the rotation. The 1956 Braves finished second, only one game behind the Brooklyn Dodgers.
In 1957, the Braves celebrated their first pennant in nine years spearheaded by Aaron's MVP season, as he led the National League in both home runs and RBI. Perhaps the most memorable of his 44 round-trippers that season came on September 23, a two-run walk-off home run that gave the Braves a 4–2 victory over the St. Louis Cardinals and clinched the League championship. The team then went on to its first World Series win in over 40 years, defeating the powerful New York Yankees of Berra, Mantle, and Ford in seven games. One-time Yankee Burdette, the Series MVP, threw three complete-game victories against his former team, giving up only two earned runs.
In 1958, the Braves again won the National League pennant and jumped out to a three games to one lead in the World Series against the New York Yankees once more, thanks in part to the strength of Spahn's and Burdette's pitching. But the Yankees stormed back to take the last three games, in large part to World Series MVP Bob Turley's pitching.
The 1959 season saw the Braves finish the season in a tie with the Los Angeles Dodgers, both with 86–68 records. Many residents of Chicago and Milwaukee were hoping for a Sox-Braves Series, as the cities are only about apart, but it was not to be because Milwaukee fell in a best-of-3 playoff with two straight losses to the Dodgers. The Dodgers would go on to defeat the Chicago White Sox in the World Series.
The next six years were up-and-down for the Braves. The 1960 season featured two no-hitters by Burdette and Spahn, and Milwaukee finished seven games behind the Pittsburgh Pirates, who went on to win the World Series that year, in second place, one year after the Braves were on the winning end of the 13-inning near-perfect game of Pirates pitcher Harvey Haddix. The 1961 season saw a drop in the standings for the Braves down to fourth, despite Spahn recording his 300th victory and pitching another no-hitter that year.
Aaron hit 45 home runs in 1962, a Milwaukee career high for him, but this did not translate into wins for the Braves, as they finished fifth. The next season, Aaron again hit 44 home runs and notched 130 RBI, and 42-year-old Warren Spahn was once again the ace of the staff, going 23–7. However, none of the other Braves produced at that level, and the team finished in the "second division", for the first time in its short history in Milwaukee.
The Braves were mediocre as the 1960s began, with an inflated win total fed by the expansion New York Mets and Houston Colt .45s. To this day, the Milwaukee Braves are the only major league team that played more than one season and never had a losing record.
Perini sold the Braves to a Chicago-based group led by William Bartholomay in 1962. Almost immediately Bartholomay started shopping the Braves to a larger television market. Keen to attract them, the fast-growing city of Atlanta, led by Mayor Ivan Allen Jr. constructed a new $18 million, 52,000-seat ballpark in less than one year, Atlanta Stadium, which was officially opened in 1965 in hopes of luring an existing major league baseball and/or NFL/AFL team. After the city failed to lure the Kansas City A's to Atlanta (the A's ultimately moved to Oakland in 1968), the Braves announced their intention to move to Atlanta for the 1965 season. However, an injunction filed in Wisconsin kept the Braves in Milwaukee for one final year. In 1966, the Braves completed the move to Atlanta.
Eddie Mathews is the only Braves player to have played for the organization in all three cities that they have been based in. Mathews played with the Braves for their last season in Boston, the team's entire tenure in Milwaukee, and their first season in Atlanta.
Atlanta (1966–present).
1966–1974.
The Braves were a .500 team in their first few years in Atlanta; 85–77 in 1966, 77–85 in 1967, and 81–81 in 1968. The 1967 season was the Braves' first losing season since 1952, their last year in Boston. In 1969, with the onset of divisional play, the Braves won the first-ever National League West Division title, before being swept by the "Miracle Mets" in the National League Championship Series. They would not be a factor during the next decade, posting only two winning seasons between 1970 and 1981 – in some cases, fielding teams as bad as the worst Boston teams.
In the meantime, fans had to be satisfied with the achievements of Hank Aaron. In the relatively hitter-friendly confines and higher-than-average altitude of Atlanta Stadium ("The Launching Pad"), he actually increased his offensive production. Atlanta also produced batting champions in Rico Carty (in 1970) and Ralph Garr (in 1974). In the shadow of Aaron's historical home run pursuit, was the fact that three Atlanta sluggers hit 40 or more home runs in 1973 – Darrell Evans and Davey Johnson along with Aaron.
By the end of the 1973 season, Aaron had hit 713 home runs, one short of Ruth's record. Throughout the winter he received racially motivated death threats, but stood up well under the pressure. On April 4, opening day of the next season, he hit No.714 in Cincinnati, and on April 8, in front of his home fans and a national television audience, he finally beat Ruth's mark with a home run to left-center field off left-hander Al Downing of the Los Angeles Dodgers. Aaron spent most of his career as a Milwaukee and Atlanta Brave before being traded to the Milwaukee Brewers on November 2, 1974.
Ted Turner era.
1976–1977: Ted Turner buys the team.
In 1976, the team was purchased by media magnate Ted Turner, owner of superstation WTBS, as a means to keep the team (and one of his main programming staples) in Atlanta. The financially strapped Turner used money already paid to the team for their broadcast rights as a down-payment. It was then that Atlanta Stadium was renamed Atlanta–Fulton County Stadium. Turner quickly gained a reputation as a quirky, hands-on baseball owner. On May 11, 1977, Turner appointed himself manager, but because MLB passed a rule in the 1950s barring managers from holding a financial stake in their teams, Turner was ordered to relinquish that position after one game (the Braves lost 2–1 to the Pittsburgh Pirates to bring their losing streak to 17 games).
Turner used the Braves as a major programming draw for his fledgling cable network, making the Braves the first franchise to have a nationwide audience and fan base. WTBS marketed the team as "The Atlanta Braves: America's Team", a nickname that still sticks in some areas of the country, especially the South. Among other things, in 1976 Turner suggested the nickname "Channel" for pitcher Andy Messersmith and jersey number 17, in order to promote the television station that aired Braves games. Major League Baseball quickly nixed the idea.
1978–1990.
After three straight losing seasons, Bobby Cox was hired for his first stint as manager for the 1978 season. He promoted 22-year-old slugger Dale Murphy into the starting lineup. Murphy hit 77 home runs over the next three seasons, but he struggled on defense, unable to adeptly play either catcher or first base. In 1980, Murphy was moved to center field and demonstrated excellent range and throwing ability, while the Braves earned their first winning season since 1974. Cox was fired after the 1981 season and replaced with Joe Torre, under whose leadership the Braves attained their first divisional title since 1969.
Strong performances from Bob Horner, Chris Chambliss, pitcher Phil Niekro, and short relief pitcher Gene Garber helped the Braves, but no Brave was more acclaimed than Murphy, who won both a Most Valuable Player and a Gold Glove award. Murphy also won an MVP award the following season, but the Braves began a period of decline that defined the team throughout the 1980s. Murphy, excelling in defense, hitting, and running, was consistently recognized as one of the league's best players, but the Braves averaged only 65 wins per season between 1985 and 1990. Their lowest point came in 1988, when they lost 106 games. The 1986 season saw the return of Bobby Cox as general manager. Also in 1986, the team stopped using their Indian-themed mascot, Chief Noc-A-Homa.
1991–1994: From worst to first.
From 1991 to 2005 the Braves were one of the most consistently winning teams in baseball. The Braves won a record 14 straight division titles, five National League pennants, and one World Series title in 1995. Bobby Cox returned to the dugout as manager in the middle of the 1990 season, replacing Russ Nixon. The Braves finished the year with the worst record in baseball, at 65–97. They traded Dale Murphy to the Philadelphia Phillies after it was clear he was becoming a less dominant player. Pitching coach Leo Mazzone began developing young pitchers Tom Glavine, Steve Avery, and John Smoltz into future stars. That same year, the Braves used the number one overall pick in the 1990 MLB draft to select Chipper Jones, who became one of the best hitters in team history. Perhaps the Braves' most important move was not on the field, but in the front office. Immediately after the season, John Schuerholz was hired away from the Kansas City Royals as general manager.
The following season, Glavine, Avery, and Smoltz would be recognized as the best young pitchers in the league, winning 52 games among them. Meanwhile, behind position players David Justice, Ron Gant and unexpected league Most Valuable Player and batting champion Terry Pendleton, the Braves overcame a 39–40 start, winning 55 of their final 83 games over the last three months of the season and edging the Los Angeles Dodgers by one game in one of baseball's more memorable playoff races.
The "Worst to First" Braves, who had not won a divisional title since 1982, captivated the city of Atlanta and the entire southeast during their improbable run to the flag. They defeated the Pittsburgh Pirates in a very tightly contested seven-game NLCS only to lose the World Series, also in seven games, to the Minnesota Twins. The series, considered by many to be one of the greatest ever, was the first time a team that had finished last in its division one year went to the World Series the next; both the Twins and Braves accomplished the feat.
Despite the 1991 World Series loss, the Braves' success would continue. In 1992, the Braves returned to the NLCS and once again defeated the Pirates in seven games, culminating in a dramatic game seven win. Francisco Cabrera's two-out single that scored David Justice and Sid Bream capped a three-run rally in the bottom of the ninth inning that gave the Braves a 3–2 victory. It was the first time in post-season history that the tying and winning runs had scored on a single play in the ninth inning. The Braves lost the World Series to the Toronto Blue Jays, however.
In 1993, the Braves signed Cy Young Award winning pitcher Greg Maddux from the Chicago Cubs, leading many baseball insiders to declare the team's pitching staff the best in baseball. The 1993 team posted a franchise-best 104 wins after a dramatic pennant race with the San Francisco Giants, who won 103 games. The Braves needed a stunning 55–19 finish to edge out the Giants, who led the Braves by nine games in the standings as late as August 11. However, the Braves fell in the NLCS to the Philadelphia Phillies in six games.
In 1994, in a realignment of the National League's divisions following the 1993 expansion, the Braves moved to the Eastern Division. This realignment was the main cause of the team's heated rivalry with the New York Mets during the mid-to-late 1990s.
The player's strike cut short the 1994 season, prior to the division championships, with the Braves six games behind the Montreal Expos with 48 games left to play.
1995–2005: World Series champs and 14 straight division titles.
The Braves returned strong the following strike-shortened (144 games instead of the customary 162) year and beat the Cleveland Indians in the 1995 World Series. This squelched claims by many Braves critics that they were the "Buffalo Bills of Baseball" (January 1996 issue of "Beckett Baseball Card Monthly"). With this World Series victory, the Braves became the first team in Major League Baseball to win world championships in three different cities. With their strong pitching as a constant, the Braves appeared in the and 1999 World Series, losing both to the New York Yankees, managed by Joe Torre, a former Braves manager.
In October 1996, Time Warner acquired Ted Turner's Turner Broadcasting System and all of its assets, including its cable channels and the Atlanta Braves. Over the next few years, Ted Turner's presence as the owner of the team would diminish.
A 95–67 record in produced a ninth consecutive division title. However, a sweep by the St. Louis Cardinals in the National League Division Series prevented the Braves from reaching the NL Championship Series.
They had a streak of 14 division titles from 1991 to 2005, three in the Western Division and eleven in the Eastern, interrupted only in 1994 when the strike ended the season early. Pitching was not the only constant in the Braves organization —Cox was the Braves' manager, while Schuerholz remained the team's GM until after the 2007 season when he was promoted to team president. Terry Pendleton finished his playing career elsewhere but returned to the Braves system as the hitting coach.
Liberty Media era.
Liberty Media buys the team.
In December 2005, team owner Time Warner, which inherited the Braves after purchasing Turner Broadcasting System in 1996, announced it was placing the team for sale. Liberty Media began negotiations to purchase the team.
In February 2007, after more than a year of negotiations, Time Warner agreed to a deal to sell the Braves to Liberty Media, which owned a large amount of stock in Time Warner, pending approval by 75 percent of MLB owners and the Commissioner of Baseball, Bud Selig. The deal included the exchange of the Braves, valued in the deal at $450 million, a hobbyist magazine publishing company, and $980 million cash, for 68.5 million shares of Time Warner stock held by Liberty, worth approximately $1.48 billion. Team President Terry McGuirk anticipated no change in the front office structure, personnel, or day-to-day operations of the Braves, and Liberty did not participate in day-to-day operations. On May 16, 2007, Major League Baseball's owners approved the sale. The Braves are one of only two Major League Baseball teams under majority corporate ownership (and the only NL team with this distinction); the other team is the Toronto Blue Jays (owned by Canadian media conglomerate Rogers Communications).
2010: Cox's final season.
The 2010 Braves' season featured an attempt to reclaim a postseason berth for the first time since 2005. The Braves were once again skippered by Bobby Cox, in his 25th and final season managing the team. The Braves started the 2010 season slowly and had a nine-game losing streak in April. Then they had a nine-game winning streak from May 26 through June 3, the Braves longest since 2000 when they won 16 in a row. On May 31, the Atlanta Braves defeated the then-first place Philadelphia Phillies at Turner Field to take sole possession of first place in the National League East standings, a position they had maintained through the middle of August.
The last time the Atlanta Braves led the NL East on August 1 was in 2005. On July 13, 2010, at the 2010 MLB All-Star Game in Anaheim, Braves catcher Brian McCann was awarded the All-Star Game MVP Award for his clutch two-out, three-run double in the seventh inning to give the National League its first win in the All-Star Game since 1996. He became the first Brave to win the All-Star Game MVP Award since Fred McGriff did so in 1994. The Braves made two deals before the trade deadline to acquire Álex González, Rick Ankiel and Kyle Farnsworth from the Toronto Blue Jays and Kansas City Royals, giving up shortstop Yunel Escobar, pitchers Jo-Jo Reyes and Jesse Chavez, outfielder Gregor Blanco and three minor leaguers. On August 18, 2010, they traded three pitching prospects for first baseman Derrek Lee from the Chicago Cubs.
On August 22, 2010, against the Chicago Cubs, Mike Minor struck out 12 batters across 6 innings; an Atlanta Braves single game rookie strikeout record. The Braves dropped to second in the NL East in early September, but won the NL Wild Card. They lost to the San Francisco Giants in the National League Division Series in four games. Every game of the series was determined by one run. After the series-clinching victory for the Giants in Game 4, Bobby Cox was given a standing ovation by the fans, also by players and coaches of both the Braves and Giants.
2012: Chipper's last season.
In 2012, the Braves began their 138th season after an upsetting end to the 2011 season. On March 22, the Braves announced that third baseman Chipper Jones would retire following the 2012 season after 19 Major League seasons with the team. The Braves also lost many key players through trades or free agency, including pitcher Derek Lowe, shortstop Alex González, and outfielder Nate McLouth. To compensate for this, the team went on to receive many key players such as outfielder Michael Bourn, along with shortstops Tyler Pastornicky and Andrelton Simmons.
Washington ended up winning their first division title in franchise history, but the Braves remained in first place of the NL wild-card race. Keeping with a new MLB rule for the 2012 season, the top two wild card teams in each league must play each other in a playoff game before entering into the Division Series. The Braves played the St. Louis Cardinals in the first-ever Wild Card Game. The Braves lost the game 6–3, ending their season.
2017: Front office changes.
After the 2016 season was over the Braves promoted interim manager Brian Snitker to full-time manager. On October 2, 2017, John Coppolella resigned as general manager of the Braves amid a Major League Baseball investigation into Atlanta's international signings, having committed what the Braves termed "a breach of MLB rules regarding the international player market". On November 13, 2017, the Braves announced Alex Anthopoulos as the new general manager and executive vice president. John Hart was removed as team president and assumed a senior adviser role with the organization.
Braves chairman Terry McGuirk apologized to fans "on behalf of the entire Braves family" for the scandal. McGuirk described Anthopoulos as "a man of integrity" and that "he will operate in a way that will make all of our Braves fans proud." On November 17, 2017, the Braves announced that John Hart had stepped down as senior advisor for the organization. Hart said in a statement that "with the hiring of Alex Anthopoulos as general manager, this organization is in great hands."
MLB investigation and penalties.
On November 21, 2017, Major League Baseball Commissioner Rob Manfred announced the findings of the MLB investigation into Atlanta's international signings. Manfred ruled that the Braves must forfeit 13 international prospects, including highly touted Kevin Maitan, an infielder from Venezuela who signed for $4.25 million in 2016.
The team also forfeited a third-round draft pick in the 2018 draft. Former Braves general manager John Coppolella was placed on baseball's permanently ineligible list. Additionally, the Braves shall be prohibited from signing any international player for more than $10,000 during the 2019–20 signing period and their international signing bonus pool for the 2020–21 signing period will be reduced by 50%.
2018–2022: Return to the postseason and World Series title.
The Braves introduced a new mascot named Blooper on January 27, 2018 at the Atlanta Braves fan fest. Blooper succeeded the Braves' "Homer of the Brave" mascot after he went into retirement. The Braves began a new streak of NL East division titles in 2018, when they went 90–72. In 2019, their 97–65 record was their best since 2003. However, in neither season did the Braves advance past the Division Series. In the 2020 National League Championship Series against the Dodgers, the Braves led 3–1 before the Dodgers came back to win the series and advance to the World Series.
The Braves returned to the NLCS in 2021 after beating the Milwaukee Brewers 3–1 in the 2021 NLDS on the heels of a Freddie Freeman game-winning home run in the bottom of the 8th inning in Game 4. With the score tied at 4, Freeman delivered a blast to left center field to give the Braves a 5–4 lead headed to the top of the 9th. After allowing a lead off single to Eduardo Escobar, Will Smith subsequently retired the side in order to secure the Braves berth in the NLCS.
On October 23, 2021, the Braves defeated the Dodgers in the National League Championship Series, a rematch of the 2020 NLCS, in six games to advance to the World Series for the first time since 1999, thereby securing their first pennant in 22 years. They defeated the Houston Astros in six games to win their fourth World Series title.
Logos.
From 1945 to 1955 the Braves primary logo consisted of the head of an Indian warrior. From 1956 to 1965 it was a laughing Indian with a mohawk and one feather in his hair. When the Braves moved to Atlanta in 1966, the "Braves" script was added underneath the laughing Indian. In 1985, the Braves made a small script change to the logo. The Braves modern logo debuted in 1987. The modern logo is the word "Braves" in cursive with a tomahawk below it. In 2018, the Braves made a subtle color change to the primary logo.
World Series championships.
Over the 120 years since the inception of the World Series (118 total World Series played), the Braves franchise has won a total of four World Series Championships, with at least one in each of the three cities they have played in.
Uniforms.
The Braves updated their uniform set in 1987, returning to buttoned uniforms and belted pants. This design returned to the classic look they wore in the 1950s. For the 2023 season the Braves have four uniform combinations. The white home uniform features red and navy piping, the "Braves" script and tomahawk in front, and radially arched (vertically arched until 2005; sewn into a nameplate until 2012) navy letters and red numbers with navy trim at the back. The gray road uniforms are identical to the white home uniforms save for the "Atlanta" script in front. Initially, the cap worn with both uniforms is the red-brimmed navy cap with the script "A" in front. In 2008, an all-navy cap was introduced and became the primary road cap the following season.
The Braves alternate navy blue road jerseys features red lettering, a red tomahawk and silver piping. Unlike the home uniforms, which are worn based on a schedule, the road uniforms are chosen on game day by the starting pitcher. However, they are also subject to Major League Baseball rules requiring the road team to wear uniforms that contrast with the uniforms worn by the home team. Due to this rule, the gray uniforms are worn when the home team chooses to wear navy blue, and sometimes when the home team chooses to wear black.
For home games the Braves also have two alternate uniforms. The team has a Friday night red alternate home uniform. The uniform features navy piping, navy "Braves" script and tomahawk in front, and white letters and navy numbers with white trim at the back. It was paired with the Braves normal home cap. For Saturday games, the Braves wear the City Connect uniforms in honor of Hank Aaron. The jersey is inspired by the 1974 Braves home uniform and is reimagined with “The A” emblazoned across the chest. The cap features the “A” logo and bears the colors of the 1974 uniform.
Ballparks.
Truist Park.
The Atlanta Braves home ballpark has been Truist Park since 2017. Truist Park is located approximately 10 miles (16 km) northwest of downtown Atlanta in the unincorporated community of Cumberland, in Cobb County, Georgia. The team played its home games at Atlanta–Fulton County Stadium from 1966 to 1996, and at Turner Field from 1997 to 2016. The Braves opened Truist Park on April 14, 2017, with a four-game sweep of the San Diego Padres. The park received positive reviews. Woody Studenmund of the Hardball Times called the park a "gem" saying that he was impressed with "the compact beauty of the stadium and its exciting approach to combining baseball, business and social activities." J.J. Cooper of Baseball America praised the "excellent sight lines for pretty much every seat."
CoolToday Park.
Since 2019, the Braves have played spring training games at CoolToday Park in North Port, Florida. The ballpark opened on March 24, 2019, with the Braves' 4-2 win over the Tampa Bay Rays. The Braves left Champion Stadium, their previous Spring Training home near Orlando to reduce travel times and to get closer to other teams' facilities. CoolToday Park also serves as the Braves' year round rehabilitation facility.
Home attendance.
Truist Park.
(*) – There were no fans allowed in any MLB stadium in 2020 due to the COVID-19 pandemic.
Major rivalries.
New York Mets.
The Braves–Mets rivalry is a rivalry between the two teams, featuring the Braves and the New York Mets as they both play in the National League East.
Although their first major confrontation occurred when the Mets swept the Braves in the 1969 NLCS, en route to their first World Series championship, the first playoff series won by an expansion team (also the first playoff appearance by an expansion team), the rivalry did not become especially heated until the 1994 season when division realignment put both the Mets and the Braves in the NL East division. During this time the Braves became one of the most dominant teams in professional baseball, earning 14 straight division titles through 2005, including five World Series berths, and one World Series championship during the 1995 season. The rivalry remained heated through the early 2000s.
Philadelphia Phillies.
While their rivalry with the Philadelphia Phillies lacks the history and hatred of the Mets, it has been the more important one in the last decade. Between 1993 and 2013, the two teams reigned almost exclusively as NL East champions, the exceptions being in 2006, when the Mets won their first division title since 1988 (no division titles were awarded in 1994 due to the player's strike), and in 2012, when the Washington Nationals claimed their first division title since 1981 when playing as the Montreal Expos. The Phillies 1993 championship was also part of a four-year reign of exclusive division championships by the Phillies and the Pittsburgh Pirates, their in-state rivals.
While rivalries are generally characterized by mutual hatred, the Braves and Phillies deeply respect each other. Each game played (18 games in 2011) is vastly important between these two NL East giants, but at the end of the day, they are very similar organizations. Overall, the Braves have five more National League East division titles than the Phillies, the Braves having won 16 times since 1995, and holding it for 11 consecutive years from 1995 through 2005. (The Braves also have five NL West titles from 1969 through 1993.)
Nationwide fanbase.
In addition to having strong fan support in the Atlanta metropolitan area and the state of Georgia, the Braves are often referred to as "America's Team" in reference to the team's games being broadcast nationally on TBS from the 1970s until 2007, giving the team a nationwide fan base.
The Braves boast heavy support within the Southeastern United States particularly in states such as Mississippi, Alabama, South Carolina, North Carolina, Tennessee and Florida.
Tomahawk chop.
The tomahawk chop was adopted by fans of the Atlanta Braves in 1991. Carolyn King, the Braves organist, had played the "tomahawk song" during most at bats for a few seasons, but it finally caught on with Braves fans when the team started winning. The usage of foam tomahawks led to criticism from Native American groups that it was "demeaning" to them and called for them to be banned. In response, the Braves' public relations director said that it was "a proud expression of unification and family". King, who did not understand the sociopolitical ramifications, approached one of the Native American chiefs who were protesting. The chief told her that leaving her job as an organist would not change anything and that if she left "they'll find someone else to play."
The controversy has persisted since and became national news again during the 2019 National League Division Series. During the series, St. Louis Cardinals relief pitcher and Cherokee Nation member, Ryan Helsley was asked about the chop and chant. Helsley said he found the fans' chanting and arm-motions insulting and that the chop depicts natives "in this kind of caveman-type people way who aren't intellectual." The relief pitcher's comments prompted the Braves to stop handing out foam tomahawks, playing the chop music or showing the chop graphic when the series returned to Atlanta for Game 5. The Braves released a statement saying they would "continue to evaluate how we activate elements of our brand, as well as the overall in-game experience" and that they would continue a "dialogue with those in the Native American community after the postseason concludes." The heads of the Muscogee (Creek) Nation and Cherokee Nation both publicly condemned the chop and chant.
During the off-season, the Braves met with the National Congress of American Indians to start discussing a path forward. In July 2020, the team faced mounting pressure to change their name after the Cleveland Indians and Washington Redskins announced they were discussing brand change. The Braves released a statement announcing that discussions were still ongoing about the chop, but the team name would not be changed.
Achievements.
Retired numbers.
The Braves have retired eleven numbers in the history of the franchise, including most recently Chipper Jones' number 10 in 2013, John Smoltz's number 29 in 2012, Bobby Cox's number 6 in 2011, Tom Glavine's number 47 in 2010, and Greg Maddux's number 31 in 2009. Additionally, Hank Aaron's 44, Dale Murphy's 3, Phil Niekro's 35, Eddie Mathews' 41, Warren Spahn's 21 and Jackie Robinson's 42, which is retired for all of baseball with the exception of Jackie Robinson Day, have also been retired. The color and design of the retired numbers reflect the uniform design at the time the person was on the team, excluding Robinson.
Of the ten Braves whose numbers have been retired, all who are eligible for the National Baseball Hall of Fame have been elected with the exception of Murphy.
On April 3, 2023, the Braves announced that they will retire number 25 in honor of former centerfielder Andruw Jones on September 9.
Minor league affiliates.
The Atlanta Braves farm system consists of six minor league affiliates.
Radio and television.
The Braves regional games are exclusively broadcast on Bally Sports Southeast. Brandon Gaudin is the play-by-play announcer for Bally Sports Southeast. Gaudin is joined in the booth primarily by Jeff Francoeur. Tom Glavine will join the broadcast for 35 to 40 games. Peter Moylan and Nick Green will also appear in the booth for select games as in-game analysts.
The radio broadcast team is led by the tandem of play-by-play announcer Ben Ingram and analyst Joe Simpson. They work the bulk of the games, with Jim Powell joining Simpson or Ingram throughout the season. Braves games are broadcast across Georgia and seven other states on at least 172 radio affiliates, including flagship station 680 The Fan in Atlanta and stations as far away as Richmond, Virginia; Louisville, Kentucky; and the US Virgin Islands. The games are carried on at least 82 radio stations in Georgia.
|
2141 | Atari ST | The Atari ST is a line of personal computers from Atari Corporation and the successor to the Atari 8-bit family. The initial model, the Atari 520ST, had limited release in April–June 1985 and was widely available in July. It was the first personal computer with a bitmapped color GUI, using a version of Digital Research's GEM from February 1985. The Atari 1040ST, released in 1986 with 1 MB of RAM, was the first home computer with a cost-per-kilobyte of less than US$1.
After Jack Tramiel purchased the assets of the Atari, Inc. consumer division to create Atari Corporation, the 520ST was designed in five months by a small team led by Shiraz Shivji. Alongside the Macintosh, Amiga, Apple IIGS, and Acorn Archimedes, the ST is part of a mid-1980s generation of computers with 16- or 32-bit processors, 256 KB or more of RAM, and mouse-controlled graphical user interfaces. "ST" officially stands for "Sixteen/Thirty-two", referring to the Motorola 68000's 16-bit external bus and 32-bit internals.
The ST was sold with either Atari's color monitor or less expensive monochrome monitor. Color graphics modes are available only on the former while the highest-resolution mode requires the monochrome monitor. Some models can display the color modes on a TV. In Germany and some other markets, the ST gained a foothold for CAD and desktop publishing. With built-in MIDI ports, it was popular for music sequencing and as a controller of musical instruments among amateur and professional musicians. The primary competitor of the Atari ST was the Amiga from Commodore.
The 520ST and 1040ST were followed by the Mega series, the STE, and the portable STacy. In the early 1990s, Atari released three final evolutions of the ST with significant technical differences from the original models: Atari TT (1990), Mega STE (1991), and Falcon (1992). Atari discontinued the entire ST computer line in 1993, shifting the company's focus to the Jaguar video game console.
Development.
The Atari ST was born from the rivalry between home computer makers Atari, Inc. and Commodore International. Jay Miner, one of the designers of the custom chips in the Atari 2600 and Atari 8-bit family, tried to convince Atari management to create a new chipset for a video game console and computer. When his idea was rejected, he left Atari to form a small think tank called Hi-Toro in 1982 and began designing the new "Lorraine" chipset.
Amiga ran out of capital to complete Lorraine's development, and Atari, by then owned by Warner Communications, paid Amiga to continue its work. In return, Atari received exclusive use of the Lorraine design for one year as a video game console. After that time, Atari had the right to add a keyboard and market the complete computer, designated the 1850XLD.
Tramel Technology.
After leaving Commodore International in January 1984, Jack Tramiel formed Tramel (without an "i") Technology, Ltd. with his sons and other ex-Commodore employees and, in April, began planning a new computer. Interested in Atari's overseas manufacturing and worldwide distribution network, Tramiel negotiated with Warner in May and June 1984. He secured funding and bought Atari's consumer division, which included the console and home computer departments, in July. As executives and engineers left Commodore to join Tramel Technology, Commodore responded by filing lawsuits against four former engineers for infringement of trade secrets. The Tramiels did not purchase the employee contracts with the assets of Atari, Inc. and re-hired approximately 100 of the 900 former employees. Tramel Technology soon changed its name to Atari Corporation.
Commodore and Amiga.
Amid rumors that Tramiel was negotiating to buy Atari, Amiga Corp. entered discussions with Commodore. This led to Commodore wanting to purchase Amiga Corporation outright, which Commodore believed would cancel any outstanding contracts, including Atari's. Instead of Amiga Corp. delivering Lorraine to Atari, Commodore delivered a check of $500,000 on Amiga's behalf, in effect returning the funds Atari invested in Amiga for the chipset. Tramiel countered by suing Amiga Corp. on August 13, 1984, seeking damages and an injunction to bar Amiga (and effectively Commodore) from producing anything with its technology.
The lawsuit left the Amiga team in limbo during mid-1984. Commodore eventually moved forward, with plans to improve the chipset and develop an operating system. Commodore announced the Amiga 1000 with the Lorraine chipset in July 1985, but it wasn't available in quantity until 1986. The delay gave Atari time to deliver the Atari 520ST in June 1985. In March 1987, the two companies settled the dispute out of court in a closed decision.
ST hardware.
The lead architect of the new computer project at Tramel Technology and Atari Corporation was ex-Commodore employee Shiraz Shivji, who previously worked on the Commodore 64's development. Different CPUs were investigated, including the 32-bit National Semiconductor NS32000, but engineers were disappointed with its performance, and they moved to the Motorola 68000. The Atari ST design was completed in 5 months in 1984, concluding with it being shown at the January 1985 Consumer Electronics Show.
A custom sound processor called AMY had been in development at Atari, Inc. and was considered for the new ST computer design. The chip needed more time to complete, so AMY was dropped in favor of a commodity Yamaha YM2149F variant of the General Instrument AY-3-8910.
Operating system.
Soon after the Atari buyout, Microsoft suggested to Tramiel that it could port Windows to the platform, but the delivery date was out by two years. Another possibility was Digital Research, which was working on a new GUI-based system then known as Crystal, soon to become GEM. Another option was to write a new operating system, but this was rejected as Atari management was unsure whether the company had the required expertise.
Digital Research was fully committed to the Intel platform, so a team from Atari was sent to the Digital Research headquarters to work with the "Monterey Team", which comprised a mixture of Atari and Digital Research engineers. Atari's Leonard Tramiel was the Atari person overseeing "Project Jason" (also known as The Operating System) for the Atari ST series, named for designer and developer Jason Loveman.
GEM is based on CP/M-68K, a direct port of CP/M to the 68000. By 1985, CP/M was becoming increasingly outdated; it did not support subdirectories, for example. Digital Research was also in the process of building GEMDOS, a disk operating system for GEM, and debated whether a port of it could be completed in time for product delivery in June. The decision was eventually taken to port it, resulting in a GEMDOS file system which became part of Atari TOS (for "The Operating System", colloquially known as the "Tramiel Operating System"). This gave the ST a fast, hierarchical file system, essential for hard drives, and provided programmers with function calls similar to MS-DOS. The Atari ST character set is based on codepage 437.
Release.
After six months of intensive effort following Tramiel's takeover, Atari announced the 520ST at the Winter Consumer Electronics Show in Las Vegas in January 1985. "InfoWorld" assessed the prototypes shown at computer shows as follows:Pilot production models of the Atari machine are much slicker than the hand-built models shown at earlier computer fairs; it doesn't look like a typical Commodore 64-style, corner-cutting, low-cost Jack Tramiel product of the past.Atari unexpectedly displayed the ST at Atlanta COMDEX in May. Similarities to the original Macintosh and Tramiel's role in its development resulted in it being nicknamed "Jackintosh". Atari's rapid development of the ST amazed many, but others were skeptical, citing its "cheap" appearance, Atari's uncertain financial health, and poor relations between Tramiel-led Commodore and software developers.
Atari ST print advertisements stated, "America, We Built It For You", and quoted Atari president Sam Tramiel: "We promised. We delivered. With pride, determination, and good old ATARI know how". But Atari was out of cash, Jack Tramiel admitted that sales of its 8-bit family were "very, very slow", and employees feared that he would shut the company down.
In early 1985, the 520ST shipped to the press, developers, and user groups, and in early July 1985 for general retail sales. It saved the company. By November, Atari stated that more than 50 thousand 520STs had been sold, "with U.S. sales alone well into five figures". The machine had gone from concept to store shelves in a little under one year.
Atari had intended to release the 130ST with 128 KB of RAM and the 260ST with 256 KB. However, the ST initially shipped without TOS in ROM and required booting TOS from floppy, taking 206 KB RAM away from applications. The 260ST was launched in Europe on a limited basis. Early models have six ROM sockets for easy upgrades to TOS. New ROMs were released a few months later and were included in new machines and as an upgrade for older machines.
Atari originally intended to include GEM's GDOS (Graphical Device Operating System), which allows programs to send GEM VDI (Virtual Device Interface) commands to drivers loaded by GDOS. This allows developers to send VDI instructions to other devices simply by pointing to it. However, GDOS was not ready at the time the ST started shipping and was included in software packages and with later ST machines. Later versions of GDOS support vector fonts.
A limited set of GEM fonts were included in the ROMs, including the ST's standard 8x8 pixel graphical character set. It contains four characters which can be placed together in a square, forming the face of J. R. "Bob" Dobbs (the figurehead of the Church of the SubGenius).
The ST was less expensive than most contemporaries, including the Macintosh Plus, and is faster than many. Largely as a result of its price and performance factor, the ST became fairly popular, especially in Europe where foreign-exchange rates amplified prices. The company's English advertising slogan of the era was "Power Without the Price". An Atari ST and terminal emulation software was much cheaper than a Digital VT220 terminal, commonly needed by offices with central computers.
By late 1985, the 520STM added an RF modulator for TV display.
Industry reaction.
"Computer Gaming World" stated that Tramiel's poor pre-Atari reputation would likely make computer stores reluctant to deal with the company, hurting its distribution of the ST. One retailer said, "If you can believe Lucy when she holds the football for Charlie Brown, you can believe Jack Tramiel"; another said that because of its experience with Tramiel, "our interest in Atari is zero, zilch". Neither Atari nor Commodore could persuade large chains like ComputerLand or BusinessLand to sell its products. Observers criticized Atari's erratic discussion of its stated plans for the new computer, as it shifted between using mass merchandisers, specialty computer stores, and both. When asked at COMDEX, Atari executives could not name any computer stores that would carry the ST. After a meeting with Atari, one analyst said, "We've seen marketing strategies changed before our eyes".
Tramiel's poor reputation influenced potential software developers. One said, "Dealing with Commodore is like dealing with Attila the Hun. I don't know if Tramiel will be following his old habits ... I don't see a lot of people rushing to get software on the machine." Large business-software companies like Lotus, Ashton-Tate, and Microsoft did not promise software for either the ST or Amiga, and the majority of software companies were hesitant to support another platform beyond the IBM PC, Apple, and Commodore 64. Philippe Kahn of Borland said, "These days, if I were a consumer, I'd stick with companies [such as Apple and IBM] I know will be around".
At Las Vegas COMDEX in November 1985, the industry was surprised by more than 30 companies exhibiting ST software while the Amiga had almost none. After Atlanta COMDEX, "The New York Times" reported that "more than 100 software titles will be available for the [ST], most written by small software houses that desperately need work", and contrasted the "small, little-known companies" at Las Vegas with the larger ones like Electronic Arts and Activision, which planned Amiga applications.
Trip Hawkins of Electronic Arts said, "I don't think Atari understands the software business. I'm still skeptical about its resources and its credibility." Although Michael Berlyn of Infocom promised that his company would quickly publish all of its games for the new computer, he doubted many others would soon do so. Spinnaker and Lifetree were more positive, both promising to release ST software. Spinnaker said that "Atari has a vastly improved attitude toward software developers. They are eager to give us technical support and machines". Lifetree said, "We are giving Atari high priority". Some, such as Software Publishing Corporation, were unsure of whether to develop for the ST or the Amiga. John C. Dvorak wrote that the public saw both Commodore and Atari as selling "cheap disposable" game machines, in part because of their computers' sophisticated graphics.
Design.
The original 520ST case design was created by Ira Velinsky, Atari's chief Industrial Designer. It is wedge-shaped, with bold angular lines and a series of grilles cut into the rear for airflow. The keyboard has soft tactile feedback and rhomboid-shaped function keys across the top. It is an all-in-one unit, similar to earlier home computers like the Commodore 64, but with a larger keyboard with cursor keys and a numeric keypad. The original has an external floppy drive (SF354) and AC adapter. Starting with the 1040ST, the floppy drive and power supply are integrated into the base unit.
Ports.
The ports on the 520ST remained largely unchanged over its history.
Standard.
Because of its bi-directional design, the Centronics printer port can be used for joystick input, and several games used available adaptors that used the printer socket, providing two additional 9-pin joystick ports.
Monitor.
The ST supports a monochrome or colour monitor. The colour hardware supports two resolutions: 320 × 200 pixels, with 16 of 512 colours; and 640 × 200, with 4 of 512 colours. The monochrome monitor was less expensive and has a single resolution of 640 × 400 at 71.25 Hz. The attached monitor determines available resolutions, so each application either supports both types of monitors or only one. Most ST games require colour with productivity software favouring the monochrome.
Floppy drive.
Atari initially used single-sided 3.5 inch floppy disk drives that could store up to 360 KB. Later drives were double-sided and stored 720 KB. Some commercial software, particularly games, shipped by default on single-sided disks, even supplying two 360 KB floppies instead of a single double-sided one, to avoid alienating early adopters.
Some software uses formats which allow the full disk to be read by double-sded drives but still lets single-sided drives access side A of the disk. Many magazine coverdisks (such as the first 30 issues of "ST Format") were designed this way, as were a few games. The music in "Carrier Command" and the intro sequence in "Populous" are not accessible to single-sided drives, for example.
STs with double-sided drives can read disks formatted by MS-DOS, but IBM PC compatibles can not read Atari disks because of differences in the layout of data on track 0.
Later systems.
1040ST.
Atari upgraded the basic design in 1986 with the 1040STF, stylized as 1040STF: essentially a 520ST with twice the RAM and with the power supply and a double-sided floppy drive built-in instead of external. This adds to the size of the machine, but reduces cable clutter. The joystick and mouse ports, formerly on the right side of the machine, are in a niche underneath the keyboard. An "FM" variant includes an RF modulator allowing a television to be used instead of a monitor.
The trailing "F" and "FM" were often dropped in common usage. In "BYTE" magazine's March 1986 cover photo of the system, the name plate reads 1040STFM but in the headline and article it's simply "1040ST".
The 1040ST is one of the earliest personal computers shipped with a base RAM configuration of 1 MB. With a list price of in the US, "BYTE" hailed it as the first computer to break the $1000 per megabyte price barrier. "Compute!" noted that the 1040ST is the first computer with one megabyte of RAM to sell for less than $2,500.
A limited number of 1040STFs shipped with a single-sided floppy drive.
Mega.
Initial sales were strong, especially in Europe, where Atari sold 75% of its computers. West Germany became Atari's strongest market, with small business owners using them for desktop publishing and CAD.
To address this growing market segment, Atari introduced the ST1 at Comdex in 1986. Renamed to Mega, it includes a high-quality detached keyboard, a stronger case to support the weight of a monitor, and an internal bus expansion connector. An optional 20 MB hard drive can be placed below or above the main case. Initially equipped with 2 or 4 MB of RAM (a 1 MB version, the Mega 1, followed), the Mega machines can be combined with Atari laser's printer for a low-cost desktop publishing package.
A custom blitter coprocessor improved some graphics performance, but was not included in all models. Developers wanting to use it had to detect its presence in their programs. Properly written applications using the GEM API automatically make use of the blitter.
STE.
In late 1989, Atari Corporation released the 520STE and 1040STE (also written STE), enhanced version of the ST with improvements to the multimedia hardware and operating system. It features an increased color palette of 4,096 colors from the ST's 512 (though the maximum displayable palette without programming tricks is still limited to 16 in the lowest 320 × 200 resolution, and even fewer in higher resolutions), genlock support, and a blitter coprocessor (stylized as "BLiTTER") which can quickly move large blocks of data (particularly, graphics data) around in RAM. The STE is the first Atari with PCM audio; using a new chip, it added the ability to play back 8-bit (signed) samples at 6258 Hz, 12517 Hz, 25033 Hz, and even 50066 Hz, via direct memory access (DMA). The channels are arranged as either a mono track or a track of LRLRLRLR... bytes. RAM is now much more simply upgradable via SIMMs.
Two enhanced joystick ports were added (two normal joysticks can be plugged into each port with an adapter), with the new connectors placed in more easily accessed locations on the side of the case. The enhanced joystick ports were re-used in the Atari Jaguar console and are compatible.
The STE models initially had software and hardware conflicts resulting in some applications and video games written for the ST line being unstable or even completely unusable, primarily caused by programming direct hardware calls which bypassed the operating system. Furthermore, even having a joystick plugged in would sometimes cause strange behavior with a few applications (such as the WYSIWYG word-processor application 1st Word Plus). Very little use was made of the extra features of the STE: STE-enhanced and STE-only software was rare.
The last STE machine, the Mega STE, is an STE in a grey Atari TT case that had a switchable 16 MHz, dual-bus design (16-bit external, 32-bit internal), optional Motorola 68881 FPU, built-in 1.44 MB "HD" 3-inch floppy disk drive, VME expansion slot, a network port (very similar to that used by Apple's LocalTalk) and an optional built-in 3" hard drive. It also shipped with TOS 2.00 (better support for hard drives, enhanced desktop interface, memory test, 1.44 MB floppy support, bug fixes). It was marketed as more affordable than a TT but more powerful than an ordinary ST.
Atari TT.
In 1990, Atari released the high-end workstation-oriented Atari TT030, based on a 32 MHz Motorola 68030 processor. The "TT" name ("Thirty-two/Thirty-two") continued the nomenclature because the 68030 chip has 32-bit buses both internally and externally. Originally planned with a 68020 CPU, the TT has improved graphics and more powerful support chips. The case has a new design with an integrated hard-drive enclosure.
Falcon.
The final model of ST computer is the Falcon030. Like the TT, it is 68030-based, at 16 MHz, but with improved video modes and an on-board Motorola 56001 audio digital signal processor. Like the Atari STE, it supports sampling frequencies above 44.1 kHz; the sampling master clock is 98340 Hz (which can be divided by a number between 2 and 16 to get the actual sampling frequencies). It can play the STE sample frequencies (up to 50066 Hz) in 8 or 16 bit, mono or stereo, all by using the same DMA interface as the STE, with a few additions. It can both play back and record samples, with 8 mono channels and 4 stereo channels, allowing musicians to use it for recording to hard drive. Although the 68030 microprocessor can use 32-bit memory, the Falcon uses a 16-bit bus, which reduces performance and cost. In another cost-reduction measure, Atari shipped the Falcon in an inexpensive case much like that of the STF and STE. Aftermarket upgrade kits allow it to be put in a desktop or rack-mount case, with the keyboard separate.
Released in 1992, the Falcon was discontinued by Atari the following year. In Europe, C-Lab licensed the Falcon design from Atari and released the C-Lab Falcon Mk I, identical to Atari's Falcon except for slight modifications to the audio circuitry. The Mk II added an internal 500 MB SCSI hard disk; and the Mk X further added a desktop case. C-Lab Falcons were also imported to the US by some Atari dealers.
Software.
As with the Atari 8-bit family of computers, software publishers attributed their reluctance to produce Atari ST products in part to—as "Compute!" reported in 1988—the belief in the existence of a "higher-than-normal amount of software piracy". That year, WordPerfect threatened to discontinue the Atari ST version of its word processor because the company discovered that pirate bulletin board systems (BBSs) were distributing it, causing "ST-Log" to warn that "we had better put a stop to piracy "now" ... it can have harmful effects on the longevity and health of your computer". In 1989, magazines published a letter by Gilman Louie, head of Spectrum HoloByte. He stated that he had been warned by competitors that releasing a game like "Falcon" on the ST would fail because BBSs would widely disseminate it. Within 30 days of releasing the non-copy protected ST version, the game was available on BBSs with maps and code wheels. Because the ST market was smaller than that for the IBM PC, it was more vulnerable to piracy which, Louie said, seemed to be better organized and more widely accepted for the ST. He reported that the Amiga version sold in six weeks twice as much as the ST version in nine weeks, and that the Mac and PC versions had four times the sales. "Computer Gaming World" stated "This is certainly the clearest exposition ... we have seen to date" of why software companies produced less software for the ST than for other computers.
Several third-party OSes were developed for, or ported to, the Atari ST. Unix clones include Idris, Minix, and the MiNT OS which was developed specifically for the Atari ST.
Audio.
Plenty of professional quality MIDI-related software was released. The popular Windows and Macintosh applications Cubase and Logic Pro originated on the Atari ST (the latter as Creator, Notator, Notator-SL, and Notator Logic). Another popular and powerful ST music sequencer application, KCS, contains a "Multi-Program Environment" that allows ST users to run other applications, such as the synthesizer patch editing software XoR (now known as Unisyn on the Macintosh), from within the sequencer application.
Music tracker software became popular on the ST, such as the TCB Tracker, aiding the production of quality music from the Yamaha synthesizer, now called chiptunes.
Due to the ST having comparatively large amounts of memory for the time, sound sampling packages became feasible. Replay Professional features a sound sampler using the ST cartridge port to read in parallel from the cartridge port from the ADC. For output of digital sound, it uses the on-board frequency output, sets it to 128 kHz (inaudible) and then modulates the amplitude of that.
MasterTracks Pro originated on Macintosh, then ST, then IBM PC version. It continued on Windows and macOS, along with the original company's notation applications Encore.
Applications.
Professional desktop publishing software includes PageStream and Calamus. Word processors include WordPerfect, Microsoft Write, AtariWorks, Signum, Script and First Word (bundled with the machine). Spreadsheets include 3D-Calc, and databases include Zoomracks. Graphics applications include NEOchrome, DEGAS & DEGAS Elite, Deluxe Paint, STAD, and Cyber Paint (which author Jim Kent would later evolve into Autodesk Animator) with advanced features such as 3D design and animation. The Spectrum 512 paint program uses rapid palette switching to expand the on-screen color palette to 512 (up to 46 colors per scan line).
3D computer graphics applications (like Cyber Studio CAD-3D, which author Tom Hudson later developed into Autodesk 3D Studio), brought 3D modelling, sculpting, scripting, and computer animation to the desktop. Video capture and editing applications use dongles connected to the cartridge port for low frame rate, mainly silent and monochrome, but progressed to sound and basic color in still frames. At the end, Spectrum 512 and CAD-3D teamed up to produce realistic 512-color textured 3D renderings, but processing was slow, and Atari's failure to deliver a machine with a math coprocessor had Hudson and Yost looking towards the PC as the future before a finished product could be delivered to the consumer.
Graphical touchscreen point of sale software for restaurants was originally developed for Atari ST by Gene Mosher under the ViewTouch copyright and trademark. Instead of using GEM, he developed a GUI and widget framework for the application using the NEOchrome paint program.
Software development.
The 520ST was bundled with both Digital Research Logo and Atari ST BASIC. Third-party BASIC systems with better performance were eventually released: HiSoft BASIC, GFA BASIC, FaST BASIC, DBASIC, LDW BASIC, Omikron BASIC, BASIC 1000D and STOS. In the later years of the Atari ST, Omikron Basic was bundled with it in Germany.
Atari's initial development kit from Atari is a computer and manuals. The cost discouraged development. The later Atari Developer's Kit consists of software and manuals for . It includes a resource kit, C compiler (first Alcyon C, then Mark Williams C), debugger, 68000 assembler, and non-disclosure agreement. The third-party Megamax C development package was .
Other development tools include 68000 assemblers (MadMac from Atari, HiSoft Systems's Devpac, TurboAss, GFA-Assembler), Pascal (OSS Personal Pascal, Maxon Pascal, PurePascal), Modula-2, C compilers (Lattice C, Pure C, Megamax C, GNU C, Aztec C, AHCC), LISP, and Prolog.
Games.
The ST had success in gaming due to the low cost, fast performance, and colorful graphics. ST game developers include Peter Molyneux, Doug Bell, Jeff Minter, Éric Chahi, Jez San, and David Braben.
The realtime pseudo-3D role-playing video game "Dungeon Master", was developed and released first on the ST, as the best-selling software ever produced for the platform. Simulation games like "Falcon" and "Flight Simulator II" use the ST's graphics hardware, as do many arcade ports. The 1987 first person shooter, "MIDI Maze", uses the MIDI ports to connect up to 16 machines for networked deathmatch play. The critically acclaimed "Another World" was originally released for ST and Amiga in 1991 with its engine developed on the ST and the rotoscoped animation created on the Amiga. Games simultaneously released on the Amiga that do not use the Amiga's superior graphics and sound capabilities were often accused by video game magazines of simply being ST ports.
Garry Kasparov became the first chess player to register a copy of ChessBase, a popular commercial database program for storing and searching records of chess games. The first version was built for Atari ST with his collaboration in January 1987. In his autobiography "Child of Change", he regards this facility as "the most important development in chess research since printing".
Emulators.
Spectre GCR emulates the Macintosh. MS-DOS emulators were released in the late 1980s. PC-Ditto has a software-only version, and a hardware version that plugs into the cartridge slot or kludges internally. After running the software, an MS-DOS boot disk is required to load the system. Both run MS-DOS programs in CGA mode, though much more slowly than on an IBM PC. Other options are the PC-Speed (NEC V30), AT-Spee (Intel 80286), and ATonce-386SX (Intel 80386SX) hardware emulator boards.
Music industry.
The ST's low cost, built-in MIDI ports, and fast, low-latency response times made it a favorite with musicians.
Technical specifications.
All STs are made up of both custom and commercial chips.
ST/STF/STM/STFM.
As originally released in the 520ST:
Very early machines have the OS on a floppy disk before a final version was burned into ROM. This version of TOS was bootstrapped from a small core boot ROM.
In 1986, most production models became STFs, with an integrated single- (520STF) or double-sided (1040STF) double density floppy disk drive built-in, but no other changes. Also in 1986, the "520STM" (or "520STM") added an RF Modulator for allowing the low and medium resolution color modes when connected to a TV. Later "F" and "FM" models of the 520 had a built-in double-sided disk drive instead of a single-sided one.
STE.
As originally released in the 520STE/1040STE:
Models.
The members of the ST family are listed below, in roughly chronological order:
Unreleased.
The 130ST was intended to be a 128 KB variant. It was announced at the 1985 CES alongside the 520ST but never produced. The 4160STE was a 1040STE, but with 4 MB of RAM. A small quantity of development units were produced, but the system was never officially released. Atari did produce a quantity of 4160STE metallic case badges which found their way to dealers, so it's not uncommon to find one attached to systems which were originally 520/1040STE. No such labels were produced for the base of the systems.
Related systems.
Atari Transputer Workstation is a standalone machine developed in conjunction with Perihelion Hardware, containing modified ST hardware and up to 17 transputers capable of massively parallel operations for tasks such as ray tracing.
Clones.
Following Atari's departure from the computer market, both Medusa Computer Systems and Milan Computer manufactured Atari Falcon/TT-compatible machines with 68040 and 68060 processors. The FireBee is an Atari ST/TT clone based on the Coldfire processor. The GE-Soft Eagle is a 32 MHz TT clone.
|
2142 | List of artificial intelligence projects | The following is a list of current and past, non-classified notable artificial intelligence projects.
|