20090830

11. THE KOREAN WAR NEVER ENDED

Better not tell Hawkeye Pierce and the rest of the gang from M*A*S*H, but the Korean War is technically still happening. This comes to us from no less an authority than Howard S. Levie, the I man who drafted the Korean Armistice Agreement. At the time, this law professor was a captain in the Office of the Judge Advocate General (JAG). He explains:

An armistice is not a peace treaty. While its main objective is to bring about a cease-fire, a halt to hostilities, that halt may be indefinite or for a specified period of time only. An armistice agreement does not terminate the state of war between the belligerents. A state of war continues to exist with all of its implications for the belligerents and for the neutrals.

The Korean Armistice itself even specifies that it is only a stop-gap measure "until a final peaceful settlement is achieved." To date, this settlement — otherwise known as a peace treaty — has never occurred. One attempt was made, at the Geneva Convention of 1954, but nothing came of it.

Interestingly, the Armistice wasn't signed at all by South Korea but rather by the head honchos in the United Nations Command, North Korea's army, and China's army. It should also be noted that the conflict in Korea wasn't technically a "war," because — like so many other post-WWII hostilities — there was no formal declaration of war. As The Korean War: An Encyclopedia trenchantly observes: "Since the war had never been declared, it was fitting that the should be no official ending, merely a suspension of hostilities."

North Korea has more than once denounced the Armistice, threatening to press the "play" button on the long-paused Korean War. Most recently, in February 2003, Kim Jong-il's government said that because of repeated US violations, the Armistice is merely "a blank piece of paper without any effect or significance."

12. AGENT ORANGE WAS USED IN KOREA

"Agent Orange" is practically synonymous with the Vietnam War. The Dow Chemical defoliant was used to de-junglize large areas, exposing enemy troops, supplies, and infiltrators. It has been linked, though never definitively, to a number of nasty health problems such as Hodgkin's disease and adult-onset diabetes, plus spina bifida in offspring. The Veterans Administration compensates sick veterans who were exposed in Vietnam.

But it turns out that 'Nam wasn't the only place to get doused with this super-herbicide. From April 1968 to July 1969, 21,000 gallons of Agent Orange were sprayed along a strip of land abutting the southern border of the Demilitarized Zone between the two Koreas. During that time period, mound 80,000 US military personnel served in South Korea, although not all of them would've been in the vicinity of the DMZ. The VA contradicts itself regarding who did the spraying, claiming at one point that it was South Korea but saying at another that the Department of Defense did it.

In September 2000, the VA quietly sent letters to veterans who served in Korea during the spraying, letting them know that they may have been dosed with Agent Orange. Since these letters were sent over 30 years after the exposure, the Pentagon must've just found out about it, light? Actually, even if you buy the story that the South Koreans were responsible, the US military knew about the spraying at the time it happened but kept quiet about it for decades. It was only when news reports began citing declassified documents in 1999 that the government decided to do something.

Possibly exposed vets can get tested for free by the Veterans Administration. The catch is, if they're sick with Hodgkin's or some other horrible disease, they — unlike their Vietnam compatriots — aren't eligible for compensation or additional health care. However, for their agony, Korean vets will receive a free newsletter, the same one that Vietnam vets get.

13. KENT STATE WASN'T THE ONLY — OR EVEN THE FIRST MASSACRE OF COLLEGE STUDENTS DURING THE VIETNAM ERA

It's one of the defining moments of the Vietnam era and, more than that, twentieth-century US history in general. On May 4, 1970, the Ohio National Guard opened fire on unarmed Kent State University students protesting the war. Four were killed, eight were wounded, and another was left paralyzed. It's so ingrained in the country's psyche that it even appears in American history textbooks, and the anniversary is noted each year by the major media. Yet this wasn't the only time the authorities slaughtered unarmed college kids during this time period. It happened on at least two other occasions, which have been almost completely forgotten.

A mere ten days after the Kent State massacre, students at the historically black Jackson State University in Mississippi were protesting not only the Vietnam War and the recent killings at Kent, but racism as well. On the night of May 14, 1970, during the protests, a small riot broke out when a false rumor swept the campus: The black mayor of Fayette, Mississippi, was said to have been assassinated. As at Kent State, some students or provocateurs threw bricks and stones and set fires. Firefighters trying to put out a blaze in a men's dorm were hassled by an angry crowd, so they called for police protection. The campus was cordoned off. Jackson State's Website devoted to the incident says: "Seventy-five city policemen and Mississippi State Police officers armed with carbines, submachine guns, shotguns, service revolvers and some personal weapons, responded to the call." After the fire had been extinguished, the heavily armed cops marched down the street, herding students towards a women's dorm. As the notes: "No one seems to know why."

Seventy-five to 100 students were pushed back until they were in front of the dorm, where they began yelling and throwing things at the police. "Accounts disagree as to what happened next. Some students said the police advanced in a line, warned them, then opened fire. Others said the police abruptly opened fire on the crowd and the dormitory. Other witnesses reported that the students were under the control of a campus security officer when the police opened fire. Police claimed they spotted a powder flare in the Alexander West Hall third floor stairwell window and fire in self-defense on the dormitory only. Two local television news reporters present at the shooting agreed that a shot was fired, but were uncertain of the direction. A radio reporter claimed to have seen an arm and a pistol extending from a dormitory window." Two people — both outside the dorm — were killed in over 30 seconds of sustained gunfire from the cops. Jackson student Phillip Lafayette Gibbs was shot in the head, and a bystander —high-school senior James Earl Green — took it in the chest. A dozen students were nonfat ally shot, and many more were injured by flying glass. Over 460 rounds had hit the dorm. No member of law enforcement was injured.

Aim the carnage, Inspector "Goon" Jones radioed the dispatcher, saying that "nigger students" been killed. When the dispatcher asked him about the injured, he said: "I think there are about three more nigger males there.... There were two nigger gals — two more nigger gals from over there shot in the arm, I believe."

Even less known is the Orangeburg massacre, which took place two years earlier. Students at South Carolina State University in Orangeburg —joined by students from another black college, Claflin University — were protesting the failure of the town's only bowling alley to racially integrate. February 8,1968, was the fourth night of demonstrations, and students had lit a bonfire on campus. Police doused it, but a second one was started. When the cops tried to extinguish this one, the crowd — in a scene to be replayed at Kent and Jackson — started throwing things at them. One highway patrolmen fired warning shots into the air, and all hell broke loose as the assembled police opened fire on the unarmed crowd.

After a barrage of weapons-fire, three people were dead — eighteen-year-olds Henry Smith and I Samuel Hammond, and high-school student Delano Middleton. Twenty-seven other demonstrators were wounded. The vast majority of them had been shot in the back as they ran away.

South Carolina's Governor praised the police for their handling on the situation, giving all of them promotions. Nine patrolmen were eventually tried on federal charges, and all were acquitted. It! was only 33 years later — on the 2001 anniversary of the carnage — that a Governor of the state admitted the heinous nature of what happened that night. Governor Jim Hodges said, "We deeply regret" the mass-shooting, but he stopped short of apologizing for it.

14. WINSTON CHURCHILL BELIEVED IN A WORLDWIDE JEWISH CONSPIRACY

Like Henry Ford, Britain's larger-than-life wartime Prime Minister, Winston Churchill, believed that a group of "international Jews" was striving to take over the world. On February 8, 1920, the Illustrated Sunday Herald (published in London) ran an article by Churchill. Its title: "Zionism Versus Bolshevism: A Struggle for the Soul of the Jewish People." At the time, Winnie was Secretary of State for War and Air and had already been a prominent Member of Parliament. Churchill didn't slam all Jews; rather, he painted them as a people of two extremes. "The conflict between good and evil which proceeds unceasingly in the breast of man nowhere reaches such an intensity as in the Jewish race. The dual nature of mankind is nowhere more strongly or more terribly exemplified.... It would almost seem as if the gospel of Christ and the gospel of Antichrist ware destined to originate among the same people; and that this mystic and mysterious race had been chosen for the supreme manifestations, both of the divine and the diabolical." He identifies three strains of political thought among the world's Jews: Nationalism, in which a Jewish person identifies first and foremost with the country in which he or she lives. Zionism, in j which a Jewish person wants a country specifically for Jews (Israel would be formed 28 years after Winnie's essay). These are both honorable, says Churchill, unlike the third option — the terrorism and atheistic communism of "International Jews." He writes:

International Jews
In violent opposition to all this sphere of Jewish effort rise the schemes of the International Jews. The adherents of this sinister confederacy are mostly men reared up among the unhappy populations of countries where Jews are persecuted on account of their race. Most, if not all, of them have forsaken the faith of their forefathers, and divorced from their minds all spiritual hopes of the next world. This movement among the Jews is not new. From the days of Spartacus-Weishaupt to those of Karl Marx, and down to Trotsky (Russia), Bela Kun (Hungary), Rosa Luxembourg (Germany), and Emma Goldman (United States), this world-wide conspiracy for the overthrow of civilization and for the reconstitution of society on the basis of arrested development, of envious malevolence, and impossible equality, has been steadily growing. It played, as a modern writer, Mrs. Webster, has so ably shown, a definitely recognizable part in the tragedy of the French Revolution. It has been the mainspring of every subversive movement during the Nineteenth Century; and now at last this band of extraordinary personalities from the underworld of the great cities of Europe and America have gripped the Russian people by the hair of their heads and have become practically the undisputed masters of that enormous empire.

Terrorist Jews
There is no need to exaggerate the part played in the creation of Bolshevism and In the actual bringing about of the Russian Revolution, by these international and for the most part atheistical Jews, it is certainly a very great one; it probably outweighs all others. With the notable exception of Lenin, the majority of the leading figures are Jews. Moreover, the principal inspiration and driving power comes from the Jewish leaders. Thus Tchitcherin,
a pure Russian, is eclipsed by his nominal subordinate Litvinoff, and the influence of Russians like Bukharin or Lunacharski cannot be compared with the power of Trotsky, or of Zinovieff, the Dictator of the Red Citadel (Petrograd) or of Krassin or Radek — all Jews. In the Soviet institutions the predominance of Jews is even more astonishing. And the prominent, if not indeed the principal, part in the system of terrorism applied by the Extraordinary Commissions for Combating Counter-Revolution has been taken by Jews, and in some notable cases by Jewesses. The same evil prominence was obtained by Jews in the brief period of terror during which Bela Kun ruled in Hungary. The same phenomenon has been presented in Germany (especially in Bavaria), so far as this madness has been allowed to prey upon the temporary prostration of the German people. Although in all these countries there are many non-Jews every whit as bad as the worst of the Jewish revolutionaries, the part played by the latter in proportion to their numbers in the population is astonishing.


Naturally, Churchill's admirers aren't exactly proud of this essay, which has led some of them to question its authenticity. However, the leading Churchill bibliographer, Frederick Woods, has pronounced the article genuine, listing it in his authoritative A Bibliography of the Works of Sir Winston Churchill.

20090826

15. THE AUSCHWITZ TATTOO WAS ORIGINALLY AN IBM CODE NUMBER

The tattooed numbers on the forearms of people held and killed in Nazi concentration camps have become a chilling symbol of hatred. Victims were stamped with the indelible number in a dehumanizing effort to keep track of them like widgets in the supply chain.
These numbers obviously weren't chosen at random. They were part of a coded system, with each number tracked as the unlucky person who bore it was moved through the system. Edwin Black made headlines in 2001 when his painstakingly researched book, IBM and the Holocaust, showed that IBM machines were used to automate the "Final Solution" and the
jackbooted takeover of Europe. Worse, he showed that the top levels of the company either knew or willfully turned a blind eye.
A year and a half after that book gave Big Blue a black eye, the author made more startling discoveries. IBM equipment was on-site at the Auschwitz concentration camp. Furthermore:

Thanks to the new discoveries, researchers can now trace how Hollerith numbers assigned to inmates evolved into the horrific tattooed numbers so symbolic of the Nazi era. (Herman Hollerith was the German American who first automated US census information in the late 19th century and founded the company that became IBM. Hollerith's name became synonymous with the machines and the Nazi "departments" that operated them.) In one case, records show, a timber merchant from Bendzin, Poland, arrived at Auschwitz in August 1943 and was assigned a characteristic five-digit IBM Hollerith number, 44673. The number was part of a custom punch-card system devised by IBM to track prisoners in all Nazi concentration camps, including the slave labor at Auschwitz. Later in the summer of 1943, the Polish timber merchant's same five-digit Hollerith number, 44673, was tattooed on his forearm. Eventually, during the summer of 1943, all non-Germans at Auschwitz were similarly tattooed.

The Hollerith numbering system was soon scrapped at Auschwitz because so many inmates died. Eventually, the Nazis developed their own haphazard system.

16. ADOLPH HITLER'S BLOOD RELATIVES ARE ALIVE AND WELL IN New York STATE

Adolph Hitler never had kids, so we tend to take for granted the idea that no one alive is closely related to him. But historians have long known that he had a nephew who was born in Britain and moved to the United States. Alois Hitler, Jr., was Adolph's older half-brother (their common parent was Alois Sr). Alois Jr. — a waiter in Dublin — married an Irish woman, and, after moving to Liverpool, they had a son, William Patrick Hitler.
Pat, as he was called, moved to Germany as a young adult to take advantage of his uncle's rising political stature, but Adolph just gave him minor jobs and kept him out of the limelight. After being subtly threatened by Rudolph Hess to become a German citizen, and having gotten tired of being dissed by Adolph, Pat came to America in 1939 and went on a lecture tour around the US, denouncing his uncle. (For his part, Adolph referred to his nephew as "loathsome.") While
World War II was raging, Pat joined the US Navy, so he could fight against Uncle Adolph. Afterwards, he changed his last name, and this is where the trail goes cold. That is, until US-based British reporter David Gardner was assigned to track down and interview William Patrick. Originally given two weeks to file the story, Gardner realized that finding
Hitler's long-lost nephew was tougher than it first appeared. He worked on the story during his spare time for several years, unearthing old news clippings, filing requests for government documents, interviewing possible relatives, and chasing a lot of dead ends.
He finally discovered that William Patrick had ended up in a small town in Long Island, New York. Pat had died in 1987, but Gardner showed up unannounced on the doorstep of his widow, Phyllis, who confirmed that her late husband was Adolph Hitler's nephew. She also mentioned that she and Pat had sons, but she quickly clammed up and asked Gardner to leave. The two never spoke again.
After more legwork, Gardner found that Pat and Phyllis produced four children, all sons. The eldest, born in 1949, is named Alexander Adolph. (Just why Pat would name his firstborn after his detested uncle is one of many mysteries still surrounding the Hitler kin.) Then came Louis in 1951, Howard (1957), and Brian (1965). Howard — a fraud investigator for the IRS — died in a car crash w 1989, and Louis and Brian continue to run a landscaping business in the small New
York community. Alex lives in a larger Long Island city. He twice spoke to Gardner but didn't reveal very much, saying that the family's ancestry is "a pain in the ass." Alex said that his brothers made a pact never to have children, in order to spare their progeny the burden of being related to a monster. He denied having made such a vow himself, despite the fact that he is still childless.
Gardner sums it up: "Although there are some distant relations living equally quiet lives in Austria, the three American sons are the only descendants of the paternal line of the family. They are, truly, the last of the Hitlers."

17. AROUND ONE QUARTER OF "WITCHES" WERE MEN

The word "witch" has become synonymous with "woman accused of working magic," and the consensus tells us that the witch trials in Europe and Colonial America were simply a war against women (ie, "gendercide"). Most popular works on the subject ignore the men who were accused and executed for supp-osedly practicing witchcraft. Academic works that don't omit male witches usually explain them away, as if they were just a few special cases that don't really count.
Into this gap step Andrew Gow, an associate professor of history at the University of Alberta, and one of his grad students, Lara Apps. Their book Male Witches in Early Modern Europe scours the literature and finds that, of the 110,000 people tried for witchcraft and the 60,000 executed from 1450 to 1750, some-where between 20 to 25 percent were men.
This is an average across Europe, the British Isles, and the American Colonies; the gender ratios vary widely from place to place. The lowest percentages of males were persecuted in the Basel region of Switzerland (5 percent) and in Hungary (10 percent). Places that hovered around the 50/50 mark were Finland (49 percent) and Burgundy (52 percent). Men were the clear majority of "witches" in Estonia (60 percent) and Norway (73 percent). During Iceland's witch craze, from 1625 to 1685, an amazing 110 out of 120 "witches" were men, for a percentage of 92. As for America, almost a third of those executed during the infamous Salem witch trials (six out of nineteen) were men. Besides bringing these numbers to light, professor Gow and pupil Apps present serious challenges to the attempts to erase male witches from the picture. For example, some writers claim that the men were caught up in the hysteria solely because they were related to accused women. In this scenario, the men were only "secondary targets" ("collateral damage," perhaps?). But in numerous instances men were persecuted by themselves. In other cases, a woman became a secondary target after her husband had been singled out as a witch.
Although women were the overall majority of victims, the "burning times" were pretty rough for men, too.

18. THE VIRGINIA COLONISTS PRACTICED CANNIBALISM

During the harsh winter of 1609-1610, British subjects in the famous colony of Jamestown, Virginia, ate their dead and their shit. This fact doesn't make it into very many US history textbooks, and the state's official Website apparently forgot to mention it in their history section. When you think about it rationally, this fact should be a part of mainstream history. After all, it demonstrates the strong will to survive among the colonists. It shows the mind-boggling hardships they endured and overcame. Yet the taboo against eating these two items is so over-powering that this episode can't be mentioned in conventional history.
Luckily, an unconventional historian, Howard Zinn, revealed this fact in his classic A People's History of the United States. Food was so nonexistent during that winter, only 60 out of 500 colonists survived. A government document from that time gives the gruesome details:

Driven thru insufferable hunger to eat those things which nature most abhorred, the flesh and excrements of man as well of our own nation as of an Indian, digged by some out of his grave after he had lain buried three days and wholly devoured him; others, envying the better state of body of any whom hunger has not yet so much wasted as their own, lay wait and threatened to kill and eat them; one among them slew his wife as she slept in his bosom, cut her in pieces, salted her and fed upon her till he had clean devoured all parts saving her head.

19. MANY OF THE PIONEERING FEMINISTS OPPOSED ABORTION

The idea that feminism equals the right to an abortion has become so ingrained that it -.iiems ludicrous to think otherwise. "Prolife feminism" appears to be an inherent contradiction in terms. Yet more than 20 founding mothers of the feminist movement — who helped secure women's rights to vote, to own property, to use contraception, to divorce abusive husbands — were adamantly opposed to abortion.
The most famous nineteenth-century feminist — Susan B. Anthony, she of the ill-fated dollar coin — referred to abortion as "the horrible crime of child-murder." And that's just for starters. She also called it "infanticide," "this most monstrous crime," "evil," and a "dreadful deed." Surprisingly, given that unsparing language, she didn't believe that it should be made illegal. Responding to an article in which a man called for the outlawing of abortion, Anthony writes: "Much as I deplore the horrible crime of child-murder, earnestly as I desire its suppression, I cannot believe with the writer of the above-mentioned article, that such a law would have the desired effect. It seems to be only mowing off the top of the noxious weed, while the root remains."
The root, she believed, was the horrible way in which women (and children) were treated. As summed up in the book Prolife Feminism, these pioneering women felt that "abortion was the product of a social system that compelled women to remain ignorant about their bodies, that enabled men to dominate them sexually without taking responsibility for the consequences, that denied women support during and after the resulting pregnancies, and that placed far more value on a child's 'legitimacy' than on his or her life and well-being."
Indeed, while Anthony gave women a lot of grief for ending a pregnancy, she reserved the most vitriol for the men who knocked them up:

Guilty? Yes, no matter what the motive, love of ease, or a desire to save from suffering the unborn innocent, the woman is awfully guilty who commits the deed. It will burden her conscience in life, it will burden her soul in death; but oh! thrice guilty is he who, for selfish gratification, heedless of her prayers, indifferent to her fate, drove her to the desperation which impelled her to her crime.

Elizabeth Cady Stanton, Anthony's best friend for life, resented society's dictate that all women must become mothers. Yet she also thought that "maternity is grand," but it must be on the woman's own terms. Despite this, she railed against abortion. Like her pal, she referred to abortions as "murder," "a crying evil," "abominations," and "revolting outrages against the laws of nature and our common humanity." Also like Anthony, Stanton laid the blame for abortion at
the feet of men. Dr. Elizabeth Blackwell, lionized as the first US woman to become a medical doctor (in 1849), wrote in her diary:

The gross perversion and destruction of motherhood by the abortionist filled me with indignation, and awakened active antagonism. That the honorable term "female physician" should be exclusively applied to those women who carried on this shocking trade seemed to me a horror. It was an utter degradation of what might and should become a noble position for women.

Another prolife feminist was Victoria Woodhull, best known for being the first female candidate .for US President (way back in 1870). Radical even by early feminist standards, she and her sister, Tennnessee Claflin, declared that children had rights which began at conception. Their essay "The slaughter of the Innocents" first discusses the abominable death rate of children under five, then turns its sights on abortion:

We are aware that many women attempt to excuse themselves for procuring abortions, upon the ground that it is not murder. But the fact of resort to so weak an argument only shows the more palpably that they fully realize the enormity of the crime. Is it not equally destroying the would-be future oak, to crush the sprout before it pushes its head above the sod, as it is to cut down the sapling, or cut down the tree? Is it not equally to destroy life, to crush it in its very germ, and to take it when the germ has evolved to any given point in its line of development? Let those who can see any difference regarding the time when life, once begun, is taken, console themselves that they are not murderers having been abortionists.

20090821

20. BLACK PEOPLE SERVED IN THE CONFEDERATE ARMY

Like "prolife feminist," the phrase "black Confederate" seems like an oxymoron. But the record shows that many slaves and free blacks were a part of the South's military during the US Civil War.
None other than abolitionist Frederick Douglass, a former slave and one of the most prominent African Americans in history, declared:

There are at present moment [autumn 1861], many colored men in the Confederate Army doing duty not only as cooks, servants, and laborers, but as real soldiers, having musket on their shoulders and bullets in their pockets, ready to shoot down loyal troops and do all that soldiers may do to destroy the Federal government and build up that of the traitors and rebels.

In Black Confederates and Afro-Yankees in Civil War Virginia, Professor Ervin L. Jordan, Jr., writes:

Numerous black Virginians served with Confederate forces as soldiers, sailors, teamsters, spies, and hospital personnel.... I know of black Confederate sharp-shooters who saw combat during the 1862 Seven Days Campaign and [of] the existence of black companies [which] organized and drilled in Richmond in March-April 1865. Integrated companies of black and white hospital workers fought against the Union army in the Petersburg trenches during March 1865. There were several recruitment campaigns and charity balls held in Virginia on behalf of black soldiers and special camps of instruction were established to train them.

The book Black Confederates contains loads of primary documents testifying to the role of African Americans: letters, military documents, tributes, obituaries, contemporaneous newspaper articles, and more. In an 1862 letter to his uncle, a soldier at Camp Brown in Knoxville, Tennessee, wrote that his company had recently gunned down six Union soldiers and that "Jack Thomas a colored person that belongs to our company killed one of them."
An 1861 article in the Montgomery Advertiser says:
"We are informed that Mr. G.C. Hale, of Autauga County, yesterday tendered to Governor Moore the services of a company of negroes, to assist in driving back the horde of abolition sycophants who are now talking so flippantly of reducing to a conquered province the Confederate States of the South."
The obituary of black South Carolinian Henry Brown states that he had never been a slave and had served in three wars: the Mexican, the Spanish-American, and the Civil (on the side of the South). He was given a 21-gun salute at his funeral.
In 1890, black Union veteran Joseph T. Wilson wrote in his book, The Black Phalanx: A History of the Negro Soldiers of the United States, that New Orleans was home to two Native Guard regiments, which comprised 3,000 "colored men." Referring to these regiments in an 1898 book, Union Captain Dan Matson said: "Here is a strange fact. We find that the Confederates themselves first armed and mustered the Negro as a solider in the late war."
Most blacks in the Confederate Army, though, were in supporting roles such as cook, musician, nurse, and the catch-all "servant." However, a lot of them ended up fighting on the battlefield, even though the South didn't officially induct black soldiers until late in the conflict. And all of them — whether inducted or not, whether solider or some other position — were eligible for military pensions from several Southern states (including Tennessee and Mississippi), an records show that many of them signed up for these benefits.

A follow-up volume, Black Southerners in Confederate Armies, presents even more source documents. A book from 1866 contains the recollection of a Union man whose compatriot killed a black Confederate sniper "who, through his skill as a marksman, had done more injury to our men that any dozen of his white compeers..." Union documents show Henry Marshall, a black soldier with the 14th Kentucky Cavalry, being held in Northern prisoner of war camps. A pension document from South Carolina reveals that "a free Negro who volunteered" for the army served from August 1861 to the end of the war — over three and a half years. An obituary for George Mathewson says that the former slave received "a Cross of Honor for bravery in action," based on his role as standard-bearer.
The New York Tribune noted "that the Rebels organized and employed 'Negro troops' a full year before our government could be persuaded to do any thing of the sort." After the Battle of Gettys-burg, the New York Herald reported: "Among the rebel prisoners who were marched through Gettysburg there were observed seven negroes in uniform and fully accoutered as soldiers."
An article from Smithsonian magazine relates: "A New York Times correspondent with Grant in 1863 wrote: 'The guns of the rebel battery were manned almost wholly by Negroes, a single white man, or perhaps two, directing operations.'"
While it certainly couldn't be said that African Americans played a major military role in the Southern army, they were definitely there. And some of them had even volunteered.

21. ELECTRIC CARS HAVE BEEN AROUND SINCE THE 1880s

The car of the future runs completely on electricity. No more dependence on gas. No more choking the atmosphere with fumes. Whenever the possibility of electric cars is raised, the media and other commentators ooh and ahh over the potential. But this technology isn't futuristic — it's positively retro. Cars powered by electricity have been on the scene since the 1800s and actually predate gas-powered cars.
A blacksmith in Vermont — Thomas Davenport — built the first rotary electric motor in 1833 and it to power a model train the next year. In the late 1830s, Scottish inventor Robert Davidson rigged a carriage with an electric motor powered by batteries. In his Pulitzer-nominated book Taking Charge, archaeology professor and technology historian Michael Brian Schiffer writes that this "was perhaps the first electric car."
After this remarkable achievement, the idea of an electric car languished for decades. In 1881, a French experi-menter debuted a personal vehicle that ran on electricity, a tricycle (ie, three wheels and a seat) for adults. In 1888, many inventors in the US, Britain, and Europe started creating threeand four-wheel vehicles — which could carry two to six people — that ran on electricity. These vehicles remained principally curios-ities until May 1897, when the Pope Manufacturing Company — the country's most successful bicycle manufacturer — started selling the first commercial electric car: the Columbia Electric Phaeton, Mark III. It topped out at fifteen miles per hour, and had to be recharged every 30 miles. Within two years, people could choose from an array of electrical carriages, buggies, wagons, trucks, bicycles, tricycles, even buses and ambulances made by numerous manufacturers.
New York City was home to a fleet of electric taxi cabs starting in 1897. The Electric Vehicle Company eventually had over 100 of them ferrying people around the Big Apple. Soon it was unleashing electric taxis in Chicago, Philadelphia, Boston, and Washington DC. By 1900, though, the company was in trouble, and seven years later it sputtered out. As for cars powered by dead dinosaurs, Austrian engineer Siegfried Marcus attached a one cylinder motor to a cart in 1864, driving it 500 feet and thus creating the first vehicle powered by gas (this was around 25 years after Davidson had created the first electro-car). It wasn't until 1895 that gas autos — converted carriages with a two-cylinder engine — were commercially sold (and then only in microscopic numbers).
Around the turn of the century, the average car buyer had a big choice to make: gas, electric, or steam? When the auto industry took form around 1895, nobody knew which type of vehicle was going to become the standard. During the last few years of the nineteenth century and the first few of the twentieth, over 100 companies placed their bets on electricity. According to Schiffer, "Twenty-eight percent of the 4,192 American automobiles produced in 1900 were electric. In the New York automobile show of that year more electrics were on display than gasoline or steam vehicles."
In the middle of the first decade of the 1900s, electric cars were on the decline, and their gaseating cousins were surging ahead. With improvements in the cars and their batteries, though, electrics started a comeback in 1907, which continued through 1913. The downhill slide started the next year, and by the 1920s the market for electrics was "minuscule," to use Schiffer's word.
Things never got better.
Many companies tried to combine the best of both approaches, with cars that ran on a mix of electricity and gas. The Pope Manufacturing Company, once again in the vanguard, built a working prototype in 1898. A Belgian company and a French company each brought out commercial models the next year, beating the Toyota Prius and the Honda Insight to the market by over a century. Even Ferdinand Porsche and the Mercedes Company got in on the act. Unfortunately, these hybrids never really caught on.
Didik Design — which manufactures several vehicles which run on various combinations off electricity, solar power, and human power — maintains an extensive archive on the history of electric and electro-fuel cars. According to their research, around 200 companies and individuals have manufactured electric cars. Only a few familiar names are on the list (although some of them aren't familiar as car manufacturers): Studebaker (1952-1966), General Electric (1901-1904), Braun (1977), Sears, Roebuck, and Company (1978), and Oldsmobile (1896 to the present). The vast majority have long been forgotten: Elecctra, Pfluger, Buffalo Automobile Company, Hercules, Red Bug, and Nu-Klea Starlite, to name a few. Henry Ford and Thomas Edison teamed up on an electric car, but, although some prototypes were built, it never was
commercially produced. Though they have faded from mass cultural memory, electric cars have never been completely out of production.The reasons why electrics faded into obscurity while gas cars and trucks became 99.999 percent dominant are complex and are still being debated. If only they hadn't been sidelined and had continued to develop apace, the world would be a very different place.

22. JURIES ARE ALLOWED TO JUDGE THE LAW, NOT JUST THE FACTS

In order to guard citizens against the whims of the King, the right to a trial by jury was established by the Magna Carta in 1215, and it has become one of the most sacrosanct legal aspects of British and American societies. We tend to believe that the duty of a jury is solely to determine whether someone broke the law. In fact, it's not unusual for judges to instruct juries that they are to judge only the facts in a case, while the judge will sit in judgment of the law itself. Nonsense. Juries are the last line of defense against the power abuses of the authorities. They have the right to judge the law. Even if a defendant committed a crime, a jury can refuse to render a guilty verdict. Among the main reasons why this might happen, according to attorney Clay S. Conrad:

When the defendant has already suffered enough, when it would be unfair or against the public interest for the defendant to be convicted, when the jury disagrees with the law itself, when the prosecution or the arresting authorities have gone "too far" in the single-minded quest to arrest and convict a particular defendant, when the punishments to be imposed are excessive or when the jury suspects that the charges have been brought for political reasons or to make an unfair example of the hapless defendant...

Some of the earliest examples of jury nullification from Britain and the American Colonies were refusals to convict people who had spoken ill of the government (they were prosecuted under "seditious libel" laws) or who were practicing forbidden religions, such as Quakerism. Up to the time of the Civil War, American juries often refused to convict the brave souls who helped runaway slaves. In the 1800s, jury nullifications saved the hides of union organizers who were being prosecuted with conspiracy to restrain trade. Juries used their power to free people charged under the anti-alcohol laws of Prohibition, as well as antiwar protesters during the Vietnam era.
Today, juries sometimes refuse to convict drug users (especially medical marijuana users), tax protesters, abortion protesters, gun owners, battered spouses, and people who commit "mercy killings."
Judges and prosecutors will often outright lie about the existence of this power, but centuries of court decisions and other evidence prove that jurors can vote their consciences.
When the US Constitution was created, with its Sixth Amendment guarantee of a jury trial, the most popular law dictionary of the time said that juries "may not only find things of their own knowledge, but they go according to their consciences." The first edition of Noah Webster's celebrated dictionary (1828) said that juries "decide both the law and the fact in criminal prosecutions."
Jury nullification is specifically enshrined in the constitutions of Pennsylvania, Indiana, and Maryland. The state codes of Connecticut and Illinois contain similar provisions. The second US President, John Adams, wrote: "It is not only [the juror's] right, but his duty...to find the verdict according to his own best understanding, judgment, and conscience, though in direct opposition to the direction of the court." Similarly, Founding Father Alexander Hamilton declared: "It is essential to the security of personal rights and public liberty, that the jury should have and exercise the power to judge both of the law and of the criminal intent."
Legendary Supreme Court Chief Justice John Jay once instructed a jury:

It may not be amiss, here, Gentlemen, to remind you of the good old rule, that on questions of fact, it is the providence of the jury, on questions of law, it is the providence of the court to decide. But it must be observed that by the same law, which recognizes this reasonable distribution of jurisdiction, you have nevertheless the right to take upon yourselves to judge of both, and to determine the law as well as the fact in controversy.

The following year, 1795, Justice James Irdell declared: "[T]hough the jury will generally respect the sentiment of the court on points of law, they are not bound to deliver a verdict conformably to them." In 1817, Chief Justice John Marshall said that "the jury in a capital case were judges, as well of the law as the fact, and were bound to acquit where either was doubtful." In more recent times, the Fourth Circuit Court of Appeals unanimously held in 1969:

If the jury feels that the law under which the defendant is accused is unjust, or that exigent circumstances justified the actions of the accused, or for any reason which appeals to their logic and passion, the jury has the power to acquit, and the courts must abide that decision.

Three years later, the DC Circuit Court of Appeals noted: "The pages of history shine on instances of the jury's exercise of its prerogative to disregard uncontradicted evidence and instructions of the judge."
In a 1993 law journal article, federal Judge Jack B. Weinstein wrote: "When juries refuse to convict on the basis of what they think are unjust laws, they are performing their duties as jurors."
Those who try to wish away the power of jury nullification often point to cases in which racist juries have refused to convict white people charged with racial violence. As attorney Conrad shows in his book, Jury Nullification: The Evolution of a Doctrine, this has occurred only in very rare instances. Besides, it's ridiculous to try to stamp out or deny a certain power just because it can be used for bad ends as well as good. What form of power hasn't been misused at least once in a while?
The Fully Informed Jury Association (FIJA) is the best-known organization seeking to tell all citizens about their powers as jurors. People have been arrested for simply handing out FIJA literature in front of courthouses. During jury selections, FIJA members have been excluded solely on the grounds that they belong to the group.
FIJA also seeks laws that would require judges to tell jurors that they can and should judge the law, but this has been an uphill battle, to say the least. In a still-standing decision (Sparf and Hansen v. US, 1895), the Supreme Court ruled that judges don't have to let jurors know their full powers. In cases where the defense has brought up jury nullification during the proceedings, judges have sometimes held the defense attorney in contempt. Still, 21 state legislatures have introduced informed-jury legislation, with three of them passing it through one chamber (ie, House or Senate).
Quite obviously, the justice system is terrified of this power, which is all the more reason for us to mow about it.

23. THE POLICE AREN'T LEGALLY OBLIGATED TO PROTECT YOU

Without even thinking about it, we take it as a given that the police must protect each of us. That's their whole reason for existence, right? While this might be true in a few jurisdictions in the US and Canada, it is actually the exception, not the rule. In general, court decisions and state laws have held that cops don't have to do a thing to help you when you're in danger.
In the only book devoted exclusively to the subject, Dial 911 and Die, attorney Richard W. Stevens writes:

It was the most shocking thing I learned in law school. I was studying Torts in my first year at the University of San Diego School of Law, when I came upon the case of Hartzler v. Cityof San Jose. In that case I discovered the secret truth: the government owes no duty to protect individual citizens from criminal attack. Not only did the California courts hold to that rule, the California legislature had enacted a statute to make sure the courts couldn't change the rule.

But this doesn't apply to just the wild, upside down world of Kalifornia. Stevens cites laws an cases for every state — plus Washington DC, Puerto Rico, the Virgin Islands, and Canada - which reveal the same thing. If the police fail to protect you, even through sheer incompetence and negligence, don't expect that you or your next of kin will be able to sue. Even in the nation's heartland, in bucolic Iowa, you can't depend on 911. In 1987, two men broke into a family's home, tied up the parents, slit the mother's throat, raped the 16-year-old daughter, and drove off with the 12-year old daughter (whom they later murdered). The emergency dispatcher couldn't be bothered with immediately sending police to chase the kidnappers/murders/rapists while the abducted little girl was still alive. First he had to take calls about a parking violation downtown and a complaint about harassing phone calls. When he got around to the kidnapping, he didn't issue an all-points bulletin but instead told just one officer to come back to the police station, not even mentioning that it was an emergency. Even more blazing negligence ensued, but suffice it to say that when the remnants of the family sued the city and the police, their case was summarily dismissed before going to trial. The state appeals court upheld the decision, claiming that the authorities have no duty to protect individuals.

Similarly, people in various states have been unable to successfully sue over the following situations:
  1. when 911 systems have been shut down for maintenance
  2. when a known stalker kills someone
  3. when the police pull over but don't arrest a drunk driver who runs over someone later that night
  4. when a cop known to be violently unstable shoots a driver he pulled over for an inadequate muffler
  5. when authorities know in advance of a plan to commit murder but do nothing to stop it
  6. when parole boards free violent psychotics, including child rapist-murderers
  7. when felons escape from prison and kill someone
  8. when houses burn down because the fire department didn't respond promptly
  9. when children are beaten to death in foster homes

A minority of states do offer a tiny bit of hope. In eighteen states, citizens have successfully sued over failure to protect, but even here the grounds have been very narrow. Usually, the police and the victim must have had a prior "special relationship" (for example, the authorities must have promised protection to this specific individual in the past). And, not surprisingly, many of these states have issued contradictory court rulings, or a conflict exists between state law and the rulings of the courts.
Don't look to Constitution for help. "In its landmark decision of DeShaney v. Winnebago County Department of Social Services," Stevens writes, "the US Supreme Court declared that the Constitution does not impose a duty on the state and local governments to protect the citizens from criminal harm."
All in all, as Stevens says, you'd be much better off owning a gun and learning how to use it. Even in those cases where you could successfully sue, this victory comes only after years (sometimes more than a decade) of wrestling with the justice system and only after you've been gravely injured or your loved one has been snuffed.

24. THE GOVERNMENT CAN TAKE YOUR HOUSE AND LAND, THEN SELL THEM TO PRIVATE CORPORATIONS

It’s not an issue that gets much attention, but the government has the right to seize your house, business, and/or land, forcing you into the street. This mighty power, called "eminent domain," is enshrined in the US Constitution's Fifth Amendment: "...nor shall private property be taken for public use without just compensation." Every single state constitution also stipulates that a person whose property is taken must be justly compensated and that the property must be put to public use. This should mean that if your house is smack-dab in the middle of a proposed highway, the government can take it, pay you market value, and build the highway. Whether or not this is a power the government should have is very much open to question, but what makes it worse is the abuse of this supposedly limited power. Across the country, local governments are stealing their citizens' property, then turning around and selling it to corporations for the construction of malls, condominiums, parking lots, racetracks, office complexes, factories, etc. The Institute for Justice — the country's only nonprofit, public-interest law firm with a libertarian philosophy — spends a good deal of time protecting individuals and small businesses from greedy corporations and their partners in crime: bureaucrats armed with eminent domain. In 2003, it released a report on the use of "governmental condemnation" (another name for eminent domain) for private gain. No central data collection for this trend exists, and only one state (Connecticut) keeps statistics on it. Using court records, media accounts, and information from involved parties, the Institute I found over 10,000 such abuses in 41 states from 1998 through 2002. Of these, the legal I process had been initiated against 3,722 properties, and condemnation had been threatened against 6,560 properties. (Remember, this is condemnation solely for the benefit of private parties, not for so-called legitimate reasons of "public use.")
In one instance, the city of Hurst, Texas, condemned 127 homes so that a mall could expand. Most of the families moved under the pressure, but ten chose to stay and fight. The Institute writes:

A Texas trial judge refused to stay the condemnations while the suit was on-going, so the residents lost their homes. Leonard Prohs had to move while his wife was in the hospital with brain cancer. She died only five days after their house was demolished. Phyllis Duval's husband also was in the hospital with cancer at the time they were required to move. He died one month after the demolition. Of the ten couples, three spouses died and four others suffered heart attacks during the dispute and litigation. In court, the owners presented evidence that the land surveyor who designed the roads for the mall had been told to change the path of one road to run through eight of the houses of the owners challenging the condemnations.

In another case, wanting to "redevelop" Main Street, the city of East Hartford, Connecticut, used eminent domain to threaten a bakery/deli that had been in that spot for 93 years, owned and operated by the same family during that whole time. Thus coerced, the family sold the business for $1.75 million, and the local landmark was destroyed. But the redevelopment fell through, so the lot now stands empty and the city is in debt.
The city of Cypress, California, wanted Costco to build a retail store on an 18-acre plot of land. Trouble was, the Cottonwood Christian Center already owned the land fair and square, and was planning to build a church on it. The city council used eminent domain to seize the land, saying that the new church would be a "public nuisance" and would "blight" the area (which is right beside a horse-racing track). The Christian Center got a federal injunction to stop the
Condem-nation and the city appealed this decision. To avoid further protracted legal nightmares, the church group consented to trade its land for another tract in the vicinity. But all of this is small potatoes compared to what's going on in Riviera Beach, Florida:

City Council members voted unanimously to approve a $1.25 billion redevelopment plan with the authority to use eminent domain to condemn at least 1,700 houses and apartments and dislocate 5,100 people. The city will then take the property and sell the land to commercial yachting, shipping, and tourism companies.

If approved by the state, it will be one of the biggest eminent domain seizures in US history. In 1795, the Supreme Court referred to eminent domain as "the despotic power." Over two centuries later, they continue to be proven right.

25. THE SUPREME COURT HAS RULED THAT YOU'RE ALLOWED TO INGEST ANY DRUG, ESPECIALLY IF YOU'RE AN ADDICT

In the early 1920s, Dr. Linder was convicted of selling one morphine tablet and three cocaine tablets to a patient who was addicted to narcotics. The Supreme Court overturned the con-viction, declaring that providing an addicted patient with a fairly small amount of drugs is an acceptable medical practice "when designed temporarily to alleviate an addict's pains." (Linder v. United States)
In 1962, the Court heard the case of a man who had been sent to the clink under a California state law that made being an addict a criminal offense. Once again, the verdict was tossed out, with the Supremes saying that punishing an addict for being an addict is cruel and unusual and, thus, unconstitutional. (Robinson v. California.)
Six years later, the Supreme Court reaffirmed these principles in Powell v. Texas. A man who was arrested for being drunk in public said that, because he was an alcoholic, he couldn't help it. He invoked the Robinson decision as precedent. The Court upheld his conviction because It had been based on an action (being wasted in public), not on the general condition of his addiction to booze. Justice White supported this decision, yet for different reasons than the others. In his concurring opinion, he expanded Robinson:

If it cannot be a crime to have an irresistible compulsion to use narcotics,... I do not see how it can constitutionally be a crime to yield to such a compulsion. Punishing an addict for using drugs convicts for addiction under a different name. Distinguishing between the two crimes is like forbidding criminal conviction for being sick with flu or epilepsy, but permitting punishment for running a fever or having a convulsion. Unless Robinson is to be abandoned, the use of narcotics by an addict must be beyond the reach of the criminal law. Similarly, the chronic alcoholic with an irresistible urge to consume alcohol should not be punishable for drinking or for being drunk.

Commenting on these cases, Superior Court Judge James R Gray, an outspoken critic of drug prohibition, has recently written:

What difference is there between alcohol and any other dangerous and sometimes addictive drug? The primary difference is that one is legal while the others are not. And the US Supreme Court has said as much on at least two occasions, finding both in 1925 and 1962 that to punish a person for the disease of drug addiction violated the Constitution's prohibition on cruel and unusual punishment. If that is true, why do we continue to prosecute addicted people for taking these drugs, when it would be unconstitutional to prosecute them for their addiction?

Judge Gray gets right to the heart of the matter: "In effect, this 'forgotten precedent' says that >in! Can only be constitutionally punishable for one's conduct, such as assaults, burglary, and driving under the influence, and not simply for what one puts into one's own body." If only the Supreme Court and the rest of the justice/law-enforcement complex would apply these decisions, we'd be living in a saner society.

26. THE AGE OF CONSENT IN MOST OF THE US IS NOT EIGHTEEN

The accepted wisdom tells us that the age at which a person can legally consent to sex in the US is eighteen. Before this line of demarcation, a person is "jailbait" or "chicken." On their eighteenth birthday, they become "legal." But in the majority of states, this isn't the case. It's up to each state to determine its own age of consent. Only fifteen states have put theirs at eighteen, with the rest going lower. Eight have set the magic point at the seventeenth birthday. The most popular age is sixteen, with 27 states and Washington DC setting the ability to sexually consent there. (Hawaii's age of consent had been fourteen until mid-2001, when it was bumped to sixteen.)
Of course, as with anything regarding the law, there are considerable shades of gray. For one thing, these laws don't apply if the lovers are married. The age of consent for marriage, especially with parental permission, is usually lower than the age of sexual consent. The Constitution of the State of South Carolina says that females aged fourteen and up can consent to sex, but state law appears to set the age at sixteen.
In a lot of states, the age of the older partner is a consideration. For example, Tennessee doesn't consider sex with someone aged thirteen to seventeen to be statutory rape if the elder partner in less than four years older. So a nineteen-year-old could get it on with a sixteen-year-old without breaking the law. The most extreme example of this rule is in Delaware. If you're 30 or older, boffing a sixteen- or seventeen-year-old is a felony. But if you're 29 or younger, it's perfectly legal.
And let's not even get into Georgia's Public Law 16-6-18, which outlaws sex between anyone who isn't married, no matter what their ages or genders.
Then, of course, we have the laws regarding same-sex relations, which are completely illegal in fifteen or so states. In almost all of the others states, the age of consent for gay sex is the same as that for het-sex. Two exceptions are Nevada and New Hampshire, which both allow sixteen year-olds to consent to a member of the opposite sex, but set the limit at eighteen for those who go the other way. Somewhat startlingly, even though New Mexico's age of consent for straights
is seventeen, it's thirteen for gays and lesbians.
The situation around the world varies even more than within the US. The age of consent in the UK is sixteen, except in Northern Ireland, where it's a year older. Various territories in Australia set the age at sixteen or seventeen, and in Canada it's universally fourteen. The lowest age — in a few countries, such as Chile and Mexico — is twelve. Only one country is known to have set the age above eighteen — Tunisia, which feels that twenty is the acceptable age.

27. MOST SCIENTISTS DON'T READ ALL OF THE ARTICLES THEY CITE

Every scientific discovery builds on what came before. Because of this, research papers are chock-full of references to previous papers, leading you to believe that those older studies actually have been read and digested and are now being expanded upon.
After noticing that a lot of citations with identical mistakes were showing up in various papers, two researchers at the University of California, Los Angeles, set out to study the problem. They looked at the way well-known, heavily-cited papers had been referenced in subsequent papers.
Regarding an influential paper on crystals published in 1973, New Scientist explains:

They found it had been cited in other papers 4300 times, with 196 citations containing misprints in the volume, page or year. But despite the fact that a billion different versions of erroneous reference are possible, they counted only 45. The most popular mistake appeared 78 times.

Obviously, these pursuers of scientific truths hadn't actually read the original paper, but had just clipped the reference from another paper, a trick they probably learned in college and never stopped using. Of course, some of the scientists who got the citation right hadn't read the paper, either. In the final analysis:

The model shows that the distribution of misprinted citations of the 1973 paper could only have arisen if 78 percent of all the citations, including the correct ones, were "cut and pasted" from a secondary source. Many of those who got it right were simply lucky.

20090819

28. LOUIS PASTEUR SUPPRESSED EXPERIMENTS THAT DIDN'T SUPPORT HIS THEORIES

One of the greatest scientific duels in history occurred between those who believed that microorganisms spontaneously generate in decaying organic matter and those who believed that the tiny creatures migrated there from the open air. From the late 1850s to the late 1870s, the eminent French chemist and microbiologist Louis Pasteur was locked in a death-match with opponents of spontaneous generation, especially Felix Pouchet.
The two camps performed experiments one after f the other, both to prove their pet theory and to prove the opponent's. As we know, Pasteur won the debate: The fact that microbes travel through the air is now accepted as a given, with s spontaneous generation relegated to the slagheap of quaint, discarded scientific ideas. But Pasteur didn't win fair and square.
It turns out that some of Pasteur's experiments gave strong support to the notion that rotting organic matter produces life. Of course, years later those experiments were realized have been flawed, but at the time they buttressed the position of Pasteur's enemies. So he kept them secret. In his myth-busting book Einstein's Luck, medical and scientific historian John Waller writes:

"In fact, throughout his feud with Pouchet, Pasteur described in his notebooks as 'successful' any Experiment that seemed to disprove spontaneous generation and 'unsuccessful' any that violated. His own private beliefs and experimental expectations."

When Pasteur's rivals performed experiments that supported their theory, Pasteur would not publicly replicate those studies. In one case, he simply refused to perform the experiment or even discuss it. In another, he hemmed and hawed so long that his rival gave up in exasperation. Waller notes: "Revealingly, although Pasteur publicly ascribed Bastian's results to sloppy methodology, in private he and his team took them rather more seriously. As Gerald Geison's study of Pasteur's notebooks has recently revealed, Pasteur's team spent several weeks secretly testing Bastian's findings and refining their own ideas on the distribution of germs in the environment."
Pasteur would rail at his rivals and even his mentor when he thought they weren't scrupulously following the scientific method, yet he had no qualms about trashing it when doing so suited his aims. Luckily for him, he was on the right side of the debate. And just why was he so cocksure that spontaneous generation was wrong? It had nothing to do with science. "In his notes he repeatedly insisted that only the Creator-God had ever exercised the power to convert the inanimate into the living," writes Waller. "The possibility that life could be created anew without man first discovering the secrets of the Creator was rejected without any attempt at scientific justification."

29. THE CREATOR OF THE GAIA HYPOTHESIS SUPPORTS NUCLEAR POWER

James Lovelock is one of the icons of the environmental movement. His idea that the Earth is a self-regulating, living organism (the GAIA hypothesis, first expounded in his 1979 book GAIA:
A New Look at Life on Earth) provides the philosophical underpinning of environmentalism. So it may be surprising that Lovelock is an enthusiastic supporter of nuclear energy, which he says has "great benefits and small risks." In the preface to the seemingly paradoxical book Environmentalists for Nuclear Energy, he writes: "I want to put it to you that the dangers of continuing to burn fossil fuels (oil, gas, coal) as our main energy source are far greater and they threaten not just individuals but civilization itself." The answer, he maintains, is the clean energy from nuke plants, which produce almost nothing that clogs up the atmosphere. As for what to do with all that radioactive waste, Lovelock has a shocking answer:

Natural ecosystems can stand levels of continuous radiation that would be Intolerable in a city. The land around the failed Chernobyl power station was evacuated because its high radiation intensity made it unsafe for people, but this radioactive land is now rich in wildlife, much more so than neighboring populated areas. We call the ash from nuclear power nuclear waste and worry about its safe disposal. I wonder if instead we should use it as an incorruptible guardian of the beautiful places of the Earth. Who would dare cut down a forest in which was the storage place of nuclear ash?

Lovelock does admit that nuclear power is "potentially harmful to people," something that his brethren in the group Environmentalists for Nuclear Power often try to downplay. Truthfully, some of their points are good ones. More people have been killed by coal-mining than by nuclear power, even when you factor in the shorter time that nuclear power has existed. Most of the radiation we get zapped with comes from outer space (around two-thirds) and medical procedures (around a third), with only a smidgen from nuke plants. Still, when you know about all the unpublicized accidents and near-meltdowns that have occurred, it's hard to be quite so blasé about the dangers. After all, the group's own literature
says, "Nuclear energy is a very clean energy if it is well designed, well-built, well operated, and well managed." Trouble is, it's often none of those things. Design flaws, human error, corruption, incompetence, greed, and toothless oversight mean that in the real world, nuke plants often don't work as advertised.

30. GENETICALLY-ENGINEERED HUMANS HAVE ALREADY BEEN BORN

The earthshaking news appeared in the medical journal Human Reproduction under the impenetrable headline: "Mitochondria in Human Offspring Derived from Ooplasmic Transplantation." The media put the story in heavy rotation for one day, and then forgot about it. We all forgot about it.
But the fact remains that the world is now populated by dozens of children who were genetically engineered. It still sounds like science fiction, yet it's true.
In the first known application of germ line gene therapy — in which an individual's genes are changed in a way that can be passed to offspring — doctors at a reproductive facility in New Jersey announced in March 2001 that nearly 30 healthy babies had been born with DNA from three people: dad, mom, and a second woman. Fifteen were the product of the fertility clinic, with the other fifteen or so coming from elsewhere.
The doctors believe that one cause for failure of women to conceive is that their ova contain old mitochondria (if you don't remember your high school biology class, mitochondria are the part of cells that provides energy). These sluggish eggs fail to attach to the uterine wall when fertilized.
In order to soup them up, scientists injected them with mitochondria from a younger woman. Since mitochondria contain DNA, the kids have the genetic material of all three parties. The DNA from the "other woman" can even be passed down along the female line. The big problem is that no one knows what effects this will have on the children or their progeny. In fact, this substitution of mitochondria hasn't been studied extensively on animals, never mind Homo sapiens. The doctors reported that the kids are healthy, but they neglected to mention something crucial. Although the fertility clinic's technique resulted in fifteen babies, a total of seventeen fetuses had been created. One of them had been aborted, and the other miscarried. Why? Both of them had a rare genetic disorder, Turner syndrome, which only strikes females. Ordinarily, just one in 2,500 females is born with this condition, in which one of the X chromo-some is incomplete or totally missing. Yet two out of these seventeen fetuses had developed it.
If we assume that nine of the fetuses were female (around 50 percent), then two of the nine female fetuses had this rare condition. Internal documents from the fertility clinic admit that this amazingly high rate might be due to the ooplasmic transfer.
Even before the revelation about Turner syndrome became known, many experts were appalled that the technique had been used. A responding article in Human Reproduction said, in a dry understatement: "Neither the safety nor efficacy of this method has been adequately investigated." Ruth Deech, chair of Britain's Human Fertilization and Embryology Authority, told the BBC: "There is a risk, not just to the baby, but to future generations which we really can't assess at the moment." The number of children who have been born as a result of this technique is unknown. The original article gave the number as "nearly thirty," but this was in early 2001. At that time, at least two of the mutant children were already one year old.
Dr. Joseph Cummin, professor emeritus of biology at the University of Western Ontario, says that no further information about these 30 children has appeared in the medical literature or the media. As far as additional children born with two mommies and a daddy, Cummin says that a report out of Norway in 2003 indicated that ooplasmic transfer has been used to correct mitochondrial disease. He opines: "It seems likely that the transplants are going on, but very, very quietly in a regulatory vacuum, perhaps."

31. THE INSURANCE INDUSTRY WANTS TO GENETICALLY TEST ALL POLICY HOLDERS

The insurance industry's party line is that it doesn't want to genetically test people who sign up for policies, a practice that would detect a predisposition to develop cancer, multiple sclerosis, and other diseases and disorders. The industry's internal documents tell a completely different story, though.
While researching War against the Weak — his sweeping history of eugenics (and its successor, genetics) in the United States and Germany — Edwin Black found two reports written by insurers for insurers. "Genetic Information and Medical Expense" — published in June 2000 by the American Academy of Actuaries — intones that an inability to ask for genetic tests "would have a direct impact on premium rates, ultimately raising the cost of insurance for everyone." A paper issued by the same group in spring 2002 goes further, envisioning a nightmare scenario in which the entire insurance industry collapses. The genetically impure can't be weeded out, thus meaning that more of them get covered. Because of this, the insurers have lo pay out more benefits, which drives up premiums for everybody. This causes some people with perfect chromosomes to be unable to afford insurance, which means a higher percentage of the insured are chromosomally challenged. A downward spiral has started, with more benefits paid out, higher premiums charged, fewer healthy people covered, more benefits, higher premiums, fewer healthy people, etc. This, the report warns, "could eventually cause the insurers to become insolvent."
In the UK, insurance companies were widely screening applicants for genetic red flags until Parliament slapped a moratorium on the practice in 2001, allowing only one type of test to be used. British companies argue that they will go belly-up if the ban isn't lifted soon. Based on this alone, it's ridiculous for the US insurance industry to claim it isn't hoping to use these tests. With the fate of the insurance racket supposedly hanging in the balance, how long can it be before genetic screening is mandatory when applying for health or life coverage?

32. SMOKING CAUSES PROBLEMS OTHER THAN LUNG CANCER AND HEART DISEASE

The fact that smoking causes lung disease and oral cancer isn't exactly news, and only tobacco industry executives would express (feigned) shock at being told. But cigarettes can lead to a whole slew of problems involving every system of your tar-filled body, and most people aren’t aware of this.
The American Council on Science and Health's book Cigarettes: What the Warning Label Doesn't Tell You is the first comprehensive look at the medical evidence of all types of harm triggered by smoking. Referencing over 450 articles from medical journals and reviewed by 45 experts — mainly medical doctors and PhDs — if this book doesn't convince you to quit, nothing will.
Among some of the things that cancer sticks do:
  1. Besides cancers of the head, neck, and lungs, ciggies are especially connected to cancers of the bladder, kidney, pancreas, and cervix. Newer evidence is adding leukemia and colorectal cancer to the list. Recent studies have also found at least a doubling of risk among smokers for cancers of the vulva and penis, as well as an eight-fold risk of anal cancer for men and a nine-fold risk for women.
  2. Smoking trashes the ability of blood to flow, which results in a sixteen-fold greater risk of peripheral vascular disease. This triggers pain in the legs and arms, which often leads to an inability to walk and, in some instances, gangrene and/or amputation. Seventy-six percent of all cases are caused by smoking, more than for any other factor, including diabetes, obesity, and high blood pressure.
  3. Smokers are at least two to three times more likely to develop the heartbreak of psoriasis. Even if that doesn't happen, they'll look old before their time. The American Council tells us, "Smokers in their 40s have facial wrinkles similar to those of nonsmokers in their 60s."
  4. Smokers require more anesthesias for surgery, and they recover much more slowly. In fact, wounds of all kinds take longer to heal for smokers.
  5. Puffing helps to weaken bones, soft tissue, and spinal discs, causing all kinds of musculoskeletal pain, more broken bones and ruptured discs, and longer healing time. "A non-smoker's leg heals an average of 80 percent faster than a smoker's broken leg."
  6. Smoking is heavily related to osteoporosis, the loss of bone mass, which results in brittle bones and more breaks.
  7. Cigarettes interfere with your ability to have kids. "The fertility rates of women who smoke are about 30 percent lower than those of nonsmokers." If you're an idiot who continues to smoke while you're expecting — even in this day and age, some people, including stars Catherine ZetaJones and Courtney Love, do this — you increase the risks of miscarriage, stillbirth, premature birth, low birth weight, underdevelopment, and cleft pallet. If your child is able to survive outside the womb, it will have a heavily elevated risk of crib death (SIDS), allergies, and intellectual impairment.
  8. Smoking also does a serious number on sperm, resulting in more deformed cells, less ability of them to swim, smaller loads, and a drastic decrease in overall number of the little fellas. The larger population of misshapen sperm probably increases the risk of miscarriages and birth defects, so even if mommy doesn't smoke, daddy could still cause problems. What's more, because smoking hurts blood flow, male smokers are at least twice as likely to be unable to get it up.
  9. Besides shutting down blood flow to the little head, smoking interferes with the blood going to the big head in both sexes. This causes one quarter of all strokes. It also makes these strokes more likely to occur earlier in life and more likely to be fatal.
  10. "Depression — whether viewed as a trait, a symptom or a diagnosable disorder — is overrepresented among smokers." Unfortunately, it's unclear how the two are related. Does smoking cause depression, or does depression lead to smoking? Or, most likely, do the two feed on each other in a vicious cycle?
  11. "Smokers experience sudden hearing loss an average of 16 years earlier than do never smokers."
  12. Smokers and former smokers have an increased risk of developing cataracts, abnormal eye movements, inflammation of the optic nerve, permanent blindness from lack of blood flow, and the most severe form of macular degeneration.
  13. Lighting up increases plaque, gum disease, and tooth loss.
  14. It also makes it likelier that you'll develop diabetes, stomach ulcers, colon polyps, and Crohn's disease.
  15. Smoking trashes the immune system in myriad ways, with the overall result being that you're more susceptible to disease and allergies.

And let's not forget that second-hand smoke has horrible effects on the estimated 42 percent of toddlers and infants who are forced to inhale it in their homes:

According to the Environmental Protection Agency (EPA), children's "passive smoking," as it is called, results in hundreds of thousands of cases of bronchitis, pneumonia, ear infections, and worsened asthma. Worse yet, the Centers for Disease Control and Prevention estimates that 702 children younger than one year die each year as a result of sudden infant death syndrome (SIDS), worsened asthma and serious respiratory infections.

It's very surprising to note that smoking can have a few health benefits. Because they zap women's estrogen levels, cigarettes can lead to less endometriosis and other conditions related to the hormone. Smoking also decreases the risk of developing osteoarthritis in the knees, perhaps because the pliability of thin bones takes some pressure off of the cartilage. And because it jacks up dopamine levels, it helps ward off Parkinson's disease. Of course, these benefits seem to be side effects of the hazards of smoking, so the trade-off hardly seems worth it.

33. HERDS OF MILK-PRODUCING COWS ARE RIFE WITH BOVINE LEUKEMIA VIRUS

Bovine leukemia virus is a cancer-causing microbe in cattle. Just how many cows have it? The US Department of Agriculture reports that nationwide, 89 percent of herds contain cows with BLV. The most infected region is the Southeast, where 99 percent of herds have the tumor-causing bug. In some herds across the country, almost every single animal is infected. A 1980 study across Canada uncovered a lower but none-too-reassuring rate of 40 percent. BLV is transmitted through milk. Since the milk from all cows in a herd is mixed before processing, if even a single cow is infected, all milk from that herd will have BLV swimming in it. Citing an article in Science, oncologist Robert Kradjian, MD, warns that 90 to 95 percent of milk starts out tainted. Of course, pasteurization — when done the right way — kills BLV, but the process isn't perfect. And if you drink raw milk, odds are you're gulping down bovine leukemia virus.
Between dairy cows and their cousins that are used for meat (who tend to be infected at lower rates), it appears that a whole lot of BLV is getting inside us. A 2001 study in Breast Cancer Research detected antibodies to the bovine leukemia virus in blood samples from 77 out of 100 volunteers. Furthermore, BLV showed up more often in breast tissue from women with breast cancer than in the tissue from healthy women. Several medical studies have found positive correlations between higher intake of milk/beef and increased incidence of leukemia or lymphoma in humans, although other studies haven't found a correlation. No hard evidence has yet linked BLV to diseases in humans, but do you feel comfortable knowing that cow cancer cells are in your body?

34. MOST DOCTORS DON'T KNOW THE RADIATION LEVEL OF CAT SCANS

Using extended doses of encircling X-rays, CAT scans give a detailed look inside your body, revealing not only bones but soft tissue and blood vessels, as well. According to the health site lmaginis.com, over 70,000 places around the world offer CAT scans to detect and diagnose tumors, heart disease, osteoporosis, blood clots, spinal fractures, nerve damage, and lots of other problems. Because it can uncover so much, its use has become widespread and continues to rise. In fact, healthy people are getting scans just to see if anything might be wrong, kind of like a routine check-up.
The downside, and it's a doozy, is that a CAT scan jolts you with 100 to 250 times the dose of radiation that you get from a chest X-ray. What's even more alarming is that most doctors apparently don't know this.
An emergency physician from the Yale School of Medicine surveyed 45 of his colleagues about the pros and cons of CAT scans. A mere nine of them said that they tell patients about the radiation. This might be just as well, in a weird way, since most of them had absolutely no clue about how much radiation CAT scans give off. When asked to compare the blast from a chest Xray to the blast from a CAT scan, only 22 percent of the docs got it right. As for the other threequarters, The Medical Post relates:

Three of the doctors said the dose was either less than or equal to a chest X-ray. Twenty (44%) of the doctors said the dose was greater than a chest X-ray, but less than 10 times the dose. Just over one-fifth of the doctors (22%) said the radiation dose from a CT was more than 10 times that of an X-ray but less than 100 times the dose.

Only ten of them knew that a single CAT scan equals 100 to 250 chest X-rays, while two thought that the scans were even worse than that. Feel free to give your doc a pop quiz during your next office visit.

35. MEDICATION ERRORS KILL THOUSANDS EACH YEAR

Next time you get a prescription filled, look at the label very carefully. Getting the wrong drug or the wrong dosage kills hundreds or thousands of people each year, with many times that number getting injured.
Renegade health reporter Nicholas Regush — a self-imposed exile from ABC News — provides ii long list of specific problems:

Poor handwriting. Verbal orders. Ambiguous orders. Prescribing errors. Failure to write orders. Unapproved uses. When the order is not modified or cancelled. Look-alike and sound-alike drug names. Dangerous abbreviations. Faulty drug distribution systems in hospital. Failure to read the label or poor labeling. Lack of knowledge about drugs. Lack of knowledge concerning proper dose. Lack of knowledge concerning route of administration. Ad nauseam.

After pouring over death certificates, sociology professor David Philips — an expert in mortality statistics — determined that drug errors kill 7,000 people each year in the US. His study was published in The Lancet, probably the most prestigious medical journal in the world. The Institute of Medicine, a branch of the National Academies of Science, also estimated 7,000. Interestingly, the Food and Drug Administration published the lowball figure of 365 annually (one per day). But even the FDA admits that such bungling injures 1.3 million people each year. New York Newsday cited several specific cases, such as: "In 1995, a Texas doctor wrote an illegible prescription causing the patient to receive not only the wrong medication, but at eight times the drug's usually recommended strength. The patient, Ramon Vasquez, died. In 1999, a court ordered the doctor and pharmacy to pay the patient's family a total of $450,000, the largest amount ever awarded in an illegible prescription case."
Besides doctors' indecipherable chicken scratch, similar-sounding drug names are another big culprit. Pharmaceutical companies have even started warning medical professionals to be careful with the cookie-cutter names of their products. In a typical example, Celebrex, Cerebyx, Celexa, and Zyprexa sometimes get confused. (Respectively, they're used to treat arthritis, seizures, depression, and psychosis.) According to WebMD: "Bruce Lambert, an assistant professor of
pharmacy administration at the University of Illinois at Chicago, says there are 100,000 potential pairings of drug names that could be confused."

20090818

36. PRESCRIPTION DRUGS KILL OVER 100,000 ANNUALLY

Even higher than the number of people who die from medication errors is the number of people who die from medication, period. Even when a prescription drug is dispensed properly, there's no guarantee it won't end up killing you.
A remarkable study in the Journal of the American Medical Association revealed that prescription drugs kill around 106,000 people in the US every year, which ranks prescription drugs as the fourth leading cause of death. Furthermore, each years sees 2,216,000 serious adverse drug reactions (defined as "those that required hospitalization, were permanently disabling, or resulted in death").
The authors of this 1998 study performed a meta-analysis on 39 previous studies covering 32 years. They factored out such things as medication errors, abuse of prescription drugs, and adverse reactions not considered serious. Plus, the study involved only people who had either been hospitalized due to drug reactions or who experienced reactions while in the hospital. People who died immediately (and, thus, never went to the hospital) and those whose deaths weren't realized to be due to prescription drugs were not included, so the true figure is probably higher.
Four years later, another study in the JAMA warned:

Patient exposure to new drugs with unknown toxic effects may be extensive. Nearly 20 million patients in the United States took at least 1 of the 5 drugs withdrawn from the market between September 1997 and September 1998. Three of these 5 drugs were new, having been on the market for less than 2 years. Seven drugs approved since 1993 and subsequently withdrawn from the market have been reported as possibly contributing to 1002 deaths.

Examining warnings added to drug labels through the years, the study's authors found that of the new chemical entities approved from 1975 to 1999, 10 percent "acquired a new black box warning or were withdrawn from the market" by 2000. Using some kind of high-falutin' statistical process, they estimate that the "probability of a new drug acquiring black box warnings or being withdrawn from the market over 25 years was 20%."
A statement released by one of the study's coauthors — Sidney Wolfe, MD, Director of Public Citizen's Health Studies Group — warned:

In 1997, 39 new drugs were approved by the FDA. As of now [May 2002], five of them (Rezulin, Posicor, Duract, Raxar and Baycol) have been taken off the market and an additional two (Trovan, an antibiotic and Orgaran, an anticoagulant) have had new box warnings. Thus, seven drugs approved that year (18% of the 39 drugs approved) have already been withdrawn or had a black box warning in just four years after approval. Based on our study, 20% of drugs will be withdrawn or have a black box warning within 25 years of coming on the market. The drugs approved in 1997 have already almost "achieved" this in only four years — with 21 years to go.

How does this happen? Before the FDA approves a new drug, it must undergo clinical trials. These trials aren't performed by the FDA, though — they're done by the drug companies themselves. These trials often use relatively few patients, and they usually select patients most likely to react well to the drug. On top of that, the trials are often for a short period of time (weeks), even though real-world users may be on a drug for months or years at a time. Dr. Wolfe points out that even when adverse effects show up during clinical trials, the drugs are sometimes released anyway, and they end up being taken off the market because of those same adverse effects.

Postmarketing reporting of adverse effects isn't much better. The FDA runs a program to collect reports of problems with drugs, but compliance is voluntary. The generally accepted estimate in the medical community is that a scant 10 percent of individual instances of adverse effects are reported to the FDA, which would mean that the problem is ten times worse than we currently believe.
Drugs aren't released when they've been proven safe; they're released when enough FDA bureaucrats — many of whom have worked for the pharmaceutical companies or will work for them in the future — can be convinced that it's kinda safe. Basically, the use of prescription drugs by the general public can be seen as widespread, long-term clinical trials to determine their true safety. We are all guinea pigs.

37. WORK KILLS MORE PEOPLE THAN WAR

The United Nations' International Labor Organization has revealed some horrifying stats:

The ILO estimates that approximately two million workers lose their lives annually due to occupational injuries and illnesses, with accidents causing at least 350,000 deaths a year. For every fatal accident, there are an estimated 1,000 non-fatal injuries, many of which result in lost earnings, permanent disability and poverty. The death toll at work, much of which is attributable to unsafe working practices, is the equivalent of 5,000 workers dying each day, three persons every minute.

This is more than double the figure for deaths from warfare (650,000 death* per year). According to the ILO's SafeWork programme, work kills more people than alcohol and drugs together and the resulting loss in Gross Domestic Product is 20 times greater than all official development assistance to the developing countries.


Each year, 6,570 US workers die because of injuries at work, while 60,225 meet their maker due to occupational diseases. (Meanwhile, 13.2 million get hurt, and 1.1 million develop illnesses that don't kill them.) On an average day, two or three workers are fatally shot, two fall to their deaths, one is killed after being smashed by a vehicle, and one is electrocuted. Each year, around 30 workers die of heat stroke, and another 30 expire from carbon monoxide.
Although blue collar workers face a lot of the most obvious dangers, those slaving in offices or stores must contend with toxic air, workplace violence, driving accidents, and (especially for the health-care workers) transmissible diseases. The Occupational Safety and Health Administration warns that poisonous indoor air in nonindustrial workplaces causes "[t]housands of heart disease deaths [and] hundreds of lung cancer deaths" each year.
But hey, everybody has to go sometime, right? And since we spend so much of our lives in the workplace, it's only logical that a lot of deaths happen — or at least are set into motion — on the job. This explanation certainly is true to an extent, but it doesn't excuse all such deaths. The International Labor Organization says that half of workplace fatalities are avoidable. In A Job to Die For, Lisa Cullen writes:

In the workplace, few real accidents occur because the surroundings and operations are known; therefore, hazards can be identified. When harm from those hazards can be foreseen, accidents can be prevented....
Most jobs have expected, known hazards. Working in and near excavations, for example, poses the obvious risks of death or injury from cave-in.... When trenches or excavations collapse because soil was piled right up to the edge, there is little room to claim it was an accident.

38. THE SUICIDE RATE IS HIGHEST AMONG THE ELDERLY

If you judge by the media and the public education programs, you might be inclined to think that teenagers and young adults (aged 15 to 24) are the age group most likely to kill themselves. Actually, they have the second-lowest rate of suicide. (The absolute lowest rate is among kids aged 5 to 14; children younger than that are apparently deemed incapable of consciously choosing to end their lives.) It is the elderly, by far, who have the highest rate of suicide. In the US, of every 100,000 people aged 75 to 79, 16.5 kill themselves. For those 80 and over, the rate is 19.43. This compares to a rate of 8.15 per 100,000 for people between the ages 15 and 19, and 12.84 for people aged 20 to 24.
As with every age group, men are far more likely to kill themselves, but among the elderly this trend reaches extreme proportions. Of people 65 and older, men comprise a staggering 84 percent of suicides.
Because men commit the vast majority of hara-kiri among old people, looking at these male suicide rates makes for extremely depressing reading. For guys aged 75 to 79, the suicide rate is 34.26 per 100,000. In the 80 to 84 group, men's suicide rate is 44.12. When you look at men 85 and older, the suicide rate is a heart-breaking 54.52. Compare this to the suicide rate for dudes in their mid to late teens: 13.22 per 100,000.
It is true that suicide ranks as the second or third most common cause of death in young people (depending on age group), while it is number 15 and under for various groups of the elderly. Still, the suicide rate among the young is equal to their proportion of the population, while the elderly are way overrepresented as a group. And old people are cut down by a great many diseases and disorders virtually unknown to the young, which naturally pushes suicide down in the rankings.
The reasons why this suicide epidemic is ignored are highly speculative and would be too lengthy to get into here. However, we can rule out one seemingly likely explanation — suicide among the aged is invisible because they usually O.D. on prescription drugs or kill themselves in other ways that could easily be mistaken for natural death in someone of advanced years. This doesn't wash, primarily because guns are the most common method of dispatch. Of suicides over
65, men used a gun 79.5 percent of the time, while women shot themselves 37 percent of the time. It's hard to mistake that for natural causes.
The sky-high suicide rate among the elderly applies to the entire world, not just the US. Plotted in a graph, suicide rates by age group around the globe gently curve upward as age increases. When the graph reaches the final age group, the line suddenly spikes, especially for men.
Worldwide, men 75 and over have a suicide rate of 55.7 per 100,000, while women in the same age group have a rate of 18.8. This rate for old men is almost three times the global rate for guys aged 15 to 24, while the rate for old women is well over three times the rate for young gals in that age group.