Age Eleven: Never Again
Abdullah bin Abdul Al Saud (1924–2015) was the King of Saudi Arabia from 2005 until his death in 2015. He was the son of Ibn Saud, the founder (and namesake) of modern Saudi Arabia. He was named crown prince when his brother Fahd became king in 1982. When King Fahd had a debilitating stroke in 1995, Abdullah became the de facto ruler and then formally became king when Fahd died a decade later.
Although Saudi Arabia ranks very low on the democracy-authoritarianism and human rights scales (see entry for Saudi Arabia), King Abdullah was a relatively moderating influence. Late in his reign he gave women the right to vote for municipal councils and to compete in the Olympics. He encouraged high quality western style education for both men and women for at least some citizens. Although maintaining Sharia law (a strict legal system based on Islamic principles), he introduced professional training for judges and a review system for decisions. He took steps to moderate the conduct of the Mutaween, the religious police who enforce public dress, attendance at prayer, and other behaviors.
He started an entrepreneurial sector in what had been a stifling process for starting new businesses. He founded King Abdullah University for Science and Technology with a relatively modern curriculum and providing equal opportunity for both genders. He modernized the education system at all grade levels to prepare the population for STEM (science, technology, engineering, and math) fields.
Abdullah maintained a positive political and military relationship with the United States. The US needs Saudi Arabian oil and Saudi Arabia’s moderating influence in OPEC (the Organization of Petroleum Exporting Countries which has influence on world oil prices) while Saudi Arabia in turn needs US military and intelligence support to survive in a volatile region of the world.
Abdullah married about 30 times and had more than 30 children. His personal fortune was estimated at $18 billion, although there is no clear division between assets of the royal family and the assets of the nation.
Abdullah took steps towards interfaith dialogue. He was the first Saudi Arabian leader to meet with the Pope (Benedict XVI) and led a number of interfaith meetings calling for tolerance in interfaith relations.
In the alternative reality of Danielle: Chronicles of a Superheroine, King Abdullah is a fan of eleven-year-old Danielle and asks for her guidance to move his nation toward more modern values, including equality of men and women.
Saudi Arabia is the second biggest Arab nation by land mass, adjacent to eight other Arab nations, and with a coast bordering both the Red Sea, the Persian Gulf. It has a population of about 20 million Saudi citizens and eight million foreigners who service the country’s leadership and its oil industry. Most of its land consists of barren deserts.
It houses the two holiest places in Islam: Al-Masjid al-Haram in Mecca, and Al-Masjid an-Nabawi in Medina. Founded by the nation’s namesake Ibn Saud in 1932, it is an absolute hereditary monarchy. All Saudi citizens are required to be Muslims and the primary form of Islam practiced in the country is Wahhabism, also called Salafism, a very strict form of Islam.
As of 2015, Saudi Arabia was the world’s largest oil producer. However, the United States has increasingly perfected the extraction of oil from shale oil rock, which consists of stones with a very low concentration of oil and natural gas. This has led to a dramatic increase in American oil production and a substantial reduction in oil prices, and it has led to the United States overtaking Saudi Arabia in overall oil production in 2016. The Saudi economy is almost exclusively based on oil production, although under King Abdullah (see entry for King Abdullah), major investments have been made in preparing the populace for careers in STEM (science, technology, engineering, and mathematics) fields. Of the 20 largest economies in the world, they are the only Arab nation, based due to its fossil fuels economy.
Freedom House, founded in 1941, is a US government funded nongovernmental research organization that evaluates democracy, political freedom, and human rights. It rates Saudi Arabia with its lowest rating, “Not Free.” The Economist ranks the Saudi government as the fifth most authoritarian government out of 167 nations.
There were attempts by King Abdullah to reform the authoritarian form of Saudi government including encouragement of public policy debates, municipal elections, the appointment of the first woman to a ministerial post, and other changes. Abdullah transferred control of girls’ education from the Ulema (an organization of Islamic religious leaders) to the Ministry of Education. Critics have complained that these modernizations have been too slow and ineffective.
Capital punishment, usually by beheading, is carried out at a high rate (345 times between 2007 and 2010) for such offenses as murder, robbery, drug use, apostasy (the abandonment or renunciation of Islam), adultery, homosexuality, witchcraft, and sorcery. This record has been condemned by Amnesty International, Human Rights Watch, and other human rights organizations.
Saudi Arabia has been a long-term political and military ally of the United States, which has inspired terrorist groups such as al-Qaeda to oppose the Saudi government and its leaders.
In the alternative reality of Danielle: Chronicles of a Superheroine, King Abdullah seeks Danielle’s help to accelerate the modernization and reform of Saudi Arabia with varying levels of success.
A sovereign wealth fund (SWF) is a financial fund that manages the accumulated wealth of a nation. These funds are generally very diversified with investments in many types of assets including stocks, bonds, real estate, commodities, and other funds such as private equity funds, hedge funds, and venture capital funds. At the end of 2014, sovereign wealth funds totaled $7.11 trillion worldwide, of which $4.29 trillion was from countries whose wealth came primarily from oil and gas exports.
In the alternative reality of Danielle: Chronicles of a Superheroine, Danielle discusses with King Abdullah the disposition of Saudi Arabia’s sovereign wealth fund, which at the end of 2014 totaled $672 billion. As part of its economic and strategic alliance with the United States, the Saudi SWF invests most of its money in US Treasury bonds which are safe but pay very low interest rates. Other Arab SWFs generally invest in higher-risk and higher-yield securities. With the collapse of oil prices in 2015, Saudi Arabia has been running a budget deficit. One of its principal investors, Prince Alwaleed bin Talal, called for a change in investment strategy toward higher yield investments that he argues would cover most of the budget deficit. However, Finance Minister Ibrahim Alassaf has rejected this advice.
The word Madrassa simply means “school of learning” in Arabic. However, the term usually refers to Islamic education institutions.
The type of Madrassa that Danielle talks to Saudi King Abdullah about are the schools funded by Saudi Arabia that teach Wahhabism, the dominant faith in Saudi Arabia and an austere and severe form of Islam. Wahhabis believe that it is not sufficient for a person to simply practice Islam, but that anyone who does not practice the Wahhabist form of Islam should be regarded as an enemy. Many Wahhabists believe such heathens should be destroyed.
Wahhabist schools represent the primary source of education in Saudi Arabia and since the 1970s Saudi Arabia and affiliated charities have promoted and supported Wahhabist Madrassa schools across the world, including in the United States. Critics estimate that the level of Saudi funding for Madrassa schools outside Saudi Arabia over the past twenty years has been over $100 billion.
A major part of the curriculum in many Madrassa schools consists of rote reading of the Qur’ān (Koran), the holy book of Islam. The belief is that simply reading the words of the Qur’ān even without understanding them will impart wisdom. This reflects, in part, Islam’s emphasis on maintaining the exact words of the Qur’ān without modification since its inception, which contrasts with the Jewish and Christian Bible which has been repeatedly updated to modern languages.
Radical Islamic terrorist groups such as al-Qaeda, the Taliban, and ISIS have their roots in the Wahhabist faith and many of their leaders and members have been educated in Wahhabist Madrassa schools. Bin Laden appears to have been educated in schools organized by the Muslim Brotherhood, a radical Islamic group in Egypt, although Saudi King Faisal gave support to the Muslim Brotherhood and recruited their teachers for the Saudi Madrassa schools. Although the Saudi government claims to be opposed to al-Qaeda, ISIS, and affiliated terrorist groups, critics say that their opposition is tepid and that they continue to provide major funding for the Madrassa schools that support the ideology of these groups.
In the alternative reality of Danielle: Chronicles of a Superheroine, Danielle encourages King Abdullah to reform the Saudi supported Madrassa schools to provide a more diverse and tolerant education.
Note: Parental Guidance is suggested for children under 13 for this entry.
In 1997 the World Health Organization defined Female Genital Mutilation (FGM) as the “partial or total removal of the external female genitalia or other injury to the female genital organs for non-medical reasons.”
These “non-medical reasons” include religious (primarily Islamic), cultural, and aesthetic traditions that go back hundreds, and in some cases thousands of years, and remain very active in today’s world. It is often carried out as part of a girl’s coming of age ceremony but may be done as early as infancy or after puberty.
FGM is an extremely painful procedure and is typically performed in unsterile conditions using razor blades, knives, pieces of glass, scissors and/or sharp stones, and with the girl forcibly restrained.
The procedure differs somewhat from country to country depending on the cultural traditions governing the FGM ritual. It generally involves removal of the clitoral hood and clitoral glans and removal of the inner labia. To bind the resulting wounds, such materials as agave or acacia thorns are typically used instead of surgical thread.
About 20 percent of FGM procedures are of the most severe form, known as infibulation, which includes removal of the inner and outer labia and closing of the vulva with a small hole left open for the passage of urine and menstrual fluid. In this case, the vagina is subsequently opened with another painful procedure for intercourse and for childbirth.
Somali poetess Dahabo Musa describes the “three feminine sorrows” associated with infibulation (the original FGM procedure, the wedding night, and childbirth) in a 1988 poem (See femaleintegrity.se/poem.htm).
Effects may include swelling, excessive bleeding, chronic pain, infections, anemia, tetanus, gangrene, damage to surrounding organs, necrotizing fasciitis (flesh-eating disease), and others. The crude instruments that are typically reused between different girls may aid in the transmission of a variety of diseases, including hepatitis B, hepatitis C, and possibly HIV.
Numerous studies have shown that FGM is extremely harmful to women’s physical and emotional health throughout their lives. Long-term effects include chronic infections, pain during menstruation and sexual intercourse, HIV, and many others.
UNICEF estimated that in 2014 there were 133 million girls and women who had undergone FGM earlier in their lives with three million girls experiencing it each year. The 2008 Demographic Health Survey found that 91 percent of women aged 15 to 49 in Egypt had undergone FGM. FGM is prevalent in Saudi Arabia although exact numbers are not available.
The numbers are significant even in western countries due to a large Diaspora population of the cultures that practice it. An unpublished 2015 study by the Centers for Disease Control estimated that a half million women and girls in the United States had undergone FGM or were likely to have the procedure. The European Parliament estimated in 2009 that half a million women in Europe had undergone it.
Although the practice is generally carried out by women, their motivation is often to spare the girls from social exclusion by men who will otherwise reject and ostracize them.
There has been an international movement to end FGM since the 1970s. In 2003, the United Nations designated every February 6th as an “International Day of Zero Tolerance to Female Genital Mutilation.” In 2012 the United Nations General Assembly voted unanimously to recognize FGM as a violation of human rights. As of 2015, the practice is nominally outlawed (or in some cases merely restricted) in most countries. However, these laws are almost never enforced, and they have had limited effect on lowering these numbers. The incidence of FGM rituals each year as a percentage of the population has declined slightly in recent decades, but the total number (of new procedures each year) is actually climbing because population growth is outpacing the small percentage reduction in the practice.
Remarkably, there has been a reaction against the FGM eradication movement led by anthropologists who accuse the anti-FGM movement of cultural colonialism by placing their cultural ideas above the centuries old traditions of the cultures who practice FGM. For example, Sylvia Tamale, a Ugandan law professor, writes that western opposition to FGM stems from “a Judeo-Christian judgment that African sexual and family practices are primitive and require correction.” She goes on to write that African feminists “do not condone the negative aspects of the practice but take strong exception to the imperialist, racist, and dehumanizing infantilization of African women.” Even those who oppose FGM commonly encourage a high degree of sensitivity to the cultures that practice it. The result of this push back by factions of anthropologists has largely succeeded in quieting the anti-FGM movement.
On a personal note, I saw the wildly successful musical The Book of Mormon, written by Trey Parker and Matt Stone, creators of the animated comedy South Park, along with Robert Lopez, co-composer and co-lyricist of Avenue Q and Frozen. I did find some of the intentionally offensive, irreverent and politically incorrect dialogue and lyrics humorous, but the musical lost me in its treatment of FGM. In one of the songs, “Hasa Diga Eebowai” (which translated literally into English from Ugandan means “F*** you God!”), the cast enthusiastically sings “All the young girls here get circumcised! Their clits get cut right off! Dayyyyyo!!” The musical has been praised by some for bringing the issue of FGM to a wider audience, but in my view this effort was unsuccessful judging by the unrestrained laughter that these lyrics (and other lyrics about FGM) engendered. I do think it is possible to create humor from serious issues, but my sense in this case was that the audience was laughing as a result of their uncorrected ignorance of how severe an issue FGM is in the real world.
In the alternative reality of Danielle: Chronicles of a Superheroine, in her speech to the United Nations General Assembly (having been invited by King Abdullah to take the Saudi Arabian speaking spot), eleven-year-old Danielle begins by citing a reason that people do not speak about FGM other than the multiculturalist pushback from anthropologists and some feminists. She describes FGM as an egregious form of female and child abuse and the fight to end FGM as a matter of fundamental human rights. She decries the failure of the feminist movement to give a priority to ending it. Danielle points out that there is essentially no public discussion of this issue despite the fact that millions of girls undergo this degrading and barbaric procedure each year.
She also condemns the phrase “female circumcision” saying that this is not a benign synonym for FGM but rather a “grotesque and inexcusable distortion of language.” She goes on to say that “The proper analogy to male anatomy would be female castration.”
Cholera is a bacterial infection caused by the bacterium Vibrio cholerae. If treated appropriately, it can be readily cured with antibiotics, electrolyte rebalancing, and oral rehydration, but in poor areas without basic healthcare, its effects can be severe with death rates over 50 percent. With proper treatment, death rates can be held to less than one percent.
Cholera is spread by infected seafood, water and food contaminated with human feces, and poor sanitation. Prevention efforts generally focus on establishing proper sanitation.
Figures from the World Health Organization is that there are 1.3 to 4.0 million cases of cholera each year with 21,000 to 143,000 deaths each year from the disease. This is a big improvement compared to the 1980s when death rates were estimated at over three million persons a year.
The worst epidemic of cholera in recent history was in Haiti after the devastating earthquake there in 2010. The epidemic started in a United Nations health center and quickly spread to all regions. By 2015, over 700,000 Haitians had become ill with cholera. Due to intensive efforts at treatment by Haitian and international health organizations, the death rate has been held to fairly low levels with an estimated 9,000 deaths.
In the alternative reality of Danielle: Chronicles of a Superheroine, after losing her Mum in the Haitian earthquake, Danielle’s sister Claire becomes heavily involved in successful efforts to combat cholera in Haiti.
The General Assembly (GA) is the one organ of the United Nations that all member nations participate in. Its one-vote-per-nation rule allows nations with only five percent of the world population to pass a resolution by a two-thirds margin.
The General Assembly has little formal power in that its resolutions are not binding on its members. Binding resolutions on significant security issues must be passed by the Security Council (which has five permanent members who wield a veto and 10 non-permanent members). The General Assembly is influential, however, in that speeches to the General Assembly receive worldwide attention, and the GA is often a focal point for diplomacy on a wide range of world issues.
The most important formal function of the GA is that it appoints the non-permanent members of the Security Council and is responsible for the overall budget of the United Nations.
The first session of the General Assembly was in 1946 and included 51 nations. Today there are 193 nations represented.
In the alternative reality of Danielle: Chronicles of a Superheroine, the eleven-year-old Danielle addresses the General Assembly taking the Saudi Arabian speaking spot having been invited by King Abdullah. Her speech focusing on Female Genital Mutilation (see the entry for Female Genital Mutilation) and the rights of girls results in controversy and a backlash.
“Female circumcision” is a phrase often used to refer to Female Genital Mutilation.
In the alternative reality of Danielle: Chronicles of a Superheroine, eleven-year-old Danielle condemns the phrase female circumcision in her speech to the United Nations General Assembly as a “grotesque and inexcusable distortion of language.”
See entry for Female genital mutilation.
Realpolitik is a term for politics, diplomacy, and strategic thinking based primarily on considerations of military and financial power rather than moral, ethical, or ideological factors. This approach can be considered to be simply pragmatic, or more pejoratively coercive, intimidating, and amoral.
The word comes from an 1853 book by German politician Ludwig von Rochau, Grundsätze der Realpolitik angewendet auf die staatlichen Zustände Deutschlands, who wrote:
The study of the powers that shape, maintain and alter the state is the basis of all political insight and leads to the understanding that the law of power governs the world of states just as the law of gravity governs the physical world.
In the alternative reality of Danielle: Chronicles of a Superheroine, Claire comments on Danielle’s relationship to considerations of Realpolitik in her interactions with world issues.
The National Organization for Women (NOW) is a prominent US based feminist organization founded in 1966.
The precursors to the founding of NOW include the President’s Commission on the Status of Women, organized by President Kennedy in 1961 with Eleanor Roosevelt as the head. This was followed by the publication of American author and activist Betty Friedan’s (1921–2006) highly influential book The Feminine Mystique in 1963. The Civil Rights Act, which prohibited sexual discrimination, was passed in 1964, but quickly led to frustration by feminists due to the lack of enforcement of its gender provisions.
In an interview after publishing The Feminine Mystique, Betty Friedan said
… I realized that it was not enough just to write a book. There had to be social change. And I remember somewhere in that period coming off an airplane [and] some guy was carrying a sign … It said, “The first step in revolution is consciousness.” Well, I did the consciousness with The Feminine Mystique. But then there had to be organization and there had to be a movement.
Friedan became the cofounder and first president of NOW at its organizing conference in Washington, DC on October 29, 1966. The first paragraph of the NOW Statement of Purpose, which Friedan drafted on a napkin, reads:
We, men and women, who hereby constitute ourselves as the National Organization for Women, believe that the time has come for a new movement toward true equality for all women in America, and toward a fully equal partnership of the sexes, as part of the worldwide revolution of human rights now taking place within and beyond our national borders.
My family has been involved with the feminist movement going back to the founding by Regina Stern, my mother’s mother’s mother, of the Stern Schule in Vienna 1868. It was the first school in Europe that provided higher education for girls. If a girl was lucky enough to get an education at all in the mid-nineteenth century in Europe, it went only through ninth grade. The Stern Schule went through fourteenth grade. Her daughter, my grandmother Lillian Bader, became the first woman in Europe to be awarded a PhD in chemistry and subsequently took over the school (see entry for The Stern Schule).
In the alternative reality of Danielle: Chronicles of a Superheroine, in her speech to the United Nations General Assembly, the eleven-year-old Danielle criticizes NOW and other American feminist organizations for their lack of emphasis and priority on ending Female Genital Mutilation (see entry for Female genital mutilation).
China’s one-child policy refers to a set of laws introduced between 1978 and 1980 in China that made it illegal in most circumstances to have more than one child. There were exceptions for certain minority groups, and, in later years, for certain portions of the population parents were allowed to have a second child if their first child was a girl.
The early history of communist China saw the population almost double. Under Mao Tse-tung, infant mortality declined from 227 of every 1000 births (a quarter of all births) to 53 in 1981, life expectancy increased from 35 to 66, and the government encouraged large families because it believed that this would make the nation stronger. As a result, China’s population went from 540 million in 1949 to 940 million in 1976.
In the late 1970s, western writers and organizations such as the Club of Rome and the Sierra Club were predicting a worldwide catastrophe caused by overpopulation. The thinking of Chinese leaders was heavily influenced by these views, especially the ideas contained in two popular books, The Limits to Growth by Dennis Meadows, Donella Meadows, Jørgen Randers, and William W. Behrens III, and A Blueprint for Survival by Edward Goldsmith, which were widely read by the Chinese leadership. The one child policy was adopted by the Chinese Communist Party in 1979.
The policy has been heavily criticized for the resulting forced sterilizations, forced abortions, and strict control over birth permits. The biggest controversy is what happened to tens of millions of missing girls. The ratio of boys to girls at birth in China has been around 117 to 100 since the one-child policy was instituted. According to China’s National Population and Family Planning Commission, there will be 30 million more men than women in China by 2020, which will lead to problems of social instability. In a report by the Canadian Broadcasting Corporation, “the [Chinese] government … has expressed concern about the tens of millions of young men who won’t be able to find brides and may turn to kidnapping women, sex-trafficking, other forms of crime or social unrest.”
Although sex-specific abortion, child abandonment, and infanticide are illegal in China, the US State Department, the British Parliament, and the human rights organization Amnesty International have each claimed that China’s one child policy has contributed to infanticide. Amartya Sen, quoted in 2013 in the Georgetown Journal’s Guide to the One-Child Policy said, “The one-child policy has led to the ‘Missing Women,’ or the 100 million girls ‘missing’ from the populations of China and other developing countries as a result of female infanticide, abandonment and neglect.” University of California at Davis scientist G. William Skinner and researcher Yuan Jianhua have written that infanticide occurred fairly frequently in China before the 1990s.
American social scientist and China expert Steven W. Mosher wrote in his 1993 book A Mother’s Ordeal, that it was not uncommon for Chinese women to have their children killed right before or after birth.
In Mosher’s 2008 book, Population Control, Real Costs, Illusory Benefits, Mosher summarizes that
… those who argue for [overpopulation’s] premises conjure up images of poverty—low incomes, poor health, unemployment, malnutrition, overcrowded housing to justify antinatal programs. The irony is that such policies have in many ways caused what they predicted—a world which is poorer materially, less diverse culturally, less advanced economically, and plagued by disease. The population controllers have not only studiously ignored mounting evidence of their multiple failures, they have avoided the biggest story of them all. Fertility rates are in free fall around the globe.
About half of the “missing women” in China in the 1980s and 1990s are accounted for by the common practice of Chinese female babies and young children having been adopted by parents in other countries. Another common phenomenon called “heihaizi” (meaning black child) refers to children (mostly girls) born outside of the one-child policy who are not registered with the Chinese National Household Registration System. These children do not legally exist and thus are not eligible for such services as education, health care, and protection under the law.
Recent research studies indicate that the abandonment and killing of baby girls has become rare compared to earlier periods of time due to stricter enforcement of laws against the practice and increasing numbers of exceptions to the one-child policy.
In 2015, the Chinese Government announced the end of the one-child policy instituting instead a two-child policy. Parents will still need to obtain a license to have each child. The government offered no reasons for the change and claimed that the one-child policy was successful and prevented 400 million births. The validity of this claim has been widely disputed. For example, Mei Fong, former journalist for the Wall Street Journal and winner of the Pulitzer Prize, writes in her 2015 book, One Child: The Past and Future of China’s Most Radical Experiment:
The reason China is doing this right now is because they have too many men, too many old people, and too few young people. They have this huge crushing demographic crisis as a result of the one-child policy. And if people don’t start having more children, they’re going to have a vastly diminished workforce to support a huge aging population.
In the alternative reality of Danielle: Chronicles of a Superheroine, eleven-year-old Danielle criticizes China’s one-child policy as having been a disaster for girls. Claire debates Danielle’s message with the president of the National Organization for Women.
Benjamin Netanyahu (born in 1949), nicknamed “Bibi,” was elected as the 9th and 13th prime minister of Israel and is serving as prime minister as of 2015. He has been elected prime minister four times, matching the record of David Ben-Gurion, Israel’s first prime minister.
Between 1967 and 1972, Netanyahu served in the Israeli military, became a team leader of a special forces unit, and was injured in 1972.
He attended the Massachusetts Institute of Technology in the 1970s and received Bachelor of Science and Master of Science degrees. He was in an MIT class that I lectured to in the 1970s on emerging technologies.
After graduating from MIT, he was a consultant for the Boston Consulting Group. In 1978 he returned to Israel where he founded the Yonatan Netanyahu Anti-Terror Institute. It was named after his brother, Yonatan Netanyahu, who, in 1976, led Operation Entebbe, in which 100 Israeli Commandos carried out a successful and daring rescue in Uganda to free 102 mostly Israeli hostages. Yonatan Netanyahu was the only Israeli death resulting from the mission.
Benjamin Netanyahu became the leader of the Likud party, the major center-right secular party in Israel, and was elected prime minister in 1996 serving for three years.
He became the Foreign Affairs Minister and Finance Minister in Ariel Sharon’s Likud led government between 2002 and 2005. As Minister of Finance, he led a major reform effort away from Israel’s (democratic) socialist orientation and towards the creation of a robust entrepreneurial sector. He was widely credited with having dramatically boosted the Israeli economy which cemented his leadership of the conservative political and economic factions in Israel.
In 2005 Sharon left Likud to form Kadima, a new center-right party, and Netanyahu again became the leader of the Likud party. In 2009 he again was elected prime minister and has served in that position since that time.
Part of his coalition includes smaller parties pressing for expanding settlements in the West Bank and his concessions to these parties has created tension with the United States and Europe. He has taken the position that in any peace settlement with the Palestinians, Israeli settlers in the West Bank should be allowed to stay in their settlements under Palestinian rule.
Netanyahu’s second and third terms as prime minister have been characterized by unwavering opposition to Iran obtaining a nuclear weapon. In speeches in 2009, he said, “Iran is seeking to obtain a nuclear weapon and constitutes the gravest threat to our existence since the war of independence.” He added, “The Iranian regime is motivated by fanaticism … They want to see us go back to medieval times. The struggle against Iran pits civilization against barbarism. This Iranian regime is fueled by extreme fundamentalism.”
The Obama administration’s negotiation and agreement with Iran to place restrictions on Iran’s nuclear program in exchange for a $150 billion relief of western sanctions has led to significant tension between the Netanyahu and Obama administrations. A poll of the Israeli public in late 2014 showed that the overwhelming majority of Israelis believe the relationship between Israel and the US has been hurt by the poor relationship between Obama and Netanyahu.
In 2009, I met with Prime Minister Netanyahu in his office after having engaged in an onstage dialogue with Israeli President Shimon Peres at the Israeli Presidential Conference in Jerusalem. Netanyahu remembered having attended my class in the 1970s. I reported to him on a study that Larry Page, cofounder of Google, and I had conducted on emerging energy technologies for the US National Academy of Engineering. Our study had concluded that the number of watts of solar energy produced worldwide was doubling every two years (and had been for twenty-five years) and was only eight more doublings from meeting all of the world’s energy needs. Netanyahu asked, “But do we have enough sunlight to do this with?” I responded that we have 10,000 times more than we need. In other words, after doubling eight more times and meeting all of the world’s energy needs (in less than twenty years), we would be using only one part in 10,000 of the free sunlight that falls on the Earth. Based on our discussion, he announced at the Israeli Presidential Conference the next day the organization of an Israeli institute to accelerate the use of solar energy and other emerging renewable energy sources.
In the alternative reality of Danielle: Chronicles of a Superheroine, eleven-year-old Danielle meets with Prime Minister Netanyahu in his office. This is their first conversation:
“What is a nice Jewish girl from Los Angeles doing running Saudi Arabia?” he asked her.
“I’m not exactly running it,” she replied.
“Looks like that to me.”
“Yeah, well what is a nice engineer from MIT doing running Israel?”
“Ha, have you seen my coalition? You have more influence on Saudi Arabia than I do on Israel.”
Danielle continues to negotiate with Netanyahu as part of her effort to promote her peace plan for the Middle East. She agrees with Bibi that Israeli settlers should be allowed to stay in the West Bank under Palestinian rule.
Yad Vashem, Israel’s official memorial to the Holocaust and its victims, was opened in 1953, five years after the founding of the State of Israel. It is located on the Mount of Remembrance in Jerusalem. The primary feature of Yad Vashem is the Holocaust History Museum, but it also houses memorial sites including the Children’s Memorial, the Hall of Remembrance, the Museum of Holocaust Art, and an educational center called the International Institute for Holocaust Studies.
About one million people visit Yad Vashem each year, which is second in popularity only to the Western Wall, which is a small surviving piece of the western retaining wall of the Second Jewish Temple, and the holiest site in Judaism. The Western Wall sits on a hill known as the Temple Mount which is holy to Jews, Christians, and Muslims.
A key objective of Yad Vashem has been to honor gentiles who, “at great personal risk” and without a selfish motive, saved Jewish lives from the Holocaust genocide.
These heroes are honored in a section of Yad Vashem called the Garden of the Righteous Among the Nations.
Yad Vashem seeks to maintain the memory and history of the six million Jews murdered in the Holocaust and holds ceremonies of remembrance, supports research on Holocaust history, and preserves a vast number of memoirs, documents, photographs, artifacts, and diaries.
The original museum in 1957 focused on uprisings in the Warsaw Ghetto and the Sobibor and Treblinka death camps. In 2005, a new and far more comprehensive museum opened replacing the original, after twelve years of development. It contains 10 linked exhibition halls, each devoted to a chapter of the Holocaust story. Avner Shalev, Yad Vashem curator, puts it this way:
[Yad Vashem] looks into the eyes of the individuals. There weren’t six million victims, there were six million individual murders … The museum serves as an important signpost to all of humankind, a signpost that warns how short the distance is between hatred and murder, between racism and genocide.
I visited the old Yad Vashem museum in 1985 and 1995 and the new museum in 2009 and found each visit to be a deeply informative and moving experience.
In the alternative reality of Danielle: Chronicles of a Superheroine, eleven-year-old Danielle and nineteen-year-old Claire both wear black dresses when they visit Yad Vashem, and Danielle shares her perspective on the Holocaust with a world audience.
The Holocaust refers to the organized Nazi program to systematically exterminate the Jews during World War II. This act of genocide resulted in the killing of six million Jews, which was two thirds of the Jewish population in Europe, plus an additional five to nine million non-Jews.
See entry for Hitler’s Final Solution.
The Eternal Flame at the Yad Vashem Holocaust Memorial contains an inextinguishable flame in what appears to be a broken bronze goblet. The flame provides the light for the Hall of Remembrance. Next to it is a stone crypt with the ashes of Holocaust victims from the death camps, and arrayed around the flame are the names of the concentration and death camps in Hebrew and English.
In the alternative reality of Danielle: Chronicles of a Superheroine, Danielle and Claire place 15 stones by the Eternal Flame commemorating the 15 million victims of the Holocaust.
The Hall of Remembrance at the Yad Vashem Holocaust Memorial, dedicated in 1961, has become the focal point for commemoration and prayer for the victims of the Holocaust.
In the alternative reality of Danielle: Chronicles of a Superheroine, it is at the Hall of Remembrance that the eleven-year-old Danielle shares her thoughts on the Holocaust with a world audience.
Shoah is a biblical word that means “destruction.” As early as the 1940s, it became the Hebrew term to refer to the Holocaust.
The Hall of Names in the Yad Vashem Holocaust Museum is a memorial devoted to each individual Jewish victim. It symbolizes their individuality by displaying 600 individual photographs in a large cone above the hall. There is a lower cone containing water that reflects these images representing victims whose names are unknown.
The Hall of Names also contains an archive with over two million pages of written testimony, over 100,000 testimonials in audio, video, and other forms, as well as a computerized data bank.
In the alternative reality of Danielle: Chronicles of a Superheroine, during her speech to a world audience, the eleven-year-old Danielle shares her reaction to seeing a woman in the Hall of Names who is still grieving for her lost child.
Johanna “Hannah” Arendt (1906–1975) was a German-born American social philosopher although she preferred the description “political theorist.”
In the 1920s, Arendt, while a student at the University of Freiburg, began a romantic relationship with Martin Heidegger, a highly influential (and married) German philosopher and rector at the university. She became troubled by reports that he was giving speeches at meetings of the National Socialist (Nazi) party. She asked him to deny these reports which he declined to do, replying instead that he still had romantic feelings for her. She ended their relationship shortly after that in 1932.
Arendt was denied a professorship in Germany because she was Jewish. She conducted research on anti-Semitism and was arrested by the Nazi Gestapo in 1933, and released after a brief imprisonment. She fled Germany that year for Czechoslovakia, then moved to Geneva, Switzerland, finally settling in Paris, France, where she helped Jewish refugees. After Germany conquered France, she was deported by the Vichy Regime (the government of France after its defeat by the Nazis and widely regarded as a Nazi puppet government) to Camp Gurs in southwest France, an internment camp for “enemy aliens.” She managed to be released after a few weeks, obtained an illegal visa to travel to Portugal, and ended up settling in the United States. After World War II, she returned temporarily to Germany and worked for the Zionist organization Youth Aliya, which saved thousands of children from the Holocaust and helped settle them in the British Mandate of Palestine (a portion of which became Israel in 1948).
She became a naturalized citizen of the United States in 1950 and rekindled her romance with Heidegger for two years. At that time, she defended Heidegger from severe criticism for his role as a supporter of the Nazis during World War II saying that he had been naïve, and that in any event his philosophy was not related to that of the Nazis. Her relationship with and defense of Heidegger earned her considerable criticism.
Arendt’s first major book, The Origins of Totalitarianism, published in 1951, took equal aim at Stalinism and Nazism and was opposed by the American Left for portraying the two movements as moral equivalents.
She was chosen to report on the 1961 trial of Adolf Eichmann for The New Yorker. Eichmann, who has been called the architect of the Holocaust, was captured in Argentina by Mossad, Israel’s intelligence service, in 1960, was put on trial in Israel and found guilty of war crimes and hanged in 1962. Her report was expanded into her most influential book, Eichmann in Jerusalem: A Report on the Banality of Evil, and published in 1963.
The primary message of the book is summed up in the last three words of the title. Arendt was examining the nature of human evil and concludes that evil can result from a failure to question the ideas and values that are prevalent in one’s world.
Arendt has been closely associated with this emphasis on the moral importance of critical thinking, of taking personal responsibility for one’s actions and not falling back, as many people do, on the attitudes and blind spots of one’s community. It is a powerful idea but I would point out that evil is not always banal as that adjective would not be an apt description of Hitler.
Arendt ends the book with this passage,
Just as you [Eichmann] supported and carried out a policy of not wanting to share the earth with the Jewish people and the people of a number of other nations—as though you and your superiors had any right to determine who should and who should not inhabit the world—we find that no one, that is, no member of the human race, can be expected to want to share the earth with you. This is the reason, and the only reason, you must hang.
In the alternative reality of Danielle: Chronicles of a Superheroine, eleven-year-old Danielle refers to Arendt in her speech at Yad Vashem, “[Arendt] expected to descend into the bowels of human loathing [when she interviewed Eichmann]. Instead she encountered an ordinary and prosaic bureaucrat whose malevolence resulted from his failure to question the values in his midst. The Shoah resulted at least in part from this failure of critical thinking, from this ‘banality of evil,’ to quote her deservedly famous phrase.”
Otto Adolf Eichmann (1906–1962), a German Nazi lieutenant colonel, was regarded as the leading architect and organizer of the Holocaust, the systematic Nazi program to exterminate the Jews during World War II. This act of genocide resulted in the killing of six million Jews, which was two thirds of the Jewish population in Europe, plus an additional five to nine million non-Jews. (See the entry for Hitler’s Final Solution for a more complete history.)
In 1941, as the Nazis began their invasion of the Soviet Union, the Nazi policy for the Jewish populations under their control began to switch from exile to extermination. To formulate a definitive plan, Reinhard Heydrich (born in 1904, assassinated in 1942 in Prague), one of the highest-ranking Nazi officials during World War II, with the titles of SS-Obersturmbannführer, General der Polizei (police), and Chief of the Reich Main Security Office, hosted a meeting called the Wannsee Conference on January 20, 1942, held in a villa in a Berlin suburb. The Nazi regime’s administrative leaders including Eichmann and Heydrich attended. Eichmann was responsible for collecting the background information for Heydrich, preparing the recommendations and writing the minutes. The decision reached at the Wannsee Conference was for the Jews of Europe to be transported to death camps, primarily in Poland. Immediately after the Wannsee Conference, Eichmann was put in charge of implementing the plan. He determined that the primary method of killing would be to gas the victims and then burn their bodies.
When the Nazis were defeated by the Allies in 1945, Eichmann escaped to Austria where he lived for five years, then emigrated to Argentina using forged papers and lived there with a false identity. In 1960, he was captured by Mossad, Israel’s intelligence service, and brought to Israel to stand trial on 15 criminal charges including war crimes and crimes against humanity.
Eichmann’s defense was the same as those used by many of the Nazi defendants accused of war crimes, such as the those who stood trial at the Nuremberg trials in 1945–1946: he had no choice, was bound by his oath of loyalty, and was just following orders. Eichmann said the decisions to annihilate the Jews was not made by him, but by Müller, Heydrich, Himmler, and ultimately Hitler (Heinrich Müller, known as Gestapo Müller, was Chief of the Gestapo, the secret police of Nazi Germany; Heinrich Himmler had the title Reichsführer and was regarded as second in command to Hitler; Adolf Hitler was the Chancellor of Germany from 1933 to 1945 and Führer, or ultimate leader, of Nazi Germany from 1934 until their defeat in 1945). “I was one of the many horses pulling the wagon,” Eichmann said in his defense, “and I couldn’t escape left or right because of the will of the driver.”
The social philosopher Hannah Arendt reported on Eichmann’s 1961 trial for The New Yorker. She noted his flat affect and apparent ordinary demeanor. He did not appear to be a monster, but rather a banal bureaucrat, hence her famous phrase “the banality of evil,” which became part of the title of her 1963 book on the trial, Eichmann in Jerusalem: A Report on the Banality of Evil.
Eichmann was found guilty of several criminal counts including war crimes and crimes against humanity and was hanged in 1962.
The true nature of Eichmann and his massive crimes has been the subject of considerable debate given its importance in understanding the nature of human evil. Several historians and observers have criticized Arendt’s characterization of Eichmann as a banal bureaucrat, saying this was historically incorrect. German journalist Elke Schmitter wrote in the German magazine Der Spiegel that Eichmann’s “performance in Jerusalem was a successful deception,” that evidence that came out after the trial showed that he was a fanatical anti-Semite. Film historian Saul Austerlitz wrote in a review of Arendt’s book in The New Republic, that her “book makes for good philosophy, but shoddy history.” University of Southampton professor of social and political philosophy David Owen wrote that “while Arendt’s thesis concerning the banality of evil is a fundamental insight for moral philosophy, she is almost certainly wrong about Eichmann.”
Arendt defended her description of Eichmann and wrote that she was aware of much of the “new” material that purportedly surfaced after the trial, such as an interview of Eichmann conducted while he was in hiding in Argentina conducted by Nazi journalist and war criminal, William Sassen. Arendt acknowledged that Eichmann was indeed a rabid anti-Semite, but that this was not inconsistent with his being “banal,” by which she meant that he was incapable of original thought, of questioning assumptions. Arendt wrote, “whether writing his memoirs in Argentina or in Jerusalem, he always sounded and spoke the same. The longer one listened to him, the more obvious it became that his inability to speak was closely connected with an inability to think, namely, to think from the standpoint of someone else.”
It is clear that Eichmann’s defense that he was just following orders was not correct. For example, in 1944, Eichmann’s superior Heinrich Himmler issued orders in 1944, hoping to obtain leniency given his perception of the inevitability of defeat, to “take good care of the Jews, act as their nursemaid.” Eichmann disobeyed these orders and instead organized Jewish death marches. According to Arendt, Eichmann “did his best to make the Final Solution final.” Eichmann was quoted near the end of the war that he “would leap laughing into the grave because the feeling that he had five million people on his conscience would be for him a source of extraordinary satisfaction.”
While in Israeli custody, he was quoted as saying, “To sum it all up, I must say that I regret nothing … We shall meet again. I have believed in God. I obeyed the laws of war and was loyal to my flag.”
In the alternative reality of Danielle: Chronicles of a Superheroine, eleven-year-old Danielle discusses Hannah Arendt, Adolf Eichmann, and the nature of evil in her speech at Yad Vashem, watched by a world audience.
“Banality of Evil” is a portion of the title of a 1963 book by Hannah Arendt (Eichmann in Jerusalem: A Report on the Banality of Evil). The phrase summarizes Arendt’s philosophy of the importance of critical thinking.
In the alternative reality of Danielle: Chronicles of a Superheroine, eleven-year-old Danielle cites this famous phrase in her speech at Yad Vashem, Israel’s memorial and museum about the Holocaust.
See entry for Hannah Arendt.
According to Raul Hilberg, the noted Holocaust historian, shortly after US forces liberated the Nazi death camp Buchenwald in April of 1945, inmates held up handmade signs with the phrase “Never again.” The truth of this report has not been confirmed, but the phrase has become associated with the Holocaust. It has also been associated with the founding of Israel as a means by which the Jewish people would assure the sentiment expressed by the phrase.
In the alternative reality of Danielle: Chronicles of a Superheroine, eleven-year-old Danielle uses these two words in her speech at Yad Vashem, Israel’s memorial and museum about the Holocaust.
See entry for Hitler’s Final Solution.
For years, Israeli politics was split between the Labor party, which favored liberal economic policies along with support for negotiation with the Palestinians for a peace treaty that would give the Palestinians control of the West Bank, and the Likud party, which in contrast, favored smaller government, lower taxes, and was skeptical of calls for Israeli to relinquish control of the West Bank.
Ariel Sharon (1928–2014), who had led the assault on the Sinai in Israel’s successful “Six Day War” in 1967 and the encirclement of the Egypt’s Third Army in the 1973 Yom Kippur war, was regarded as perhaps the greatest Israeli general, and was widely known by his nickname, “The King of Israel.” As leader of the Likud party, he became Israel’s prime minister in 2001. By 2003, Sharon became concerned that if Israel held onto the West Bank, it would lead either to Israel no longer being a Jewish state (given the size of the Arab population in the West Bank) or an undemocratic state (if Israel decided not to give citizenship and voting rights to the Palestinian population under its control). Based on this reasoning, he decided on a policy of unilateral disengagement from the Palestinian population in the Gaza Strip and West Bank.
In 2003, Prime Minister Sharon proposed that Israel simply vacate the Gaza Strip to be followed by a similar move in the West Bank. The proposal for leaving Gaza was adopted by the Israel government in June of 2004, approved by the Knesset (Israeli parliament) in February of 2005, and was implemented in August of 2005. Some Israeli citizens living in the Gaza Strip accepted payments from the Israeli Government to leave Gaza whereas others were forcibly evicted by the Israeli military. The Likud party under Benjamin Netanyahu criticized the withdrawal saying that the result would be Gaza becoming a threat to Israel.
Indeed, Hamas, which was recognized by the US and Europe as a terrorist organization, won the Palestinian elections in both Gaza and the West Bank in 2006, and in 2007 in a military clash with the Fatah-controlled Palestinian government took over Gaza. This led to Hamas launching missile attacks against Israel from the Gaza Strip and wars between Israel and Hamas-controlled Gaza in 2008 and 2014.
On November 21, 2005, Prime Minister Sharon resigned as head of the Likud party, formed a new party called Kadima (“Forward”) with the stated policy of unilateral Israeli disengagement from the West Bank, dissolved Parliament and called for new elections. Polls indicated that Sharon would win, but in January of 2006 he suffered a debilitating stroke that left him in a coma. Kadima won the election anyway and Ehud Olmert, who had taken over leadership of Kadima, became Prime Minister. Benjamin Netanyahu became leader of Likud.
In the 2009 elections, Kadima, under the leadership of Tzipi Livni, won the most seats (29 out of 120), but was unable to form a governing coalition and Netanyahu and the Likud party returned to power. Shortly after this, I met with Livni, then Israel’s opposition leader, in her office in 2009. She supported Sharon’s original view that Israel could not keep control of the West Bank indefinitely, but also felt that Israel had no responsible negotiating partner in the West Bank.
In 2012, Livni was defeated for continued leadership of Kadima and she formed a new center-left party called Hatnuah. With this splintering of the party, Kadima won only two seats in the Knesset in the 2013 elections, and did not participate in the 2015 elections.
In the alternative reality of Danielle: Chronicles of a Superheroine, eleven-year-old Danielle manages to politically box in both the Kadima and Likud parties with her Middle East peace proposal.
Likud, a major center-right and secular political party in Israel, was founded in 1973 by Ariel Sharon and Menachem Begin, who had been the leader of Irgun, a Zionist militant group which had fought British control in Palestine from 1944 to the establishment of the Israeli state in 1948. Likud, allied with both right-wing and liberal parties, won a landslide victory in 1977 making Begin prime minister.
In 1979, Prime Minister Begin signed a peace treaty with Egypt and shared the Nobel Peace Prize with Anwar Sadat, leading to Israel withdrawing from the Sinai desert. Israel’s peace accord with Egypt has remained in force, although that has largely been a cold peace. Although Likud has traditionally been skeptical of peace agreements with its Arab neighbors, it has been said that it is only right-wing parties such as Likud that can make such peace agreements and deliver Israeli popular support.
Begin’s government was also responsible for the successful destruction of the Osirak nuclear plant in Iraq on June 7, 1981 and is credited with having destroyed Saddam Hussein’s nascent program to build an atomic bomb.
After leading the country for much of the 1980s, Likud lost control of the government in 1992. Likud again became the ruling party under the leadership of Benjamin Netanyahu in 1996. Likud lost again in 1999 but regained control of the government under Ariel Sharon in 2001. In 2003, Prime Minister Sharon changed his position with regard to negotiating with the Arabs and advocated unilateral withdrawal from the Gaza Strip and the West Bank. This was strongly criticized by Netanyahu and while prime minister, Sharon left Likud to form the
In the alternative reality of Danielle: Chronicles of a Superheroine, eleven-year-old Danielle uses a unique strategy to promote her Middle East peace plan through the intricacies of Israeli politics.
Palestine may refer to any of the following: (i) a geographic region in the Middle East located between the Mediterranean Sea and the Jordan River, (ii) the geographic area currently comprised of Israel, the West Bank and the Gaza Strip, (iii) a proto-state located on the West Bank and the Gaza Strip recognized by some nations, or (iv) the Palestinian National Authority, a government established in 1994 to govern the West Bank and the Gaza Strip.
The name goes back to the Ancient Greeks, and was also used by the Roman Empire to refer to a region that includes modern Syria and Israel. Over the millennia, it has been controlled variously by the Ancient Egyptians, Persians, Romans, Arabs, Christian Crusaders, and others. The borders of the region have also changed throughout history.
After a series of military campaigns, the British were awarded a mandate to govern Palestine by the League of Nations in 1922. Non-Jewish inhabitants of the region mounted revolts in 1920, 1929, and 1936 but Britain maintained control.
At Britain’s request in 1947, after the conclusion of World War II and the Holocaust, the United Nations General Assembly adopted Resolution 181(II) recommending Palestine be partitioned into an Arab State and a Jewish state with a “Special International Regime” to govern the city of Jerusalem.
The British mandate was scheduled to terminate at midnight at the end of May 14, 1948. David Ben Gurion, the Executive Head of the World Zionist Organization, declared the establishment of the State of Israel to take effect at midnight. Eleven minutes after midnight, the Truman Administration announced de facto recognition by the United States of the new nation which was followed by official recognition by the United States on January 31, 1949. The first nation to officially recognize the State of Israel was the Soviet Union on May 17, 1948. Israel was admitted to membership of the United Nations on May 11, 1949.
On May 15, 1948, hours after the creation of the State of Israel, a coalition of Arab states declared war on Israel and attacked it in what became the first Arab-Israeli war. Many Arabs have referred to the creation of Israel as “Nakba” (the Catastrophe). The war lasted ten months with Israel capturing an additional 26 percent of the original British Mandate territory.
The subsequent period of over half a century has witnessed frequent wars and military conflicts between Israel and its Arab neighbors which continue to the present time.
In the alternative reality of Danielle: Chronicles of a Superheroine, eleven-year-old Danielle presents a unique plan for a Middle East peace settlement concerning Palestine that differs in a significant way from attempted peace plans of the past.
In the 1960s, the fields of computation and communications used electronic clocks using megahertz frequency, meaning a million pulses or cycles per second. This meant that an operation such as a calculation was measured in millionths of a second.
Around the end of the twentieth century, these clocks were replaced with ones using gigahertz frequency, meaning a billion cycles per second, and calculation speeds began to be measured in billionths of a second.
Scientists are now experimenting with terahertz clocks meaning a trillion (million million) cycles per second. The promise is computation and communication speeds that are thousands of times faster than those from around 2015.
Terahertz frequency also has significance for noninvasive security scanning in that terahertz frequency electromagnetic waves can travel long distances and reveal dangers. It can see through clothing and detect concealed weapons, including non-metallic ones, detect substances such as explosives, and biological and chemical weapons. This form of detection can potentially be done from significant distances so it could be used to scan people in a crowd, such as a crowded marketplace. The resulting interference patterns could be automatically analyzed by artificial intelligence systems to detect security hazards.
It is widely believed that terahertz scanning of people is safe although some scientists have expressed concern that it could disrupt DNA replication.
In the alternative reality of Danielle: Chronicles of a Superheroine, part of eleven-year-old Danielle’s security plan for the Middle East is for Israel to use terahertz frequency scanning to prevent terrorist attacks.
A microbot is a robot in which the key features are measured in microns (millionths of a meter) and the overall dimensions of the robot are measured in millimeters (thousandths of a meter). Such a robot would contain a computer, sensors to detect its environment, such as a video camera and pressure sensors, and actuators so that it can move and manipulate its surroundings.
Microbots emerged in the first decade of the twenty-first century with the advent of single chip computers and microelectronic mechanical systems (MEMS), which are mechanical systems built using chip technology. Research on microbots goes back to the 1970s.
A microbot swarm is a large number of coordinated microbots intended to perform a single mission. Examples including exploring a dangerous terrain or searching for survivors in a building that is on fire or in danger of collapse.
Microbot swarms can also be used for security and military missions. For example, an enclosure containing a swarm of microbots could be dropped from a vehicle, such as a plane, which would release its cargo of microbots to perform a reconnaissance or offensive mission. The target of a microbot attack would find it difficult to defend against, given that it involves hundreds of small robots.
In the alternative reality of Danielle: Chronicles of a Superheroine, microbot swarms are part of eleven-year-old Danielle’s plan for Israeli security.
Terahertz scanning is an emerging security technology using electromagnetic waves that are in the terahertz (trillions of cycles per second) frequency range.
See entry for Terahertz frequency.
Winston Churchill (1874–1965) is regarded by many (including this author) as the greatest European leader of the twentieth century. He led Britain to withstand an overwhelming assault from Nazi Germany and thereby enabled the Allied invasion of Normandy to retake Western Europe from Nazi control.
Churchill was a commoner, although he was the grandson of the Duke of Marlborough. His father, Lord Randolph, had been elected to the House of Commons (the Lower House of Parliament) representing the Conservative(Tory) party and was the Chancellor of the Exchequer in the 1880s. Churchill was first elected to the House of Commons in 1900 and served as a cabinet member and as president of the Board of Trade in 1908.
During much of the 1930s Churchill was an isolated voice in the House of Commons warning about the need to re-arm against Germany and the danger of the Nazi party. The prevailing opinion in England, and in Parliament, during this time was one of accommodation to the rise of Fascism in Germany.
On September 30, 1938, Prime Minister Neville Chamberlain (1869–1940) met with Reichskanzler (chancellor) and Führer (absolute leader) Adolf Hitler at Hitler’s private mountain retreat in Berchtesgaden to discuss reports that Hitler intended to invade Czechoslovakia.
The resulting “Munich Agreement” provided for Germany to annex and re-arm the Sudetenland, which was part of Czechoslovakia but had been part of Germany prior to it being given to Czechoslovakia as part of the Treaty of Versailles that ended World War I. Chamberlain returned to England, and in a famous speech said, “The settlement of the Czechoslovakian problem … is, in my view, only the prelude to a larger settlement in which all Europe may find peace … a British Prime Minister has returned from Germany bringing peace with honour. I believe it is peace for our time. … Go home and get a nice quiet sleep.”
On October 5, 1938, Churchill gave a blistering speech in the House of Commons condemning the Munich pact, saying:
I will begin by saying what everybody would like to ignore or forget but which must nevertheless be stated, namely, that we have sustained a total and unmitigated defeat. … We in this country, as in other Liberal and democratic countries, have a perfect right to exalt the principle of self-determination, but it comes ill out of the mouths of those in totalitarian states who deny even the smallest element of toleration to every section and creed within their bounds. … It is the most grievous consequence of which we have yet experienced of what we have done and of what we have left undone in the last five—five years of futile good intention, five years of eager search for the line of least resistance, five years of uninterrupted retreat of British power, five years of neglect of our air defences. … [T]here can never be friendship between the British democracy and the Nazi power, that Power … which vaunts the spirit of aggression and conquest, which derives strength and perverted pleasure from persecution, and uses, as we have seen, with pitiless brutality the threat of murderous force. That Power cannot ever be the trusted friend of the British democracy.
The “peace in our time” that Chamberlain promised was short lived as Nazi Germany went on to invade Czechoslovakia and Poland, leading to Britain declaring war on Germany on September 3, 1939 with Chamberlain still in his position as prime minister. However, this was called the “Phony War” by critics as Chamberlain did essentially nothing to prosecute the war aside from some minor naval attacks.
When Germany invaded France on May 10, 1940, Chamberlain was stunned. Confidence in him instantly evaporated and he resigned that day. Chamberlain actually advised British King George VI to recommend that Churchill be asked to organize the next government, and on that same day Churchill became prime minister of Great Britain. His first official act was to thank Chamberlain for his support.
Neville Chamberlain and the Munich Pact that he negotiated with Hitler have become known as the ultimate symbols of appeasement. Although Churchill was nearly alone in his recognition of Hitler’s true face and his condemnation of England’s lack of defiance to signs of Nazi aggressiveness in the 1930s, he quickly rallied his countrymen and women after becoming prime minister to stand up to the Nazi onslaught in Europe.
France fell quickly. On July 10, 1940, France established the Vichy Government, which relocated the administration of France from Paris to the town of Vichy with Philippe Pétain as Marshal (Leader) of France in what was widely regarded as a puppet regime controlled by the Nazis.
The French Navy was docked at its base at Mers-el-Kébir on the coast of French Algeria. Concerned that this naval force would come under control of the Nazis, and despite assurances from French leaders that this would not happen, Churchill ordered the destruction of the French Navy on July 3, 1940 resulting in the death of 1,297 French soldiers. It was the first military action by England against France in 125 years. The Mers-el-Kébir attack immediately changed world opinion regarding the resolve of Britain and in particular of Churchill to resist the Nazis. Prior to this attack, it was generally expected that England would quickly fall to the Nazis as had the rest of Europe. It was this act that caused American President Franklin Roosevelt to realize that Churchill was a man he could trust. He praised Churchill’s courageous action and lauded it as helping American security. This began a close and crucial relationship between the two leaders.
On October 28, 1941, British mathematician and computer pioneer Alan Turing (1912–1954) and his colleagues at Hut 8 of the British Government Code and Cypher School at Bletchley Park (Hut 8 being the section responsible for breaking German codes) wrote a letter to Churchill asking for greatly increased resources for their code breaking efforts. The letter bypassed Turing’s chain of command and was resented by his superiors, but according to Turing’s biographer Andrew Hodges, the letter had a dramatic and immediate effect on Churchill who wrote a letter to British General Ismay, “ACTION THIS DAY. Make sure [Turing and his colleagues] have all they want on extreme priority and report to me that this has been done.” The result was a series of code breaking computers which were among the first computers ever successfully constructed. They provided a continuous transcription of encrypted Nazi military messages to Churchill and the British leadership, including messages to the Luftwaffe (the German Air Force) and the German fleet of U-boats. Churchill had to use this information with extreme discretion lest it tip off the Germans that their Enigma code had been cracked. Turing and his fellow mathematicians assisted Churchill in this task by computing what fraction of such messages they could actually use without creating suspicion. Historians credit the code breaking efforts by Turing and his colleagues and Churchill’s support of their efforts with having enabled the British Royal Air Force to resist the far greater numerical strength of the Luftwaffe, and otherwise prevail in the Battle of Britain.
Starting with Roosevelt’s positive reaction to Churchill’s destruction of the French Navy at Mers-el-Kébir, the relationship between Churchill and Roosevelt was one of the great personal and military alliances in twentieth century history. Between 1939 and 1945 they wrote each other about 1,700 letters and had 120 days of close personal contact. Roosevelt provided extensive war supplies and financial support for England’s war against Germany via a new form of financing called Lend-Lease, in which repayment for the loans was assistance in defending the United States rather than monetary.
Starting in 1939 German forces used a military strategy which became known as Blitzkrieg (“lightning war”) which consisted of quick thrusts by tank divisions coordinated with overwhelming air power. This strategy combined with the strength of the German rearmament had quickly overrun mainland Europe, including Czechoslovakia and Poland, and on June 16, 1940, France. Churchill rallied the morale of his countrymen and women with his defiant and inspired oratory. On the eve of the French capitulation, he spoke to the British people and the world in the form of a speech to the House of Commons:
… [T]he Battle of France is over … the Battle of Britain is about to begin. Upon this battle depends the survival of Christian civilization. Upon it depends our own British life, and the long continuity of our institutions and our Empire. The whole fury and might of the enemy must very soon be turned on us.
Hitler knows that he will have to break us in this island or lose the war. If we can stand up to him, all Europe may be freed and the life of the world may move forward into broad, sunlit uplands. But if we fail, then the whole world, including the United States, including all that we have known and cared for, will sink into the abyss of a new Dark Age made more sinister, and perhaps more protracted, by the lights of perverted science.
Let us therefore brace ourselves to our duties, and so bear ourselves that if the British Empire and its Commonwealth last for a thousand years, men will still say, “This was their finest hour.”
The Blitzkrieg, directed at England in the form of a devastating air campaign, started on September 7, 1940 and lasted nine months. All major British cities sustained enormous aerial raids including 71 major attacks on London. On September 7, 1940, the German Luftwaffe began bombing London for 57 consecutive nights. This resulted in the destruction of more than one million London homes.
Londoners spent days, and in some cases weeks, at a time in underground shelters, some fashioned from subway stations. Churchill again used his oratory to keep British morale and resolve high, and credited the shelters with limiting damage:
We were told also, on last Thursday week, that 251 tons of explosives were thrown upon London in a single night, that is to say, only a few tons less than the total dropped on the whole country throughout the last war. Now we know exactly what our casualties have been. On that particular Thursday night 180 persons were killed in London as a result of 251 tons of bombs. That is to say it took one ton of bombs to kill three-quarters of a person. … In [World War I] the small bombs of early patterns which were used killed ten persons for every ton discharged in the built-up areas. … That is, the mortality is now less than one-tenth of the mortality attaching to the German bombing attacks in the last war. …
What is the explanation? There can only be one, namely the vastly improved methods of shelter which have been adopted.
The Nazi focus on bombing British cities distracted them from focusing on the British war economy. This lopsided Nazi strategy, together with effective British defenses, prevented significant damage to the British economy and war production.
When Japan attacked Pearl Harbor on December 7, 1941, Churchill was quoted as saying “We have won the war!” realizing that the attack would give Roosevelt the political cover he needed to declare war against both Japan and Germany. In an address to a joint meeting of the US Congress on December 26, 1941, Churchill asked rhetorically, “What kind of people do they (the German and Japanese leadership) think we are?” Churchill became known in the American and Russian press as “the British Bulldog.”
On June 6, 1944, Allied Forces under the command of US General Dwight Eisenhower invaded Normandy deployed from bases on the Southern coast of England in the largest sea based invasion in history. Known as “D-Day,” the invasion marked a turning point in the war against Nazi Germany. Germany surrendered on May 7, 1945.
Winston Churchill died on January 24, 1965 at the age of 90. More than 300,000 people filed past his coffin, flags were lowered, the cranes in London dipped their hoists, and the Queen and leaders from around the world attended his funeral, in recognition of a fallen hero that had not been seen previously in England (for other than a king or queen) and has not been seen since. He became the first British commoner of the twentieth century to appear on a British postage stamp.
Key to Churchill’s impact on history was his resoluteness which is reflected in his formidable oratory, evidenced by the many powerful quotations attributed to him, a small sample of which include:
“Success consists of going from failure to failure without loss of enthusiasm.”
“If you’re going through hell, keep going.”
“You have enemies? Good. That means you’ve stood up for something, sometime in your life.”
“History will be kind to me for I intend to write it.”
“A lie gets halfway around the world before the truth has a chance to get its pants on.”
“We shall defend our island, whatever the cost may be, we shall fight on the beaches, we shall fight on the landing grounds, we shall fight in the fields and in the streets, we shall fight in the hills; we shall never surrender.”
“A pessimist sees the difficulty in every opportunity; an optimist sees the opportunity in every difficulty.”
“It is no use saying, ‘We are doing our best.’ You have got to succeed in doing what is necessary.”
“It has been said democracy is the worst form of government except all the others that have been tried.”
My grandfather Edwin Bader took the middle name Winston to honor Churchill for his role in saving Europe from Hitler and in so doing saving his own (and my!) family. My grandfather was able to flee with his wife and two daughters (my mother and my aunt) in the Summer of 1938 from Vienna to London. He kept a statue of Churchill on his desk.
Finally, there is the quote that begins the companion novel to this book, Danielle: Chronicles of a Superheroine:
“Never give in. Never give in. Never, never, never, never—in nothing, great or small, large or petty—never give in, except to convictions of honour and good sense.”
In the alternative reality of Danielle: Chronicles of a Superheroine, eleven-year-old Danielle cites Churchill while defending her Middle East peace plan.
The Knesset is the unicameral national legislature of Israel. Its 120 members are responsible for passing all the laws and electing the president and prime minister. Parties put up slates of candidates and members vote for the party of their choice, not individual candidates. Parties receive seats in the Knesset based on proportional representation but must surpass an election threshold to receive any seats. The threshold was originally set at 1 percent in 1949, then increased to 1.5 percent in 1992, 2 percent in 2003 and 3.25 percent in March of 2014. The increases were considered reforms to prevent very small parties from gaining undue influence, but the threshold is still low enough that the Knesset typically represents on the order of 10 different parties. No one party has ever had a majority so the government (and the selection of the prime minister) has always been based on a coalition of parties. The primary responsibility of the president is to ask the leader of the party most likely to assemble a governing coalition to do so. The president usually asks the party with the largest number of seats to form a government, but this is not required. For example, in the 2009 elections the governing coalition was assembled by the Likud party under the leadership of Benjamin Netanyahu even though the Kadima party had won a larger number of seats.
From the founding of the Likud party in 1973 and through much of the 1990s, Israeli politics was split between the Labor and Likud parties. This gave an outsized influence to small parties, especially the religious parties controlled by Orthodox and Hassidic communities. For example, Rabbi Schneerson (who lived in Brooklyn) was credited with preventing the Labor Party, under the leadership of Shimon Peres, from forming a government in 1990.
In the alternative reality of Danielle: Chronicles of a Superheroine, eleven-year-old Danielle gains influence on Israeli politics using this insight about the role of the religious parties to advance her Middle East peace plan.
The Quartet is a group of four nations and institutions that was formed in 2002 to assist in negotiating peace in the Israel-Arab conflict, especially the Israeli-Palestinian conflict. It consists of the United States, the United Nations, the European Union and Russia.
The idea to form this diplomatic group emerged after the outbreak of the “Second Intifada” (the second uprising of Palestinians against Israel) in September of 2000. The formation of the Quartet formalized the fact that these four nations and institutions were the ones primarily involved in mediating this conflict. Tony Blair was appointed the official envoy of the Quartet on the same day that he resigned as prime minister of England, June 27, 2007.
The Quartet has been effective in negotiating cease fires in the Second Intifada and in two wars between Hamas controlled Gaza and Israel, however, it has made only limited progress in an overall Israeli-Palestinian settlement.
In the alternative reality of Danielle: Chronicles of a Superheroine, eleven-year-old Danielle gives a role to the Quartet in her settlement proposal for the Middle East conflict.
Alfred Nobel (1833–1896) invented dynamite in 1867 and went on to become wealthy as an industrialist perfecting and manufacturing weapons of war based on explosives. In 1888, his brother died, and a newspaper mistakenly printed an obituary for Alfred Nobel. It read “Dr. Alfred Nobel, who became rich by finding ways to kill more people faster than ever before, died yesterday. … The merchant of death is dead.”
Nobel was alarmed by this description and in seeking to leave a better legacy he wrote a will in 1895 leaving 94 percent of his assets (about $250 million in today’s dollars) to create five Nobel Prizes in physics, chemistry, medicine or physiology, literature, and peace. The Nobel Prizes have become the most prestigious prizes in the world in their fields.
Notable recipients of the Nobel Peace Prize have included:
- Albert Schweitzer, the Missionary surgeon in 1952.
- Dag Hammarskjold, Secretary General of the United Nations in 1961.
- Linus Pauling, the biochemist in 1962 for his campaign against nuclear weapons testing. He has been the only person to have won two unshared Nobel Prizes having received the Nobel Prize in Chemistry in 1954.
- Martin Luther King, the civil rights leader, in 1964.
- Mohamed Anwar Al-Sadat, the president of Egypt, and Menachem Begin, Prime Minister of Israel for the Israel-Egypt peace treaty in 1978.
- Mother Teresa, the Founder of the Missionaries of Charity, in 1979.
- Lech Wałęsa, Founder of the Solidarity labor movement in Poland which challenged the Soviet Union, and led to the dissolution of the Soviet Union, in 1983.
- Desmond Tutu, the Bishop of Johannesburg, in 1984.
- Elie Wiesel, Chairman of the President’s Commission on the Holocaust, in 1986.
- Tenzin Gyatso (The Dalai Lama), the exiled religious and political leader of Tibet, in 1989.
- Mikhail Gorbachev, president of the Soviet Union, in 1990.
- Nelson Mandela and Frederik Willem de Klerk, for their peaceful resolution of the conflict over the South African Apartheid regime in 1993.
- Barack Obama in 2009 for the hope that he brought the world with his election.
- Malala Yousafzai, the teenage girl shot by the Taliban for her efforts to promote the education of girls in Pakistan, in 2014.
The issue of peace is inherently political, and the Nobel Peace Prize has been criticized for controversial recipients. In 1994, Yasser Arafat shared the Peace prize with Yitzhak Rabin and Shimon Peres for the Camp David Accords which provided a road map for peace between Israel and Palestine. Arafat’s prize was criticized for the incomplete nature of the agreement as well as the fact that he was accused of involvement with a number of terrorist attacks including the killing of eleven Israeli athletes at the 1972 Summer Olympics.
In 1973, Henry Kissinger, US secretary of state, and Le Duc Tho, Leader of the Communist Party of North Vietnam, shared the Nobel Peace Prize for their negotiation of a peace agreement in the Vietnam War. This was criticized because the war continued as well as war crime accusations against both recipients. Le Duc Tho refused the prize saying the war had not been resolved.
Barack Obama, who had just been elected president of the United States, received the prize in 2009 based on his electoral platform of achieving peace in the Iraq and Afghanistan wars. This prize was criticized based on the fact that Obama’s peace plan was just that, a plan that had not yet been implemented. The award to Obama has been described as part of a pattern of the Nobel Peace Prize being awarded to attempt to influence future events rather than to just recognize past achievements.
The Nobel Peace Prize has also been criticized for omissions, in particular for not awarding the prize to Mahatma Gandhi who had successfully led the peaceful resistance to the British Colonization of India, and who became a symbol of nonviolent resistance. Gandhi was nominated five times but never received the award.
In 2015 I gave the opening keynote address in Sweden on the “Future of Intelligence” for the “Nobel Dialogues” which was convened by the Nobel Prize Committee and took place the day before the Nobel Prizes for Physics, Chemistry, Medicine, Literature, and Economics were bestowed in Stockholm. I spoke about how we will expand our biological human intelligence by merging with the artificial intelligence we are creating. I described how this development provides a unique opportunity to solve the major problems that humanity has struggled with for millennia including the issue of war and peace, but it also presents a corresponding challenge to control the associated perils.
In the alternative reality of Danielle: Chronicles of a Superheroine, the Nobel Peace Prize Committee makes an unusually fast decision to award the Nobel Peace Prize to eleven-year-old Danielle for her Middle East peace efforts.
Shiite or Shia, short for Shiatu Ali (followers of Ali), is a branch of Islam that believes that the successor to Muhammad as Caliph (the religious and political leader of the Muslim world) was Muhammad’s son-in-law and cousin Ali ibn Abi Talib. This contrasts with the Sunni Islam belief that the proper successor was Muhammad’s father-in-law Abu Bakr.
The Shia community and belief is the second largest branch of Islam constituting about 10 to 13 percent of the Muslim population in the world, compared to 87 to 90 percent for the Sunni sect. 70 to 80 percent of Shias live in Iran, Iraq, Pakistan, and India. A third of all Shias live in Iran, where over 90 percent of the Muslim population is Shia. About a quarter of all Shias live in India, where about a quarter of the Muslim population is Shia.
Many of the fundamental tenets of Islam are common to the Shiite and Sunni branches. For example, both believe in the authority of the scrolls of Abraham (the first of the Biblical patriarchs whose story is told in Genesis, chapters 11 through 25 of the Book of Genesis, in the First Testament of the Bible), the scrolls of Moses (the most important prophet in the Jewish religion who led the Jewish people from slavery in Egypt for forty years through the Sinai Dessert), the Torah (the principal religious text of the Jewish people), the Psalms, the Gospel, and the Qur’ān (the principal religious text of Islam which Muslims believe was revealed by God to Muhammad through the angel Jibril over a period of 23 years).
Muslims of both sects pray five times a day. Both regard Mecca, Medina, and Jerusalem to be the holiest sites in Islam. Details of how to pray and other rituals and beliefs have evolved in a somewhat different fashion.
Relations between the Shiite and Sunni sects were peaceful for the most part up through the fifteenth century. This ended with a massacre of 40,000 Shia in 1514 ordered by the Ottoman Sultan Selim I. From the sixteenth century through the present day, Sunni leaders have often regarded the minority Shiite population as a religious and political threat and have sought to marginalize and often to persecute them. A recent example is the persecution of the Shiite population in Iraq by Saddam Hussein and Hussein’s invasion of Shiite dominated Iran in 1980.
Tension between the Sunni and Shiite branches of Islam continues to this day with the Sunni branch led by Saudi Arabia and the Shiite branch led by Iran.
In the alternative reality of Danielle: Chronicles of a Superheroine, eleven-year-old Danielle cites the continued tension between the Shiite and Sunni branches of Islam as one reason for her only receiving a B+ in her Social Studies class despite her achieving an Israeli-Palestinian settlement.
Sunni Islam is the largest denomination of the Islam religion. It believes that the first Caliph (the religious and political leader of the Muslim world) after Muhammad was Muhammad’s father-in-law, Abu Bakr, in contrast to the Shiite belief that it was Muhammad’s son-in-law and cousin, Ali ibn Abi Talib. Sunni Muslims constitute 87 to 90 percent of the Muslim population.
See entry for Shiite for more details on the differences and relations between the Sunni and Shiite sects of Islam.
Wahhabism, an austere and severe form of Islam, is an influential Sect of Sunni Islam and is the primary form of Islam in Saudi Arabia.
The original Sunni terrorist group that captured the world’s attention was al-Qaeda, founded by Osama bin Laden, with its attack on the United States on September 11, 2001. The “9/11” attacks carried out by 19 al-Qaeda terrorists, 15 of whom were from Saudi Arabia, caused the collapse of both 110-story towers of the World Trade Center, significant destruction to the Pentagon, and resulted in the death of 2,996 people. It resulted in US President Bush declaring the War on Terror and initiating the Afghanistan War to remove the Taliban, who had provided a haven for al-Qaeda, from power in Afghanistan. It played a more controversial role in initiating the 2003 Iraq War in which a coalition headed by the US toppled the regime of Saddam Hussein.
A success of the War on Terror has been that up through the writing of this book, al-Qaeda has been unable to carry out another terrorist attack anywhere near the scale of the 9/11 attacks, although numerous smaller attacks have taken place around the world by the group and its affiliates. With the weakening of al-Qaeda itself, other Sunni terrorist groups have emerged that have been associated with even more ruthless tactics. In particular, the Islamic State (known variously as the Islamic State of Syria (ISIS), the Islamic State of the Levant (ISIL), the Islamic State, or Daesh, which is an Arabic acronym for al-Dawla al-Islamiya al-Iraq al-Sham (Islamic State of Iraq and the Levant) took over a region of Syria. With the withdrawal of American forces from Iraq, along with the weakness of the Iraqi government and military, ISIS took control of an adjacent region of Iraq including the city of Mosul, a city of 2.5 million people. Like al-Qaeda, ISIS follows the Wahhabist doctrine of Sunni Islam.
ISIS has been effective in using social media to recruit, organize, and inspire terrorists around the world. Although ISIS has not been able to carry out a terrorist attack on the scale of 9/11, there have been a substantial number of both coordinated and lone wolf attacks around the world organized or inspired by ISIS, including 130 people killed in a series of attacks in Paris on November 13, 2015, and 14 people killed in San Bernardino, California on December 2, 2015. ISIS has been noted for especially horrific and graphic atrocities including killing captives by beheading them or burning them alive and using captive women and girls as sex slaves.
Estimates in 2015 are that about one third of Syria and one third of Iraq are under ISIS control or influence and that ISIS commands about 200,000 fighters.
In the alternative reality of Danielle: Chronicles of a Superheroine, the emergence of ISIS negatively affects eleven-year-old Danielle’s grade on her Middle East Social Studies project.
Hyde Park is one of the largest public parks in England comprising 350 acres in the middle of London. It is contiguous with Kensington Gardens which comprises another 270 acres. Kensington Gardens is generally considered part of Hyde Park, but technically they are two different parks. Hyde Park was originally created by King Henry VIII in 1536 for his hunting parties.
Hyde Park is a important venue for public demonstrations, meetings of various kinds, and cultural events. The Great Exhibition (the World’s Fair) took place in Hyde Park in May of 1851. A famous march by the Suffragettes, called the Mud March due to the inclement weather that day, started in Hyde Park on February 7, 1907. Pope Benedict held a prayer vigil at Hyde Park on September 18, 2010 for about 80,000 people. Many famous music performers such as the Rolling Stones, Pink Floyd, Queen, Bruce Springsteen, and Paul McCartney have held large outdoor concerts there.
Whenever I am in London, I take my daily walks in Hyde Park.
In the alternative reality of Danielle: Chronicles of a Superheroine, eleven-year-old Danielle and nineteen-year-old Claire, along with Danielle’s backup band, put on an impromptu “flash mob” concert at Hyde Park to celebrate her Middle East peace settlement. Covered live by social media and including instantly assembled crowds in major squares around the world, such as Times Square in New York City and Tiananmen Square in Beijing, an estimated two billion people dance to Danielle and Claire’s live music from Hyde Park.
See entry for Flash mob.
To define Artificial Intelligence, we first need to define “intelligence.” In The Age of Spiritual Machines (Viking, 1999), I wrote:
A goal may be survival—evade a foe, forage for food, find shelter. Or it might be communication—relate an experience, evoke a feeling. Or perhaps it is to partake in a pastime—play a board game, solve a puzzle, catch a ball. Sometimes it is to seek transcendence—create an image, compose a passage. A goal may be well defined and unique, as in the solution to a math problem. Or it may be a personal expression with no clearly right answer. My view is that intelligence is the ability to optimally use limited resources—including time—to achieve such goals.
Artificial Intelligence, then, is a system that exhibits intelligence based on “artificial” means, that is a nonbiological system.
The intellectual roots of artificial intelligence go back millennia. Plato quotes Socrates as saying that intelligence must be based on principles, and Plato goes on to write that these principles of intelligence must be based on strict rules, rules that could in theory be followed by a machine.
Wilhelm Leibniz (1646–1716) wrote,
What if … we were magically shrunk and put into someone’s brain while he was thinking. We would see all the pumps, pistons, gears and levers working away, and we would be able to describe their workings completely, in mechanical terms, thereby completely describing the thought processes of the brain. But that description would nowhere contain any mention of thought! It would contain nothing but descriptions of pumps, pistons, levers!
We might assume that a machine exhibiting artificial intelligence would be electronic, but actually the first computer that was designed was the Analytical Engine, designed by Charles Babbage in 1837. Although Babbage never got his computer to work, he and his assistant Ada Lovelace, the world’s first computer programmer (and daughter of the poet Lord Byron), contemplated the idea that it could, in theory, be programmed to perform intelligent and even creative tasks. Babbage wrote:
It is not a bad definition of man to describe him as a tool-making animal. His earliest contrivances to support uncivilized life were tools of the simplest and rudest construction. His latest achievements in the substitution of machinery, not merely for the skill of the human hand, but for the relief of the human intellect, are founded on the use of tools of a still higher order.
In her Notes, which accompanied her description of Babbage’s Analytical Engine, Lovelace speculates on the feasibility of artificial intelligence, the first such speculation ever written. Around the same time, she wrote in a letter to her mother that she was working on “certain productions” investigating the relationship between music and mathematics. While she writes about the similarity of computation to thinking, she concludes in her Notes that “The Analytical Engine has no pretensions whatever to originate anything. It can do whatever we know how to order it to perform.” However, she also concludes that in the future computers such as the Analytical Engine will collaborate with individuals and society. (See entries for Analytical Engine and Ada Lovelace.)
As an aside, computers today are indeed electronic, but the ultimate computer is likely to use mechanical parts built at the atomic level as contemplated in Eric Drexler’s 1986 book Engines of Creation, so we are likely to go back to mechanical computing in the future.
The first person to define the field of Artificial Intelligence in detail was Alan Turing, whose heroic efforts to build the world’s first working computers and to crack the Nazi Enigma code enabled the Allies to win the Battle of Britain in World War II. (See entry for Winston Churchill.) In his classic 1950 paper, “Computing Machinery and Intelligence,” Turing described an agenda that would in fact occupy the next half century of advanced computer research: game playing, decision making, natural language understanding, translation, theorem proving, and, of course, encryption and the cracking of codes. He wrote (with his friend David Champernowne) the first chess-playing program.
Some of Turing’s early pronouncements about Artificial Intelligence around 1950 include the following:
In the 1950s, progress in AI seemed to come quickly. In 1955 and 1956, Allen Newell, J.C. Shaw, and Herbert Simon created a program called Logic Theorist (and a later version called General Problem Solver in 1959) which used recursive search techniques to solve problems in mathematics. Recursion, as I discuss below, is a powerful method of defining a solution in terms of itself. Logic Theorist and General Problem Solver were able to find proofs for many of the key theorems in Russell and Whitehead’s seminal work on set theory, Principia Mathematica, including a completely original proof for an important theorem that had never been previously solved. These early successes led Simon and Newell to say in a 1958 paper, entitled “Heuristic Problem Solving: The Next Advance in Operations Research,” “There are now in the world machines that think, that learn, and that create. Moreover, their ability to do these things is going to increase rapidly until—in a visible future—the range of problems they can handle will be coextensive with the range to which the human mind has been applied.” The paper goes on to predict that within ten years (that is, by 1968) a digital computer would be the world chess champion. A decade later, an unrepentant Simon predicts that by 1985, “machines will be capable of doing any work that a man can do.” Perhaps Simon was intending a favorable comment on the capabilities of women, but these predictions, decidedly more optimistic than Turing’s, were an embarrassment to the new AI field. This disappointment ushered in what became known as the “AI Winter,” as AI optimism swung to pessimism.
In the mid-1980s I had predicted that a computer would defeat the world chess champion by 1998. When Deep Blue defeated Gary Kasparov, then the reigning human world chess champion, in 1997 (using the recursive algorithm), one prominent professor commented that all we had learned was that playing a championship game of chess does not require intelligence after all. The implication is that capturing real intelligence in our machines still remained far beyond our grasp. While I don’t wish to overstate the significance of Deep Blue’s victory, I believe that from this perspective we will ultimately find that there are no human activities that require “real” intelligence.
During the 1960s, the academic field of AI began to work on the agenda that Turing had described in 1950, with encouraging or frustrating results, depending on your point of view. Daniel G. Bobrow’s program Student could solve algebra problems from natural English language stories and reportedly did well on high school math tests. The same performance was reported for Thomas G. Evans’ Analogy program for solving IQ test geometric-analogy problems. The field of expert systems was initiated with Edward A. Feigenbaum’s DENDRAL, which could answer questions about chemical compounds. And natural language understanding got its start with Terry Winograd’s SHRDLU, which could understand any meaningful English sentence, so long as you talked about colored blocks.
It turned out that the problems we thought were difficult—solving mathematical theorems, playing respectable games of chess, reasoning within domains such as chemistry and medicine—were easy, and the multithousand-instructions-per-second computers of the 1950s and 1960s were often adequate to provide satisfactory results. What proved elusive were the skills that any five-year-old child possesses, such as telling the difference between a dog and a cat, or understanding an animated cartoon.
Turing offered an explanation of why we would fail to acknowledge intelligence in our machines. In 1947, he wrote:
The extent to which we regard something as behaving in an intelligent manner is determined as much by our own state of mind and training as by the properties of the object under consideration. If we are able to explain and predict its behavior … we have little temptation to imagine intelligence. With the same object, therefore, it is possible that one man would consider it as intelligent and another would not; the second man would have found out the rules of its behavior.
I am also reminded of Elaine Rich’s definition of artificial intelligence as the “study of how to make computers do things at which, at the moment, people are better.” I would reword that as follows: Artificial intelligence is inherently defined as the pursuit of hard computer science problems that have not yet been solved.
I made the point in my 1999 book The Age of Spiritual Machines and made it again in my 2005 book The Singularity is Near (SIN), that as long as there is anything that AI cannot do, some observers will dismiss what it can do and focus on the remaining unsolved problems.
Here is a cartoon I designed for my 1999 book The Age of Spiritual Machines and revised in my 2005 book The Singularity is Near which attempts to make this point.
In the cartoon, a defensive “human race” is seen writing out signs that state what only people (and not machines) can do. Littered on the floor are the signs the human race has already discarded, because machines can now perform these functions: diagnose an electrocardiogram, compose in the style of Bach, recognize faces, guide a missile, play Ping-Pong, play master chess, pick stocks, improvise jazz, prove important theorems, and understand continuous speech. As time goes on, more and more signs have fallen (and will continue to fall) to the floor.
The artificial intelligence field has two competing methodologies. The “symbolic” school defines intelligence as solutions to logical and symbolic equations. The “connectionist” school, on the other hand, builds models of “neural” networks that learn how to solve problems from examples. These networks are regarded as approximate models of how the human neocortex solves problems.
When I was 14 in 1962, I met with the leading figures of these two schools of thought. One was Marvin Minsky, professor at the Massachusetts Institute of Technology and cofounder (with John McCarthy) of the MIT Artificial Intelligence Laboratory. Minsky was regarded as the leader of the symbolic school (even though he had done pioneering work on neural nets in the 1950s). The other was Frank Rosenblatt, professor at Cornell University, who invented the “Perceptron,” an early neural net that could recognize printed letters and spoken words. I went on to attend MIT with Minsky as my mentor, a relationship that remained for over half a century until Minsky’s death at the age of 88 on January 24, 2016.
There are several techniques used in the symbolic school. One of the most interesting is the “recursive” method which uses a well-defined description of a problem to produce a solution to the problem.
The recursive formula starts with a careful statement of the problem. A recursive procedure is one that calls itself. For example, in the game of chess the statement of the problem consists of stating the rules and goal of the game. We construct a program called “Pick Best Move” to select each move. Pick Best Move starts by listing all of the possible moves from the current state of the board. This is where the careful statement of the problem comes in, because in order to generate all of the possible moves we need to precisely consider the rules of the game. For each move, the program constructs a hypothetical board that reflects what would happen if we made this move. For each such hypothetical board, we now need to consider what our opponent would do if we made this move. This is where recursion comes in, because Pick Best Move simply calls Pick Best Move (that is, itself) to pick the best move for our opponent. In calling itself, Pick Best Move then lists all of the legal moves for our opponent.
The program keeps calling itself, looking ahead as many moves as we have time to consider, which results in the generation of a huge move-countermove tree. This is an example of exponential growth, because to look ahead an additional half-move requires multiplying the amount of available computation by about five in the game of chess.
Key to the recursive formula is pruning this huge tree of possibilities, and ultimately stopping the recursive growth of the tree, and returning from these nested calls to “Pick Best Move.” In the game context, if a board looks hopeless for either side, the program can stop the expansion of the move-countermove tree from that point (called a “terminal leaf” of the tree), and consider the most recently considered move to be a likely win or loss.
When all of these nested program calls are completed, the program will have determined the best possible move for the current actual board, within the limits of the depth of recursive expansion that it had time to pursue.
The recursive formula was good enough to build a machine—a specially designed IBM supercomputer—that defeated the world chess champion (although Deep Blue does augment the recursive formula with databases of moves from most of the grandmaster games of this century). In The Age of Intelligent Machines (MIT Press, 1990) I noted that while the best chess computers were gaining in chess ratings by 45 points a year (as a result of the exponential increase in computation), the best humans were advancing by closer to zero points. That put the year in which a computer would beat the world chess champion at 1998, which turned out to be overly pessimistic by one year.
Our simple recursive rule plays a world class game of chess. A reasonable question then is, what else can it do? We certainly can replace the module that generates chess moves with a module programmed with the rules of another game. Stick in a module that knows the rules of checkers, and you can also beat just about any human. Recursion is really good at backgammon. Hans Berliner’s program defeated the human backgammon champion with the slow computers we had back in 1979.
The recursive formula is also a rather good mathematician. Here the goal is to solve a mathematical problem, such as proving a theorem. The rules then become the axioms of the field of math being addressed, as well as previously proved theorems. The expansion at each point is the possible axioms (or previously proved theorems) that can be applied to a proof at each step. This was the approach used by Allen Newell, J.C. Shaw, and Herbert Simon for their 1959 General Problem Solver.
The connectionist school, epitomized by neural nets, works a little differently, although the basic building block is also simple. The neural net paradigm is an attempt to emulate the computing structure of neurons in the human brain. We start with a set of inputs which represent a problem to be solved. For example, the input may be a set of pixels representing an image that needs to be identified. These inputs are randomly wired to a layer of simulated neurons. Each point of the input (e.g., each pixel in an image) is randomly connected to the inputs of the first layer of simulated neurons. Each connection has an associated “synaptic strength” that represents the importance of this connection. These strengths are also set at random values. Each neuron adds up the signals coming into it. If the combined signal exceeds a threshold, then the neuron “fires” and sends a signal to its output connection. If the combined input signal does not exceed the threshold, then the neuron does not fire and its output is zero. The output of each neuron is randomly connected to the inputs of the neurons in the next layer. At the top layer, the output of one or more neurons, also randomly selected, provide the answer. A problem, such as an image of a printed character to be identified, is presented to the input layer and the output neurons provide their answer. And the answers are remarkably accurate for a wide range of problems.
In actuality, the answers are not accurate at all. Not at first, anyway. Initially, the output is completely random, so the answers are random. For a neural net to show intelligence, it needs to learn its subject matter, just like the human brains it is modeled on.
The neural net’s teacher, which may be a human, a computer program, or perhaps another, more mature neural net that has already learned its lessons, rewards the student neural net when it is right and punishes it when it is wrong. This feedback is used by the student neural net to adjust the strengths of each interneuronal connection. Connections that were consistent with the right answer are made stronger. Those that advocated a wrong answer are weakened. This method is called “back propagation.” Over time, the neural net organizes itself to provide the right answers without coaching. Experiments have shown that neural nets can learn their subject matter even with unreliable teachers. For example, if the teacher is correct only 60 percent of the time, the student neural net will still learn its lessons.
In 1962, when I was 14 years old, I visited Frank Rosenblatt, a computer scientist at Cornell who had invented the “Perceptron,” which was the first practical neural net. It was able to recognize images of printed letters but only in a single typestyle and size. He said to me, “… but don’t worry, if we feed the output of a Perceptron to another Perceptron and keep doing that, it will be able to generalize and recognize the essence of each letter shape.” Rosenblatt died in 1971, before he had a chance to prove that his conjecture was true.
In the decades since Rosenblatt’s Perceptron, multilayer neural nets (with the output of one layer of neural net feeding into the next) were created, and indeed they were able to generalize to some degree. Each deeper layer is able to recognize deeper and more abstract features than the layers below it. However, up until recently, we could not go beyond three or four layers. Between 2008 and 2015, a mathematical breakthrough was achieved so that we are now able to build neural nets with hundreds of layers, and indeed we find that the deeper layers can capture the essence of deeper and more abstract insights.
Using these deeper neural nets, Google created a program that can now distinguish between cats and dogs (a task that AI could not do just a few years earlier) as well as thousands of other categories, even writing captions for complicated images. Thus Rosenblatt’s conjecture to me in 1962 has been proven correct a half-century later.
For example, the Google caption writing program accurately describes the image below as “Two pizzas sitting on top of a stovetop oven.”
Another very interesting example was conducted by researchers at the University of Tübingen in Germany. They built a deep neural net to analyze the essence of the style of famous painters in terms of color, technique and brush strokes. They scanned this photograph of buildings:
They then asked the program to recreate the picture in the style of different artists it had analyzed. In each case below, the inset picture is the painting by the original artist.
The neural net recreated the picture of the buildings in the style of Vincent Van Gogh’s painting A Starry Night as follows:
The neural net recreated the same image of the buildings in the style of Pablo Picasso’s Femme Nue Assise this way:
It recreated the picture of the buildings in the style of Edvard Munch’s painting The Scream this way:
There have been many other impressive achievements in the field of AI in the years around 2015. The Google self-driving cars have gone over a million miles without any incident (other than being pulled over by the police once for going too slowly, but the officer subsequently realized that there was no minimum speed on that street, so the Google driverless car did not get a ticket).
In 2011, IBM’s Watson computer won a televised Jeopardy! tournament against Brad Rutter and Ken Jennings, the best two human players in the world. It got this query correct in the rhyme category: “A long tiresome speech delivered by a frothy pie topping.” Watson correctly responded, “What is a meringue harangue?” Rutter and Jennings were not able to respond to this query and Watson got a higher score than the two of them combined.
What is not widely appreciated is that Watson’s knowledge was not hand coded by the engineers, but rather Watson obtained the knowledge on its own as a result of it having read 200 million pages of documents including all of Wikipedia.
After the match, Ken Jennings commented,
The computer’s techniques for unraveling Jeopardy! clues sounded just like mine. That machine zeroes in on key words in a clue, then combs its memory (in Watson’s case, a 15-terabyte data bank of human knowledge) for clusters of associations with these words. It rigorously checks the top hits against all the contextual information it can muster: the category name; the kind of answer being sought; the time, place, and gender hinted at in the clue; and so on. And when it feels ‘sure’ enough, it decides to buzz. This is all an instant, intuitive process for a human Jeopardy! player, but I felt convinced that under the hood my brain was doing more or less the same thing.
He added, “I for one, welcome our new robot overlords.”
These accomplishments in the AI field around 2015 has fueled a level of optimism that has been unprecedented. Concern among the general public has gone from skepticism that AI would ever achieve human levels to concern us about the implications of what will happen when it does.
In my 2012 book How to Create a Mind (Viking Penguin), I describe my “Pattern Recognition Theory of Mind” (PRTM) which has several key differences from the deep neural nets (DNNs) I described above. In the book I provide the evidence that the human brain operates on PRTM principles. One of the key challenges PRTM seeks to overcome is a current weakness of DNNs, which is that they need enormous amounts of data to learn their lessons. The human neocortex (the region of the brain we use for our thinking) can learn from relatively few examples.
In the PRTM, I describe the brain as a series of modules, each of which can recognize a pattern. These modules are organized in an elaborate hierarchy and we create that hierarchy with our own thinking.
Each pattern (which is recognized by one of the 300 million pattern recognizers I estimate are in the neocortex) is composed of three parts. Part one is the input, which consists of the lower-level patterns that comprise the main pattern. The descriptions for each of these lower-level patterns do not need to be repeated for each higher-level pattern that references them. For example, many of the patterns for words will include the letter “A.” Each of these patterns does not need to repeat the description of the letter “A” but will use the same description. Think of it as being like a web pointer. There is one web page (that is, one pattern) for the letter “A,” and all of the web pages (patterns) for words that include “A” will have a link to the “A” page (to the “A” pattern). Instead of web links the neocortex uses actual neural connections. There is an axon from the “A” pattern recognizer that connects to multiple dendrites, one for each word that uses “A.” Keep in mind also the redundancy factor: there is more than one pattern recognizer for the letter “A.” Any of these multiple “A” pattern recognizers can send a signal up to the pattern recognizers that incorporate “A.”
In a different part of the cortex is a comparable hierarchy of pattern recognizers processing actual images of objects (as opposed to printed letters). If you are looking at an actual apple, low-level recognizers will detect curved edges and surface color patterns leading up to a pattern recognizer firing its axon and saying in effect “Hey guys, I just saw an actual apple.” Yet other pattern recognizers will detect combinations of frequencies of sound leading up to a pattern recognizer in the auditory cortex that might fire its axon indicating “I just heard the spoken word ‘apple.’”
The neocortex uses a great deal of redundancy—we don’t just have a single pattern recognizer for “apple” in each of its forms (written, spoken, visual). There are likely to be hundreds of such recognizers firing, if not more. The redundancy not only increases the likelihood that you will successfully recognize each instance of an apple but also deals with the variations in real-world apples. For apple objects, there will be pattern recognizers that deal with the many varied forms of apples: different views, colors, shadings, shapes, and varieties.
Also keep in mind that this hierarchy is a hierarchy of concepts. These recognizers are not physically placed above each other; because of the thin construction of the neocortex, it is physically only one pattern recognizer thick. The conceptual hierarchy is created by the connections between the individual pattern recognizers.
Getting back to the flow of recognition from one level of pattern recognizers to the next, in the above example, we see the information flow up the conceptual hierarchy from basic letter features to letters to words. Recognitions will continue to flow up from there to phrases and then more complex language structures. If we go up several dozen more levels we get to higher level concepts like irony, envy, and humor.
A very important point to note here is that information flows down the conceptual hierarchy as well as up. If anything, this downward flow is even more significant. If, for example, we are reading from left to right and have already seen and recognized the letters “A,” “P,” “P,” and “L,” the “APPLE” recognizer will predict that it is likely to see an “E” in the next position. It will send a signal down to the “E” recognizers saying, in effect, “please be aware that there is a high likelihood that you will see your ‘E’ pattern very soon, so be on the lookout for it.” The “E” recognizers then adjust their thresholds such that they are more likely to recognize an “E.” So if an image appears next that is vaguely like an “E,” but is perhaps smudged such that it would not have been recognized as an “E” under “normal” circumstances, an “E” recognizer may nonetheless indicate that it has indeed seen an “E,” since it was expected.
The neocortex is, therefore, predicting what it expects to encounter. Envisaging the future is one of the primary reasons we have a neocortex. At the highest conceptual level, we are continually making predictions—who is going to walk through the door next, what someone is likely to say next, what we expect to see when we turn the corner, the likely results of our own actions, and so on. These predictions are constantly occurring at every level of the neocortex hierarchy. We often misrecognize people and things and words because our threshold for confirming an expected pattern is too low.
With this organization of the neocortex, we can gain insight into how human creativity works. A key aspect of creativity is the process of finding great metaphors—symbols that represent something else. The neocortex is a great metaphor machine, which accounts for why we are a uniquely creative species. Every one of the approximately 300 million pattern recognizers in our neocortex is recognizing and defining a pattern and giving it a name, which in the case of the neocortical pattern recognition modules is simply the axon emerging from that pattern recognizer which will fire when that pattern is found. That symbol in turn then becomes part of another pattern. Each one of these patterns is essentially a metaphor. The recognizers can fire up to 100 times a second, so we have the potential of recognizing up to 30 billion metaphors a second. Of course, not every module is firing in every cycle—but it is fair to say that we are indeed recognizing millions of metaphors a second. At the highest levels of the neocortical hierarchy, these metaphors can be the key to creativity.
Ultimately, we will create an artificial neocortex that has the full range and flexibility of its human counterpart. Consider the benefits: Electronic circuits are millions of times faster than our biological circuits. At first, we will have to devote all of this speed increase into compensating for the relative lack of parallelism in our computers, but ultimately the digital neocortex will be much faster than the biological variety and will only continue to increase in speed.
When we augment our own neocortex with a synthetic version, we won’t have to worry about how much additional neocortex can physically fit into our bodies and brains, as most of it will be in the Cloud (computation accessible through wireless communication), like most of the computing we use today. I estimated above that we have in the order of 300 million pattern recognizers in our biological neocortex. That’s as much as could be squeezed into our skulls even with the evolutionary innovation of a large forehead (which we evolved about two million years ago when we became humanoids) and with the neocortex taking about 80 percent of the available space in our skulls. As soon as we start thinking in the Cloud, there will be no natural limits—we will be able to use billions or trillions of pattern recognizers, basically whatever we need, and whatever the law of accelerating returns (my thesis that information technology progresses at an exponential rate) can provide at each point in time.
In order for a digital neocortex to learn a new skill, it will still require many iterations of education, just as a biological neocortex does, but once a single digital neocortex somewhere and at some time learns something it can share that knowledge with every other digital neocortex without delay. We can each have our own private neocortex extenders in the Cloud just as we have our own private stores today of personal data.
Last but not least, we will be able to back up the digital portion of our intelligence. As we have seen, it is not just a metaphor to state that there is information contained in our neocortex, and it is frightening to contemplate that none of this information is backed up today. There is, of course, one way in which we do back up some of the information in our brains—by writing it down. The ability to transfer at least some of our thinking to a medium that can outlast our biological bodies was a huge step forward, but a great deal of data in our brains continues to remain vulnerable.
300 million pattern processors may sound like a large number, and indeed it was sufficient to enable Homo sapiens to develop verbal and written language, all of our tools, and other diverse creations. These inventions have built upon themselves, giving rise to the exponential growth of the information content of technologies as described in my law of accelerating returns. No other species has achieved this. A few other species, such as chimpanzees, do appear to have a rudimentary ability to understand and form language and also to use primitive tools. They do, after all, have a neocortex, but their abilities are limited, due to its smaller size, particularly their frontal cortex. The size of our own neocortex has exceeded a threshold that has enabled our species to build ever more powerful tools, including tools that now enable us to understand our own intelligence. Ultimately our brains, combined with the technologies they have fostered, will permit us to create a synthetic neocortex that will contain well beyond a mere 300 million pattern processors for each of us. Why not a billion? Or a trillion?
In the alternative reality of Danielle: Chronicles of a Superheroine, Danielle uses artificial intelligence for many of her endeavors, including her Libyan campaign to overthrow Qadaffi when she is ten, and providing security (combined with terahertz scanning (see entry for terahertz scanning) for Israel as part of her Middle East settlement, and at her worldwide celebrations.
A flash mob is a group of people who apparently spontaneously appear in a public setting and perform an organized activity such as a music or dance performance or demonstrate for a cause. Flash mobs are usually organized by social media.
The concept of a flash mob was originated in 2003 in New York City by Bill Wasik, senior editor of Harper’s Magazine, as a playful social experiment on the issue of conformity. On June 17, 2003, 130 people suddenly assembled in the rug department of Macy’s Department Store saying that they all lived together and were shopping for a “love rug,” and that they always made these types of decisions as a group.
Examples of flash mobs since its origin in 2003 have included celebrations of Star Wars with lightsabers, ballet performances, public singing, mime performances, and a demonstration by and for cancer survivors in a train station in New Haven.
According to YouTube statistics, interest in flash mobs peaked in 2010.
In the alternative reality of Danielle: Chronicles of a Superheroine, eleven-year-old Danielle and nineteen-year-old Claire organize a synchronized flash mob celebration across the world, of two billion people, involving music and dance to celebrate Danielle’s peace agreement in the Middle East. One thing she does during this event causes concern to the premier of China.
Tiananmen Square is the largest city square in China and one of the five largest in the world. It is named after Tiananmen Gate (Gate of Heavenly Peace) to its north, with the Forbidden City (the palace of the Chinese Emperor from 1420 to 1912) to its south. Tiananmen Gate became known as the Gate of China.
Mao Tse-tung announced the founding of the People’s Republic of China in Tiananmen Square on October 1, 1949. His embalmed body lies there.
Tiananmen Gate was built in 1417 during the decline of the Ming Dynasty. The square was built in 1651 and was enlarged during the 1950s to its present size of 109 acres.
Tiananmen Square is perhaps best known for the pro-democracy protests which took place there in 1989. Hu Yaobang, an ally of Deng Xiaoping (the paramount leader of China from 1978 to 1992), was the General Secretary of the Communist Party of China from 1980 to 1987. Yaobang pursued economic and political reforms under Xiaoping’s leadership. Yaobang was ousted in 1987 by party conservatives after nationwide student protests erupted in 1987. The day after Yaobang died in 1989, a small student demonstration sought to celebrate and support the reforms he had pursued years earlier. A week later these demonstrations had grown to 100,000 students who marched on Tiananmen Square. The immediate government response was conciliatory. The students staged a hunger strike and the demonstrations spread to 400 cities. At this point, on May 20, 1989, the government declared Martial Law and mobilized 300,000 troops sending them to Beijing.
These troops violently suppressed the demonstrations using assault rifles and tanks on unarmed protestors, resulting in hundreds to thousands of deaths. The precise number of casualties is unknown due to suppression of information about the event by Chinese authorities. After this event, the government also censored information about Yaobang’s life but lifted these restrictions in 2005.
When people refer to “Tiananmen Square,” they are often referring to these 1989 Tiananmen Square protests and to the government’s violent reaction. Information about this confrontation remains suppressed in China. In general, China continues to censor information it deems unfriendly to the government.
I traveled to China in January of 2015 to speak to thousands of entrepreneurs in Beijing at a conference called “Geek Park.” What I saw there was contradictory. On the one hand, there was a great explosion of entrepreneurial energy with thousands of startup companies with creative ideas and hundreds of Chinese venture capital firms eager to support them; it seemed like Silicon Valley on steroids. On the other hand, there was pervasive censorship with Google, Facebook, YouTube, Wikipedia, The New York Times, The Wall Street Journal, and many other sites being banned. There are hundreds of millions of blogs in China which are relatively freewheeling in their content so long as they do not deal with subjects that are prohibited. For example, if you live in China, you might not want to title your blog, “Remember Tiananmen Square.”
In the alternative reality of Danielle: Chronicles of a Superheroine, eleven-year-old Danielle organizes hundreds of thousands of people in Tiananmen Square in a flash mob and communicates something to them that disquiets Chinese leaders.
The “Zhongshan suit” was introduced by Sun Yat-sen around 1912. Sun was a Chinese revolutionary and established the Republic of China on January 1, 1912. He promoted the “Three Principles of the People,” which included Chinese nationalism with ethnic tolerance, western style democracy, and capitalism with free trade. His rule was tumultuous with various war lords dominating his leadership. His successor was Chiang Kai-shek who created the Kuo-mintang which was defeated by Mao’s communists.
When Mao overthrew the Kuomintang and proclaimed the People’s Republic of China on October 1, 1949, he adopted the Zhongshan suit as his signature style of clothing. Its design became a symbol of the Proletariat (working class) with its simple construction and four front pockets which could be used by workers to hold tools. He wore it so often that it became known as the Mao Suit or Mao Jacket. Other Chinese leaders at the time followed Mao’s example. It was also popular in the 1960s and 1970s among European and American intellectuals.
After Mao’s death in 1976, the style lost popularity, and today Chinese leaders generally wear Western style suits, although some Chinese still wear the Mao jacket to identify with the communists of the Mao era.
In the alternative reality of Danielle: Chronicles of a Superheroine, one such Chinese leader who is wearing a Mao jacket discusses eleven-year-old Danielle’s communication with the Chinese people in Tiananmen Square with a comrade.
Charismatic leadership refers to a leader that uses his or her personal charisma to help their rule. Democratic examples include Franklin D. Roosevelt and John F. Kennedy. Totalitarian examples include Hitler and Muammar Qaddafi. The example discussed here is the totalitarian leader Mao Tse-tung.
In the alternative reality of Danielle: Chronicles of a Superheroine, after eleven-year-old Danielle organizes hundreds of thousands of people in Tiananmen Square, Chinese leaders compare this to mass gatherings Mao had organized there. They allude to the Chinese leadership having avoided charismatic leadership since Mao. Left unsaid are the reasons: Mao’s 1957 “Great Leap Forward,” a campaign of forced collectivization of agriculture that is blamed for the death of between 18 and 45 million people, and Mao’s 1966 “Cultural Revolution,” which also led to millions of abuses and killings and the destruction of much of China’s five thousand-year cultural and historical artifacts.