Paragrapher Free Article

— %100 Original Causerie —

French Lessons in Londonistan

MUSLIMS HAVE been landing on the shores of Britain and France for decades. And, as these populations arrived and settled in the Republic, Paris pursued a policy it believed would eventually lead immigrants to full cultural integration into French society. Meanwhile, London, facing a similar influx of foreigners, attempted to create a full-fledged multicultural polity. The former emphasized that what was shared between the new arrivals and their native hosts was crucial, their differences secondary. The latter argued that the British needed to respect the uniqueness of their immigrant neighbors—whether national, religious or ethnic—and that such a stance was at the core of a harmonious political system. In color-blind France, built on a long tradition of a strong, centralized state and the successful assimilation of southern and eastern Europeans—who have been migrating to the country since the nineteenth century—religious identity was not to interfere in public life. Under the French tricolor, state and nation were fused into the cradle of the one and indivisible Republic. In race-aware Britain, with Anglicanism as its established church, there was always room for different nationalities—English, Welsh, Scottish, Irish—under the Union Jack.

The French and British experience as colonizers—and the ways in which those under imperial rule would come to see their occupiers—haunt the place of Muslim immigrants on both sides of the Channel. The Moslems of the British Raj lived as a minority among Hindus and struggled to maintain a separate identity through religious movements like the Deobandis (founded in India in 1867 and ancestors of the present-day Taliban). The political economy of the Raj was based on communalism, with Hindus, Sikhs and Muslims (and Sunni and Shia) fighting against each other. London fanned the embers of religious discord to keep military expenses low and the number of redcoats at a minimum. Divide and conquer.

At the end of the day, the British approach led to the bloody partition of the Raj between India and Pakistan; Karachi was homogeneously Muslim (though sectarian strife would soon rise among Sunni and Shia, and civil war would pit liberals against extremists), New Delhi became multicultural with a caste flavor.

The French colonies were something altogether different. Unlike the Deobandis of India, North and West Africa possessed no similar religious movements that struggled to maintain a separate Islamic identity in the face of a hostile non-Muslim majority. The French policed, at a high cost, every village of Algeria and Senegal, just as the gendarmes did in Provence and Corsica. Thus, France’s immigrants were ignorant of the kind of self-imposed apartheid that could be transported and implemented on French soil. The North and West Africans who migrated to France after World War II came from Muslim-majority countries and felt no need to enhance their religious peculiarities. Bachelors perceived themselves as temporary migrants. Families, most of them coming from Algeria, had no special claim to be religiously different. And after the end of the Algerian War in 1962, immigrants quietly and smoothly acquired the French citizenship to which they were entitled—to the furor of their leaders back home. For the musulmans who comprised a majority of the French colonial empire, the best possible future, according to the dominant French narrative, was to become French one day.

Such a grand récit was, of course, not implemented in colonial days—for the promise of citizenship was part and parcel of a workable imperial dominion. But in the end, as soon as the former colonized set foot on French soil in their new migrant-worker garb, they took Paris at its word, and France paid its colonial debt through a process of cultural and political integration that ran parallel to the process of turning earlier immigrants—Italians, Spaniards, Portuguese, Poles, et al.—into members of the Republic.

No such transformation was possible, however, for those British subjects moving from the peripheries of the empire to its island center. In Britain, one is born English, end of story. When Muslims started to migrate en masse from the former colonies, they became Commonwealth subjects with voting rights, and their “Islamness” turned out to be a kind of nationality of its own, albeit under the umbrella of what would later become British citizenship. Clearly, one could never hope to become English.

America—immigrant nation extraordinaire—is facing its first experience with homegrown Islamist extremism. How the United States conceives of and approaches the threat on its shores will clearly etch out the future of its relationship with its Muslim population in all of its complexity. Washington has much to learn from its European ancestors, who have struggled with, fallen victim to and at times overcome jihadists in their own lands. At its core, this is a question of culture—the approach to “other.”

THE IMPERIAL experience serves as a backdrop to the markedly contrasting ways that London and Paris have approached the immigration dilemma. France has created an intermingled culture, which is being forged on a daily basis between the native Gaul and the immigrant Arab and Berber. It revolves around two French obsessions: the bed and the dinner table. Your average young Muslim girl is interested in living and having children with a French gouer, a North-African colloquial term meaning “infidel”—i.e., non-Muslim. (Gouer is itself a corruption of the classical Arabic kuffar, used in immigrant slang to designate a French native. They are also known as fromage, or “cheese”—ironically the same synecdoche that was used in the neocon-coined “cheese-eating surrender monkeys.”) These women would loathe the very idea of an arranged marriage to a fellah (peasant) cousin from the far away bled (North Africa) with his unrefined manners and pedestrian French. By the same token, the most popular national dish of France—the country of gastronomy par excellence—regularly confirmed by opinion polls, is couscous, the semolina-based traditional dish of North Africa, now fully assimilated by French palates. And even beyond the confines of culture and marriage, what is Catholic France’s holy trinity of the most popular heroes, in survey after survey? The soccer player Zinedine Zidane (of Algerian-Berber descent), tennis player Yannick Noah (of mixed Cameroon-Alsatian descent) and filmmaker Dany Boon (of North-African-Muslim descent), who converted to Judaism at the time of his wedding to his Sephardic wife.

For the most part, this emphasis on integration—though not without its faults—has worked pretty well in France. Western Europe’s biggest “Muslim country” (the current numbers hover around 6 million people) has not seen a successful terrorist attack on its territory since 1996. All plots were uncovered; their perpetrators jailed or deported. An efficient intelligence service, well trained in Arabic and Muslim politics, played an important role, and special legal rules—such as the ability to keep terror suspects in custody—allowed for great ad hoc efficiency.1 This successful counterterrorism policy could never have worked without the cultural acquiescence of the vast majority of French citizens and residents of Muslim descent. They cooperate because they would simply never trade their decades-long effort and investment in becoming full-fledged French citizens—even in the face of latent xenophobia and social discrimination—for the vagaries of Islamist radicalism, which would make all of them suspect, and offer a political space for the extreme Right.

Much of this French success has to do with how the term “Muslim” is used in political parlance, where the preference is for expressions like “of Muslim descent” or “from Muslim culture.” This stems from the French notion of laïcité—loosely translated as “secularism”—which has been a backbone of French culture ever since its implementation under the Third Republic in the early twentieth century. To resist the overwhelming influence of a Vatican-aligned, reactionary Catholic church that interfered in both education and politics, the French government passed a law separating church and state in 1905, severing the historic link between Paris and Rome. The French conception of religion in the public sphere is thus quite different from the ascriptive understanding of religion found in Britain or America—a difference illustrated by the fact that the British national census asks respondents to define themselves in religious terms. By contrast, its French laïque counterpart merely defines religion in sociological and cultural terms, provided the concerned individuals agree on that identity to which they are, by the by, entitled to be indifferent—even hostile.

Thus, in France, a community that would encompass all “Muslims” a priori is politically impossible—and without that, there can be no political brokers or “community leaders” who monopolize representation of “Muslims” (or at least pretend to do so). This was no more evident than in the French government’s attempt to reconcile the differences between Islamic factions by creating the French Council of the Muslim Faith (CFCM) in 2003. The hope was to make peace between different Islamic groupings so as to facilitate the free exercise of Muslim religion, organize pilgrimages to Mecca, ensure access to halal foodstuffs in the army, corporations and restaurants, and build mosques by which practicing Muslims would have the same rights and advantages as believers in other faiths. At the same time, then–Interior Minister Nicolas Sarkozy, who professed an “open” understanding of laïcité that relied more on religious leaders as role models, wanted to use the CFCM as a go-between with practicing Muslims.2 But the differences between Islamic factions, be it because of their doctrinal tenets or the fierce competition between the foreign states that influence some of them (Algeria, Morocco, Saudi Arabia, Iran, etc.), never allowed the CFCM to emerge and find a role that would resemble other united religious mouthpieces, whether the Bishops’ Conference or the Representative Council of French Jewish Institutions (CRIF). Overall, the dominant narrative in France has always been to be French first and foremost. Religious identity continues to take a backseat to citizenship in the Republic.

IN THE UK, things were happening quite differently. From the beginning of mass migration in the 1950s, British Muslims organized as such and started to establish mosques on British soil. The segregated experience of the Muslim community under the Raj was duplicated in Britain, except this time the majority population was not Hindu, but the white English working class with its beer-on-tap-and-bacon culture. Meanwhile, intra-Muslim sectarian and denominational strife led different groups to create their own enclaves. The Deobandis wanted to have their own places of worship, as did the Barelvis, a Sufi-oriented sect considered heretic by its rivals, with a special reverence for the Prophet Muhammad. The same went for the Ahl-i-Hadith, a puritanical group close to Saudi Wahhabism. When British authorities tried to provide brick-and-mortar mosques to replace makeshift prayer rooms, they faced upheaval. The Deobandis, for instance, refused to pray behind a Barelvi imam who sang the praises of the Prophet in terms the Deobandis saw as close to idolatry, and things degenerated into fistfights as the disparate sects tried to control the pulpit in the so-called cathedral mosques. Though this might seem to echo Sarkozy’s futile attempts to mediate between the different Muslim groups in France, there is one key difference: for British Muslims, that religious identity has always come before all others, whatever the infighting between different sects may be. In France, it was the wide array of available identities—Islamic, Algerian, working class, unionized, leftist, laïque and what have you—that made the concept of Muslim categorization secondary at best.

This secluded British-Muslim religious identity led to a far more introverted social life than was the case for North Africans in France. Though curry may have replaced fish and chips in British stomachs, the practice of seeking a consort in the extended family (biradari in Urdu)—which led fathers to travel yearly to Mirpur or Punjab so as to bring back to Manchester or Bradford suitable, non-Anglophone husbands for their British-born and -educated daughters—perpetuated a cultural isolation.

It is this insular Muslim practice that led to Salman Rushdie’s The Satanic Verses; a frontal attack on that immigrant seclusion. The book aimed to undermine it with a vitriolic criticism of the religious tenets of Islam. In particular, it mocked the Prophet and his many wives, describing his abode as a brothel. Though the names were changed, and the novel was a work of fiction, Rushdie wanted to rock the foundations of British-Muslim life, and force his coreligionists to reconsider their self-segregation and begin to integrate into British society. But his ambitious project backfired. Far from serving as a liberating cri de coeur, The Satanic Verses only reinforced the grasp of radical mullahs on their communities. The parochial old-timers who knew little English and who had no real interaction with British authorities (except when they traded their vote banks for community control) proved incapable of taking up Rushdie’s challenge. And they gradually were replaced by better-groomed, younger preachers, some of whom had links to radicalized international Islamist organizations.

The book burning of The Satanic Verses by the Bradford Council for Mosques in front of the city hall of that derelict Yorkshire Victorian city in 1989 was originally intended to express to the larger public the pain and suffering of Muslims who felt insulted by Rushdie’s blasphemy of the Prophet Muhammad. But it produced quite the opposite effect on TV viewers: book burners were seen as fanatics performing an auto-da-fé tantamount to the Spanish Inquisition or Nazi Germany, and they got no sympathy from the press. British perceptions of Islam’s fanatical response were cemented a month later on February 14, when Ayatollah Khomeini sent a valentine to Britain in the form of a fatwa condemning Rushdie, his publishers and his translators to death. The leader of the Islamic Republic was attempting to regain his status as the champion of oppressed Muslim masses worldwide—a status that had been seriously challenged by the victory of U.S.-backed Sunni jihadists in Afghanistan, who had compelled the Red Army to pull out of the country on the following day—February 15. On the British political stage, the infamous fatwa meant that all of a sudden, the UK (and the rest of Europe and the world by the same token) had become part of a virtual Dar al-Islam (abode of Islam) where the rules of sharia—or Muslim God–inspired law—would apply, punishing blasphemy (or, for that matter, “insult to the Prophet”) with death.

The Rushdie affair was in a way quintessentially British. It happened in the context of a political scene divided along communalist lines, and it triggered reactions from community leaders and ordinary believers who felt threatened in their imposed and self-imposed seclusion, a situation that made them unable to distance themselves from the defensive attitudes of their peers.

On the other side of the Channel, where men and women of Muslim descent were not organized in this way, and where imams retained far less influence than their opposite numbers in Great Britain, the Rushdie affair did not mobilize any Islamic outbursts, save for a tiny group of radicals led by two recent converts to Islam, the grandchildren of Maurice Thorez, the deceased strongman of the French Communist Party, who took to the streets in front of journalists who widely outnumbered them.

NEVERTHELESS, 1989 was also a watershed year for Islam in France, and it pinpointed the difficulties of the traditional republican and cultural-integration model. While the French were supposed to be celebrating the two hundredth anniversary of the fall of the Bastille and the triumph of Enlightenment, and the rest of the world was focused on the end of the Communist era, the French press was obsessed with an entirely different affair. Three teenage female pupils of Muslim descent had entered their classes at a middle school in a northern Paris banlieue wearing hijabs, the Salafist “Muslimwear” that was steadily imposed on Muslims worldwide—through the expansion of Wahhabism and the Muslim Brother subculture—into the expression of Islam in the public sphere.3 This piece of cloth placed hijab-wearing French public-school students in a cultural cluster and separated them from their classmates—on the basis of a proclaimed religious identity.

It seemed the decades-long French philosophy of laïcité had come back to haunt the country. Its detractors saw this policy as insensitive to cultural differences. And this view was not confined to Muslims in France. Americans and Brits alike mocked the country as closed to the other, draped in the rags of its past glory, an obsolete singleton in a globalized world. And the goal of cultural integration was lambasted as “assimilation”—a term with particularly bad connotations, no more so than in some Jewish circles, where it is tantamount to “cultural genocide.” The stakes were high, the debate highly political. Both the French branch of the Tablighi Jamaat—an Indian Islamist movement preaching cultural seclusion from the non-Muslim environment—and the local Muslim Brothers supported the girls, and in the case of the latter, used the affair to pretend that they were the choice representatives of a “Muslim community” that was in the making on French soil. They changed their name from the Union of Islamic Organizations in France (UOIF) to the Union of Islamic Organizations of France, in an attempt to show the professed new stage the group had reached in its process of asserting Islamic identity in France. As one of its leaders explained to me, they no longer considered France a land of temporary residence for “Muslims”; many now called it home. Hence, it was no more a part of Dar al-Solh (or “abode of contract”), a foreign territory where Muslims could stay temporarily and where sharia was irrelevant. It had become part of Dar al-Islam, where sharia applied for Muslims who so wished.

If sharia was not state implemented, it was the right of every French “Muslim” to enact the laws. The three hijab-wearing pupils were the first manifestation of the UOIF’s new policy—its bid to be the community leader of the “Muslims of France” and the champion of an exemplary cause.

As much as the Rushdie affair was evidence of the contradictions of Britain’s relationship with its Muslim citizens, the hijab affair was typically French. It could never have taken place in the UK, where it had long been common practice for schools to welcome the hijab, segregate Muslim female pupils from sporting and swimming classes with their male counterparts, and so on and so forth.

The question then for Paris was whether “liberty” should come first, or was education to provide a space free from political, religious and similar statements—based on the other tenet of the Republic: “equality”? When the UOIF and their fellow travelers from the multicultural Left—along with the allies they made on that occasion among the Catholic clergy, Protestant pastors and some conservative rabbis—made their claim in the French public sphere, they used the political language of freedom. They cast themselves as the opposition to the authoritarianism of the Jacobin, laïque fundamentalist, assimilationist state. Some Islamist militants even took to the streets wearing a yellow star under their hijab or beard, implying they were persecuted like the Jews had been by the Nazis (that line was difficult to carry on and introduced confusion into the minds of some otherwise anti-Semitic and anti-Israeli radicals). Yet, when Muslim youths were instructed to wear the hijab by the Tablighis, Salafis or Muslim Brothers, it was not a matter of freedom, but of religious obligation. Notwithstanding such internal contradictions (of which the French press and public debate were largely unaware), the hijab affair poisoned the educational environment. Endless litigation and demonstrations that benefited radicals who portrayed themselves as victims of state repression followed. However, in spite of all this apparent distaste for laïcité, in the end there was very little support for the hijab cause, and certainly no mobilization of an improbable “Muslim community” that the UOIF and its ilk wanted to bring to life. The fact that during this time the Algerian civil war—which subsequently spilled over into France—was fully aflame, and still French Muslims largely ignored the call to jihad, is the starkest evidence of how little sway these radicals held over the so-called Muslim community.

SUCH WAS the backdrop for 9/11 on each side of the Channel. In France, the trauma of the Algerian civil war—with the casualties caused by Algerian-linked terrorism on French soil, the terrible death toll in Algeria itself, and the political and military defeat of Islamist insurgents in 1997—had three main consequences. First, there was little love lost on the part of French citizens or residents of Muslim descent for the kind of radicalism and terrorist attacks they had both experienced and suffered. In France, 9/11 was viewed as Act II of the same play. Second, the repression of the Islamist rebels in Algeria had destroyed networks and movements that might otherwise have spilled over into France. And third, French security and intelligence forces were trained in vivo to trace and eliminate Islamist terrorist networks. They had a sound, direct and on-the-spot knowledge of such groups and of their international connections, and state policy would not allow foreign radical Islamists to obtain political asylum in France.

In the UK, on the other hand, where Muslim communities were organized and represented by leaders and brokers who had sizable followings, the state had minimal direct interaction with such populations, mirroring the days of the Raj when communalism was a mode of government. As opposed to the French, who had banned foreign Islamist leaders from entering their country, British authorities granted asylum to a vast array of them—including the Egyptian Abu Hamza al-Masri (aka, “Hook”), Abu Qatada al-Filistini from Palestine, Syria’s Abu Musab al-Suri and many others—who acted as important contributors to the production and dissemination of Salafist-jihadist literature, and audio, video and Internet propaganda. All were veteran jihadist fighters from Afghanistan in the 1980s who had supported jihad in Egypt, Algeria, Bosnia and Chechnya in the 1990s. They created an underworld of sorts, labeled “Londonistan” by the Arab press.

Their presence in Britain was rationalized; politicians argued that the former jihadists would abstain from radicalizing local British-Muslim youth. The asylum seekers were Arabs, the British Muslims were from the subcontinent, so it looked as if there would be a major cultural gap between them in any case. More so, continuing the long-held British tradition, cultural identification of Muslim communities with their new homeland was by no means a priority in the multicultural-tinged “cool Britannia” of the Blair years. More than ever, Muslim immigrants retained ties to their countries of origin—something that would prove disastrous as Pakistan experienced a steady Talibanization from the mid-2000s onward, and Britons of Pakistani descent visited the country every year to revive family networks, shop for consorts for their children and partake in the political strife of Pakistan. Worse, an activist minority spent time in radical madrassas of the Deobandi sect, and in the training camps of the Taliban and other jihadist guerillas.

BUT THIS is not to say that all was well in France. The hijab issue remained an irritant, and in the spring of 2003, then-President Jacques Chirac convened a committee of experts, the Stasi Commission (named for its president, French politician Bernard Stasi, and of which I was a member), to examine whether laïcité was threatened, and how to deal with the issue in a society much changed from the Third Republic that mandated separation of church and state almost a century before. The commission recommended that the wearing of ostentatious religious signs (whether it be hijab, cross or yarmulke) be forbidden in schools benefiting from state funds (public or private). The ban was limited to students who were minors. Once in college or university, they were deemed mature enough to dress as they liked.

The hijab prohibition was met with incomprehension. Paris passed the law in the spring of 2004 to take effect in September, a decision that produced an outcry in Islamist and multiculturalist circles worldwide. In France, the UOIF organized demonstrations that were widely covered and hyped on Al Jazeera—where a Muslim Brother was at that point editor in chief. In late August, the “Islamic Army in Iraq” took two French journalists hostage, and threatened to kill them unless the “anti-hijab law” was rescinded. Much to the surprise of those who believed the Al Jazeera coverage, the wide majority of French citizens of Muslim descent supported the hijab ban. Many took to the streets and went on the air to express their total rejection of a terrorist group that had hijacked their voice. And the UOIF was compelled to backpedal, its spokeswoman offering on TV to take the place of the hostages so that her hijab would not be tainted by innocent blood. That was the end of the hijab turmoil. To date it is no longer worn in schools, and the UOIF decided to drop its efforts to overturn the law (in any case, its campaign has lost steam since 2004).

So France’s policy of laïcité seemed to be vindicated. But a year after the 2004 hijab dispute, the banlieues outside Paris exploded in violence. It was as if all the French had to say for the success story of their cultural-integration model fell short. Upward social mobility was nowhere to be found for many of the migrant youth living in the banlieues—the only contemporary French word that has since made its way into international idiom and needs no translation!4 When young people of migrant descent (some, but not all, Muslims) started burning cars in these infamous neighborhoods in the autumn of 2005, it provided Fox News with vivid coverage (“Paris Is Burning”) filled with “Muslim riots” and “Baghdad-on-the-Seine” nonsense. Meanwhile, pro-war-on-terror pundits ridiculed then-President Chirac and then–Prime Minister Dominique de Villepin for their opposition to the Iraq War using a chickens-come-home-to-roost logic. Yet all academic studies in the aftermath of the riots amply demonstrated that they had little if anything to do with Islam per se; instead, they were due to a lack of social integration and economic opportunities. The rioters wanted to get the public’s attention drawn to these issues—a far cry from any urge to establish a radical “Islamistan” in the banlieues. The riots, then, were an appeal for further social integration, something that the same controversy-ridden Stasi Commission understood well, and proposed to deal with via new urban planning to destroy the ghettos and the institution of Yom Kippur and Eid al Kabir as school holidays—these and other attempts to respect diversity were summarily ignored.5 Media interest soon moved on to the next story, and there was little public awareness of these findings.

IN BRITAIN, where Tony Blair had planned to invade Iraq since 2002 alongside George W. Bush, the prime minister felt confident that government support of domestic Islamist communalism would grant him immunity from British-Muslim criticism of the “invasion of a Muslim land by infidel armies,” and would not lead to retaliation in the form of jihadi-inspired terrorist action. Alas, this was not to be. Pakistani radical networks lambasted British (and American) policy. So too did al-Qaeda and the Taliban. Scores of these British-Muslim activists, who had spent time in the Taliban’s schools and camps, rallied to the extremist cause. Deputies of radical Islamist groups in the UK stopped all collaboration with British authorities, and as Her Majesty’s security services’ grassroots knowledge of Islamist whereabouts had relied to a large extent on community leaders, there were suddenly a number of blind spots in the general surveillance of radical groups and individuals, particularly in provincial areas removed from London. Agencies discovered belatedly that the Arab luminaries of Londonistan had learned English and were bonding with the subcontinental English-speaking youth from Bradford to East London. This dangerous environment provided the background for the July 7, 2005, attacks. The suicide bombings in London were perpetrated by English-educated British Muslims from Yorkshire. Their prerecorded will, broadcast by al-Qaeda and introduced by no less than Ayman al-Zawahiri, starred the chief of the group, Mohammed Siddique Khan, declaring in heavily accented working-class Yorkshire English that he was a fighter in the war against infidels who had invaded Iraq and Palestine. By the end of July 2005, another suicide attack was narrowly avoided. In the summer of 2006, a major plot to bomb transatlantic flights between London and New York with liquid explosives was foiled at the eleventh hour. In 2007, another plot half-succeeded when a car laden with explosives (which failed to detonate) barreled into the entryway of the Glasgow airport.

Since 2007, and Tony Blair’s departure, there has been a major review of British policy. The government of Gordon Brown has painstakingly tried to fashion a concept of “Britishness” as part of its “deradicalization” policy aimed more at integrating Muslim youths into the wider British community. The shift from multiculturalism coupled with the intelligence-agencies-issued report Preventing Extremism Together definitely brings policies on both sides of the Channel much closer than they ever were in the past. The issue of social-cum-cultural integration remains a crucible for populations of Muslim descent as they seek to identify politically with their Western country of residence, adoption and, increasingly, birth.

AS THE United States now faces home- grown terrorism, in the form of Nidal Hasan’s Fort Hood massacre and the “underwear bomber” Umar Farouk Abdulmutallab’s near detonation of a plane bound for Detroit, it is certainly worthwhile to analyze Europe’s relationship with its Muslim residents in a less patronizing way than was the case both in the warmongering parlance of the neocons and President Obama’s naive Cairo speech last year. While the present administration just granted a long-denied entry visa to Islamist intellectual Tariq Ramadan, and so seems to be following the Tony Blair model (which counted on Ramadan to pacify the Muslim ranks in Britain after 7/7, that is, until the prime minister and the preacher had a falling out), it might indeed be wise to evaluate the European experience in all its dimensions. The “special relationship” may not be all that is on offer. Old Europe has, after all, been the neighbor of the Muslim world, has colonized some of it and now has integrated part of that world into its very identity. While some predict that, in a few decades, Europe will be but the northern part of the Maghreb, one may equally surmise that North Africa and the Middle East will be far more Europeanized.

Gilles Kepel is a professor and chair of Middle East and Mediterranean Studies at the Sciences-Po, Paris, and the Philippe Roman Professor in History and International Relations at the London School of Economics.

1 The French legal term Association de malfaiteurs en vue d’une entreprise terroriste (criminal association with a terrorist aim) allows the judiciary to keep terrorism suspects in custody for seventy-two hours before they are charged or freed (as opposed to twenty-four hours in other cases), which increases the chances that suspects will be destabilized enough to give away their networks, and allows the police enough time to take action. Such emergency measures are taken under the control of an antiterrorism-habilitated judge. Judge Jean-Louis Bruguière, one of the most successful French antiterrorism judges of the 1990s and early 2000s, told me that this legal measure was the key to French success, and also made any Guantánamo-type decisions unnecessary.

2 In his visit to Saudi Arabia in January 2008, President Sarkozy addressed the Saudi Majlis al- Shura (nonelected Parliament), praising religious figures—including imams—for their role in society, one that he considered unmatched by secular educators and the like. Though it is true that the French state-school system is undergoing a crisis with regard to its former central role toward cultural and social integration of youth from all walks of life and inherited cultures, the advocacy of its replacement with religious figures was met with an uproar in many French circles.

3 Wahhabism is a puritanical understanding of Islam that follows the teachings of Muhammad Ibn Abd al Wahhab, a late-eighteenth-century preacher. Wahhabis alinged their sect with the Saud family, allowing for the creation of the Saudi Arabian state. It was marginal in the wider Muslim world until oil wealth fuelled its export as a means to fight socialism in the postwar Arab and Muslim countries. “Wahhabis” prefer to call themselves “Salafis” (“following the ancestors,” i.e., strictly observant of pristine Islam). They abhor any kind of worship of a human being. But all Salafis are not Wahhabis. The society of the Muslim Brothers was founded in Egypt in 1928, with the political aim of establishing a Muslim state, abiding by sharia laws. In spite of their diverse interpretations of Islam, Wahhabis, Salafis and Muslim Brothers share the same subculture that makes the tenets of Islam permeate every dimension of daily social and cultural life.

4 In 1987, when I published a study on Islam in France entitled Les banlieues de l’islam (Paris: Editions du Seuil), I had to translate the title as “The outskirts of Islam” to make it understandable to the English-speaking public, and explain that such outskirts were not suburbia, rather “inner cities” (UK) or “ghettos” (United States). All that confusion stopped when Anglophone media pundits started using banlieues as a catchword for lambasting French policies of integration, in particular during and after the so-called “Muslim riots” of the fall of 2005. See chapters 4 and 5 of my Beyond Terror and Martyrdom (Cambridge, MA: Harvard University Press, 2008) for more information.

5 Though the French government foolishly rejected the commission’s proposals at the time, it subsequently espoused a number of the Stasi Commision’s additional policy suggestions. By then it was too late to affect the situation.

Advertisements

May 29, 2010 Posted by | Muslim world, War | , , , , , , , , , , , , , , , , , , , , , , , , , , , , | Leave a comment

25 Best Cities for College Grads

The Class of 2010 is heading into the real world but where should they live? Urban guru Richard Florida and his team find the best cities for the young and ambitious.

Congratulations, Class of 2010—and welcome to one of the worst job markets of all time. You’ve likely seen the litany of stories echoing the gloom and doom meme. Harvard grads grateful for the chance to wait on tables. MIT computer whizzes pioneering new ways to flip burgers. And then there are the terrifying statistics. Unemployment for people between the ages of 15 and 24 has passed 20 percent. You won’t just be competing with your peers—all 1.6 million of them—but with people your parents’ ages too, who lost their savings in the crash and have had to postpone their retirements for pretty much forever. “With the obvious exception of youngsters born during the Great Depression, no generation in American history faces more daunting obstacles,” writes a dour Joe Queenan in The Wall Street Journal. “Even the pasty-faced Pilgrim toddlers gamboling around Plymouth Rock in 1620 had better prospects.”

Click Image to View the Top 25 Cities for College Graduates

HP Main - Cities College Grads

Let’s not go overboard. That 20 percent plus unemployment rate includes high school dropouts and people who didn’t finish college. The unemployment rate for college graduates is actually less than 5 percent. And the unemployment rate in the professional and technical fields where you’re most likely to work—science and engineering, business and management, education and health care—is just under 4 percent. Make no mistake about it, times are tough—but it’s blue-collar workers and blue-collar communities that have borne the full brunt of the crisis.

Most recent college grads will find jobs, even if they have to look a little longer than previous classes did. And that’s not such a bad thing. With all those high-paying corporate entry-level jobs for the taking during the boom years, too many young people went for the bucks and landed in careers that were unsatisfying and unfulfilling.

Now more than ever, it’s really important to put serious thought into where you want to live. The place you choose to live is key to your economic future. Jobs no longer last forever. In fact, the average twentysomething switches jobs every year. Places can provide the vibrant, thick labor market that can get you that next job, and the one after that and be your hedge against layoffs during this economic downturn. Early career moves are the most important of all, according to Don Peck in the National Journal. He cites a prominent study that finds that “about two-thirds of all lifetime income growth occurs in the first 10 years of a career, when people can switch jobs easily, bidding up their earnings.” Sure you can move from place to place—and it’s true twentysomethings are three- to four-times more likely to move than fiftysomethings—but it’s a lot easier to manage a forward-looking career if you choose the right place with abundant opportunity to start out in.

So what do twentysomethings want in a community? To get at this, my team and I analyzed the results of a Gallup survey of some 28,000 Americans in their 20s. Some key things stand out. Jobs are clearly important—but just as clearly, they’re not all-important. When asked what would keep them in their current location, twentysomethings ranked the availability of jobs second.

Twentysomethings understand well they face not only fewer job options but dwindling corporate commitment—it’s not only harder to find a job, it’s also easier to lose it. So it makes good sense to pick a city where the labor market is thick with job opportunities as a hedge against economic insecurity. What twentysomethings value the most is the ability to meet people and make friends. This also makes very good sense actually. Personal networks are about much more than having fun, they’re among the best ways to find a job and move forward in a career.

Twentysomethings rank the availability of outstanding colleges and universities highly. Many want to go back to school to pursue a graduate degree or professional degree, and having these options available where you live is a big plus. Of course, young people value amenities, too—from parks and open space to nightlife and culture. It’s less about all-night partying though, twentysomethings prefer places where they can easily go for a run or bike ride, work out or walk their dog, grab a coffee, take in a concert, see interesting new art, or take in a good meal with friends.

With all this in mind, we compiled our rankings of the Best Places for Recent College Grads. These rankings are based on an index of nine statistical indicators for the more than 350 metropolitan areas (that is, core cities and their surrounding suburbs) across the United States. The core measures in the rankings include:

Presence of twentysomethings (20-24 year olds) in the population

• Singles—measured as the share of unmarried people

· Earnings potential—measured as average salary

• Unemployment rate

• College educated workforce—the share of the workforce with a bachelor’s degree or higher.

• Rental housing—having an abundant, available stock of rental housing is key. We measured this as the share of all housing made up of rental units.

• Youth-oriented amenities—like bars, restaurants, cafes, sports facilities and entertainment venues.

• Creative capital: We use this to capture the creative energy of a place. It’s measured as the share of employed artists, musicians, actors, dancers, writers, designers, and entertainers in the workforce.

• Openness: A region’s openness to new and different kinds of people reflects a lack of barriers and willingness to let newcomers, including young people, have a go. Our measure is the share of gays and lesbians and foreign-born residents in a community

Affordability: The overall rankings do not take housing costs into account. Generally speaking, new college grads are renters and can easily share apartments to reduce costs. It’s also difficult to get a handle on the full living costs borne by young people—some communities have accessible mass transit; in others, new grads must buy a car (and pay for insurance, maintenance, gas, and parking). So, we decided to break out an additional index to account for affordability. This index includes a variable for rent levels—median contract rent. It weights affordability at 25 percent of the overall index value, and lets the other nine indicators account for the remaining 75 percent. We mark cities that rank in the top 25 on this combined affordability index with an asterisk(*).

The data is the most current available, for 2008, 2009, or 2010 depending on the variables. All nine variables are equally weighted. The technical analysis was conducted by a Martin Prosperity Institute team of Charlotta Mellander, Kevin Stolarick, Patrick Adler, and Ian Swain.

College towns dominate the top spots. Ithaca is first followed by Madison, Wisconsin; Ann Arbor, Michigan; Durham, North Carolina; Austin, Texas; and Boulder Colorado. That may seem a bit surprising to the legions of new grads who are off to the big city. Boulder and Austin are two of the country’s leading centers for innovation and high-tech business with great sports and music scenes to boot. And college towns—from Iowa City, Iowa to Charlottesville, Virginia, from Lawrence, Kansas to Lincoln, Nebraska, from Columbia, Missouri to State College, Pennsylvania—provide terrific “stay-over” locations for new grads who want to maintain their networks, try out their skills or develop new ones. They have high percentages of young, highly educated singles; they provide an affordable alternative to bigger cities while still delivering a high quality of life; and they’ve proven to be among the most resilient communities during the economic downturn.

The list also has its share of big cities. D.C. is the top big city on our list in seventh place; and it’s followed closely by New York City and Boston. San Francisco, San Diego, L.A., Seattle, and San Jose (the hub of Silicon Valley, still hands-down the best place for techies) all make the top 25.

But do remember: There’s no absolute best place for new grads—or anyone else for that matter. Different strokes for different folks: For every twentysomething that wants to head to the big city there are those who prefer some place closer to home or a smaller, more affordable community.

It’s best to think of this list as a general guide to help you orient your choices. When we were building our index we found that small shifts in the datasets we used and how they were weighted would reorder the cities near the top, but the picks in the top 25 remained surprisingly consistent. Ithaca, for example, always made the top 25, but adding the last two variables to the index raised its rank from 14th to first. So college grads should think of this list as a way to orient their own personal list, rather than a winner-take-all competition. That’s the key thing, really—to pick the place that’s best for you—that fits your own career outlook, your current situation, and your life plans. My team at the Martin Prosperity Institute has developed a tool called Place Finder that asks for some of your preferences and generates a custom list of places that might be right for you.

That choice is more important now than ever. While the place you choose to start your career and your life is always important, it’s taken on additional importance during the current economic downturn. This is no run-of-the-mill economic cycle recession but a full-blown economic transformation, the kind that comes around only once every generation or two. Great Resets like these give rise to the life-altering “gales of creative destruction” that the great economist Joseph Schumpeter wrote of—to new technologies, new industries, and whole new ways of living. If some cities may fall further and further behind, others—the most innovative, adaptive, open-minded places—may be on the brink of unprecedented prosperity. And you might just be a part of it. Choose wisely.

Richard Florida is Director of the University of Toronto’s Martin Prosperity Institute and author of The Great Reset, published this month by Harper Collins.

Kevin Stolarick developed the data; Charlotta Mellander conducted the statistical analysis. Patrick Adler and Ian Swain assisted with the analysis.

Get a head start with the Morning Scoop email. It’s your Cheat Sheet with must reads from across the Web. Get it.

May 29, 2010 Posted by | 25 Best Cities for College Grads, Off Topig | , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , | 1 Comment

About New York Business Insurance

New York Business Insurance can be put into one word: Service.
Even though there are business insurance companies galore each one should be dedicated to giving the best service possible.
Service designed with your company in mind to help reduce any losses and to control your insurance costs.

New York Business Insurance can be put into one word: Service.
Even though there are business insurance companies galore each one should be dedicated to giving the best service possible.
Service designed with your company in mind to help reduce any losses and to control your insurance costs.
Also to have honest, fair and prompt service regarding coverage, loss and claims.
Any business insurance company worth their salt will want to establish a good relationship with their clients.
Where the motto should be that they do not just work for an insurance company, they work for you. You as the client are what should matter.
There are many different types of business insurance. You will want to check and compare prices and policies. There are as many different prices and policies as there are insurance companies. It could make all the difference in the world by taking some time to look into what the business insurance companies have to offer.
From general liability to group health. From professional liability to workers compensation.
Your company’s specific needs are examined and then tailor made to fit your budget, and your standards. Whether you have employees or not. Whether you have business use automobiles or not.
There is also Crime Insurance for burglaryFree Articles, robbery and theft. Fidelity Bond Insurance protects a business against losses that are due to fraudulent employee acts.
Depending on how large a company you have there is even Directors Liability Insurance. Coverage provided for the costs of legal fees and other court costs if they are sued as individuals.
No matter how small or how large your company is there is a business insurance that will take care of all your coverage needs.

May 17, 2010 Posted by | Business | , , , , , , , , | Leave a comment