Showing posts with label The Conversation. Show all posts
Showing posts with label The Conversation. Show all posts

26 May 2017

What The Manchester Attack Leaks Mean For The UK-US Intelligence-Sharing Relationship

Donald Trump and Theresa May
Donald Trump and Theresa May - PA
By Colin Murray, Newcastle University

Just a few hours after the British home secretary, Amber Rudd, issued a stern warning to the US government and intelligence officials about leaking sensitive information, they were at it again. The Conversation

US news outlets had already published the name of the suspect in the Manchester attack before the UK authorities were prepared to make it public. And now not only had more intelligence information been released about the suspects family and their movements, but the New York Times published photographs of bomb fragments and the tattered remains of a backpack.

But while this latest storm over the UK-US relationship and intelligence sharing in the wake of the Manchester attack is far from unique, these leaks are of a different order.

They indicate the febrile state of the administration in Washington. And when the White House gives the appearance of being cavalier with shared intelligence, it is unsurprising when nameless officials ape the commander-in-chief for their own advantage.

Perhaps inevitably, they resulted in the suspension of information sharing – even if this suspension was just for a matter of hours and limited to this single investigation. And in these times of shifting sands, the prime minister, Theresa May, went from defending the Trump administration’s approach to intelligence sharing to confronting the US president over leaks – all in the space of a single week.

But, for all the air of despondency the Manchester investigation leaks have generated, any damage in the UK-US security relationship is likely to be fleeting. Both countries have gained too much from the relationship. And more than anything, the leaks demonstrate just how quickly actionable intelligence flows between them.

A brief history of leaks

The challenge of effective international co-operation and intelligence sharing has long been a subject fraught with controversy. And it is testament to the durability of the relationship that it has weathered many such storms for the best part of a century.

This prize was not easy won, and it has proven very difficult to emulate reliable intelligence sharing – even where terrorism is at issue. Well into the 20th century many states would zealously guard intelligence. And when information was shared it would be on an ad hoc basis, for geo-political advantage.

For one group of countries, intelligence sharing during World War II changed this approach. The the so-called “Five Eyes” countries – the US, UK, Canada, Australia and New Zealand – recognised the value of shared intelligence to the allied victory. And, in light of the looming threat of the Cold War, these countries opted to maintain their sharing of signals and communications intelligence under the UKUSA Agreement, which in 1947 set up a top secret, post-war arrangement for sharing intelligence.

But even between these close “Five Eyes” partners, this arrangement did not stop horse-trading of other sources of information – and tensions inevitably arose. To mitigate these problems the partnership developed the “control principle”. This meant that the country which produced the original intelligence could determine whether it was shared with countries outside the partnership, or even if it was to be made public.

The bartering of secrets

That the “Five Eyes” system was maintained in the aftermath of the Cold War was not the product of mere habit. In an era of diffuse and emergent threats there was even – amid talk of a “new world order” – a concerted effort to extend intelligence sharing. And yet, all too often in the wake of terrorist attacks it emerged that different countries’ security agencies held vital information which, if pieced together, could have averted an atrocity.

So, in response to early instances of al-Qaeda related terrorism, Article 15 of the International Convention for the Suppression of Terrorist Bombings obliged signatory states to cooperate in the prevention of terrorist attacks. This was formed in 1997 and required countries to share “accurate and verified information” in such instances.

In the immediate aftermath of the 9/11 attacks the United Nations Security Council took things even further – enjoining all UN member states to “increase co-operation”. This measure aimed to transform international intelligence sharing in response to terrorism in the hope of preventing the next 9/11. And more importantly, put an end to the bartering of secrets seen within the “Five Eyes” system.

Close to home

But without a mechanism for enforcing co-operation, barter has continued to predominate within counter-terrorism partnerships forged after 9/11. For example, Saudi Arabia threatened to terminate intelligence sharing with the UK if a Serious Fraud Office investigation into bribery surrounding the arms company BAE was not halted in 2006. And, in a similar vein, security cooperation with Pakistan is only maintained in exchange of foreign aid – making it clear that intelligence remains a valuable commodity.

Even established security partnerships, such as the “Five Eyes” arrangement, have struggled to adapt to this new paradigm – in part because the “control principledoes not sit easily with a legal duty to share intelligence which might prevent a terrorist attack, but also because an increased risk of leaks is part of the price for enhanced co-operation.

So, despite the US and UK’s seemingly “back-to-normal” working relationship, in a world of imperfect intelligence the Manchester investigation leaks risk exacerbating the tendency of intelligence agencies to want to keep as much information as possible close to home.

About Today's Contributor:

Colin Murray, Senior Lecturer in Law, Newcastle University

This article was originally published on The Conversation

23 May 2017

The So-Called Islamic State Group Has Weaponized Children

A girl leaves flowers for victims of an attack at Manchester Arena
A girl leaves flowers for victims of an attack at Manchester Arena. REUTERS/Peter Nicholls
By Mia Bloom, Georgia State University

In claiming responsibility for the attack in Manchester at an Ariana Grande concert on May 22, the so-called Islamic State group has sunk to a new low. The Conversation

We have seen terrorists target venues where young people congregate before – shopping malls, discos and schools. If IS was indeed involved, they have now deliberately targeted young children, tweens and teens and their parents in a horrific attack that has killed 22 as of this writing and wounded 59. The attacker used a nail bomb to maximize the carnage.

Through my research I have gained access to the Islamic State’s encrypted online propaganda platform, Telegram, where last night in the aftermath of the attack, IS supporters disseminated images of dead children from Mosul, saying, “The West’s children would not be safe if their (children) were not.”

This echoed a sentiment I heard many years ago when writing my book “Dying to Kill” about suicide attackers. In August 2001, a Jordanian woman named Ahlam al Tamimi researched a Sbarro pizzeria in Jerusalem to select a time in which the maximum number of families were present. In her attack on the restaurant, 15 people were killed, including seven children and a pregnant woman. Palestinians justified the attack, saying: “If our children are not sacrosanct, neither are theirs.”

As shocking as this attack was, it follows a tradition in which terrorists target children or venues specifically to maximize killing the greatest number of young people.

Children in IS propaganda
The IS propaganda machine uses graphic images of dead children to whip up their base and motivate people from around the world to join their so-called caliphate. These images of children are intended to persuade people that moving to the IS strongholds of Raqqa, Syria or Mosul, Iraq is the only way to halt Syrian President Bashar al-Assad’s slaughter of children.

During the course of research for my forthcoming book, “Small Arms: Children and Terror,” I have found that the group has also increasingly been using children as terrorist operatives, on the battlefield in mixed commando units they call Inghimasi, as propaganda disseminators, building munitions and, since December 2014, as suicide bombers.

Akram Rasho Khalaf, 10, was captured at the age of 7, trained and sold into servitude by Islamic State militants. AP Photo/Maya Alleruzzo

According to a report on children and armed conflict, “In rural Aleppo, Dayr al-Zawr and rural Raqqa, the U.N. found military training of at least 124 boys between 10 and 15 years of age. The use of children as child executioners was reported and appeared in video footage in Palmyra and specific executions.”
IS has used children as young as four to execute prisoners using a remote control, and recently disseminated a video of a four-year-old shooting a prisoner in the head.

One cannot emphasize enough that there is no childhood in IS. The terrorists do not recognize the innocence of the victims at the Ariana Grande concert. The terrorists likewise do not subscribe to the notion that children have, need or deserve an idyllic period of their life in which they are to be protected and cherished.

In fact, Ali Akhbar Mahdi, a professor of religion at California State University at Northridge, argues that the word “teen” has no equivalent in Middle Eastern languages. Instead, they refer to pre-puberty, pre-youth or pre-adult. In most contexts, childhood is simply understood to be a period of time characterized by the absence of reason (‘aql).

Killing children: New norm
Terrorist targeting of children has been more common than most people realize.

For example, from Sept. 1-3, 2004, Chechen terrorists held School Number One in Beslan, Russia hostage for three days. There were 1,100 hostages in the school, including 777 children. By the end of the crisis, 384 people were dead, among them the terrorists and more than 350 civilians.
This is not exclusively a Jihadi tactic. The Oklahoma City bombing of the FBI Murrah building included a day care center. “Of the 21 children who were inside the day-care center on the morning of April 19, the morning of the bombing, 15 died, including all four of the infants by the window.
While IS has opportunistically taken credit for the attack, we do not yet have evidence to determine whether it was a directed or inspired attack. We do know, however, that the terrorist group has manipulated, brainwashed and exploited children for their own purposes and will continue to do so.

The average age for IS suicide bombers and executioners is skewing younger and younger, and they appear to be normalizing the use of children across its affiliates. For example, the terrorist group Boko Haram has used children against soft targets, civilians and marketplaces.

IS has gone from using children to inspire adults, to manipulating children and their parents to fight alongside adults, to targeting children instead of adults. They do not consider what they have done to be truly evil, although we know it to be.

About Today's Contributor:
Mia Bloom, Professor of Communication, Georgia State University

This article was originally published on The Conversation. 

19 May 2017

Hag, Temptress or Feminist Icon? The Witch In Popular Culture

Child dressed as a witch for Halloween
Child dressed as a witch/ EPA/Filip Singer
By Chloe Germaine Buckley, Manchester Metropolitan University

You would have thought that Western society might have grown out of the habit of portraying powerful women as witches, but a trope that usually ended badly for women in the Middle Ages is still being used in the 21st century. Those who portrayed Hillary Clinton as a witch during the 2016 presidential campaign, or have given Theresa May a pointy hat and broomstick in Britain’s general election, may not be calling for them to be burned at the stake, but they do call down political destruction on their heads. The Conversation

Witches have featured in fairy tales and fiction for centuries. In her earliest incarnations, the witch served as a warning. Stories about the witch-as-hag demonised and punished women for attempting to exert power outside the bounds of the domestic sphere. Beyond the fairy tale, women with “occult” knowledge (of folk medicine, for example), or simply poor, social outcasts (such as the infamous Pendle Witches hanged at Lancaster castle in 1612), were the victims of persecution and prosecution in 16th and 17th-century Britain.

Nowadays, though, the witch is often praised as a feminist figure, who pushes boundaries, breaks the rules and punishes patriarchal authority. Buffy the Vampire Slayer’s Willow Rosenberg (Alyson Hannigan) and Disney’s Maleficant (Angelina Jolie) (2014) are two oft-cited examples of the feminist witch.

In preparation for an upcoming academic conference on “Gothic feminism”, I have been researching these contrasting representations of the witch. Which witch (sorry!) does our popular culture currently favour? And can stories about the witch really be reclaimed as feminist parables?
The witch was a recurring feature of horror film in the 1960s and 1970s. British folk horror films such as The Blood on Satan’s Claw (1971) and The Wicker Man (1973) offer deeply ambivalent representations of the witch. In The Blood on Satan’s Claw, teenage temptress, Angel Blake (Linda Hayden) seems to be an anti-authoritarian heroine – the 1960s flower power movement transported to 17th-century England. But in the end she is killed by male authority figures after she oversees the rape and murder of one of her school friends. In contrast, The Wicker Man’s siren, Willow MacGregor (Britt Eckland), gleefully triumphs over the stern Christian policeman, Sergeant Howie (Edward Woodward).

Wildly feminist
The way witches are portrayed on screen has been refashioned many times over the decades. From 1964 to 1972, ABC’s Bewitched turned the witch into the subject of a suburban sitcom as domesticated Samantha (Elizabeth Montgomery) used her magic to serve her try-hard husband. The late 20th century favoured soft focus, “white” witchcraft, epitomised by the popular American television series, Charmed (1998 - 2006). More recently, the witch has taken on an explicitly Gothic guise. The big-budget TV series, American Horror Story: Coven (2013), Penny Dreadful (2015), and Game of Thrones (2011-) represent witches as glamorous and beautiful, but also suggest that their sexuality is deadly.
In cinema, Robert Eggers’ award-winning feature, The Witch (2016), returned to the folk horror genre in its stark portrayal of a Puritan family struggling to survive in 17th-century New England. The film’s bare aesthetic slips into nightmarish horror as it restages the American folk tale of the witch in the woods to a particularly gruesome conclusion.

The film received a lot of plaudits, particularly from feminist cultural commentators. A recent article on film website Little White Lies praises The Witch as a “feminist horror fantasy” that “celebrate[s] the inherent power of femininity”. Likewise, Wired magazine called the film “wildly feminist”.

Disempowering women
However, there is another side to the witch. Mary Beard, in a recent lecture, Women in Power, argued that stories of monstrous women and witches dating back to antiquity, such as the tale of the Medusa, are parables aimed at disempowering women.

Over and again, such stories seek to reinforce the male right to defeat female (ab)users of power, suggesting that women are not entitled to power in the first place – and there’s been much of that in the way both Clinton and May have been portrayed as witches.

The Witch acknowledges this history in its return to the folk horror tradition. Early in the film, a witch pounds the flesh of a dead baby into a paste. Yet at the end of the film, the teenage heroine, Tomasin, agrees to join the witches who had so gruesomely murdered her baby brother. Even though these hags cause the deaths of the rest of Tomasin’s family, their offer of “some butter” and a “pretty dress” seems far preferable to the harsh strictures of Puritan life.
What freedom and power is there in becoming a witch? Joining the witches is Tomasin’s last, desperate resort and it places her forever on the outside of a patriarchal social system in need of reform by and for its female members. More than this, Tomasin becomes one of the gruesome hags who have murdered her baby brother. In this respect, The Witch echoes old misogynist fairy tales, which often feature actual or attempted infanticide, as much as it revels in the witch’s power to destroy an authoritarian patriarch.

Eggers’ complex depiction is not a roadmap to female empowerment. A glimpsed-at moment of freedom (an aerial broomstick ride) for Tomasin occurs on the outside of acceptable social spaces – deep in the woods and far from civilisation. At the same time, the murderous witches continue to communicate centuries-old patriarchal fears about female power.

As scholars, it’s tempting to see our favourite genres and cultural products as proof texts for our politics – but Gothic horror, in particular, has always refused that role. Its monsters do not act as representatives for either the right or the left of politics, but instead slide troublingly between the poles. Given the current lurch to the right in Western politics – and the rise of anti-feminist sentiments – the ambiguity of the witch is perhaps even something to be wary of rather than to celebrate. Though she seems to be a powerful figure for feminists, we cannot forget the witch’s origins as a figure used to delegitimise powerful women and locate them on the outside of society.

About Today's Contributor:
Chloe Germaine Buckley, Senior lecturer in English, Manchester Metropolitan University

This article was originally published on The Conversation. 

Darkest Taboos: How Fleabag Busted Unrealistic Portrayals Of Women On TV

File 20170519 12266 19zl3rt
Fleabag - BBCCC BY-NC-SA
By Helena Bassil-Mozorow, Glasgow Caledonian University

Cringeworthy moments, eye-watering sex scenes, gleeful swearing, naked vulnerability and vulgarity of every stripe: groundbreaking BBC sitcom Fleabag fully deserved its recent BAFTA award. The Conversation

Fleabag (2016-) is part of an extraordinary new trend in television that kicked off a few years ago with Netflix prison drama Orange is the New Black (2013-). Both are shockingly stark and deliberately vulgar when it comes to exposing the taboo corners of female psychology, biology and anatomy. Both are realistic to the extent of being naturalistic in terms of visuals, dialogue and narrative.

This is writing by women which promises to show female characters as they really are, and not through society’s obligatory filters that exist to pigeonhole women.

Fleabag’s titular protagonist, played by its writer Phoebe Waller-Bridge and adapted for the screen from her one-woman play Touch, is a twenty-something Londoner struggling to find meaning in life. She is a promiscuous, pornography-watching sex-addict juggling a string of grotesque relationships and random encounters with managing a failing café business.

She is also trying to come to terms with the death of her best friend who committed suicide after her boyfriend cheated on her. Halfway through the first season, we learn he cheated with Fleabag herself.

Defying expectations
Waller-Bridge’s character comes from an upper-middle class family, but defies all expectations that normally come with this kind of background. For example, she is a compulsive liar and a thief. The stealing bit comes from a deep sense of insecurity and the need to attract the attention of her emotionally unavailable father.

Fleabag’s entire life is a series of shameful mishaps, ranging from taking her top off at a bank interview to stealing a statuette of a naked woman, made by her infuriating stepmother (wonderfully played by Broadchurch actress Olivia Colman) who considers herself to be an artist. Fleabag’s unpolished “neglected orphan” image (the opposite of what a young woman is expected to be) is partly the result of her mother’s death from breast cancer.

Traditionally, female protagonists in TV dramas have been “presented” to us rather than speaking for themselves. We can’t hear their real voices as they are obscured by various societal roles and expectations collectively reflected in narratives: passive, objectified sexuality, longing for a partner and a family, looking elegant and groomed, emotional maturity, readiness to provide emotional support, sacrificial motherhood, and so on. They are “clean” characters.

This “cleanliness” is both internal and external – the purity of character and body. A “proper” woman does not steal, or lie to your face, or swear, or talk about inappropriate things at the table. Likewise, she does not sweat or smell, does not have hairy legs, is not seen to have periods, or use the toilet.

Nudity on screen has become so common that it no longer shocks. Yet filmmakers are still reluctant to show a female character who wakes up looking terrible; who has spots or rolls of fat (particularly outside comedic settings). Fleabag offers true naturalism; this is what is truly groundbreaking – not the increasingly dull sex scenes involving toned bodies to which film and TV audiences are treated to every day.

Of course, there were the four heroines of Sex and the City who candidly discussed sex and the perils of modern dating, but they were beautifully made up, successful, and fashionable. None of them evoked associations with a “fleabag”. Waller-Bridge’s creation is much closer to Lena Dunham’s series Girls (2012-1017), but still deliberately avoids HBO’s polish. Everything about Fleabag is rough and raw, from the music and camerawork to the POV (point of view) and monologues.

The Sex and the City girls
The Sex and the City girls: candid but glossy. Shutterstock
In fact, cinema and TV are generally still operating along the lines of these stereotypes for both female protagonists and secondary characters, making any deviation from the norm look refreshingly gritty. A “proper” woman is therefore so sterile she practically smells of chlorine.

Blundering and failing
It is this sense of blank sterility that Waller-Bridge defies with her depiction of a blundering, failing young woman. Her hilarious asides to the camera, often including candid, uncensored remarks on uncomfortable subjects such as anal sex, masturbation and survivor guilt, show that not only she is not ashamed of her behaviour – she is proud of it.

The hyper-naturalism, which is the hallmark of the series, is the result of this pride. After all, male protagonists in TV and film have been allowed to be make mistakes for decades. Men on screen are allowed to be funny, ridiculous, ugly, promiscuous and terrified of settling down. Why can’t women?

When asked what constitutes the “female journey” (that is, the difficulties the female protagonists have to overcome on their path in narratives), the American mythologist and author Joseph Campbell allegedly replied that there was no such thing as a female journey as a woman didn’t have anywhere to go in the first place.

In his books Campbell explored the path of the male hero in world mythology. The path consists of multiple steps, and is full of problems to be dealt with, puzzles to be solved and monsters to be killed. A woman need not bother to activate her agency like a man would: she is already “there”, already perfect. She is born at peace with herself, whereas the man has to endure trials and tribulations to become the true hero of his own story.

Fleabag in a superhero costume
Fleabag is imperfect and unhappy and aching to go on her own journey to fight her demons. Soho Theatre, CC BY

This view implies that a woman does not have to face the journey of finding who she is, blundering and looking for meaning through trial and error, let alone looking stupid in the process. Her chlorine perfection stays unchanged through her life and guarantees happiness – particularly if she finds the right man with whom to start a family.

Fleabag’s rebellious naturalism successfully challenges this vision of the female protagonist (of whom we still have very few, although their number is growing – particularly on TV). Fleabag the woman is imperfect, unhappy, itching to go on her journey and fight all sorts of internal and external monsters: addictions; insecurities; the neglectful father; the dead mother; the chilly sister; the fake pompous stepmother; the weird arsehole guy; the rude bank manager. This is her way of becoming herself, of finding her own voice.

At last there is a trend that frees women from the bland stereotyped portrayals of feminine perfection and the need to conform to good girl expectations. We should be grateful to Fleabag for showing female characters who are not ashamed of being imperfect and real.

About Today's Contributor:
Helena Bassil-Mozorow, Lecturer in Media and Journalism, Glasgow Caledonian University

This article was originally published on The Conversation

Bonus Video:

17 May 2017

From Nazis to Netflix, The Controversies and Contradictions of Cannes


Students and striking workers occupy the projection hall of the Cannes Film Festival Palace in 1968
Students and striking workers occupy the projection hall of the Cannes Film Festival Palace to prevent showing of films in 1968. AP Photo/Raoul Fornezza
By David Scott Diffrient, Colorado State University

On May 17, the 70th edition of the Festival de Cannes kicked off with the opening-night screening of director Arnaud Desplechin’s “Ismael’s Ghosts.” It will wrap up 11 days later, when the Pedro Almodovar-led jury bestows the highly coveted Palme d’Or on one of the 19 international productions in the festival’s main competition. The Conversation

In between, dozens more motion pictures will flicker to life in theaters along the Croisette, a sun-kissed promenade dotted with luxury hotels that attracts a swarm of paparazzi with the promise of celebrity sightings and scantily clad starlets.

But behind the pageantry, controversy has been brewing. Netflix has two entries premiering during this year’s event. The popular streaming service will then release the films to its millions of subscribers – foregoing the exclusive run in French cinemas requested by the organizers. In turn, they’ve threatened to ban Netflix from submitting any films to future editions of the festival. Telegraph reporter Robert Mendick called this dustup Cannes’ “most explosive.”

If it is, it’s only the latest.

As Lucy Mazdon, one of the few film scholars to have studied this annual event, points out, the Festival de Cannes has long functioned as an expression of France’s national identity. It reinforces the important place that film occupies in the country’s culture, along with its reputation as a purveyor of artistic – rather than strictly commercial – cinema.

But Cannes has sometimes struggled to live up to this ideal, and the competing agendas of art, commerce, international politics and national pride have long roiled the festival.

Anti-fascist origins
In 1938, French diplomat Philippe Erlanger, film critic René Jeanne and Minister of National Education and Fine Arts Jean Zay were disturbed by that year’s Venice Film Festival, when pro-fascist films from Germany and Italy – Leni Riefenstahl’s “Olympia” and Goffredo Alessandrini’s “Luciano Serra, Pilot” – jointly won the top award (the tellingly named Coppa Mussolini).

They were also appalled by the hostile reception given to Jean Renoir’s anti-war masterpiece “The Grand Illusion” one year earlier. (Joseph Goebbels, the Third Reich’s minister of propaganda, who had been a “guest of honor” at the Venice Biennale, had called itCinematic Public Enemy Number One.”)

In response, they came up with the idea of a French “counter-festival” that would stand in opposition to Italy’s. Originally branded as the “Festival International du Film,” the organizers hoped the event would outshine its European counterparts, celebrating the art – rather than political value or propagandist content – of cinema.
The Jean-Gabriel Domergue-designed poster for the first film festival in Cannes, which was prematurely cut short after Hitler’s invasion of Poland in 1939.
The Jean-Gabriel Domergue-designed poster for the first film festival in Cannes, which was prematurely cut short after Hitler’s invasion of Poland in 1939. Cannes
However, politics almost immediately came into play. On the night of the inaugural gathering on Sept. 1, 1939 – as guests were arriving at the Casino Municipal, including Hollywood stars Gary Cooper, George Raft, Norma Shearer and Mae West – Nazi Germany invaded Poland. Following a single screening of the RKO production “The Hunchback of Notre Dame,” organizers brought the festival to a sudden halt.

Great Britain and France declared war against Germany two days later. It would take another seven years before Erlanger, Jeanne and Zay’s vision was finally brought to fruition.

Art clashes with commerce
In 1946, the first full-fledged film festival held in post-Liberation France took place, featuring soon-to-be classics like Roberto Rossellini’s anti-fascist neorealist film “Rome, Open City” and Alfred Hitchcock’s psychological thriller “Notorious.”

Even then, the festival was torn between dueling agendas, with European ideals of art cinema rubbing up against popular Hollywood productions that many French audiences clamored for.

The contradictory nature of the Cannes Film Festival has only intensified since.
In 1959, the French Minister of Cultural Affairs André Malraux called for the establishment of an international “film market,” the controversial Marché du Film. Intended to strengthen the commercial appeal of the festival, the Marché brings together industry professionals for the purposes of networking and brokering deals between buyers and sellers. Meet-and-greet opportunities are formalized through the inclusion of daily breakfasts, round-table talks and workshops with industry leaders.

A 1967 photograph of French film director François Truffaut
A 1967 photograph of French film director François Truffaut. Wikimedia Commons, CC BY-SA

Significantly, that initial foray into the business side of cinema took place just as the festival helped launch the “Nouvelle Vague” (French New Wave), a hugely influential, decidedly noncommercial film movement. Led by François Truffaut, whose autobiographical coming-of-age tale “The 400 Blows” earned him a Best Director award that year, French New Wave cinema privileged the personal expression of young filmmakers. Films like “The 400 Blows” and Jean-Luc Godard’s “Breathless” (made one year later, in 1960) also expanded storytelling possibilities through a reflexive foregrounding of the cinematic medium itself (with characters frequently “breaking the fourth wall” and looking directly at the camera). (Ironically, Truffaut had been banned from Cannes one year earlier after he criticized the festival for prioritizing entertainment and spectacle over art and personal expression.)

A decade later, in 1968, student and worker protests swept through Europe. Truffaut and other French filmmakers and intellectuals, including Jean-Luc Godard and Claude Lelouch, called for a premature end to the 21st edition of the festival. The festival, which was supposed to run between May 10 and May 24, was shut down six days early in a show of solidarity with those who were opposed to American cultural imperialism, the Vietnam War and the global spread of capitalism.

Since then, other well-publicized episodes have disrupted the Festival de Cannes, from the discovery of a handmade bomb beneath a stage at the closing ceremony in 1978 to Danish filmmaker Lars von Trier’s explosive (if jesting) claims that he was a Nazi who “understood” Hitler in 2011.

Grappling with Netflix
This year’s edition of the festival is no exception to that history of politicized hullabaloo. Much of the recent commentary surrounding Cannes concerns the current state and future of film exhibition and distribution.

Specifically, the decision of the festival’s artistic director, Thierry Frémaux, to include two Netflix-produced films – South Korean director Bong Joon-ho’s “Okja” and American filmmaker Noah Baumbach’s “The Meyerowitz Stories” – has been criticized.

The move has drawn the ire of the National Federation of French Cinemas (FNCF), an organization that represents the interests of local theater owners who worry international streaming services will threaten not only their own livelihood but also the quality of cinema in the years to come.

Almost immediately after this year’s Cannes program was announced in early April, speculation arose in the pages of U.S. trade magazines as to whether online streaming services and small-screen platforms would be blocked from entering forthcoming film festivals. According to The Hollywood Reporter and Variety, a new rule set to go into effect next year will require any competing film at Cannes to be distributed in French theaters before being made available for online viewing.

Moreover, current French law requires a window of 36 months between theatrical release and streaming availability, a stipulation that Netflix, Amazon Studios and other streaming services aren’t likely to abide by.

The wrenching changes brought by streaming services to the TV and movie industries mark a departure from the political conflicts of years past. But controversy is certainly nothing new on the Cote d'Azur: a long view of its history suggests that strife and contention have distinguished this French cultural event since its very beginnings.

About Today's Contributor:
David Scott Diffrient, Professor of Film and Media Studies, Colorado State University

This article was originally published on The Conversation. 

Are Movies A Good Way To Learn History?


Daniel Day-Lewis as Abraham Lincoln
Daniel Day-Lewis won the 2012 Academy Award for his portrayal of Abraham Lincoln. Is Spielberg’s historical drama a good way to learn about the 16th U.S. president? Touchstone Pictures
By Scott Alan Metzger, Pennsylvania State University

Hollywood loves history. At this year’s Academy Awards, three nominees for Best Picture (“Fences,” “Hacksaw Ridge” and “Hidden Figures”) were “historical” to today’s teenagers – set in or about events that occurred before they were born. The Conversation

History movies, like most movies, have a huge audience in the U.S. Even Disney’s notorious 2004 version of “The Alamo” – a box office “bomb – was seen by millions. That’s far more people than read most best-selling historians’ books.

A lot of these viewers are kids, watching the movies in theaters, at home and even at school. I’ve observed “The Alamo” used by teachers on more than one occasion.

But are motion pictures like these good for learning about history? As a scholar of social studies education and the use of film to teach history, I offer the response that films can support learning – if used to meet specific goals and connected to the proper subject matter.
2016’s ‘Hidden Figures’ was nominated for Best Picture. Will it be used in classrooms some day to teach about this moment in the 1960s?
The allure of history movies
Fact-based or fictional, realistic or fantastic, history movies shape the way people think about the past. In a study of how 15 families discussed historical understanding of the Vietnam War era, kids and parents both spontaneously drew on memories of movies. “Forrest Gump,” in particular, was referenced by both generations.

It’s not surprising that teachers want to draw on this cultural power, showing movies in class to get students more excited about history. In one study of 84 Wisconsin and Connecticut teachers, nearly 93 percent reported that they use some portion of a film at least once a week. While not enough to draw clear conclusions, this study does suggest that history films are likely used quite often in the classroom.

So why do teachers choose to show movies with class time?

People often talk about the stereotype of the busy/lazy/overwhelmed teacher who puts on a movie instead of doing “real” teaching. However, research indicates that teachers actually tend to have good motives when it comes to showing movies in class.

In that study of 84 teachers, most felt that students are more motivated and learn more when a film is used. Case studies also describe other academic goals teachers have for using movies in class, which include understanding historical controversies, visualizing narratives of the past and studying movies as “primary sources” that reflect the time at which they were made.

In a recent study of more than 200 Australian teachers, many described how movies added audio and visual elements to learning and showcased a more personal, empathetic look at historical figures and events – both aspects that the teachers felt resonated with the learning styles and preferences of their pupils.

Forrest Gump
1994’s ‘Forrest Gump’ is a popular cultural touchpoint for thinking about the Vietnam War. Paramount Pictures

Do students trust movies?
Most young people are savvy enough to know that movies and TV are fictionalized, but that doesn’t mean they know how to keep history and Hollywood separate. After all, movies and TV shows set in a historical period can be extensively researched and often blend fact and fiction.

In a study of two U.S. history classes, high school students interviewed claimed that “Hollywood” films are less trustworthy sources of information. Yet in classroom activities, they treated them like any other legitimate source – perhaps because the teacher adds some unintentional legitimacy simply by choosing the film. The teacher “must see some good history in it,” explained one student. “I don’t think he’s going to show something random,” said another.

A case study by education professor Alan Marcus found that students believed most movies watched in class to be at least somewhat trustworthy – a source of information to gather facts.

The level of trust students have may also depend on their prior knowledge or cultural viewpoints, as in a study of 26 Wisconsin teenagers – half of them white and half Native American. The Native American teens found the 1993 Kevin Costner film “Dances with Wolves” to be slightly more trustworthy than their white peers did. The white students, on the other hand, rated the school textbook as much more trustworthy than the Native American teens did.

Kevin Costner’s ‘Dances with Wolves’
The perceived trustworthiness of Kevin Costner’s ‘Dances with Wolves’ may depend on a student’s cultural background. Orion Pictures

Educational challenges
The complicated relationship between fact and fiction is just one of the many challenge educators face when using history movies in their classrooms. It’s not as simple as pressing “play.

Among the host of practical and academic challenges:
  • Many history movies are R-rated, with material parents may not want shown in class.
  • Some administrators aren’t supportive of spending class time on popular media.
  • Pressure to cover content standards and prepare for testing can leave little time for intensive media projects.
The very structure of the school day, in fact, makes it difficult to fit film viewing into the curriculum – especially if discussion and reviewing strategies are included.

Perhaps the most daunting question is whether movies are actually good for learning history.

In one Australian study, most participating teachers believed film to be useful, but some took the position that film can confuse students with inaccurate portrayals. “Hollywood distorts history, but kids remember what they‘ve seen more than the facts,” said one teacher.

A psychological research study found that viewing history films considerably increased factual recall when the film matched historical readings. However, students came away with considerable misinformation when the film conflicted with the readings – because the students remembered the film and not the text. This occurred even when students were generally warned that the history movies were fictional.

With specific warnings about false details, most students were able to remember the accurate information as well as the misinformation. Teachers must set the stage when a movie is introduced, helping students mentally tag which elements are inaccurate.

Zack Snyder’s 2006 epic ‘300’
Zack Snyder’s 2006 epic ‘300’ has some big pieces of misinformation, but the bulk of the narrative elements is more accurate than many people think. Warner Bros.

How to learn history from Hollywood
History movies have potential as learning tools, but that potential isn’t easy to realize.
Teachers need strong subject matter knowledge about the topics portrayed, so that they can frame the movie and its relationship to fact and fiction. Teachers also need to have sound learning goals and awareness of the diverse cultural viewpoints that students bring to the classroom. And they need the time and resources for meaningful discussion or assignments after viewing.

Simply put, history movies – and most other media – by themselves don’t teach.

If a teacher lines up proper film choice, lesson goals, subject matter and class activities using the film, it is possible to really learn about history by way of Hollywood.

About Today's Contributor:
Scott Alan Metzger, Associate Professor of Education, Pennsylvania State University

This article was originally published on The Conversation. 

16 May 2017

UK: Labour's Manifesto Shows It Is The True Party Of Workers' Rights

Jeremy Corbyn launches the Labour manifesto
Jeremy Corbyn launches the Labour manifesto. Owen Humphreys/PA Wire/PA Images

By Gregor Gall, University of Bradford

It cannot be an accident that Jeremy Corbyn launched what may be his one and only general election manifesto in the city of Bradford. One of the forerunners of today’s Corbyn-led Labour Party was the Independent Labour Party (ILP). It was a full-blooded left wing party, founded in 1893 in Bradford. And, Keir Hardie, the ILP’s first leader and founder of the Labour Party, has frequently been cited by Corbyn as one of his inspirations. The Conversation

Both Hardie and the ILP were very strong advocates of workers’ rights, having emerged from the then nascent union movement. Corbyn, a former full-time officer of one of the forerunner’s of the biggest union in Britain, UNISON, is equally a very strong advocate of workers’ rights. This shows up in the publication today of Labour’s general election manifesto.

Keir Hardie
Keir Hardie. US Library of Congress

With the Conservatives trying to muscle in on traditional Labour territory by painting themselves as the party of workers, it’s worth taking a closer look to see which party truly represents workers.

Among the most significant of the pledges in the manifesto on rights at work are:
  • All workers equal rights from day one, whether part-time or full-time
  • Banning zero hours contracts so that every worker gets a guaranteed number of hours each week
  • Ending the use of overseas labour to undercut domestic wages and conditions
  • Repealing the Trade Union Act 2016 and rolling out collective bargaining by sector
  • Guaranteeing unions a right to access workplaces to represent members
  • Raising the minimum wage to the level of the living wage
  • Ending the public sector pay cap
  • Instituting a maximum pay ratio of 20:1 in the public sector and companies bidding for public contracts
  • Banning unpaid internships
  • Abolishing employment tribunal fees
  • Giving self-employed workers the status of workers
  • Setting up a commission to modernise the law around employment status
  • Creating a Ministry of Labour with the resources to enforce workers’ rights
These pledges are essentially a replication of A Manifesto for Labour Law by the Institute of Employment Rights in June 2016, devised in conjunction with labour law academics to promote healthy policy for workers.

Labour’s worker problem
The socialist left has often argued that Labour has failed to inspire the loyalty of workers, and union members especially, by being insufficiently radical. Consequently, the argument goes, there was less than a compelling reason to vote for Labour. Along with pledges to bring the water industry, railways, Royal Mail and some energy companies back into public ownership (which should reduce pressure on workers’ wages and conditions), this cannot be said to be the case this time round.

Some have criticised Corbyn’s Labour for giving into the allegedly vested and backward interest of unions. As Martin Kettle of the Guardian argued, “union power is not the same as workers’ rights”.

At one level, this is a valid point. With only around a quarter of workers now holding union membership, workers cannot rely on unions any time soon to be able to effectively defend their rights and interests.

But when one recognises that the implementation of workers’ rights has always needed the help of unions because they are the only sizeable independent organisations with the resources to do so, this point loses its force. Unions inform workers of their rights and help them apply them. Plus, unions have always helped more than just their members because employers apply the gains of union negotiated deals to all employees.

Wider significance
But focusing on the union aspect blinds critics to the actual significance of Labour’s manifesto. This is that, compared to what the Tories are proposing, Labour prioritises collective rights over individual rights so that workers can act together to advance their interests. Labour’s manifesto recognises that the workers are stronger together, echoing a fundamental belief of Karl Marx that the condition of the freedom of the individual is the condition of the freedom of all.

Indeed, without collective rights in law, especially with regard to the right to strike, any collective bargaining can easily end up being merely collective begging.

Union members protesting for their rights
Collective action is stronger than individual action. Matt Alexander/PA Archive/PA Images
The most obvious case in point concerns the right to sectoral collective bargaining, which Labour has emphasised in its manifesto. In Britain, companies in the same sector compete primarily against each other on the basis of their labour costs. Hence, there is a competitive advantage to cut wages and conditions as the principle route to profitability.

But by providing a statutory basis to sectoral collective bargaining, all companies in a sector would be compelled to furnish workers with the same minimum terms and conditions. No longer would they compete on labour costs in a “race to the bottom”. And, their attention would turn to improving productivity through investment in technology and training.

With stronger collective rights, applied and enforced with the help of unions, both unions and workers’ rights would be immeasurably strengthened. Time will shortly tell whether Labour’s manifesto will help it regain the support of working class voters. Or whether Theresa May’s pitch to be the workers’ friend will gain sufficient traction.

If Corbyn is successful, it will be a fitting tribute to the heritage of Bradford. It was here that an almighty 19-week strike at the city’s Manningham Mills textile factory by some 5,000 workers over wage cuts in 1891 gave a big spur to the founding of the ILP. It will also have been fitting that Labour launched the manifesto at the University of Bradford given that it started out life in 1832 as the Bradford Mechanics Institute, an organisation designed to help working class people gain the necessary skills for the ever changing world of work.

About Today's Contributor:
Gregor Gall, Professor of Industrial Relations, University of Bradford

This article was originally published on The Conversation

You Might Also Like