Today, we live in a burnout culture. I have mentioned this phenomenon before in previous posts and no wonder, because it is has such a pervasive, damaging effect on how we work and live our lives in the 21st century. This ‘burnout’ may manifest itself in different ways i.e. stress, fatigue or anxiety, though what is clear is that the primary offender is often one’s work life.
When trying to juggle and manage a demanding schedule and unrelenting work commitments, the first sacrifice to be made is almost always sleep. The clinical psychologist Vicki Culpin writes in The Business of Sleep that we are currently suffering from a “sleep epidemic”; Denis Campbell draws attention to the findings of The National Sleep Foundation in an article for The Guardian, which indicates that 16% of adults in the UK sleep for less that 6 hours a night. Inevitably, people often turn to artificial stimulants to compensate for the consequences of a lack of sleep with energy drinks, strong coffee and even caffeine tablets presenting themselves as a way to continue to function throughout the working day. Never mind leisure time, any spare moment throughout the week or even the weekend is precious and to be used to catch up on all things that pile up outside of work, such as doctors appointments, laundry, shopping or cleaning. When one has an unhealthy work/life balance, all other areas of life become marginalized so that life has little purpose outside an office cubicle.
It is abundantly clear that this emphasis on overwork and the idea that salary and career should come first has a hugely detrimental impact on the psyche of workers and society as a whole. A recent study shows that when people feel a degree of power after, say, after a promotion or a salary increase, they are less likely to be empathetic towards others – in indication that the philosophy our work culture is based on favors an unhealthy breed of individualism over collective social well being.Christopher Harvey’s article for GQ calls attention to the urgency and gravity of the issue, hilighting that
Half of all employees do not feel their workplace is an emotionally healthy environment, with 55 per cent of organisations having no formal strategy for handling employee wellbeing. Absenteeism increased 25 per cent over the course of the past year in the UK, highlighting that burnout is set to get worse, not better.
Overwork and burnout have become badges of honor that employees wear with pride, many people familiar with the routine smug moaning bragging of their colleagues over how little sleep they’ve had or how many hours overtime they just had to put in the night before. How much one works has morphed into a way to judge others on their work ethic, with colleagues routinely battling it out to be the most assiduous worker, a phenomenon known as “busy bragging”.
Though this might appear to work in the favour of employers, however, it is becoming increasingly clear that work-related stress and sleep deprivation, however dedicated employees may be, leads to a less productive and efficient work force. When companies exert too much pressure on their employees, it becomes harder and harder to retain staff, leading to a high staff turnover. A Forbes article calls attention to the fact that
Paradoxically, overwork does not equate to higher levels of productivity but instead only to those of work-related mental and physical health issues such as depression, anxiety and high blood pressure.
Perhaps, to curb the effects of this dangerous trend, we should re-envision what it means to be a working man or woman in 2018. Though it may sound naive and idealistic, this should be the year that we change our workplace culture so that the well being and mental health of employees comes before profit at all costs. In his Nichomachean Ethics, Aristotle sets out how he believes the ideal state should be constituted, happiness the central idea behind a functioning, flourishing society. For Aristotle, all actions have ends though some are subordinate to others, the ultimate end being that of ‘eudaimonia’ or human flourishing, essentially “doing and living well”. Moreover, while individual happiness is of value, it again is subordinate to the happiness and flourishing of a community, a philosophy we would do well to incorporate into our profit-driven modern society. Work is a means to an end – that being the happiness or flourishing of ourselves and our community – rather than an end in itself.
More and more, we are seeing corporate wellness programmes enter the workplace which give staff the opportunity to engage in activities such as mindfulness, yoga and exercise to improve their mental and physical health, though admittedly these are the companies that can afford such expenditure on their staff – often not the case for the majority of businesses. Instead, then, perhaps employers could be encouraged to try to engage with staff more on a personal basis, give them more credit for the work they put in and cease to encourage employees to work until breaking point by removing the individual pressure placed on them. It is becoming abundantly clear that businesses need to start putting people before profit not only because it is the right thing to do but also because a happy, healthy workforce is crucial to a well-functioning economy.
It’s no secret that charities have gained a bad reputation in the wake of the countless scandals that fill newspapers and dominate radio headlines; household names have been dragged through the mud and doubt sewn into the minds of donors. Despite these instances of appalling behavior, though, we should treat this as an opportunity to reconsider our attitude towards charity in order to make our ongoing support as meaningful and worthwhile as it can possibly be. In 2016, The Guardian published an article on how trust in charities has been declining slowly but surely, with 33% of people blaming media coverage of recent scandals as the reason for this apparent trend. Figures like these are likely to be even more stark a year on will all of the recent, high-profile scandals that have been rigorously covered by news outlets, Alice Ross of the Financial Times warning that donors, particularly big ones, will be much more wary of donating in future.
Over time, there have been countless examples of misconduct in the charity sector, though these last few years seem to be the worst yet. Oxfam, for example, has been embroiled in a sex scandal that emerged from 2011 which took place in Haiti, where aid workers allegedly used sex workers (some potentially underage) while working on the ground. Save The Children, the international NGO promoting children’s rights, has also been in the media firing line after it was revealed that its chief executive – as well as the husband of the murdered MP Joe Cox – harassed his female employees whose concerns were not addressed with the appropriate scrutiny and respect. In 2016, The Wounded Warrior Project, a foundation that aims to help ex-combatants recover after service was accused of “spending lavishly on itself”, directing funds towards luxuries like expensive hotel rooms and business class flights rather than its service users.
Scandals such as these happen everywhere and in almost any organization, though while there may be a measure of outrage following exposed tax evasion or fraud, such as that of Starbucks for instance, it is not nearly as long-lasting or damaging as the fallout from a charity scandal. In these cases, we are far less forgiving, which may say something about how we imagine a charity to be. In the minds of many, charities are purely do-gooding institutions that not only make the world a better place but also make us feel better about ourselves. When charities we support are involved in questionable or downright immoral behavior, we feel betrayed and experience a loss of the trust we put in them when we decided to hand over our credit cards details. To add insult to injury, many find it embarrassing when they have publicly donated to a ‘good’ cause only to be told that their contribution went to nothing or, worse still, that it funded bad behavior at the expense of people in need.
Perhaps, then, the reason why charity scandals are so abhorrent in particular is because of the way in which we think about these organizations and about ourselves when we give. Giving to charity is an easy way of assuaging a guilty conscience, an example being the thousands of people who donate to homelessness charities after passing a rough sleeper. The psychology of charity is extremely complex, research featured in the New York Times indicating that there is not one motivating factor but many when we donate, ranging from pure altruism (if that exists) to self-interest. If there’s one thing we might salvage from these charity scandals it is that they have forced us to consider why and how we give as well as who to and how our money is being used. David Shariatmadari writes for the Guardian on how giving to charity has a similar effect in the brain to taking addictive drugs such as cocaine, neatly putting it “Charity can get you high”. If anything, it is charity scandals that threaten to kill our buzz.
One way of thinking about positive charitable giving might be to think if it as a relationship between two people. If one was experiencing difficult times, the other – if they are a true friend – should try to support them and to enable them to overcome or adequately cope with the hardship while allowing them to retain a degree of independence and dignity. Clearly, this is preferable to a relationship of unequals where one party is dependent on the other, who is only committed to giving just enough, the bare minimum. In Mighty Be Our Powers, the autobiography of the leader of the Liberian women’s movement for peace, Leymah Gbowee has some interesting things to say about giving aid, arguing that organizations must work with the people affected or the service users as, quite obviously, they know that will work or what won’t and where funds really need to be channeled. In her words:
“Most of the institutions that come in to offer help after disaster don’t have the resources to provide concrete help like that. Donor communities invest billions funding peace talks and disarmament. Then they stop. The most important postwar help is missing…You’d think the international community would be sensible enough to know they should work to change this. But they aren’t.”
As we see so often, charitable causes are reduced to hashtags and one-off donations during its five minutes of fame. In 2012, for example, my Facebook feed was full of support for victims of the Lord’s Resistance Army and cries for Joseph Kony to be brought to justice for his atrocious crimes. Tellingly, the hunt only ended last year, 5 years after the media storm took off, though you’d hardly know as the cause had long ceased to be ‘fashionable’, a scandal in itself.
Charity scandals, while not all alike, shatter the illusion that these organizations can do no wrong and provide opportunities to reconsider what they need to be about. What is becoming increasingly clear is that these organizations must work with the affected people to be as meaningful as possible and fully realize their potential for doing good. Service users must be able to make informed choices about the help or support they receive and be able to meaningfully contribute to the decision-making process. It is these people who best know the scale of the issue, how outside help fits in and how they might best be supported . Now, in many organization mission statements is a declaration of the intention to work alongside those affected, such as that of the Refugee Council, for example, who claim that “We[they] work with refugees and people seeking asylum in the UK…We offer a helping hand to support and empower them to rebuild their lives”, the key words being “support” and “empower”.
Giving to charity can be a wonderfully positive and fulfilling enterprise though, like everything else, should be undergone with serious thought and reflection. There are many organizations doing excellent work which we should be funding to further empower their continuation. Really, then, we should do proper research and take time to reflect before we thoughtlessly donate and ask ourselves the difficult questions before we hit the ‘pay’ button and then share our ‘generosity’ on our social media feeds.
What with the internet and social media, the society we are currently living in nurses a culture of oversharing. We now share everything, including our bodies, tastes, habits and histories; nothing really is taboo. Without a second thought, we let those we know, our ‘friends’ or followers, what we’re reading, eating, where we’re going and what we’re buying. At face value, this may seem harmless though there is plainly a sinister undercurrent to this seemingly innocuous habit.
More and more, we offer up our personal information to be consumed by others, the essence of the issue lying in the fact that all of our actions have become performative and about marketing, be it a service, product or most often ourselves. Though this fascination with the intimate details of people’s lives is nothing new, this “narcissism epidemic”, as The Guardian refers to it, can be traced to the rise in popularity of reality TV, from shows like The Real World to Big Brother to Keeping Up With The Kardashians which glamorized the minutae of the everyday and gave audiences a taste for ever more in-depth access to peoples’ lives.
Effectively, sharing everything about ourselves on social media creates a ‘cult of the self’ where the ordinary and banal is made exciting with the automatic assumption that others are interested to hear about it. Scrolling through my Facebook feed, I see incredibly personal posts about chronic illnesses, people opening up about their sexuality and generally filling us in on almost everything, even what they had for lunch. Though there are undeniably positive aspects to this honestdirect approach to sharing, my first impression is that it indicates a fundamental insecurity, a fragility where we look outwards for affirmation and approval rather than inwards.
This apparent self-confidence actually masquerades as insecurity as we are totally dependent on others in how we see ourselves. Though we might share some good news that we are excited about, such as a pregnancy or a promotion at work, by putting it on social media we are also seeking approval from our audiences. So often is it the case that someone will share a photo of themselves that they feel confident about, only to remove it days later because it hasn’t accumulated the right amount of likes to justify its being on their profile. The phenomenon is entirely different to self-love as rather than looking inwards and being content with oneself one must look outwards to achieve a similar degree of satisfaction.
Earlier this month, Anna Freedman wrote an interesting piece for Dazed and Confused magazine about the Kylie Jenner’s decision to delay releasing news of her pregnancy until after the birth of her daughter Stormi. Freedman writes of how the young woman’s decision was “a masterclass in how to publicly strategise the private and intimate phenomenon of motherhood”. Much like the oversharing that has proved so profitable for her family, Kylie Jenner has now shown how “privacy and intimacy can be employed as skillful marketing tactics”. Even privacy is now a marketing tool, though it is important to remember that this ‘privacy’ swiftly came to an end after the birth when an 11 minute video was shared chronicling her journey through pregnancy to mollify fans who felt they had been kept in the dark. Kylie Jenner is only one of many who have shown that online performance now knows no bounds, promoting the idea that the key to success lies in the ‘share’ button.
Like all things, though, it’s not all bad; undeniably, there are some positives to this direct approach to social media. Things that were once taboo, such as medical or mental health conditions, for example, are more widely discussed. Similarly, sharing platforms can be used to find like-minded people or those that you identify with, the internet often acting as a space for marginalized groups, such as the LGBT community, to come together. We are becoming more and more confident with expressing ourselves and, in this way, wearing our hearts on our sleeve, though while this approach may be direct it is not necessarily honest. Though we might share a lot about ourselves, we share carefully and choose exactly what we want people to know or to see to align with how we want to be perceived.
In sharing intimate details of our everyday lives, it is no surprise that studies show that narcissistic traits are becoming more common and more pronounced, particularly among young people who are the principal users of these sharing platforms.
The graph shows the Narcissistic Personality Inventory score set against the year, gleaned from an online self-test, it being clear that there has been a definite rise in narcissistic traits in recent years. This trend is clearly only likely to continue in a society that tells us that everything we do is fascinating and that everyone would like to hear about it. Zoe Williams again puts it well writing for The Guardian as she sums up the belief as “once you are important enough, nothing is mundane”. In sharing everything about our lives, we are hoping – consciously or not – to obtain some kind of approval or reassurance about ourselves that is undoubtedly a hugely unhealthy habit. TIME magazine recently ran an article that links poor mental health to social media usage, particularly of Instagram, accounting for higher levels of depression and anxiety based on poor self-esteem. When we log in to our social media accounts, we are bombarded with stories of our friends going out or on holiday, news of their successes and photos showing just how attractive they are, inevitably leading to negative comparison making about ourselves.
So much of this oversharing is borne out of the need to market oneself – so telling about the society that we live in. Nowadays, the individual is king, a philosophy that breeds the need to market oneself to succeed. According to Jean Twenge, joint author of The Narcissism Epidemic, “Economic prosperity does seem to be linked to individualism” partly explaining the boom in sales of products marketed and advertised over social media and sharing platforms, particularly Facebook, Instagram and Youtube. On sites such as these, people become a highly lucrative brand, the Kardashian sisters evidently a case in point. On Friday, he BBC reported that Kylie Jenner’s decision to tweet that she no longer uses Snapchat regularly “wiped $1.3bn (£1bn) off Snap’s stock market value”, an indication of how interlinked the personal and financial are in our heavily digitized modern society. Zoe Williams writes insightfully of how the careful curation of our social media accounts creates “a competitive culture in which asserting one’s difference, one’s specialness, is the bare minimum for being market-ready.”
Oversharing and self-branding is now an ingrained a part of our everyday, bleeding into our all aspects of our lives including our personal relationships and work life; suffice it to say, it is difficult to know how to adopt healthy digital habits that preserve and promote mental well-being for the future. Mark Zuckerberg, CEO of Facebook, recently announced that the site would be working to prioritize “more meaningful social interactions” over paid-content like media articles or advertising as “We[they] feel a responsibility to make sure our services aren’t just fun to use, but also good for people’s well-being”. TIME mentions a report on social media usage conducted by the UK’s Royal Society for Public Health which “recommends the introduction of a pop-up “heavy usage” warning within these apps or website” which seems to have considerable popular support.
Regrettably, neither of these proposed solutions adequately seem to tackle the multifaceted and vastly complex underlying issue, though this is hardly surprising. Perhaps, along with policy changes and the actions of major corporations and civic bodies, it will really take the will of the people who use these platforms for any meaningful change to take place. Maybe rather than living our lives as open books, privacy might replace oversharing as the new social media trend, as unlikely as the prospect may seem. With all this uncertainty, it seems that all we can really do is think before we share and wait and see.
Museums: though we might like to think of them as neutral places where observe art, history and culture, arguably it is difficult, nay impossible, to do so with impartial eyes. The many ethical problems that present themselves to us make it clear that the nature of curation is far more ethically complex than we might have initially thought and might merit our reflection before our next visit.
Glaringly, there is the issue of how certain artifacts have come to be acquired, particularly in light of the colonial past of countries such as Britain, where valuables all around the world were plundered and looted both intellectually and materialistically. In 1897, the British led a ‘punitive’ raid of Benin (often described as a massacre), notorious for its violence and brutality. The British took many Nigerian cultural artifacts home with them where they have remained for decades, though now their right to possession is being contested, there being the possibility that they might return to Nigeria.
The Benin bronzes are a case in point in terms of the ethics of museums, namely that they are places fraught with cultural significance and complexity. Surely the way that such artifacts came into British hands cannot be ignored, particularly when their history is steeped in blood, violence and theft, as is the way with so many other valuable and precious objects. In showcasing stolen artifacts, some say that museums are essentially legitimizing the way they were acquired and the profoundly troubling superiority complex that lies behind raids such as these.
This particular objection to museums is such a vast topic, however, that I must move on, being unable to do it proper justice; I would, however, encourage people to do some digging into similar instances of cultural appropriation and theft as it is clear that the history and ethics of museums is intricately woven into the history of the world, both ancient and modern.
On top of this, museums cannot escape cultural hegemony, even if they aim towards neutrality and impartiality. Exhibitions are curated for certain audiences and must make sense for that audience, meaning that exhibitions take on a narrative quality. Curators must decide how the artifacts will be displayed, putting them together in a way that is easy for outsiders to digest. They must choose what information to include or omit, what and what not to display and how to frame and compose the exhibitions. In an article for the Independent, Shazia Awan shows that “cultural imperialism is very much alive and kicking” after a curator for the British Museum explains in a Q&A session how “We aim to be understandable by 16-year-olds. Sometimes Asian names can be confusing – so we have to be careful about using too many” displaying an attitude of what she calls “arrogance and sheer ignorance”.
When we enter a museum, we are not simply seeing a collection of facts grouped together but rather but a certain representation of history, of reality. We look at artifacts neatly dated and captioned with names, perhaps with a little background information about the piece. They are displayed in a sterile public building often thousands of miles away from their country of origin, to be viewed by people who are not only culturally removed but also by time and place.
Though facts may be objective, their curation is not. Without context, exhibitions are prone to over-simplification of the culture or time they represent. Objects are placed and information told in a certain way so that, while factually accurate, they necessarily become a narrative. On top of this, audiences come with preconceived notions about what they are about to see or intend to take away from the experience which has the potential to then be reinforced upon their arrival. Museums as institutions wield a great deal of power as they, like schools, history books and newspapers, define what is to be understood by ‘truth’.
The British anthropologist and curator who joined the expedition to Benin later published Great Benin: It’s Customs, Art and Horrors , seeking to present some of the items acquired on the raid and give each a short explanation. Though his work attempts to be anthropological and academic, he cannot escape the racism and imperialism of his time, his preface asserting that “if a city ever deserved its fate, that city was the city of Great Benin”, that is, being looted and burned down by British colonial soldiers for its apparent “squalor”, though thankfully the days when captions such as these would have been considered acceptable are long gone.
In spite of all this, to deliver a brief history of hundreds of international cultures and communities is no easy feat and is indeed a noble endeavor, particularly if you want audiences to understand and learn from what they see. Museums are great places for us to move beyond our narrow scope of experience to achieve a more global perspective. As Anra Kennedy writes in a piece for The Guardian,
They’re places where the extraordinary jostles for space with the everyday – our local community’s everyday or that of distant peoples and past times. They hold evidence of craftsmanship, ingenuity, creativity and imagination, alongside that of cruelty, horror and inhumanity. Just as valuable are their people – curators, academics, scientists, artists, makers, researchers, educators, re-enactors and storytellers
Precious objects, wherever they are from or displayed, can be given the proper care and attention they need to be preserved and enjoyed for centuries to come. In a museum, artifacts can be placed alongside similar objects rather than in isolation, allowing us to see patterns of cultural exchange throughout history and the proper context within which they should be understood.
Crucially, some exposure to the past, to other cultures, is better than none or even a limited amount. By making us aware that life goes on outside the bubble we live in, museums fulfill a crucial role in shaping how we see ourselves in relation to the rest of the world and to appreciate the unfamiliar. It is by going to exhibitions that we are able to admire Grecian sculptures or the intricacy and beauty of ancient Egyptian tombs. Though they are by nature ethically complex places, museums are crucial for the success of a culturally aware, pluralistic society.
The other day, I was stacking shelves where I work and was shocked at the conversation I overheard between two young girls popping in for a post-workout snack which went something like this: “I want to buy something healthy – what about these gluten-free dairy-free brownies? Or the vegan wheat-free flapjacks?” “Oh but look at the price” “Oh yeah *groan* it’s so hard to eat healthy – it’s all so expensive”. The girls left the shop with only a small pot of melon slices each – apparently it was the only ‘healthy’ food on offer that they could afford. I was left rather confused and disgusted, having been confronted with the inescapable force of the food trend that everyday shows itself to be more and more problematic, that being the ‘clean’ eating phenomenon that seems to have stealthily taken over attitudes towards food in recent years.
The link between poor mental health and ‘clean’ eating is undeniable, particularly when in is exacerbated by the pressure and strains of social media. The subject has been well documented by health organizations, the eating disorder charity Beat recording a rise in the number of calls to its hotline over the past few years linked to anxiety overly restrictive food rules. Documentaries such as the BBC’S Clean Eating’s Dirty Secrets and Clean Eating – The Dirty Truth have exposed the fragility of many of the claims being made and the impact they have on those exposed to them. Eating disorders are serious mental conditions that can have devastating consequences – anorexia having the highest mortality rate of any mental illness – and affect all age groups, ethnicities and genders. Recently, the phenomena of orthorexia, defined in the Oxford Dictionary as “an obsession with eating foods that one considers healthy”, has come to popular attention, though it cannot currently be clinically diagnosed. Social media platforms such as Instagram, Snapchat and Facebook alongside other sharing platforms like Youtube act like a petri dish, facilitating the growth of myriad restrictive and absolutist diet trends that often overlap and contradict one another, flying in the face of well-researched evidence on what a truly healthy diet for body and mind might actually look like.
To make matters worse, food becomes a moral issue with a hugely negative impact on the mental health of vulnerable young people. What one eats comes to be placed in to two categories, namely foods that are ‘good’ and those that are ‘bad’. This black and white thinking generates a lot of anxiety for people who feel that they are failing if they do not meet these exacting standards. As mentioned earlier, the emergence of orthorexia nervosa points to a serious problem with our attitude towards food, orthorexia also translated as a “fixation on riteous eating”. This could not be more telling, food not only being a way to fuel our bodies on a daily basis but a means to improve our moral character, to become a “righteous” individual by proxy. Eating the ‘wrong’ food triggers self-loathing and acute anxiety, the message behind the moralism being that we are intrinsically unworthy and it is only through eating the ‘right’ way that we can somehow vindicate ourselves.
It is ironic, then, this obsession with restrictive diets often proves to be unhealthy, not only mentally but physically. Much of the ‘clean’ eating movement is based on pseudoscience concocted by individuals financially invested in this lucrative lie. Restrictive food rules are dressed up as science and packaged seductively, be it in a beautiful and slim food vlogger, an expensive new cookbook or a new range of pricey products in the supermarket. Often, these companies make dubious claims about what their products can do, some saying that they reverse disease, aid weight loss or garuntee an overall ‘healthy glow’. Many, if not most, of these corporations rely on a kernel of truth which they exaggerate and capitalize upon. A significant number of these ‘clean’ celebrities do not have the appropriate medical qualifications to be touting such advice, taking advantage on the vulnerability and ignorance of their customers to turn a profit. In a piece featured in The Guardian, Bee Wilson insightfully points out how
clean eating confirms how vulnerable and lost millions of us feel about diet – which really means how lost we feel about our own bodies. We are so unmoored that we will put our faith in any master who promises us that we, too, can become pure and good.
The promise of wellness is, however, a mirage.
Crucially, this trend is also having a sinister effect on society as a whole, particularly in regards to class and elitism, food offering yet another way to divide and segregate. These so-called ‘health’ foods are marketed at inordinately high prices in full awareness that their affluent target market are willing to pay extra to opt into the ‘clean’ eating club. Food is an accessory, a statement of class, completely inaccessible to those who cannot afford to participate. Only the wealthy can afford to pay £3.99 for Deliciously Ella’s Original Granola when Tesco’s own is under half the price, or £2 for Rude Health’s organic oat milk when cows milk costs just £1.50 for more than triple the amount.
The clean, healthy eating movement is essentially a vanity project for the middle class, food being just one more way to distinguish the haves from the have-nots.
This movement is the perfect money-spinner, as where diets or eating trends used to be temporary, this trend is permanent and necessitates a complete and sustained lifestyle overhaul.Despite what the advertising industry would like you to believe, it is entirely possible to lead a healthy lifestyle without emptying your savings account.
Similarly, this same movement is contributing to the divorce of food from its social and historical context; once deeply embedded in a collective culture, certain foods turn into trivialized fads in the western world. Matcha green tea -an integral part of the ancient Japanese tea ceremony – has been adopted and can now be consumed as a latte, ice-cream or even a ‘chocolate matcha butter cup’. Quinoa, once an obscure crop from South America to many in the west has become as basic a grain as pasta. Foods that are integral to a culture heritage are taken and commercialized as the next ‘it’ food, only to be dropped and forgotten to make way for the next trend. Food is often imbued with cultural significance so the idea that a middle-class young food vlogger in Shoreditch has just ‘discovered’ the versatility and health benefits of sorghum when it has been growing in Africa for centuries and is the fifth most popular crop in the world is not only arrogant but demeaning too.
The myths of the ‘clean’ eating movement – essentially ‘fake news’ – could be debunked with proper education about the reality of what constitutes a healthy diet. No, eating only alkaline foods will not reverse cancer. No, you do not need to blow your next paycheck on the most expensive products in the supermarket to be well. No, cutting out gluten when you are not coeliac or even intolerant will not necessarily make you feel ‘energised’, neither will it make you a better person. With proper education, many of the ‘truths’ of the industry would be exposed as lies. Yet this would not solve the other equally if not more important issue of the sense of moral superiority associated with ‘clean’ eating.
Until we stop feeling the need to make ourselves feel better at the expense of others and by appealing to the standards set by the advertising industry, this distinctly unhealthy trend and accompanying mindset is here to stay.
Around six months ago, on the 19th June 2017, the 48 year old Darren Osborne is accused of having driven a van into a crowd of Muslim worshipers outside a mosque, injuring nine people and killing one 58 year old man. Though the suspect has not yet been convicted, the tragedy has opened the eyes of the public to the urgency of the problem of combating Islamophobic propaganda and of rethinking the way in which a narrative can be created about a community that is not only divisive, but potentially deadly.
Though it might have been Osborne who is responsible for the tragedy, this should in no way detract from the bravery of the victims, in particular that of the imam Mohammed Mahmoud who prevented any retaliation by shielding him from a crowd beginning to show signs of shock-induced aggression. Despite being lauded as a perfect example of someone loving their enemies and even being labelled as the ‘hero imam’, though, Mahmoud rejects this characterization on the basis that it implies that he is the exception to the rule. In an interview for The Guardian, he eloquently explains:
“We can’t escape the fact that Muslims are portrayed in an unfavourable light in the media…to conclude or theorise that [Osborne] would have been killed if [I wasn’t] there, that’s based on a narrative that’s put forward that Muslims are savage and don’t respect the law”.
Instead, he calls attention to the fact that he was helped by many others, that aggression is the natural response to such a barbaric act of cruelty. The London mayor Sadiq Khan similarly spoke out about the horrific events, asserting:
“This is a good community. They pull together, they work closely with each other and the actions of Imam Mohammed are what I would expect from a good faith leader and a good Muslim leader.”
Why, then, must it take a disaster such as this for people to recognize the power of common moral principles, of shared dignity and humanity? Why must it be necessary for a tragedy such as this for a Muslim man to be called a ‘hero’ for the work he does and for the outpouring of support that came in the wake of the attack even though Islamophobia is increasingly being felt by British Muslims on a daily basis? The Independent has published figures that show that instances of anti-Muslim hate crime targeting mosques have more than doubled between 2016 and 2017, Sadiq Khan also pointing out that Islamophobic attacks have increased fivefold since the London brige attack.
Undeniably, the media has a huge role to play in the unfair demonization of Islam, the press being responsible to much of the kind of hateful, extremist content that motivated this act of terror. Osborne’s partner, Sarah Andrews, described to the BBC in an interview how “He seemed brainwashed and totally obsessed with the subject [of Muslims]” prior to the attack. She cites programmes such as the BBC’s Three Girls and the social media accounts of nationalist parties such as Britain First and the English Defence League as contributing to his paranoia, it being fair to say that a clear line can be drawn between hate crime and the way that Islam is presented in the media. An article in the Guardian draws attention to graffiti on the Sutton Islamic Centre which reads “Terrorise your own country”, ironic when the terror suspect in this instance is British and attacking fellow Brits.
This kind of bad press disproportionally targets and affects those most vulnerable, it being convenient to create a scapegoat for society’s ills that can shoulder the blame for everyday hardships. It dehumanizes entire communities and encourages a tribal way of thinking where those who may be of an unfamiliar race or religion do not ‘belong’, or even pose a threat to the existence of one’s own tribe, even though this is an entirely ludicrous and unhealthy way to look at the world.
All of this only highlights the growing need for the government and the police to combat Islamophobia and to put the same kind of efforts into eliminating the issue as they do towards extremist Islamic propaganda. Both of these are terror-related, and must be treated with the same degree of urgency and dedication for they both pose a threat to what we might like to imagine a peaceful Britain to be . It seems completely absurd that schools might summon a boy who simply converted to Islam into a meeting to check if he was being targeted by Islamic State (as they did at mine) but authorities wouldn’t investigate a man (Osborne) who had made inflammatory and threatening statements at a pub – “I’m going to kill all the Muslims, Muslims are all terrorists. Your families are all going to be Muslim. I’m going to take it into my own hands” – and publicly announcing his intentions to kill members of the Labour party such as Sadiq Khan and Jeremy Corbyn.
Fundamentally, the debate comes down to issues of identity and belonging, Osborne harboring the ideology like many others that those of other ethnicities, cultures and religious faiths don’t ‘belong’ in Britain. At the heart of this is a certain dangerous conception of what Britain was and should be i.e. a predominantly white and Christian country. Though this Britain may never truly have existed, as this country has for thousands of years been composed of countless different cultures and ethnicities, the dream persists, more accurately described as a nightmare for the potential hatred and division it feeds and facilitates. I for one would rather belong to a community made up of members who honour justice and mercy, such as Mohammed Mahmoud, than those who take the law into their own hands and fail to recognize the humanity of others as Osborne has demonstrated. Now, rather than divide us, this tragedy should teach us that we must actively nurture compassion and understanding rather than hatred and division, or the society that we would like to live in might never come to fruition.