Marketers are often building cyber assets — websites, Facebook pages, YouTube brand channels and so on — because everyone seems to be doing so. Not much thought goes into the strategy or the execution. The outcome is a collection of disconnected properties that diffuse and disorient their brand, and confuse customers.
It is crucially important that before you execute, you have a cohesive strategy that integrates all offline and online brand assets, such that all the elements play well coordinated, yet distinct roles. That your brand’s communication and imagery remains intact as customers move across the media platforms, yet each channel contributes incrementally to its equity.
This guidebook will help you achieve these outcomes. It will give you a clear understanding of the building blocks that constitute digital marketing, and equip you with the tools, the techniques and the knowledge to develop cohesive market strategies, and prepare and execute effective digital marketing campaigns.
Ordinary people empowered with the social media are interacting and collaborating with increased speed, reach and effectiveness. This has had a profound impact on society, changing the political, economic and cultural landscapes across the globe.
This chapter explores a host of new concepts and developments such as:
It also examines trends in media consumption, interactive television, and outlines some basic facets about the world wide web.
From a learning standpoint, this chapter imparts an understanding of the impact of the new media on the social, political and market landscape. It highlights key lessons for marketers and outlines the new rules and perspectives, leaving readers with an appreciation of how the marketer’s mind-set must change to succeed in the social age.
With global internet penetration crossing 50% in 2016, more than half the world is connected through cyberspace. From one perspective, this connectivity is an unprecedented leveller, breaking down barriers, freeing up information, and exposing everyone to more opinions and viewpoints on local and global issues.
Yet this great connector has also become a great divider, enabling those who are connected to coalesce into what can be called “social cloisters” — groups that are relatively small, insulated and share similar opinions and views. Fuelled by the phenomenal growth of social media platforms such as Facebook, the ether has become home to hundreds of millions of these cloisters.
People are increasingly spending time within their cloisters, sharing their day-to-day experiences, their thoughts and their feelings, and by doing so, influencing and reinforcing each other’s mindsets.
What distinguishes this meso-level of communication from conventional media is its vulnerability to misinformation. By and large social networks, unlike newspapers and TV, eschew wider editorial responsibility for the content they distribute. As a result people are more likely to be fed misleading content, propaganda and outright lies — the rise of so-called “fake news”.
Social cloisters have an inherent tendency to be divisive in nature. Members feed on each other’s content, share stories, and hothouse thoughts, feelings and ideas which become amplified and reinforced by transmission and repetition. This can lead to hyper-partisan behaviours, with members of the cloister embracing sharply polarized views on a variety of subjects while alternative or opposing views are supressed or underrepresented.
What is becoming clear is that the reinforcement of cloistering by “fake news” and misinformation is having a profound influence on society as a whole. The recently concluded US election is a case in point.
Many reasons have been sighted as to why pollsters and election forecasters missed Donald Trump’s stunning victory. There are however, two noteworthy factors that should be pointed out.
Firstly, one of the implications of the fracturing and cloistering of society, both for governments as well as marketers, is that it is hard to predict the inclinations and behaviours of people, and draw conclusions for the population as a whole. Statistically speaking, not only is a much larger sample required, it is also more difficult in splintered populations to ensure that these samples are truly representative.
Secondly, as we know from qualitative research, it is not easy to unlock people’s minds. Polls tell us only what people claim they will do. While there is always a strong relationship between these claims and what people will actually do, the bias is hard to predict.
In the case of Trump, the bias would have been substantially accentuated, considering his public image and the battering that he received from the media for his numerous indiscretions. His rival, Hillary Clinton, was not the only person who considered it “deplorable” to vote for Trump.
In this context a small but significant proportion of Trump’s supporters would have been reluctant to express their intentions, because saying so might be deemed politically, socially or intellectually unacceptable. Within their social cloister, supported by others of like minds, they feel secure expressing their choices and opinions — but outside of the cloister it is a different matter entirely.
So what divided the states of America?
In addition to a number of other factors that fall outside the scope of this text, it is apparent that the intensified cloistering fuelled by social networks had a significant effect. Studies have shown that social networks play an enormous role in distributing news and information (with limited regard to levels of accuracy) and hence informing and shaping opinion. One report earlier this year from Pew Research found that almost half of American adults — 44 per cent — cite Facebook as their primary source of news.
In the wake of the US election several commentators have highlighted the surge in “fake news” online in the final months of campaigning, including the emergence of some operations designed to manufacture and distribute fake news specifically for profit. Within social cloisters, powered by a steady stream of genuine as well as fake information/misinformation, the echo chamber effect would have been in full force, amplifying opinions of its members and disregarding others.
This cloistering process is a product of the new media. While some may see it as a social problem, it is more a social reality. There will be no turning back of the clock.
For politicians and marketers, depending on how they adapt, it can be both an opportunity and a threat. The marketing rules and perspectives that apply to social cloistering, demand changes in methods and style — listening, reaching out to and engaging via the new media such that messages penetrate the targeted cloisters.
This is not easy, particularly when confronted by such diversity and polarisation of opinion, yet it is a challenge that must be addressed. Voters and customers cannot be swayed without empathy and understanding — and they certainly will not be won over by being labelled “deplorable”.... less
“If we are not serious about facts and what’s true and what’s not, and particularly in an age of social media when so many people are getting their information in sound bites and off their phones, if we can’t discriminate between serious arguments and propaganda, then we have problems” — President Barack Obama.
Marketers know that perceptions and beliefs are more important than the truth in influencing people’s minds. While facts do have a strong bearing on what people believe, so does fiction. People ultimately believe what they want to believe.
In this context, the spread of misinformation in cyberspace, planted by individuals or organizations seeking to dishonestly further their personal agendas, is a major concern.
Fake news has impact and reach because it is made to be sensational and demands to be shared. It is, in effect, gaming the system — packaged to appeal to the emotions of netizens.
The production and generation of fake news is not merely unethical, it can also be malicious. The reports may originate from demagogues or mere opportunists and charlatans, and their intent may vary from attracting eye balls, to selling a product or a political party, or to something more sinister like peddling terrorism.
The primary motivation, especially for “news” websites that publish bogus reports, is often financial. There exists an entire industry that manufactures and distributes fake news specifically to profit from advertising. The owners of these sites mostly do not care about what they write; they see it as an easy way to make money.
Ordinary people find it difficult to decipher fact from fiction. This is exasperated by the fact that bogus reports are usually packaged well, and mega internet platforms lend them credibility. For instance, Google’s top news link for the final results of the US election of 2016, went to a bogus site with the fake content, including the factually incorrect headline “Trump won both popular and electoral college … ” (see Exhibit 12.2). Similarly, earlier in the year, Facebook trended a fake news story about Megyn Kelly, which claimed that the Fox News anchor was sacked for secretly supporting Hillary Clinton. That such stories can top Facebook’s trending list, lends them credibility.
Part of the problem lies in that social media platforms rely on automation to filter out fakes from the colossal content that floods their network. Human editors previously employed by Facebook to curate its trending news section were replaced by computer programmes. Present-day machine learning based algorithms, however, are just not smart enough to do their jobs.
Major social networks, on their part, have acknowledged that more needs to be done, and they say that they keep improving their ability to detect misinformation, and to swiftly remove it. However, since it is not something that they are currently on top of, marketers and politicians need mechanisms to deal with it.
In this context the spread of misinformation complicates marketing efforts. Besides competing with bogus content for netizen’s mindshare, marketers must take actions to contain the impact of fraudulent reports that could damage their brands and their reputations. Many of them too are guilty of spreading their own spurious content.
Misinformation is a growing social problem that needs to be contained. There ought to be consensus that purposeful lying or intentional misrepresentation should not be condoned. The measures needed to stem the problem, however, are debateable. Every person and every government has their own view on freedom of expression.
Given this background, governments around the world are increasingly scrutinizing social networks, in an effort to contain the spread of fake and hateful content. To cite a few examples, after the US election results of 2016, President Barack Obama sharply criticised the bogus news reports saying they threatened democracy. In December 2015, the German government struck a deal with Google, Facebook and Twitter to remove any German anti-migrant sentiments from their networks. Earlier that year, the French prime minister and European Commission officials met separately with Facebook, Google, Twitter and other companies to demand faster action against online terrorism incitement and hate speech. And in 2009, China, for a variety of reasons including censorship, blocked Facebook, Twitter, YouTube, and FourSquare. Later in 2010, Google, which had maintained a service that conformed to the country’s censorship policies, shut down its Chinese search engine after a cyberattack that targeted it and some other companies.... less
MORE ... CLICK to read chapter in MarketMind