Friday, February 19, 2016

The Anatomy of Propaganda – Examining Multiple Tactics of Psychological Manipulation in Media Today

For a long while now, I have had in mind to write an article on the topic of specific techniques of thought manipulation by way of propaganda. In modern-day America, the use of propaganda is common-place in virtually every mainstream media source available. We see this manipulation on TV, in movies, magazines, internet publications, video games... Just about every established medium employs propaganda in one form or another.

The questions of who and why this manipulation takes place may be answered in another article some time in the future, but for the moment, the matter of definition is our first priority. To be personally educated on the matter of informational accuracy is vital in a free and open society, I believe, and the ability to discern between fact and fiction is of key importance as well (as many times, both fact and fiction are communicated side by side with little or no obvious distinction). This article is intended to open the eyes of the reader so as to provide the tools necessary for discerning the information we encounter.


Propaganda is a favored tool of manipulative governments around the world, and has been for ages. It may be surprising to some that this method of thought, feeling, and behavioral manipulation has been habitually employed by the American governance for decades, but the fact stands. Wikipedia provides a fairy extensive list of historic uses of this propaganda throughout multiple war-time ad campaigns, as well as numerous situations of foreign tension, throughout American history.


World War I

“The first large-scale use of propaganda by the U.S. government came during World War I. The government enlisted the help of citizens and children to help promote war bonds and stamps to help stimulate the economy. To keep the prices of war supplies down (guns, gunpowder, cannons, steel, etc.), the U.S. government produced posters that encouraged people to reduce waste and grow their own vegetables in "victory gardens". The public skepticism that was generated by the heavy-handed tactics of the Committee on Public Information would lead the postwar government to officially abandon the use of propaganda.”

World War II

“During World War II the U.S. officially had no propaganda, but the Roosevelt government used means to circumvent this official line. One such propaganda tool was the publicly owned but government funded Writers' War Board (WWB). The activities of the WWB were so extensive that it has been called the "greatest propaganda machine in history". Why We Fight is a famous series of US government propaganda films made to justify US involvement in 

World War II.

In 1944 (lasting until 1948) prominent US policy makers launched a domestic propaganda campaign aimed at convincing the U.S. public to agree to a harsh peace for the German people, for example by removing the common view of the German people and the Nazi party as separate entities. The core in this campaign was the Writers' War Board which was closely associated with the Roosevelt administration.

Another means was the United States Office of War Information that Roosevelt established in June 1942, whose mandate was to promote understanding of the war policies under the director Elmer Davis. It dealt with posters, press, movies, exhibitions, and produced often slanted material conforming to US wartime purposes. Other large and influential non-governmental organizations during the war and immediate post war period were the Society for the Prevention of World War III and the Council on Books in Wartime.”



The Cold War

“During the Cold War, the U.S. government produced vast amounts of propaganda against communism and the Soviet bloc. Much of this propaganda was directed by the Federal Bureau of Investigation under J. Edgar Hoover, who himself wrote the anti-communist tract Masters of Deceit. The FBI's COINTELPRO arm solicited journalists to produce fake news items discrediting communists and affiliated groups, such as H. Bruce Franklin and the Venceremos Organization.”

The Iraq War

“In early 2002, the U.S. Department of Defense launched an information operation, colloquially referred to as the Pentagon military analyst program. The goal of the operation is "to spread the administration's talking points on Iraq by briefing ... retired commanders for network and cable television appearances," where they have been presented as independent analysts. On 22 May 2008, after this program was revealed in the New York Times, the House passed an amendment that would make permanent a domestic propaganda ban that until now has been enacted annually in the military authorization bill.

The Shared Values Initiative was a public relations campaign that was intended to sell a "new" America to Muslims around the world by showing that American Muslims were living happily and freely, without persecution, in post-9/11 America. Funded by the United States Department of State, the campaign created a public relations front group known as Council of American Muslims for Understanding (CAMU). The campaign was divided in phases; the first of which consisted of five mini-documentaries for television, radio, and print with shared values messages for key Muslim countries.”




As we see, the United States governance (an organization which boasts the principles of freedom, equality, and liberty) has historically used this manipulative method not only upon the citizens of potential, foreign adversaries, but on its own citizenry. As stated before, this has been going on for nearly a century, unbeknownst to the American public. Although, the legal propagandization of the American public officially began in 2012.

Related Articles

The Legalization of Propagandizing the American People
http://www.businessinsider.com/ndaa-legalizes-propaganda-2012-5


The Smith-Mundt Modernization Act of 2012
http://foreignpolicy.com/2013/07/14/u-s-repeals-propaganda-ban-spreads-government-made-news-to-americans/

This topic is an area of interest at Stanford University. Two researchers, Manzaria and Bruck, wrote an extensive essay that provides specific examples of how this propaganda is used in modern society to sway attitudes, beliefs, and behaviors.

"The previous picture and poem is a clear example of propaganda which is a form of persuasion used to influence people's attitudes, beliefs, and behaviors. A working definition of propaganda is the spreading of ideas, information, or rumor for the purpose of helping or injuring an institution, a cause, or a person. While propaganda has been around for almost a thousand years, only recently (last 100 years) with the advent of technologies that allow us to spread information to a mass group has it evolved to a scientific process capable of influencing a whole nation of people. While propaganda is most evident in times of war as in the poster, it is constantly being used as a political and social means in even less obvious ways to influence peoples attitudes. This is currently evident with all the election commercials on TV, where the candidates are using propaganda techniques to elevate themselves above their competitor. Another place propaganda is being exploited is by the use of the media in its portrayal of countries that have nuclear technology.



Modern propaganda uses all the media available to spread its message, including: press, radio, television, film, computers, fax machines, posters, meetings, door-to-door canvassing, handbills, buttons, billboards, speeches, flags, street names, monuments, coins, stamps, books, plays, comic strips, poetry, music, sporting events, cultural events, company reports, libraries, and awards and prizes. It is most likely that some of these media uses are surprising, but that only serves to show how easy it is to not even recognize propaganda as such. For the purpose of our paper we will focus on mainly the usage of the press in their tactics of shaping people's opinions. The press (newspapers and magazines) is important because the most current news and issues are spread every day through them. The Dune affect is a term we coined--after the movie Dune--which explains that those who control and have access to media have access to and potential control of public opinion. 

Indeed, propaganda is so powerful because everyone is susceptible to it. This is true as explained by Robert Cialdini, an expert in influence, because people exist in a rapidly moving and complex world. In order to deal with it, we need shortcuts. We cannot be expected to recognize and analyze all the aspects in each person, event, and situation we encounter in even one day. We do not have the time, energy, or capacity to process the information; and instead we must very often use our stereotypes, our rules of thumb, to classify things according to a few key features and then to respond without thinking when one or another of these trigger feature are present (Cialdini 6). While this makes people highly susceptible to a propagandist who understands persuasion, in general it is the most efficient for of behaving, and in other cases it is simply necessary. Additionally, propaganda includes the reinforcement of societal myths and stereotypes that are so deeply embedded within a culture that it is often difficult to recognize the message as propaganda. 

For example I just used a persuasive technique that propagandist use all the time by introducing Cialdini as an expert. The heuristic this follows is the obedience to authority and is a rule that when someone credible and in this case by title of an expert, a person will automatically believe the information to be correct. "Titles are simultaneously the most difficult and the easiest symbols of authority to acquire. To earn a title normally takes years of work and achievement. Yet, it is possible for somebody who has put in none of this effort to adopt the mere label and receive an automatic difference" Cialdini 181). After all, what really makes Cialdini an expert?"


If there is one thing I appreciate about instruction of this type, it is the way that these authors brings attention to the fact that these tactics can be used without the consumer ever realizing they are being propagandized. This also calls attention to the fact that in order to build an immunity to this type of manipulation, one must develop a greater level of self awareness, and an awareness of the media they frequent. The article continues.

"Since propaganda has become a systematic process it is possible to analyze how the media has used it in shaping our opinions about France having a nuclear bomb verse Pakistan. Propaganda can be broken into ten stages when analyzing it in detail. These stages are: 1) the ideology and purpose of the propaganda campaign, 2) the context in which the propaganda occurs, 3) identification of the propagandist, 4) the structure of the propaganda organization, 5) the target audience, 6) media utilization techniques, 7) special various techniques, 8) audience reaction to various techniques, 9) counterpropaganda, if present, and 10) effects and evaluation (Jowett and O'Donnell 213). 

While it is possible to go into detail about each point, we are mainly concerned with numbers six and seven: What techniques the media uses. There are many techniques and persuasion tactics the media uses to disseminate information. We will specifically focus on three case studies in the France / Pakistan nuclear issue that highlight different tactics the media use. What is important to understand about all the tactics is that no matter which one is being used they all follow the same criteria: it must be seen, understood, remembered, and acted upon. Thus, propaganda can be evaluated according to its ends and interestingly enough this is the same criteria that advertiser use every day in ads, and commercials in "selling" a product."




Related Article

Propaganda Technique Index
http://www.foothill.edu/bss/people/peterson-david/CT/Module09_00_00.html

This portion of the Stanford article details one specific example of a propaganda campaign directed at influencing the opinions of Americans with regard to the respective nuclear programs of two separate countries--those being France and Pakistan.

"Studying media coverage of Pakistan’s nuclear achievement, it becomes clear that a certain amount of propaganda was used to make Pakistan appear threatening. The fact that Pakistan developed the technology was not what shaped the articles, but rather how this information was presented to the reader. In a sense, the propagandists were looking to turn Pakistan into an enemy of sorts, a country to be feared, instead of embraced.

One method used to by propagandists to create an enemy is through the technique of social proof. One way in which we process information is by observing what other people are doing that are similar to us or linking them to social norms. "When we are unsure of ourselves, when the situation is unclear or ambiguous, when uncertainty reigns, we are most likely to look to and accept the actions of others as correct" (Cialdini 106). Since it is almost impossible for the common American to be an expert in nuclear cause and effects, he looks to what others say as a means to form his opinion. This allows him to be persuade to an ideology not of his own. Furthermore, it is possible to rely on past stereotypes as form of linking one idea to another group.

For example, articles that took such an approach attempted to use a subset of social proof, where one casts the enemy by declaring it to be a friend of an already established enemy. For instance, in order to persuade the American public to think of Pakistan in such terms, media will link Pakistan to historically defined United States enemies such Libya, Iran, Iraq and the former Soviet Union. This tactic plays on the principle of social proof in which people look for justifications to quickly form their beliefs. Thus, linking to a country America already has shared beliefs about quickly allows one to associate and project the existing beliefs on the new group, which in this case is Pakistan."




These examples are significantly revealing with regard to how persuasive, and yet undetectable these methods of propaganda can be. Below is an even more extensive list of the tactics of manipulation used against the American public (as well as the citizenry of many other countries) on a daily basis. These methods are tested over the course of decades, and proven to be effective in swaying public opinion in any direction that is most advantageous to the propagandists (as well as their employers). The following is an excerpt from a report from Southern Methodist University which details multiple devices used by propagandists.

"Propaganda makes use of a collection of devices and tricks intended to influence your thinking. What follows is a compendium of these techniques synthesized from several sources. Some of them are related and may overlap. Learning to recognize these techniques can go a long way toward immunizing yourself from the effects of propaganda.

    • Ad Hominem Attack: If you can't refute the argument, attack the person presenting the argument. The intent is to discredit said person, as well as to distract you and make you think the argument has been refuted. 

    • Apology: Sometimes a corporation will make a public apology for something it has done. If that something is really bad, the technique may not work. 

    • Appeal to Authority; Some "higher authority" is invoked as evidence in support of a claim. Always be sure to check out that authority. 

    • Appeal to Emotion: A common fallacy. A "sob story" is used to support a claim. The problem is that the sad story doesn't really represent the whole picture. 

    • Appeal to the People: A common fallacy of attempting to support a claim on the basis of popularity. Remember that something that "everybody knows" can be wrong. 

    • Arguing from Ignorance: A common fallacy of claiming that some hypothesis is true based on lack of information. Think of claiming that something seen in the sky is an alien spacecraft because we have no other explanation at hand at the time. If you have no information, all you can say is that you don't know. 

    • Assertion: An Assertion is a simple statement of something as fact, usually with enthusiasm and without regard for whether it is true or not. It is a common feature of modern advertising. An Assertion is usually repeated often for maximum effect. 

    • Astroturf: "Astroturf lobbying" is a term attributed to Senator Lloyd Bentsen (TX). It refers to "grass roots" movements which are actually created and funded by corporate interests. This technique of lobbying can be very effective but is also very expensive. See Sharon Beder's paper in Public Relations Quarterly, Summer 98. Also see the Front Groups entry. 

    • Bad Logic: This will include all logical fallacies. 

    • Bad Science: This refers to research that is biased, poorly done, or containing major flaws. It can also mean a "scientific" claim that is not based on research at all. Appropriate misrepresentation can distort good science into bad science. 


    • Bait and Switch: This is an old technique from both retailing and politics. In retailing it means advertising a neat product at a low price, then saying it is "out of stock" before offering you a more expensive item. In politics it can mean underestimating the cost of some program; it is also called lowballing. The technique is very deceptive and not easy to detect in advance. It is not hard to find government programs that cost far more than the initial estimates that were used to sell it. 

    • Bandwagon: "Everybody is doing this." You've heard that before. The idea here is to convey the notion that if you don't get aboard you will be left out. 

    • Begging the Question: This is simple circular logic. You make a claim, then "support" with a reason whose meaning is simply a restatement of the claim. 

    • Big Lie: A Big Lie is an outright falsehood presented as fact. The conventional wisdom is that such a lie, repeated often enough, will be accepted as truth. The harder it is to debunk the lie the better. 

    • Buzz: Buzz is a cultural phenomenon used to promote a new product. The idea is to use word-of-mouth campaigns to create "buzz" about the product (or idea) such that others will think that they absolutely must know about it. 

    • Card Stacking: This can also be called Cherry-Picking. The propagandist uses only those facts and details that support their argument. The selected reasons are used to support the conclusion. You will get misled if you do not notice that important details are missing. The worst part of card-stacking is that it can be very difficult to detect if you are not really knowledgeable about the subject. 

    • Cartoons: Cartoons can be used to convey a false impression. Consider a cartoon portraying some politician as a demon of some kind; the visual impression can be influential. 

    • Celebrities: It is helpful to make shrewd use of celebrities, like film stars or athletes. Having one introduce and praise you at a public appearance is good. This will start you off with a favorable impression. Having movie personalities endorse candidates is a good strategy. 

    • Comic Books: A larger form of Cartoons. These can tell a story in cartoon form. The favored characters can be portrayed as superheroes. Comic books are inexpensive to produce, which is one of their advantages. Another is the visual nature of the medium. 

    • Composition: A logical fallacy wherein an assertion is made about some part that is not true about the whole. 

    • Concision: Concision is an unfortunate result of the structure of broadcast journalism. The time segments (between breaks) are short. Complex points simply cannot be made. Only simple and concise statements can be accommodated. This tends to restrict the topics to very conventional subjects. 

    • Controlling the Message: This is a strategy of planning exactly what the public message will be and then sticking to it. "It is critical to develop a set of key message points: simple declarations of fact relevant to the fact pattern. Once they have developed key message points, professionals practice them and keep delivering them succinctly and repeatedly in response to media inquiries." (Ingrid Cummings) See PR students examine crisis situations. The idea is to plan responses in advance and not deviate from them. All people speaking to media follow the same line - no exceptions. 


    • Demonizing the Opposition: This is done by portraying the "others" as something evil, disgusting, etc. Example: Stating that anyone wanting to do X is a bigoted racist. 

    • Disinformation: This technique is simply the release or planting of incorrect information for the specific purpose of deceiving the audience. Disinformation can contain elements of truth, but the payload is the lies. 

    • Divide and Conquer: This tactic is a devious attempt to label the propaganda user as a reasonable and moderate entity between competing groups. The tactic can be extremely sneaky and use a lot of misinformation, distortion and outright lies. See Building Bridges and Splitting Greens from PRWatch.org. 

    • Division: A logical fallacy in which an assertion is made about the whole which is not true for all of the parts. 

    • Doublespeak: This is the use of language and words carefully constructed to conceal the actual meaning. Euphemisms work well here. For example, "enhanced interrogation" actually means torture. 

    • Echo Chamber: The more sources there are for a claim or idea, the better it looks. An Echo Chamber is a loose network of outlets that tend to copy each other's material, all of which (on one topic) is traceable back to a single source. Bogus stories or information echoes through this system, seeming to come from multiple reliable sources. 

    • Either/Or Fallacy: This is the False Dichotomy fallacy. It consists of framing the issue to make it appear that there are only two options. One option is made to look terrible, with th implication that the other option presented is the only choice. 

    • Evading the Issue: Did you ever see a politician who didn't do this? When asked a tough question, the speaker gives an answer to something else. They may really emphasize "peace, justice and the American Way," but the answer does not respond to the question. This is not hard to detect and is very annoying. 

    • Extrapolation: This is simply making spectacular predictions on the basis of very few currently available facts. Such predictions tend to be extremely unreliable. Physicist Niels Bohr is credited with "Prediction is very difficult, especially about the future." 

    • False Analogy: To facilitate explanation, a complex issue may be portrayed as similar to a simple issue that everyone can understand. The trick with this technique is for the simpler issue to really not be a good comparison, but rather be close enough to pass. With clever design, the misleading simpler model will misdirect thought about the complex issue. 


    • False Cause: The order of some sequence or set of events is confused with actual causation. In propaganda the confusion is intentional. See Post Hoc, Ergo Propter Hoc. 

    • Fear: This technique is simple - warn the audience that some disaster will overtake them if they do not do what is suggested. If this succeeds, the audience's attention will be deflected from details or merits of the proposed action and toward what can be done to reduce the fear. When coupled with incomplete information, uncertainty and doubt, the fear technique can be very effective. Fear, uncertainty and doubt (FUD) take advantage of general ignorance. 

    • Forged Documents: Forgeries are an excellent method of planting disinformation. The media will often pick these up and circulate them widely. Governments will use this tactic to create a diversion or justify some action. They can also have other uses; remember the Killian memos of the 2004 Presidential campaign. 

    • Front Groups: These are organizations that purport to represent one agenda while in reality being funded by someone with different ideas. The name of the front group is often Americans for _______. Fill in the blank. It is usually interesting to find out who is bankrolling the group.

      A good example was reported in the Dallas Morning News for 22 April 2011. A local-option election was coming up in Mesquite to allow the sale of beer and wine in stores. The DMN reported:
      Thre was fear that the large retail chains were strong-arming Mesquite when a group called Save Our Stores invested six figures to back alcohol in the 2007 campaign. Meanwhile, the treasurer for the opposing group Save Our Children was citing family values - only to be exposed as a beer and wine retailer in neighboring Balch Springs who stood to lose Mesquite customers.
    • Glittering Generalities: These are vague, broad statements that will connect with the audience's beliefs and values. They really don't say anything substantive. Slogans make great examples. The vagueness means that the implications, though varying for different people, are always favorable. Think of peace, freedom, justice, family values, etc.

    • Greenwashing: [attempting to promote the idea that a program or policy is environmentally friendly]

    • Guerilla Marketing: There are a lot of ways to get a message, whether commercial, political or other, out into the community. See 100 marketing weapons on gmarketing.com. Many of these methods have the advantage of costing nothing. 


    • Image Manipulation: Today's image manipulation software makes this easy. The tactic is to produce a fake photograph by altering a genuine one, then release it into the wild. If the fake is well-done it can get a lot of mileage (and effect) before the hoax is exposed. Photos can also be staged for effect. Pictures which appear to tell a great story can be actually staged and posed. 

    • Junk Science: This is a label applied to honest scientific and public interest groups, while the term "sound science" is applied to "science" which is biased in the direction desired. 

    • Misinformation: Sometimes a public statement contains information that is not true, although not by design. It was not done deliberately. If propaganda contains untruth it is deliberate. 

    • Name-Calling This is the use of negative words or labels to create prejudice against some person, group or idea. If you fall for this you have been driven to reach a conclusion without examining the evidence. 

    • Plain Folks: The person speaking will adopt a demeanor that makes them look like "everyman." They will appear to connect with the audience and their point of view. Careful choice of clothing, vocabulary, and mannerisms is necessary to make the identity connection. 

    • Poisoning the Well: The "poisoned well" tactic is really a pre-emptive strike at the opposition. They are labeled as evil, stupid, corrupt, criminal or something else bad. It is not necessary for the derogatory information to be true. Once this is done, anything the target person or group says will be taken less seriously. See the Wikipedia note.
       
    • Policy Laundering: A tactic of excusing unpleasant government actions on the grounds that "the internation treaties require it." 

    • Politics of Personal Destruction: [a tactic of demonizing the opposition via personal attack]

    • Political Code Words: Words which, on the surface, look reasonable enough, but call on an unstated assumption to promote some agenda. Example: The idea that the President's job is to "Protect America." Sounds good. If, however, you check the President's oath of office you will find that the President swears to "preserve, protect and defend the Constitution of the United State of America." Very different. 

    • Post Hoc, Ergo Propter Hoc: A common fallacy. It confuses temporal relation with causation. The fallacy is that since B came after A, then A must have caused B. Consider that there may be several possibilities for what caused B and the time relationship could be just coincidence. 


    • Product Placement: How many times have you seen a TV show or movie in which you saw recognizable products? An April 19, 2006 bulletin from Broadcasting and Cable opened with "Two thirds of advertisers employ 'branded entertainment' -- product placement -- with the vast majority of that in commercial TV programming." 

    • Public Service Announcements: [a message communicated by media without charge]

    • Push Poll: This is far less a poll than a propaganda technique. It will use a "question" which actually implies something unfavorable about the subject of the question. A push poll question is often used to spread misinformation about someone or something. Suppose a pollster asked you "Would you be inclined to vote for Senator Fiddle if you knew he had a drinking problem?" Your answer to the question is not important; your ultimate reaction to the drinking problem allegation is. 

    • Quote Mining: This can also be called Quoting out of Context. It is often possible to lift a short quote out of a speech, essay, etc. and make it appear to say the opposite of what the speaker/writer meant. The real meaning is obvious when the quote is seen in its full context, but that context is conveniently omitted. Be wary when you see short quotes, particularly on controversial subjects, that are standing outside of their full context. You don't know what has been omitted. Political campaigns can produce some of the worst examples of quote mining.

    • Repetition: Did you ever see a TV commercial run twice in a VERY short time period? Advertisers know that a message must be repeated many times for it to be absorbed. The same goes for propaganda (see Big Lie above). Pres. Bush (G.W.) is quoted as saying "See, in my line of work you got to keep repeating things over and over again for the truth to sink in, to kind of catapult the propaganda." In propaganda, truth is squishy and hard to find. See The Ostrich Approach, 6th paragraph. 


    • Straw Man: The user of this tactic invents some misleading picture of an opponent's views in order to attack it. The straw man tactic involves misstating an opponent's ideas so that the fake view can be knocked down easily. Since the original idea has been misrepresented and distorted, the audience may think that the original idea has been knocked down when only the fake straw man view has been hit. 

    • Swiftboating: This originated in 2004 with an anti-Kerry campaign that undermined Sen. Kerry. The idea is to concoct a story with just enough truth in it to use as a smear campaign. 

    • Talking Points: A talking point is a simple key message or idea. A number of these can be compiled and used whenever dealing with reporters. The user will stick to the listed messages and focus attention on them. They are to be used to answer any tough question in one form of Evading the Issue. They are also very annoying. 

    • Testimonial: This technique has a well-known someone endorse, recommend or approve of a product, cause or program. Pop celebrities can work well here. Remember that testimonials aren't worth much, particularly if the endorser is not an authority in the field. 

    • Transfer: This is an effort to transfer your approval of something you respect and approve of to another something that the propagandist wants you to approve of. Flag-waving helps. 

    • Vagueness: Watch for this everywhere, even in news reporting. It can be a form of disinformation. "Remember the first rule of disinformation analysis: truth is specific, lie is vague. Always look for palpable details in reporting and if the picture is not in focus, there must be reasons for it." (Greg Sinaisky) See Detecting Disinformation Without Radar

    • Video News Releases: This relatively recent trick involves preparing a message (often an ad) in a video sequence which looks exactly like a news item. TV outlets will often pick these up and use them in news programs because it saves production cost."


http://www.physics.smu.edu/pseudo/Propaganda/alldevices.html


Until I found this gem of an article, I did not actually know that the methodology behind thought manipulation held so many different facets. (It was actually relieving to find it, as I initially thought that I had to compose such an extensive list on my own when there are so many other projects I need to complete.)

It is vital that each of us know these devices, and become aware of their use wherever they may turn up. When any source starts to employ these tricks in tandem, this is a warning sign that the source is either completely ignorant and/or unaware of their own actions, or they may have manipulative intent behind their communication. (It should be noted though that no one is completely perfect, and mistakes may be made from time to time. However, if a source layers these tactics in multitude, they are most likely untrustworthy.)

It is unfortunate that a country such as the United States--a country which boasts such liberties as freedom of choice--would resort to habitual and systematic manipulation, stifling, and over all limitation of that choice.  Such a country seems only to wear a mask of freedom while under the surface, functions as something much different.  Freedom of choice is not an optional privilege, as much of governance and the corporate world seem to believe.  It is the right of every human being, and I believe it is about time that we all remembered this foundation principle.  The sooner we all become aware, and learn these tactics of manipulation, the sooner we can become immune to them, which will, in turn, give those of present and future influence incentive for actual honesty and sincerity.

It has been on numerous occasions that I have personally witnessed these tactics used in both corporate media, and even a number of sources which wear a title of "alternative media."   These seemingly alternative sources typically speak on topics of interest to the alternative community.  However, their use of the exact same devices of government propaganda as listed above reveal them to be just as manipulative as these government and corporate sources.

In this age of independent truth-seeking, it is imperative that each of us learn to recognize these tactics, and to watch for them in the sources we frequent (and also to avoid using such tactics ourselves). If we see them, it is important to raze a flag. Many of those in these communities seem to be skilled at recognizing the use of the “fear” tactic, and this is positive progress. Lets become just as aware with regard to the rest.




Recent Updates...




Thanks for reading.



I started DTM because I feel that informing the masses is the most positive and impactful thing I am able to do at this point. I work at my articles as though each one were my job, as I don't quite have the health to keep an actual job right now. Somehow, I get more energized when I know I'm having a positive impact in the lives of others. 

Right now, I rely upon donations and ads to keep my site going. Ideally, we would live in a world free of the need for money of any kind. We will have that world very soon, I believe, but in the mean time, I depend upon this task to sustain me as I do my best to be dependable to you, my readers. I hope “Discerning the Mystery” is a truly positive and progressive experience for you.

Thank you for your support.



No comments:

Post a Comment