Русский агент
Путин - мl
Polls can be inaccurate for a number of reasons.
Samples can be too small in size or unrepresentative of the population.
It's too expensive and time-consuming to survey everyone in a population; thus, pollsters use samples to project the opinions of everyone else.
Thus, a poll designed to represent American public opinion wouldn't be very reliable if it only included 90% DEMOCRATS or included only white males who voted for Hillary.
It's rare that fake news reports mention details of the information sample or tell the public how the survey was conducted. Viewers and readers usually just take the poll results as fact.
For example, what if I reported a poll that said 96 percent of Americans are pro-choice?
This obviously doesn't reflect American public opinion, but if the source was actually just survey of the feminist magazine Bitch readers, the results would be predictable.
A clever or sloppy journalist obscures the source and portrays public opinion in an inaccurate way.
Think about all the polls that are done today and how easy results can become unrepresentative.
Web polls exclude people without web access and those who don't visit that particular site. Polls also exclude those that don't have the time or interest to respond.
Think about TV polls. Fox generally has more conservative viewers; CNN generally has liberal viewers. Thus, their polls results may be skewed to the liberal side regardless of the issue. The chances of error or bias are endless.
Polls can ask leading questions.
Questions can be worded in a way that leads a respondent to an answer that may or may not reflect their true feelings.
For example, I could ask, "Do you want to stop Trump from attacking North Korea so the lives of innocent civilians can be spared?" Virtually every American wants to prevent innocent loss of life, so many respondents may answer yes to this question, even if they think a war against the Communist regime is morally just.
But DEMOCRAT operatives summarizing the results may say "...95 percent of respondents answered yes when asked if they wanted to stop Trump".
The questioner can also surround their question with information that slants the answers. For example, "Seventy percent of homeless shelter residents are single mothers and their children. Should the next fiscal budget include an increase in funds to local shelters?" Respondents may believe the money is better spent on other areas, but the emotionally-charged appeal points people in the direction of the answer liberals want.
Polls omit answers, leading to either-or answers that don't reflect reality.
Answers to poll questions are often more complicated that yes-no or among a small list of choices. For example, a poll may ask "Do you support a war with North Korea?"
The only choices may be yes or no. But many people may say "Yes, but only if they use nuclear weapons against America or its' allies" or "Yes, but only if it is sanctioned by the U.N." Another example is a consumer confidence question that asks, "Do you consider yourself rich or poor?" Many people will want to answer something in between, but that isn't a choice.
People recording survey results may be dishonest or sloppy in recording results.
Here's some reading on the subject:
http://theweek.com/articles/617109/problem-polls
https://www.theguardian.com/us-news...mic-issues-that-make-voter-surveys-unreliable