Back to Top
Your current browser (Unknown Browser 0) is no longer supported by its developer or this website, so it may not work.
To make sure this website works and to get better security, you can switch to a different or newer browser.

Don’t let public input make your decisions worse

Does this sound familiar?

“My colleagues overreact to what a few residents say online or during meetings. How do I help them understand that those voices aren’t representative of the whole community?”

We hear things like this a lot. It’s great to listen, but we need to know which input is helpful and which input can make decisions worse. Otherwise good intentions lead to bad results.

Below are 3 proven ways to help solve this problem – including powerful new data showing how often public input misleads us- with an infographic you can share.

The new data – whose voices are actually being heard?

Think about the “sentiment” of all the voices you hear in meetings or see online. Maybe the majority of them hate something. Or they all like something. They might even feel like an organized group.

But how does the majority of their input compare to what the majority of your community thinks? Do they hate or like the same things? Are they different?

No one has been able to answer that question with real data… until now.

We looked back through our first 300 surveys for an answer. We measured “true community sentiment” from our scientific community surveys. We compared that to the “public input sentiment” on the same topic – what officials heard and thought before a survey based on public comments, phone calls, emails, social media posts, online surveys and other online engagement.

What we found will either confirm your suspicions or shock you…

About 70% of the time, the sentiment from public input was the opposite of what the whole community wanted. 

Even though public input feels like the voice or pulse of the community, it is definitely not.

And from a decision making perspective, it can be worse than no input at all.

Here is a summary of the data and tips you need, in an one infographic:

(Feel free to download a PDF or share this using the buttons below)

Public input is so misleading that your community would be better served by you flipping a coin or doing the opposite of what you hear.

And what about the other 30% of the time, when public input sentiment is not the opposite of community sentiment?

  • About 25% of the time, a small interest group was amplified by maybe 2 or 3 times, but not to a majority of public input
  • About 5% of the time, the majority of public input sentiment mirrored the majority of community sentiment

So about 1 out of 20 times, public input was a good representation of a community. And 19 out of 20 times… it wasn’t.

As data experts, we know that this sample isn’t an exact representation of reality either. But whether the 70% error rate is truly 75% or 65% (or even halved to 35%), this is a huge problem and source of mistakes for every local government.

How big a problem? The data from the scientific surveys in that sample helped prevent over $300 million of mistakes.

Why is public input so unrepresentative?

Maybe we shouldn’t be surprised that public input is so unrepresentative.

Experienced government professionals guess pretty close to mismatch numbers above when asked.

We might refer to it as “public” input, but that’s wishful thinking. No one really hears from random or regular members of the public. Our input always comes from whoever has the most interest in a topic, plus some usual voices online and at meetings.

You may be interested in what the whole community thinks. But the people who provide input are self-selecting themselves to give input based on their interests, not yours.

They follow an issue they care about. They give their input at much higher rates than everyone else on that topic. Then they recruit their like-minded friends to participate, who pile onto the same side of an issue. By dominating input like this, the interested residents hope to mislead you about the importance of an issue or why it should be decided for their benefit.

So of course you end up with a skewed perception of what “the public” thinks – especially on hot topics.

Traditional public input channels such as meetings, emails and online surveys/polls aren’t designed to get representative samples. Instead they attract input from people with special interests in a topic, which means biased or “unscientific” data. You can think of public input like petitions. You learn what the people who respond think, but everyone else could think the opposite.

Why is unrepresentative input so dangerous?

The bad news… is that public input should mostly be ignored – because it makes decisions worse.

The really bad news? it’s really hard to ignore it.

We’re naturally wired to pay attention to what people say right in front of us. And listening is the right, good thing to do.

But one manager described the problem nicely: “The thorn in my side is when someone says something on Nextdoor –  and the council thinks that’s what the whole community thinks.”

As an official, your heart tells you to listen and react to what a single person says – even if your brain wants to remind you they are just one self-selected and unrepresentative voice.

So we all struggle with this – research shows 97% of local governments have had their decisions “influenced by a few loud voices”.

The slippery slope to unhappy residents and low trust

Bad data puts every local government on a slippery slope. Once you get the misleading public input, it’s really hard to stop the slide from bad data to bad decisions and mad residents. The full slope looks like this:

Step One – Public input comes in opposite to what your community wants (70% of the time)

Step Two – That input influences your decisions (97% of governments have been influenced)

Step Three – These well-intentioned decisions make the majority of your community mad (widespread mistrust)

If you’ve ever wondered why community trust and respect are low despite good intentions, the mystery is solved.

In fact, the best-intentioned officials might listen the hardest and be the most responsive. But this can actually lead to the most decisions that disappoint the most residents – all while trying to do things right. It’s an unfair game for officials and residents alike.

So what can you do differently? Three tips for you:

Tip #1 Share this public input data, plus a visualization trick

Let your colleagues know that 70% of input is misleading (or download the infographic to give them). They might be surprised.

You can highlight that they could serve the community twice as well by doing the opposite of whatever the loud voices want.

If that isn’t enough, you can share this proven visualization…

Imagine that 10,000 people pitch in $5 each for pizza, and your job is to order the toppings for everyone. If 90% of your public input comes from people who want anchovy pizza, do you order anchovies on 90% of the pizzas?

Of course not! You can easily picture thousands of people yelling at you – “Don’t be an idiot! You know that’s not what we all want! Why are you catering to these unrepresentative voices?”

Fee free to picture your whole community opening pizza boxes to find 9 out of 10 are covered in anchovies. You’d see so many angry faces and hear so much disgusted gossip, you’d be sick with embarrassment, as you should be.

“How could someone mess up this order so badly by catering to the noisy instead of the many?”

Of course this simple visualization applies to all public input and all public decisions.

So when you feel the influence of the public input you get online or at meetings, picture all the other residents in your community giving you stern and disappointed faces, and cursing at you inside their homes.

Those unseen faces are every bit as real as the ones that happen in front of you.

One last trick to visualize the problems with public input is to see real, eye-opening examples from the misleading 70%. If a picture is with a thousand words, these 1 minute case study videos below might be worth a million words each, so don’t miss them

Tip #2 – Use the “true/new/for you” listening checklist

Every time someone gives input online or in person, you should listen… but then quickly decide if it’s useful or not. Just ask these 3 questions as a checklist for each piece of input:

  1. Is it true? – it’s not an opinion or unverified claim
  2. Is it new? – you didn’t already know it
  3. Is it for you? – it’s about something your agency does

If the answer to every question is YES then you have some information that might be useful: 

  • A statement of fact
  • That you didn’t know
  • About something your agency does

But if any single answer is NO, it’s not useful input because it is:

  • A false or misleading observation
  • Something you already knew
  • Input about something your agency doesn’t do

Pretty simple, right? It might take some extra work to confirm if certain claims are actually true or not, but that’s just part of doing a good job.

What if one person says they hate something or like something? The fact there is that one person hates or likes something. Every other person in the whole community could hold the opposite view.

And if ten people (or 100 people) say it? Again, all that tells you is that those ten (or 100 people) want something – just as when it’s one person. Without representative data from the whole community, you can’t know what everyone else thinks so you need to ignore it to avoid an anchovies mishap.

Now what if someone claims that “most people” want something? (“Everyone loves anchovies!”) That’s just an opinion about a possible fact, not a statement of actual fact. So ignore it. Again, they don’t know what the rest of the community thinks and neither do you.

However… if anyone gives a specific factual complaint or reason for not liking something, listen up! Those lead to important fixes and improvements. That is the “true/new/for you” gold that you hope to hear.

Tip #3 – Only use scientific data for community preferences

Tip #1 reminded you to ignore the noise and Tip #2 showed you how to listen for useful individual facts, feedback, and ideas.

But you still need to know what your whole community wants and values. That’s why you wanted public input in the first place!

Unfortunately there aren’t any shortcuts to reliable and representative data. If you want to know community-wide percentages or ratings or rankings you need a scientific community survey.

Fortunately, that’s exactly what the public wants you to do, especially for decisions that can create or destroy millions of dollars of value for your residents. They are counting on you to serve everyone – not to be influenced into mistakes by unrepresentative voices. That’s why it has always been worth spending $20,000 to $40,000 per scientific survey to get a valid representation of your whole community.

What are the smartest governments doing?

You now know how to ignore the noise and listen for “true/new/for you” facts that help you identify problems and improve services.

You also know why you should do more scientific community surveys – and not just satisfaction data every few years that sits on a shelf.

You may not know that there is new way to get scientific survey data to inform and support important decisions in real-time – 90% faster and easier then traditional scientific surveys.

The smartest governments are using FlashVote surveys to get statistically valid community input in 48 hours on any issue.

FlashVote uses the same scientific panel methodology that Gallup, Pew and other national pollsters have switched to in recent years for better data. But our patent-pending system is built from scratch for local jurisdictions. This is why we’re recognized as having the highest quality data available for community input, especially on hot topics. And the pricing makes it a no brainer for every jurisdiction with a population over 5,000 people.

FlashVote customers connect with a larger and more inclusive range of residents than any other engagement approach. So they they can deliver maximum value for the most residents, while enjoying the peace of mind that comes from avoiding mistakes and regret.

See how other governments solved problems with better input

Governments like yours are already using FlashVote data to solve the same problems you have, so you can browse lots of examples here.

You’ll see timely topics like ARPA funds, policing or COVID, and recurring issues like budgets, planning or communications. You’ll even see hot topics like housing affordability, backyard chickens or flushable wipes. Whatever your next topic is, the team at FlashVote has seen it and solved it.

Contact us below to see surveys on the topics you care about, or to learn more. We usually respond within a few minutes.

About FlashVote

Instead of giving you more work and complicated software to learn and run, FlashVote operates as a concierge service. We free up your time by doing all the work for you and just giving you the answers you need. Whatever you want to know, you get custom questions, crafted and edited by experts for the exact decision support you need. Then you kick back while data gets collected with email, text and phone call communications. Then 48 hours later you get all your answers on an interactive results page that is elegant and actionable.

The process is so fast that customers routinely go from needing a survey to having answers in the same week.

Our founding mission at FlashVote is to help every government make great decisions with great data. So we’ve invested years and millions to make surveys super affordable too. You can get a whole year of FlashVote scientific surveys for a fraction of what others pay for one survey elsewhere.

Share these videos to bring bad data to life

These mini case study videos will definitely grab your colleagues attention and reinforce the data and tips above,

In 60 to 90 seconds each you can watch how real decisions played out in other jurisdictions, with real public input on familiar topics, from the same communication channels that you have.

  1. Public input in general (trash)
  2. Emails (trails)
  3. Complaints (parks)
  4. Online surveys (community center)
  5. Online engagement (backyard chickens)
  6. Input from regulars: Usually but not always wrong
  7. BONUS 3 minute video: The true story of two elected officials

You can click the links above to go directly to a video below. Or scroll down through this whole section. Enjoy!

1 – Traditional public input can be very misleading

The first time we found bad data was our first survey. Watch how a proposed change to trash service was almost a huge mistake.

2 – Emails are not representative of your community

Watch what happens when a local government keeps track of all the emails they receive on an issue.

3 – Complaints are not representative either

Watch how one city learned that a rash of complaints represented the opposite of how the whole community felt.

4 – Online survey data isn’t reliable

Want to know why unscientific online surveys are “dangerous data”? Watch this community center example.

5 – Online engagement goes bad quickly

Watch how long it takes for online engagement to attract the residents interested in a hot topic.

6 – You can’t always ignore the noise either

Public input is usually unrepresentative, but not always, as we saw in this case study of meeting regulars.

7 – BONUS: Why elected officials need much better input

You can appreciate the problem from the perspective of two elected officials in this “based on a true story” video.

Hope the data, tips and videos here were helpful to you and your colleagues.

Contact us if you’d like to serve your community better too. We’d love to hear from you!.

Join all the counties, cities, towns and villages who are enjoying the benefits of FlashVote in 20+ states so far. See why we’re winning awards and growing fast.

  • Have questions?
  • Ready to see more?
  • Need pricing?
Feedback