data webinar
data webinar

WEBINAR: Data Governance with Polco

Best Use Cases for Municipalities
Apr 18,2019 1:30PM
Apr 18,2019 2:30PM
Recording

Click the link below to watch a recording of the webinar. 

Data Governance with Polco

Q&A With Presenters

Presenters Nick Mastronardi and Michelle Kobayashi provide answers to questions posed during the webinar, but unfortunately, we did not get to answer. 

How can one move elected officials away from the anecdotal information from their groupies or frequent fliers at their meetings?  

Response from Nick 

Ask for input broadly, make it easy to provide, and make sure it is referenceable so it can stand up to critique. There are digital and non-digital ways to do this. National Research Center (NRC) does an amazing job through its National Community Survey to ensure a scientific survey of all resident types on a variety of issues and providing peer city performance benchmarking analysis.  For shorter policy polls and resident dialogue on a faster and more frequent basis, Polco provides a similar service online that can work with a multitude of digital communication channels for outreach and then automatically cross-references, organizes, and tabulates results.  We have a lot of fun examples from cities that have heard from the frequent fliers and could then confidently stand up to them and say "Thank you for your input, we have also heard from several hundred or several thousand other residents who feel differently and on this issue we are going to move forward with __x__."  It's great to see that those frequent fliers don't always get to exercise disproportionate influence. 

 

Addition from Michelle 

I also would suggest that you educate your council on the difference between anecdotal data and data which are more systematically/scientifically collected.  There are many good examples online of how people have been misled by anecdotal information.  You may even have examples from your own community’s past.   

One experiment we conducted early on was to administer a survey at council meetings and town meetings while also conducting a survey with a more representative sample of residents.  We compared the results to show the council how the opinions of the “frequent flyers” were different than the wider population.  For several policy issues, the direction was similar (e.g. overall support for both groups) but we found those participating in the scientific survey were much more moderate in their views (e.g. much more likely to “somewhat support” or be “somewhat willing”).  However, in a couple of cases, the direction of the vote changed.   

We also have found that once council starts to use stronger data sources, they will begin to expect it.  As Nick mentioned, once a City learns how to use the more representative data to combat folks with single issue interests, they often become converts. 

 

How do we ensure that all voices are heard and that no one group, millennials, dominates the dialogue? Older folks may not be as eager to participate. 

From all our engagements we see a very similar pattern: lots of participation out of the gate from folks in their 40's through 60's. As city officials see the sufficient representation in real time from these age brackets they can start to focus on getting older and younger. We actually find it's comparably easy to get input from 70's as well as 30's, but the good news is that if you are going to get 30's and especially those in their 20's, online is the way to do so. We don't see over-participation by 30s and 20's often, but if we do that's an easy problem to fix with reweighting results by demographics of the sample to line up with the demographics of the community. This is an often-used technique that NRC has been successfully using in their scientific surveys for over 25 years. 

 

Has the Mayor and communications team been able to analyze the comments and feedback coming from constituents? 

I think this is a great question. This is the secret treasure chest in engagement. There is a lot of great data in the comments, especially when viewed by precinct or broad demographic category. 

 

How do the comments and feedback qualify the statistical results from the survey/diagnostics? 

Response from Nick  

If a survey or poll is asking the wrong question, or if there's an alternative not included, the structure of the comments quickly capture that. We have in many circumstances see people say "I voted yes, but would really like to see Y. Or, I voted no, but would be in favor if I better understood the revenue or cost implications".  Most recently we saw this in Bar Harbor ME as they were considering closing a public office a day a week to save money, and when asked if they were in support of the policy, several residents had other cost neutral solutions that were preferable. 

Response from Michelle  

We often collect data in both systematic and anecdotal ways.  We will place more weight on the more scientific, quantitative data, in making decisions.  However, we can use the anecdotal, more qualitative data to help provide depth to the results.   

 

Example 1: 

About eight in ten residents surveyed supported the City devoting additional resources to wildfire prevention. Residents attending the town meeting also supported the City taken greater action in the area wildfire protection and expressed a strong interest in the City adopting policies requiring homeowners to mitigate wildfire hazards on their property.   

Example 2:  

Approximately three-fourths of residents surveyed supported the City partnering with the County to build a housing first building to house 25 homeless residents.  About 10% of respondents were strongly opposed.   

Residents attending the town meeting were less supportive of the building.  Many town meeting participants were neighbors of the proposed building sites and expressed concerns about safety and potential decreases to their property values.   

 

What is required from citizens? Is there a sign up or sign in requirement? Does that barrier limit participation? 

To respond, residents must provide their name, email, zip and a password, or can simply log on with Google or Facebook. If when we check with the city voter file or verification list we see there are two John Smiths in the zip code, only in that case do we ask for a little more info to disambiguate, but we never share individual data with city officials and never with third parties as we want to foster trust and preserve privacy to promote continued civic participation. 

 

What would the cost associated with a program like this be? 

It depends on the size of a city, but roughly a few thousand dollars a year for all the verified polls and surveys a city would like to run, regardless of number of respondents or responses. While this is not expensive relative to traditional means to do verified polls, NLC members also receive an additional 20% off as Polco is a Savings and Solutions member. 

 

Question for MichelleWhat aspects do you make sure to include in your panel discussions in order to make them most effective?  How do we get a fair representation to come to round tables or panels? 

Here is a slide I have used to discuss successful panels.  In terms of panel management: 

  1. Try to recruit a diverse group of participants e.g. (just don’t recruit from a council meeting).  There are a lot of ways to recruit:  

  • Send post cards to a random sample or all households in your community  

  • Send invites to every resident with an email on file (rec center users, library patrons, web pages, etc.  I now see some cities requiring their utilities to add text in their contracts with residents allowing them to share the email addresses with the City).   

  • Recruit at diverse community meetings and events (e.g. farmers markets, high school football games, senior centers, etc.) 

  • Talk to non-profits and faith-based communities serving hard to reach residents and ask them to help  

  • Use social media (Facebook, twitter, Nextdoor, etc.)  

 

The bigger the list, the better.  Once you have an email list, you can match names to a voter list through Polco or send out a demographic survey.  You then have the means to create a panel that best fits your community.   

You will have attrition so recruit more participants than you need understanding that you will lose folks over time 

The number and diversity of the panel will increase if you first ask for a more limited time period for participation – such as 1 year.  Many may choose to stay on the panel, but they will feel better about joining if the commitment up front is not too large.   

Make sure you reward your panel by giving them information/feedback.  Send them the tallied responses or allow them real-time access to the data.  Also, an annual letter of thanks from the Mayor or council thanking them and telling how they used the data is a great way to keep people on board.   

I believe the bullets under Surveys are more self-explanatory except “preferred mode and times”.  Folks building panels have found it is a good practice to ask people when they want to engage.  Tuesday mornings, Thursday nights, etc.  Identifying the times people are more likely to respond will increase participation.   

 

 

Across the country, elected officials are exploring ways to better engage their residents. Technology and improving communication strategies offer a unique platform for promoting and adopting data-driven governance. And doing it is easy.

 Hear from   Mayor  Timothy Hanna  of  Appleton, Wisconsin how his community collected constructive, verified input from residents on projects and issues including, traffic, roundabouts and parks. Also hear from data experts from Polco and National Research Center, Inc. (NRC) to learn more about strategies and technology for gathering feedback on proposed plans and using that data to make decisions that best reflect community wishes. Our speakers explore performance measurement and how civic engagement can assist with performance management.